Over the last few months I’ve been having a lot of fun with the Google AIY Projects Kits. First with the Voice Kit, putting together a retro-rotary phone Voice Assistant and a voice-controlled Magic Mirror, but more recently with the new Vision Kit, when I integrated it with the Voice Kit so that my magic mirror could recognise when you were looking at it, and respond.
However after the release, last week, of the newly updated Voice and Vision Kits, I thought I should sit down and take another look at the Vision Kit.
I’ve been playing around with face detection since the early iPhone era. But the idea that the camera is now the best sensor we have, that we can process imagery using machine learning models at the edge—potentially even on non-networked enabled devices—and close the loop in real time, is still new.
One of the things that the Vision Bonnet does makes simple is controlling servos, so I decided to make use of that, close the loop, and build a cyborg Dinosaur.
Gathering your Tools and Materials
For this project you’ll need a craft knife, scissors, a set of small wire snips, a soldering iron, solder, jumper wires, electrical tape, a cable tie or two, a needle and thread, some Sugru, heat shrink tubing, some small diameter wooden dowelling (or possibly plastic straws) and, of course, a plushie dinosaur.
We’re going to make use of most of the AIY Projects Vision Kit contents, along with a Lithium Ion battery, an Adafruit PowerBoost 1000C, and a small servo. Optionally you might also need an SPDT on/off switch.
Assembling and Testing the Vision Kit
For now, just assemble the core part of the kit, the bonnet and camera, and set up the Raspberry Pi installation as normal — enabling wireless networking, secure shell, and OTG access.
Plug the kit into power and wait for it to boot, remember that booting for the first time may take a couple of minutes. Then login and stop the Joy Detection service which starts automatically when the kit is booted, an then disable it permanently as below,
$ ssh email@example.com
$ sudo systemctl stop joy_detection_demo.service
$ sudo systemctl disable joy_detection_demo.service
then go ahead and set up the development environment by running the new
AIY-projects-shell.sh script which replaces the old
activate script in previous releases of the kit.
~/AIY-projects-python $ cd src/examples/vision
From here we can run the simple
face_detection_camera.py example which runs continuous face detection using the VisionBonnet and prints the number of detected faces in the camera image.
$ python ./face_detection_camera.py --num_frames 50
Iteration #0: num_faces=0
Iteration #1: num_faces=0
Iteration #2: num_faces=0
Iteration #3: num_faces=0
Iteration #4: num_faces=1
Iteration #5: num_faces=1
Iteration #6: num_faces=1
Iteration #7: num_faces=1
Iteration #8: num_faces=0
Iteration #9: num_faces=0
Iteration #10: num_faces=1
Iteration #11: num_faces=1
Iteration #49: num_faces=0
should give a count of the number of faces in each frame. It may take some time to initialise the script before it starts, so patience is needed.
Adding and Controlling a Servo
Connecting a servo to the Vision Kit is fairly easy. Grab a small micro server, and then you can either snip the connector wires to separate them and solder connectors directly to the ends of the wires, or solder together a connector cable. Since I had the parts available, I went down the second route and built a break-out cable.
The Vision Bonnet comes with four GPIO pins broken out on the left hand side of the bonnet. While the custom Arcade Button connector, located to the right, is also a useful source of GPIO if you’re thinking of adding hardware to the Bonnet, although it’s a bit less accessible.
Go ahead and plug the black wire from your servo into the
GND connector on the Bonnet, the red wire into the
+5V pin, and then finally the signal wire (usually coloured white or yellow) into the
GPIO A pin.
Face tracking is a more-or-less out of the box experience with the AIY Projects software, we just have to grab the bounding box of the first detected face in the camera image and decide whether it is to the left, or to the right, of the center of the frame.
As you can see from the code we leave a couple of hundred pixels gap in the center of the frame, this is so the servo doesn’t constantly seek back and forth. If you’re more or less in the middle of the frame, then the servo doesn’t move.
If all goes well the servo should move so that it points towards you, moving left when you’re left of frame, and right when you’re right of frame.
Powering Things with a Battery
It’s not going to be a particularly mobile dinosaur if it’s hooked up to a power adaptor, so it’s time to break out the batteries.
I went ahead and attached the USB connector and an on/off switch to the PowerBoost board, and hooked it up to a LiPo battery. Putting everything together is as simple as finding a really short USB lead to connect the PowerBoost board to our Raspberry Pi.
Casting around for a good plushie, I came across a small blue Triceratops.
At just 15 cm (just under 6 inches) tail to nose, the plushie is just a little bit on the small side. However it was so adorable I decided to go with it anyway.
Building a Cyborg Dinosaur
The first step is to make our dinosaur a lot less plush. Carefully cut around the neck seam, and remove all the stuffing. There will be far more than you might expect from such a small toy.
You need to build a skeleton to replace the stuffing. I used some 2mm wooden doweling, but alternatively you could use some plastic straws. I also managed, with some difficulty, to build it inside he plushie. But you might want to think about opening the dinosaur up by cutting along one of the seams alongside the belly and then wrapping the skin around a pre-assembled frame.
Then go ahead and stuff the electronics inside the frame, through the neck hole. They’ll just about fit. Try and keep the boards edge on with the micro-USB charging port of the PowerBoost board, and the on/off switch, outwards towards the neck hole.
Next we want to replace the right eye with the Raspberry Pi Camera Module. It’s probably going to be easier to unclip the ribbon cable from the camera, mount the camera inside the dinosaur head, and then reattach the ribbon cable than to try and mount it without detaching the camera.
Take some snips and cut one of the eye stocks out. This should leave a camera sized hole where the right eye used to be, go ahead and poke the lens through the hole.
Then grab a needle and thread and use the mounting holes on the board loop the thread through, and stitch the camera into the head. Then mount the head on the end of the servo’s lever arm, and we’re done.
The code looks pretty similar to our previous example. However, I’ve added a time out. Instead of immediately looking away when a face moves out of left or right field, the dinosaur will pause for a while before looking back.
I christened the cyborg dinosaur, the ‘Do-you-think-he-saurs.’
The Completed ‘Do-you-think-he-saurs’
Hopefully the answer to whether our cyborg dinosaur saw us will be, “Yes!”
As, if everything goes to plan, the head of the Do-you-think-he-saurs should track left and right as you move across his field of vision. Guess he saw you!
This Dinosaur Will Catch Fire 🔥
Now obviously, all those electronics stuffed inside a very small dinosaur are going to generate a lot of heat. As it stands, the build is a bit of a fire hazard, and you should definitely not even think about charging the battery in-situ. That’s almost certainly going to be a step too far. But, I wanted something small, portable, and cute that I could use for five minutes at a time during conference talks. So for me, this was perfect. But, just this time, you should not follow along. If you do, you’re probably going to burn your house down.
If you do want to cyborg a plushie, find a larger one, put the electronics inside a proper enclosure inside the toy, and provide some decent ventilation.
You have been warned. Do not replicate this build, instead make a better one.
Where Can I Buy it?
The AIY Projects Vision Kit is for sale in Target, both in store, and online. The Voice Kit costs $49.95, while the Vision Kit costs $89.95, and both kits now come with everything you need to get started — including a Raspberry Pi Zero WH and, for the Vision Kit, the Raspberry Pi Camera Module.
Unfortunately the updated kits aren’t yet available outside the United States quite yet, but Google tells me they’re working on that, and you can sign up to their mailing list to be notified when they do become available.
If you like this build I’ve also written other posts on building a retro-rotary phone Voice Assistant with the Raspberry Pi and the AIY Projects Voice Kit, along with a series of three posts on building a voice-controller Magic Mirror which integrates both the Voice and Vision Kits.
This post was sponsored by Google.