Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
by drum303
This instructable will show you how I built "Twitch". Twitch is a virtual pet that is able to express itself through it's
eye, movements, and the sounds it makes, which happens to be sound clips from some of its movies. It can be
happy, sad, angry, or neutral depending on how much you interact with it.
Twitch evolved from a project I saw on the Adafruit site called Adafriend the Virtual Pet Cube
(https://learn.adafruit.com/adafriend/overview) by John Wall (https://learn.adafruit.com/users/WALLTECH). The
original project consisted of an Adafruit Pro Trinket, a LED Matrix and backpack with a 3d printed cube body
among other things. I already had an Arduino Uno, and wasn't interested in getting it 3d printed so the project
evolved from there.
I used much of the code from the original project, as I liked the animated eye and the different emotions it was able
to portray. From there I added a pan/tilt base, capacitive touch sensor and an Adafruit Wave shield to give my pet
the voice I was looking for.
In order to successfully complete this project, you should have decent soldering skills. It would be helpful if you
have completed at least one Arduino project before, so that you have gotten through setting up the Arduino
software and are able to upload sketches to it successfully. You should also have some experience with
debugging code, as mine is far from perfect.
Arduino Uno
Adafruit Wave Shield 1.1 (https://www.adafruit.com/product/94)
Mini Pan Tilt kit with 2 servos (https://www.adafruit.com/products/1967)
1.2" LED Matrix and backpack (https://www.adafruit.com/products/1632)
Capacitive touch pad sensor (https://www.adafruit.com/products/1362) - I used a 5 pad, but a single
pad would work just fine.
A speaker to play the audio files
An SD card to hold your sounds
Power supply for the Arduino. The servos draw too much power to run from 5V USB
A breadboard or proto board
A box to hold the electronics - I used a 4X4X3 wooden box I got from Michael's
Wires, solder, and the usual
1.
The
finished
wave
shield
The project uses wave files played with the wave shield to help convey its 4 emotional states, sad, happy, angry,
and neutral as well as sounds for when the touch sensor is activated. I used about 10 different sound clips from
different movies to represent each state, but you can choose songs, animal noises or whatever you want. I got
mine from a site called www.moviewavs.com (http://www.moviewavs.com/), which I have been using since the late
1990's. Although it is not updated as frequently as it once was, I would highly recommend it.
Although the Wave shield supports 8.3 file names for your files, I names all the files with 2 numbers to save
memory. I am sure there is a more efficient way to do this, only I couldn't work it out.
Make sure the sample rate is mono, 22KHz (or less) and 16-bit (or less) or they will not play. I used the open
source software Audacity (https://sourceforge.net/projects/audacity/) to convert them to the correct format,
although there are several other ways to do it as specified in this guide (https://learn.adafruit.com/adafruit-wave-
shield-audio-shield-for-arduino/convert-files).
I collected several files I wanted to use, organized them into the appropriate emotional states and renamed them
as references in the code.
Touch sounds are named
1.WAV - 8.WAV
Neutral sounds
10.WAV - 19.WAV
Angry sounds
30.WAV - 39.WAV
Sad sounds
50.WAV - 59.WAV
Happy sounds
70.WAV - 79.WAV
Copy these sounds to the root directory of your SD card, then insert it into the Wave shield.
Follow the image to help you wire it all up. The Fritzing library (Software used to create the drawing) does not
include an image for the wave shield, but just use the same pins. The wave shield uses pins 10,11,12, and 13 to
communicate with the SD card, and 2,3,4 and 5 for the DAC (digital audio converter), so we cannot use those pins
for anything else.
Pins A4 and A5 on the arduino are the I2C lines that communicate with the LED matrix. A4 connects to SDA
labeled "D" (Data) on the board, and A5 to SCL labed "C" (clock) on the board.
Digital pins 6 and 9 are used to control the servos. 6 is connected to the "X" servo (up and down) and 9 is
connected to the "Y" servo (left and right).
Pin A0 is connected to both outputs of the capacitive touch sensor board. If you use a single touch sensor, you
would just connect both probes (or antennas) to the input line on the breakout board, and the output would connect
to A0.
I am fully aware that some of my code is pretty mangled, and that it could really use some optimization. I would
really like to be able to fix the hard-coded wave file names, and to use the random number directly to create the
filename to play, rather than having to code the filename for each action. Unfortunately I havent had too much luck
with that.
I would also smooth out the servo movements to follow the eye movements closer rather than move after the fact. I
have made decent progress on that front, although it is not quite there yet. If anybody has some proper coding
skills, drop some ideas in the comments.
From here you can certainly customize your project even further. You could add different sensors or create
different actions or emotions. If you dig into the code a bit, you can probably figure out how to change how often
your pet speaks, or when his emotions change.
In previous versions of this project, it had an IR sensor and receiver, and you could interact with it that way, and it
was able to turn off the TV if you ignored it too long. Quite honestly, that ended up not being too much fun for
anyone, and a Timer conflict caused a significant amount problems adding other features.