Eye-To-Eye Part 4: Moving the Servos

The four pieces required for the Eye-To-Eye project are in place now: A laptop running Ubuntu with Opengazer fully functional, router, Arduino with Ethernet Shield and two servos (for one eye). The glue for all the pieces — especially the opengazer-to-network part — have also all been figured out. However, putting the pieces together still required some work I didn’t anticipate, and provided some interesting challenges I have yet to solve.

But the good news is, it works! Here’s the video:

How It Works

The rig works as previously described in parts 1, 2, and 3, with the notable exception that I’m using the mouse coordinates to control two servos — vertical mouse position for servo 1 and horizontal position for servo 2.

I chose to use mouse coordinates instead of eye coordinates for testing the initial rig because mouse coordinates are essentially the same thing as eye coordinates on a screen – x and y between 0-1024 and 0-600 (my netbook’s resolution). Mouse input provided a much smoother input which resulted in predictable behavior of the servos. Also, I didn’t have to keep my eyes on the screen the whole time so I could actually see the servos moving properly.

To dump mouse coordinates on the screen from Ubuntu, I used xdotool:

while [ 1 ]; do
xdotool getmouselocation
done

It happened to work perfectly as it outputted exactly three numbers, which is what my Arduino “API” needs. The Arduino, for now, will ignore the last number. With some sed magic, I filtered out just the naked numbers and outputted them to the network:

exec 3<> /dev/tcp/192.168.0.102/3333
while [ 1 ]; do
xdotool getmouselocation | sed 's/x:\|y:\|screen://g' >&3
done
exec 3

Challenges

Unfiltered Input

From Part 3 of the Eye-To-Eye project, the output from Opengazer is spit out directly onto the network. However, due to imperfections in the eye-tracking algorithm (undoubtedly a function of low-quality webcam and bad lighting conditions), the x-y coordinates jump around quite a bit, causing the servos to act erratically. I couldn’t really tell if the rig was working or not because the motors were continuously moving in different directions.

One obvious solution is to use low-pass filtering on the input, but I anted to eliminate Arduino-side numerical processing as much as possible. The Arduino would have to do low-pass filtering for both coordinates, and I felt it was simply too processor intensive.

Anchors and Buffering

Fortunately, Opengazer also spits out the “anchor” point — one of 12 (configurable!) calibration points — to which the eyes are closest. I decided to use this instead, rewrote the arduino software, but again had less than optimal results. The interesting thing is that this time, the method of pushing coordinates to the network for some reason waited until the program finished running before dumping all the contents on the network. I am still not sure why it happened, but it seemed like some sort of buffering was occurring. At this point, I decided to put the soldering iron down and go to sleep 🙂 The next day, I came up with using the mouse until I figure out how to smooth the coordinates better.

Side Notes

To test network connectivity quickly, and in an environment I didn’t have an Arduino handy (i.e. at work, during lunch hours), I used netcat. Netcat allows you to do pretty much anything on the network, but to open a quick little server on port 3333:

nc -l 3333

To write to this server, you can use the redirection trick I talked about in Part 3:

exec 3<> /dev/tcp/192.168.0.102/3333
echo "123" >&3
exec 3

“123” should be displayed on the computer running netcat.

Beyond Mouse-Servo Control

One of my former coworkers, Mike Senna, built a functional R2D2 and is in the process of building a Wall-E. He is light-years beyond what I can imagine doing with hardware, and is very inspirational and encouraging, and I thought maybe I aim a little higher with this project.

Check out his work here:

R2D2

Wall-E

All this got me thinking: it would be pretty lame just to have two sets of eyes moving around. As I was wearing my awesome Homer slippers:

…it struck me — I should construct a Homer Simpson animatronic head with movable eyes, maybe a mouth. Maybe I’m getting too ahead of myself, but I can’t wait to see where the learning experiences from this project take me.

Leave a Reply