Audrey - Distributed Robot using MyRobotLab

All right, she looks a little like HAL-9000 and isn't nearly as bright.  But, it is a start.  This simple setup will also be the beginning of a video tutorial on how to use MyRobotLab (MRL).  I've been working very hard on it recently, and MRL has changed fairly dramatically.  The biggest part is a "prototyping gui editor".  My goal a long time after seeing MATLAB in progress was to create a graphical editor for quick prototyping of services.  It's finally come to a pre-alpha (thanks TinHead) release.  It's got lots of bugs, but lots of potential too.

I was hoping to get some feedback as I worked on a first tutorial.  I also like the idea of turning the start  here robot into a more capable device with the small addition of a radio module.  Combining the processing and other resources of a computer with a little bitty bot.

I wanted the face tracking to be fast - fast in jerky instead of slow and smooth.  I had done face tracking before, but the tracking positions were incremented by a very small constant.  I wanted to try a different method where the position was "calculated" and proportional.  Since I have started to learn about PID I wanted to incorporate the P into the control.

In order to do this I needed to find about how much change of position was generated from a single position change of the servo.  If I remember correctly, servo's typically change about 1 degree. Now, to do face tracking quickly with OpenCV its better if you shrink the image.  TinHead mentioned I should change the resolution of the camera before filtering it.  Unfortunately, the camera drivers are not all standardized.  OpenCV does its best to work with most camera drivers, but when I attempted to change resolutions nothing happened.  It's a good thing OpenCV has pyramid up and pyramid down.  These functions will proportionally scale up or scale down an image.  

My camera's default resolution is 640X480.  I needed to see the change of moving the servos slightly in two scaled down resolutions 320 x 240 and 160 x 120.

Here are the measurements as the pan (x) and tilt (y) servos were changed.  The results were about ~2 pixels per servo increment for 160 X 120.

Measurements

 

320 X 240 x 89 y 89 - (151,102)
320 X 240 x 90 y 90 - (154, 103)
320 X 240 x 91 y 91 - (157, 108)

160 X 120 x 89 y 89 - (76,52)
160 X 120 x 90 y 90 - (79, 54)
160 X 120 x 91 y 91 - (79, 55)

160 X 120 x 90 y 90 - (79, 55)
160 X 120 x 80 y 90 - (56, 55)
 
 
 
The next step is putting this into the FaceTracking Service.  A point will be published by the OpenCV Service and sent to the FaceTracking Service. So, the FaceTracking Service will divide the difference of X and Y from the center by 2 - then tell the Pan and Tilt servos where to go.

First I am going to try and send the messages to the Logging Service.  As a safety too, I put software limits in the servos and OddBot had suggested.  This will prevent the servos from moving too far in any direction.

Sending the messages to the Logging service is very helpful. If I did not do that I'd have the servos flying all over the place.  It's possible to hook the servos up left handed or right handed.  And with the logging, I could  quickly tell if I needed to multiply the error in position by -1.  If not the servo would be really scared of my face and always zoom away, Yikes !

It just takes 2 Servos, Arduino (possibly other board if we write a driver for it), and a webcam. 

 

Distributed Audrey

I have created a few robots from laptops, when I fell into a conversation with my wife.  She asked me why anyone would want to tie up their laptop to have a robot around.  I was dumbfounded.  She can be very insightful. 

One nasty bug is you have to reload the arduinoSerial.pde driver if you restart mrl.

It is true why would you want to.

No one really would want to tie up their computer

Usually the most powerful computer is the one you are using now.  Maybe your even doing work on it.  Have a spreadsheet open, and a few emails, maybe even watching a movie.  Well, instead of making a robot out of your computer, what if your robot could "share" your computer's resources? 

It could share the power of internet resources - for example FreeTTS is a great speech generator - but it sounds a bit mechanical - Audrey can use her FreeTTS voice, or if configured use resources from the internet http://www2.research.att.com/~ttsweb/tts/demo.php

Since it runs on the computer, it has capability of sharing all the resources on the internet too, google's image database for example might be a promising resource.

Audrey's Bug Toy

Audrey's toy is a very simple and inexpensive little robot.  I might be able to make one even more simple/cheap, but it was with parts I had conveniently lying around at the time.  
Here is a simple part list:

  • 2 X Micro-Controllers - nearly any micro-controller would work.  I used a Bare Bones Board Kit Arduino clone it has a considerable amount of power and is pretty inexpensive.  If you get the BBB, you'll need to solder it together.  They have a great guide to do so, and for me it's worth the $16 in savings.
  • Radio Receiver Module - I used this one.  Its not as robust nor quick as a bluetooth modem, but it is amazingly inexpensive. 
  • Radio Transmitter Module