Machine of Loving Grace


got the IP stream thing working too!

here's my code quilt:

#file :
# a minimal tracking script - this will start all peer
# services and attach everything appropriately
# change parameters depending on your pan tilt, pins and
# Arduino details
# all commented code is not necessary but allows custom
# options
port = "/dev/ttyACM0"
xServoPin = 10
yServoPin = 9
tracker = Runtime.createAndStart("tracker", "Tracking")
# set specifics on each Servo
servoX = tracker.getX()
servoX.setMinMax(30, 150)
servoY = tracker.getY()
servoY.setMinMax(30, 150)
servoY .map(0,180,180,0)# invert the axis
# optional filter settings
opencv = tracker.getOpenCV()
# setting camera index to 1 default is 0
streamer = Runtime.createAndStart("streamer","VideoStreamer")# easy as this!
# attach them
# connect to the Arduino
# Gray & PyramidDown make face tracking
# faster - if you dont like these filters - you
# may remove them before you select a tracking type with
# the following command
# tracker.clearPreFilters()
# diffrent types of tracking
# simple face detection and tracking
# lkpoint - click in video stream with 
# mouse and it should track
# scans for faces - tracks if found

Heres the two servos and camera taking a picture of itself- I got it working with

I set up as advised by Grog by using and playing with one axis at a time (via the trackerPID interfaces)- easy to do as I now have a seperate power supply with a power switch for the servos- I can find the right voltage and test different value smoothing capacitors. Once I had the pan or X axis going in the right direction and at a workable speed I then move onto the tilt or Y axis.

Next step is to integrate the Kpid settings into the script, find out how to trigger events and remix the hardware to allow a rapspi +camera to be installed on the moving head. 



New servos arrive. I need the extra strength because I may be fiiting the Pi in the head section, due to the flimsy nature of the raspi camera cable. I will from now on be wiring all moving parts with seperate power supplies. This from adafruit forum:

"There are three basic ways noise can move from one part of a circuit to another: shared resistance, mutual capacitance, and mutual inductance. The capacitive and inductive versions happen between wires that are close to each other, but the resistive version relates to how the connections are arranged.

By definition, all current in a circuit travels in a loop. If you don't have a loop connecting two points, no current can flow between them. If the current from two loops flows through the same wire or component, it becomes possible for one signal to interfere with the other.

Powering your servos from a separate battery pack moves all that current off to its own loop. If you only have a single GND connection between the servo loop and the rest of the circuit, no current can flow through that point. That means noise can't get from the servos to the rest of the circuit through a resistive route."

Im also wondering about sticking a piezo buzzer or small speaker across the servos to augment the motor sound- and to help as an extra diagnostic source....?






Hi, I'm posting about my project here for myself and others as a reference. I'll update it as often as possible.

The aim is to have a little robot that can tell when someone is close to it, track their face and tweet the image with some eliza like text accompanying it. A side possibility is to recognize faces and to use speech.

I'm using an odroid u3 with a raspberry pi- the odroid to cover tracking and tweeting, the pi to cover recognition. Arduinos will control the pan and tilt, and also send input from ultrasonic proximity sensors.


Sooo, my first problem has been when running face tracking scripts- my Logitec c120 has been thrown off and the whole USB bus has been shut down. I thought initially it was to do with opencv on my OS. I'll save you the pain of going through various different OSs and versions of opencv- all my attempts at compiling failed. So I'm using this os:

because it has opencv already installed.2.4.6

I am running MRL with terminal i.e. cd to the folder and then ./myrobotlab.1987 or whatever version. This is good because is shows you what is happening behind the scenes a bit- when my camera and usb bus was going bonkers it was giving

HIGHGUI ERROR: V4L:Property<unknown property string>(16)not supported

which led me down several avenues- but what threw me was that the whole usb bus would stop. I had found that the odroid has a USB over current trip circuit

and the behaviour that caused a shutdown was 2 servos and webcam-move servos a bit ok..move them a bit more and pfffffffftt no usb. So far I had been pwering the servos off the arduino in turn off the odroid (I know, I know..) so I used a powered usb bus...same behaviour. So I dug out my trusty Farnell L30 PSU (which I'm thinking of using as a design source- like huey and dewey in the film Silent Running)

and powered the hub+arduino+servos off that- 5v..

which showed nearly 200mA and then a crash. So it was an electrical power problem. I powered the servos off their own supply- 5v and grounded and signal connected them to the arduino- 50 to 60mA and NO CRASH:)

Loaded up and changed connection settings for my arduino (which is now on /dev/ttyACM0 as opposed to /dev/ttyUSB0) and my camera (device 0 as opposed to1) inverted xpid and...

successful tracking! 25-31  fps ----the tracking lasts for about 10 seconds before it goes a bit crazy, but the webcam stays on and the usb bus is happy.

So in conclusion the u3 works, but make sure you dont throw much current at it or it wont.smiley

Now. to refine the tracking and have a (re)think about my deadline....

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
GroG's picture

WoooHooo !

Sounds like a great idea !  I really like your odroid + raspi combination, DJUltis mentioned doing something similar.  It makes me think about the left & right brain hemispheres.  MRL was built to be distributed so the two halfs should be able to talk to each other.

Here are some reference links

Of course things have changed since then and setup is a different and would require a few "tweaks".  One large architectual change which has happened is, a single RemoteAdapter was needed. Now, one running in each process is required.  This was done for security reasons.  Additionally there is now a (partially implemented) Security service - which is in the process of being integrated.

Gael & Borsaci are using wide angled PIR sensors to determine if someone is "near by" - what are your plans for initial detection ?

SpenMarsden's picture

I'm going with 4 ultrasonic

I'm going with 4 ultrasonic sensors with an arduino to turn a waist to face whatever point is nearest, ( or rather is showing the biggest increase in reading). I was going to have this independant of mrl, and then when someone is at a very close distance (measured by a fifth ultasonic mounted above the waist) this service will be shutoff so the robot is facing. The idea is to give the facetracking as much of a chance as possible.


thanks for the links I will probably get chance to look at them when I'm installing openCV!

SpenMarsden's picture

Ok so this is with the LK filter- he'll follow a point for 10 secs then he loses it and goes into a wierd movement loop.

My next steps are to try all the different scripts and see which work best and to try and work out why.

GroG's picture

I embedded the video in your

I embedded the video in your post.  It's pretty easy to do -"How to embed Video".  Additionally, I've add a date with a divider line with the date.  Just thought I'd give you some ideas, but your welcome to continue to post via comments if that's what you'd prefer.

Please, don't be concerned that it goes to the top on each edit.  That's what we want.  Even if you don't think its an interesting change, someone else might.  Also, it gets replaced quickly enough - so there no reason to be modest.

As for LK filter - yep points will disappear because the threshold of matching in the local area of that point is not achieved.  When it loses lock, there were a few scripts which would scan (weird movements) - looking for a face or something to re-set a point...

GroG's picture


Nice that your doing so many preventative measures before hand...  Underpowered component are probably #1 on the "pulling you hair out" issues list.  A close second would be noise.

"Im also wondering about sticking a piezo buzzer or small speaker across the servos to augment the motor sound- and to help as an extra diagnostic source....?"

I like that idea..  you'll be "seeing" noise of only audible frequencies, but that covers a whole lot !

One thing that I do sometimes is use MRL's oscope...

I can tell the difference between my Arduinos just by the signatures they make.  The BBB Arduino clone is much more cleaner - because it has a better regulator and more filtering circuitry.

versus something like this 

from my old duemilanove

When you move a servo with a trace on you can see the trace change, and something I did not expect.. is when your sampling a trace - sometimes you can see the servo jitter .. ie.  the act of "reading" data from an analog line will even cause some noise ...  very Heisenberg...  read 2 analog lines an you can see the overall voltage drop ....   no one told me this would happen !

SpenMarsden's picture

scoping it out

yes, I was thinking that by having a sonic output from motors you would kind of get used to the pitch and timbre they have- so any problems e.g. strain, you'd be able to when a cnc has been set with incorrect feed speed.Its a bit moot till I try it out though.

GroG's picture

Video streaming & face detect

Video streaming & face detect streaming on web ... nice !

Looks like you made a lot of progress in a very short time..  Saymon was having a very difficult time in getting the video streamer to work .. don't know why.  Anyway, good to see such progress, and look forward to your tutorial videos ;)

GroG's picture

I remember a shout of yours

I remember a shout of yours .... heh .. the nice thing about shoutbox ...
you said something like "Oh noooo...   cannot find -lopenal " :)

how'd you resolve it ?

or did you just keep tweaking the CMakeCahe.txt until you dropped all the dependencies the odroid doesn't have in its repo?  (inquiring minds want to know)
SpenMarsden's picture

hmmm..thats sounds like one

hmmm..thats sounds like one of the error messages thrown up when the usb camera was being thrown off the U3..Oh no hang on its the opencv compilation stuff- as in download,cd to folder, config and make..I tried installing all the dependencies to no avail, so I chose to use the U3 distro that came with opencv pre-built:

I feel bad not working out how to install opencv because it was an opportunity to understand linux better but I feel good because I could check that the U3 was going to work (as in usb cam and servos) at all. There is a raspi walkthrough on installing opencv that might be good as a model, but I would reccomend using the linaro distro- its bare bones anyway and works great. Modding the cmakecache.txt is at the moment way out of my comfort zone :)

Is that any help?

GroG's picture

Ya, its helpful to know the

Ya, its helpful to know the background on things which are in a worky state.

I managed to get it built on DJ's odroid..  the trick (and it's often this way) is to drop FFMPEG dependencies. FFMPEG is an AMAZING Project .. but its increadibly complex since its about encoding and decoding video, and every software company has decided to go in there own proprietary direction :P

MRL's OpenCV does not require FFMPEG as a dependency.... so I dropped it but I'll have more detailed writeup on how to compile on an odroid later...

Now I can watch DJ's fish online with an HSV filter... Yay !

Next step is to put all the pieces together which have been built and put that in the repo - so other MRL'ians don't need to go through more steps ... ONE SOFTWARE  TO RULE THEM ALL !!