Javadoc link

Tracking Service is used to find and track objects.  This service typically runs as a closed loop with feedback.  


The video above shows a Tracking service being started by a Python script.   Copy and paste the script below, adjust values correctly, then execute.  You may set tracking points through the GUI by clicking on the video, or set them programmatically in the script.

A common setup might include input such as LKOptical points from the OpenCV service.  The x and y position of the point would be sent two 2 PID controllers. The output of the PID controllers would be used to adjust the two Servos in a pan/tilt kit, which in turn would move the camera to follow the input.



Example code (from branch develop):
# a minimal tracking script - this will start all peer
# services and attach everything appropriately
# change parameters depending on your pan tilt, pins and
# Arduino details
# all commented code is not necessary but allows custom
# options
port = "COM19"   #change COM port to your own port
xServoPin = 13   #change this to the right servo pin if needed, for inmoov this is right
yServoPin = 12   #change this to the right servo pin if needed, for inmoov this is right
# create a servo controller and a servo
arduino = runtime.start("arduino","Arduino")
xServo = runtime.start("xServo","Servo")
yServo = runtime.start("yServo","Servo")
# start optional virtual arduino service, used for test
if ('virtual' in globals() and virtual):
    virtualArduino = runtime.start("virtualArduino", "VirtualArduino")
xServo.attach(arduino.getName(), xServoPin)
yServo.attach(arduino.getName(), yServoPin)
tracker = runtime.start("tracker", "Tracking")
# set specifics on each Servo
xServo.setMinMax(30, 150)  #minimum and maximum settings for the X servo
# servoX.setInverted(True) # invert if necessary
yServo.setMinMax(30, 150)  #minimum and maximum settings for the Y servo
# servoY.setInverted(True) # invert if necessary
# changing Pid values change the 
# speed and "jumpyness" of the Servos
pid = tracker.getPID()
# these are default setting
# adjust to make more smooth
# or faster
pid.setPID("x",5.0, 5.0, 0.1)
pid.setPID("y",5.0, 5.0, 0.1)
# connect to the Arduino ( 0 = camera index )
tracker.connect(opencv, xServo, yServo)
# Gray & PyramidDown make face tracking
# faster - if you dont like these filters - you
# may remove them before you select a tracking type with
# the following command
# tracker.clearPreFilters()
# diffrent types of tracking
# lkpoint - click in video stream with 
# mouse and it should track
# simple point detection and tracking
# tracker.startLKTracking()
# scans for faces - tracks if found
# tracker.findFace()
# tracker + facedetect : tracker.faceDetect(True)


11 years 2 months ago

Here is my "NIGHT TRACKING"...using a torch to make light on R2D2 :D

R2D2 will not escape !
I wonder if a laser would set a good tracking point, I imagine it would depend on camera and laser details...


11 years 2 months ago

In reply to by GroG

Tested them both on incubator ..  they worked but there was one "funny" part..

The incubator's hardware is horrible - very slow, over burdened, not optimized at all - very old....

Anyway I had some problems where the Servos would not be attached after running the script - I believe the issue was opening the serial port takes too long... 

I solved it with a sleep(3) after the tracking.attach(arduino) .. but there should be a better way to do this..

After I did that it all woked, but it is very WONKY when you glue the rest position at 5 instead of 90 :(

Tested the second one now .. It works (with the latest MRL but you probably already Auto-Updated :) ..
Hmm.. this should probably be a warning not an error..   Look what gluing it wonky gets you !  It's interesting to see the whole thing rotate - so that it tries to get the point centered ..  PID IS AMAZING !


11 years 2 months ago

Ok Alessandruino see if you can break them !  
Or actually if you can get them to work with "only" changing configuration values.
Deleting the .myrobotlab is perfectly acceptable, should work if you delete it or don't

I tested both on my Win64 laptop..  and as soon as the build is done, I'll be testing on the incubator too..

I was really surprised how much "more" work this took on the Tracking service (not the script) .. but there is much good from it.

The Tracking service is a "Composite type Service"  in that it does not control or produce data from hardware...  instead it manipulates other common services to produce the desired function.  I have for a long time been trying to find a pattern of how to successfully initialize a Composite Service.  And now, I'm pretty sure I have figured out a general pattern to follow.  

Other composite services like (InMoov & Cortex) should benefit greatly from this....

Try the 2 scripts - adjust the minimal amount of config - and let me know if they finally WORKY FOR YOU !

Thanks !  Making it (HARDER, BETTER, FASTER, STRONGER) :)


11 years 2 months ago

Alrigtht, I added an invert() method to pid to flip it .. and direct() to make it direct again...
Did some tweaks in the PID & Tracking services, enabled MinMax to really work, changed name of second script to "manual" versus "safe"... and more stuff..

AND NOW I FIND THE BUILD SERVER IS DOWN ..  I need a robot to kick it - so it will start and I can get a build out :P

until then I'll have to wait until I get home


11 years 2 months ago

Here is some pics about safety ranges

# make safe with

tracker.setXMinMax(60, 120)
tracker.setYMinMax(60, 120)

# save servos !


11 years 2 months ago

Hey Alessandruino,
Your No Worky is No Worky !

The 2  No-Worky log files you sent this morning contain no Servo errors .   I suspect this is because you have a -logToConsole in the myrobotlab.bat ? 

You need to remove it or you can add a file appender (file log) by selecting System->Logging->Type (and checking File) before you do the test.


11 years 2 months ago

This doesn't work right yet :)  but I wanted to post it to show you can work with the eye that's created by default from the Tracker directly .. just another way..  

this adds the face detect & an extra pyramid down after the lk .. interesting results :P



tracker = Runtime.create("tracker","Tracking")
tracker.setRestPosition(90, 90)
tracker.trackPoint(0.5, 0.5)
tracker.eye.addFilter("pd1", "PyramidDown")
tracker.eye.addFilter("FaceDetect1", "FaceDetect")




Jaykumar Vaidya

8 years 2 months ago

 Hello Respected Sir,

I am new to myrobotlab, its a sincere request to please help in 'Quality detection of bottle' project.

I am UG Electronics Engg student, working on a project 'Quality detection plant'. Its a PLC based control plant which fills, holds and caps the bottle accordingly. I want to check quality parameters such as:

1) One or more bottles are slipped down on belt or not

2) Cap fixed or bottle went uncap

3)Level of liquid

There are more parameters to be done but the&se are urgent

I tried alot with Visual Studio 10/12 plus Opencv. Which never opens my internal webcamera/external.


I want to do this with myrobotlab seeing its various functionalites, I tried tracking service in 1.0.119, but as its containing alot stuffs such as servos, pid,etc even though arduino uno is connected by version number displayed 21 still I amnot able to get what should be the connection betwwen tracking.opencv, tracking.arduino & python code. I dont want to tracki, so no need of servos, as my cam is stable at 1 position. I just wanna test whether there is cap fixed or not?(cap having different color from vicinity & bottle), and te level of liquid as bottle is white transparent. These will contribute to 3 ouput signals from arduino to PLC CB280 conveying  1)wheteher bottles slipped, 2)cap fixed, 3)level below threshold mark 



Please guide us as soon as possible.

Thanks in advance

Jaykumar Vaidya

I'm new in myrobotlab, I downloaded the MRL last release, 1.0.1758, and upload the entire code for service tracking above, but I've errors like these:

File string, line 16, in moduleAttributError: org.myrobotlab.service.Servo objact has no attribute setPin

Could you help me understand the problem? I have to start a project for tracking objects, but I can not proceed because of these problems!


4 years ago

Hello Guys!

so corona situation made the purchase of a webcam pretty hard, in germany basically every webcam below 100€ was sold out - lol.

I ended up buying one which has a longer PCB and i had to mount it 90 degree fliped into the inmoov head with custom eyeparts printed, its ok only took me 2 days and i raged a lot

as i'd like to get more into scripting i try to build my own scripts instead of using the ones that comes with the inmoov. However, i already struggle to flip my webcam image to use your obove script. As i understand the pyramid and greyfilter always comes with the start of the tracker.

How can i add a 90 degree flip (Transponse?) filter in the script and where do i place it best?  It has to be before starting face detetion/recognition? I cant get it to work when adding it afterwards manually

thank you!


Example configuration (from branch develop):
enabled: false
listeners: null
lostTrackingDelayMs: 1000
    autoStart: true
    name: tracking.controller
    type: Arduino
    autoStart: true
    type: OpenCV
    autoStart: true
    name: tracking.pan
    type: Servo
    autoStart: true
    type: Pid
    autoStart: true
    name: tracking.tilt
    type: Servo
type: Tracking