Tracking Service is used to find and track objects. This service typically runs as a closed loop with feedback.
The video above shows a Tracking service being started by a Python script. Copy and paste the script below, adjust values correctly, then execute. You may set tracking points through the GUI by clicking on the video, or set them programmatically in the script.
A common setup might include input such as LKOptical points from the OpenCV service. The x and y position of the point would be sent two 2 PID controllers. The output of the PID controllers would be used to adjust the two Servos in a pan/tilt kit, which in turn would move the camera to follow the input.
[[service/Tracking.py]]
Here is my "NIGHT
Here is my "NIGHT TRACKING"...using a torch to make light on R2D2 :D
Nice !
R2D2 will not escape !
I wonder if a laser would set a good tracking point, I imagine it would depend on camera and laser details...
Tested them both on incubator
Tested them both on incubator .. they worked but there was one "funny" part..
The incubator's hardware is horrible - very slow, over burdened, not optimized at all - very old....
Anyway I had some problems where the Servos would not be attached after running the script - I believe the issue was opening the serial port takes too long...
I solved it with a sleep(3) after the tracking.attach(arduino) .. but there should be a better way to do this..
After I did that it all woked, but it is very WONKY when you glue the rest position at 5 instead of 90 :(
And Now With Error Messages !!!
Tested the second one now .. It works (with the latest MRL but you probably already Auto-Updated :) ..
Hmm.. this should probably be a warning not an error.. Look what gluing it wonky gets you ! It's interesting to see the whole thing rotate - so that it tries to get the point centered .. PID IS AMAZING !
Strong Worky !
Ok Alessandruino see if you can break them !
Or actually if you can get them to work with "only" changing configuration values.
Deleting the .myrobotlab is perfectly acceptable, should work if you delete it or don't
I tested both on my Win64 laptop.. and as soon as the build is done, I'll be testing on the incubator too..
I was really surprised how much "more" work this took on the Tracking service (not the script) .. but there is much good from it.
The Tracking service is a "Composite type Service" in that it does not control or produce data from hardware... instead it manipulates other common services to produce the desired function. I have for a long time been trying to find a pattern of how to successfully initialize a Composite Service. And now, I'm pretty sure I have figured out a general pattern to follow.
Other composite services like (InMoov & Cortex) should benefit greatly from this....
Try the 2 scripts - adjust the minimal amount of config - and let me know if they finally WORKY FOR YOU !
Thanks ! Making it (HARDER, BETTER, FASTER, STRONGER) :)
Alrigtht, I added an invert()
Alrigtht, I added an invert() method to pid to flip it .. and direct() to make it direct again...
Did some tweaks in the PID & Tracking services, enabled MinMax to really work, changed name of second script to "manual" versus "safe"... and more stuff..
AND NOW I FIND THE BUILD SERVER IS DOWN .. I need a robot to kick it - so it will start and I can get a build out :P
until then I'll have to wait until I get home
GAEL'S SERVOS ARE FINALLY SAFE !!! :D
Here is some pics about safety ranges
# make safe with
# save servos !
No Worky <- is No Worky !
Hey Alessandruino,
Your No Worky is No Worky !
The 2 No-Worky log files you sent this morning contain no Servo errors . I suspect this is because you have a -logToConsole in the myrobotlab.bat ?
You need to remove it or you can add a file appender (file log) by selecting System->Logging->Type (and checking File) before you do the test.
OK...thanks GroG..
Ya..i have a -logToConsole in my myrobotlab.bat
This doesn't work right yet
This doesn't work right yet :) but I wanted to post it to show you can work with the eye that's created by default from the Tracker directly .. just another way..
this adds the face detect & an extra pyramid down after the lk .. interesting results :P
Object detection problem.Please help URENTLY!!!
Hello Respected Sir,
I am new to myrobotlab, its a sincere request to please help in 'Quality detection of bottle' project.
I am UG Electronics Engg student, working on a project 'Quality detection plant'. Its a PLC based control plant which fills, holds and caps the bottle accordingly. I want to check quality parameters such as:
1) One or more bottles are slipped down on belt or not
2) Cap fixed or bottle went uncap
3)Level of liquid
There are more parameters to be done but the&se are urgent
I tried alot with Visual Studio 10/12 plus Opencv. Which never opens my internal webcamera/external.
I want to do this with myrobotlab seeing its various functionalites, I tried tracking service in 1.0.119, but as its containing alot stuffs such as servos, pid,etc even though arduino uno is connected by version number displayed 21 still I amnot able to get what should be the connection betwwen tracking.opencv, tracking.arduino & python code. I dont want to tracki, so no need of servos, as my cam is stable at 1 position. I just wanna test whether there is cap fixed or not?(cap having different color from vicinity & bottle), and te level of liquid as bottle is white transparent. These will contribute to 3 ouput signals from arduino to PLC CB280 conveying 1)wheteher bottles slipped, 2)cap fixed, 3)level below threshold mark
Please guide us as soon as possible.
Thanks in advance
Jaykumar Vaidya
vaidyajaykumar@gmail.com
also a replica of hardware in
also a replica of hardware in preview to software, a SCADA like thing is needed. I saw Webgui i think thats web SCADA?or will have to use GUI tab in myrobotlab?
Hardware ready, to be inspected by opencv img procesing
Thanks
JV
I'd recommend you look at the
I'd recommend you look at the OpenCV service page
http://myrobotlab.org/service/OpenCV
If your really intrested, I'd recommend you start a new post, "clearly" describe your goals, and post some sample data (pictures of what you are trying to identify)
Then maybe you can generate interest, and illicit more help from people.
Tracking test..no work
I'm new in myrobotlab, I downloaded the MRL last release, 1.0.1758, and upload the entire code for service tracking above, but I've errors like these:
File string, line 16, in moduleAttributError: org.myrobotlab.service.Servo objact has no attribute setPin at.org.pyton.core.Py.AttributeError(Py.java:205)....
Could you help me understand the problem? I have to start a project for tracking objects, but I can not proceed because of these problems!
90 degree flip of cam + face tracking/recognition - damn u covid
Hello Guys!
so corona situation made the purchase of a webcam pretty hard, in germany basically every webcam below 100€ was sold out - lol.
I ended up buying one which has a longer PCB and i had to mount it 90 degree fliped into the inmoov head with custom eyeparts printed, its ok only took me 2 days and i raged a lot
as i'd like to get more into scripting i try to build my own scripts instead of using the ones that comes with the inmoov. However, i already struggle to flip my webcam image to use your obove script. As i understand the pyramid and greyfilter always comes with the start of the tracker.
How can i add a 90 degree flip (Transponse?) filter in the script and where do i place it best? It has to be before starting face detetion/recognition? I cant get it to work when adding it afterwards manually
thank you!
easy :)