Next version 'Nixie' is coming soon !

Help us to get ready with MyRobotLab version Nixie: try it !

I finally took the time to finish making the Wall Lamp from the Red Bull 2013 Creation Contest.

hi

iam having HC-05 bluetooth module, arduion, bluetooth dongle, chasis with dc motors and driver, and a dc supply. now i want to tarck my robot from the camera placed in the celining of the room. as we discussed in the shout box first the robot need to be mapped.

Javadoc link

Attempts to find humans through OpenCV facedetect. Utilizes a pan / tilt kit and LKOptical track points to track human after detection.

 

 

So far you have to use Eclipse and change the FindHuman.java file in the service directory to set your Arduino pin and com port settings, etc. Then start it like you would any other MRL Service.

I bought a couple relay modules for the Arduino to Borg together more of my house.  The first objective will be for the Incubator to drive the sprinklers with the Cron and Arduino service.  Also, I'll be looking into a web interface so my wife can manually override the sprinkler system.

I feel like this is a dumb question, but I am going to ask it anyway.

I have been playing around with the vision examples that come with MRL. These are very cool and work well. The OpenCV GUI is very nice and friendly. It is what brought me to MRL. Playing with faceTracking.py, I see that that the location and size of the box is printed to the python window. My question is this, how would one send this data across a serial port?

On many robots, when they move their arm, it tends to sway back and forth as the motion stops. Do you use some sort of sensor system, and active control to compensate for this swaying?

Joe Dunfee