Robots for Good Demo

Coming up next week is the Redhat Summit.  This is a conference about open source and the open source operating system Linux.  You may be framiliar with Redhat because of the Redhat Linux distribution that payed a large role to the growth of the OpenSource community.

At the Redhat Summit, Richard Hulskes was asked to talk about WeVolver and the Robots for Good project.  The goal of Robots for Good it to allow children who are confined to a hospital bed tele-operate an InMoov and use a remotely connected Oculus Rift to allow the children to experience fun places like the Zoo.

So, in order to help them out I'm planning on traveling to San Francisco next Monday with Harry to join Richard on stage.  I'm happy to travel with Harry, but it did mean a bit of polish needed to be added to Harry to provide a nice demo for the folks while he's on stage.

I've been busy hammering away at a distributed myrobotlab setup to support teleoperation of the InMoov robot with a remote OculusRift and joystick control.  The basic parts are in place to make this worky.  I have checked in the 2 scripts to git hub here :

 https://github.com/MyRobotLab/pyrobotlab/tree/master/home/kwatters/robotsforgood

There are 2 scripts there:  

RobotsForGood.py and

RobotsForGoodInMoov.py

 

The system is setup with an InMoov connected to a Raspberry Pi 2.  There are 2 webcams connected to the RasPi and I've used the program  "mjpeg-streamer"  It's not a fancy streaming protocol but it does work.  The raspberry pi also runs the RobotsForGoodInMoov.py MyRobotLab script.  This script starts up the InMoov and the RemoteAdapter  (RemoteAdapter is not worky yet as of the time of the writing of this post.)

The second MyRobotLab script starts up the OculusRift service and configures the left and right OpenCV instances to stream the data from the remote MJpeg streams coming from the RasPi.  The video is streamed from the left and right eye camera from the RasPi to the left and right eye of the Oculus Rift. The oculus rift provides head tracking information that is then transformed into servo angles and those are broadcast back to the remotely operating InMoov.

For now, I've only got the remote video and head tracking working,  I'm going to try to layer in the arm control this week and if I have time this weekend, add some batteries to Harry and if I'm really ambitious I'll try to assemble a mobile base.  

Harry is my 2nd InMoov.  Lloyd, Harry's older brother, first met Richard at the NYC World Maker Faire nearly 2 years ago.  Lloyd was the first InMoov with OculusRift support.  Though it was very crude, the basic concepts were there and in place.  Now, it's time for Harry to show off the continuing progress,  I'm looking forward to it!