This is a first test to see if we could get the HelloRobyn script of Markus to work with InMoov2.0
[[InMoov2.full3.byGael.Langevin.1.py]]
This is a first test to see if we could get the HelloRobyn script of Markus to work with InMoov2.0
[[InMoov2.full3.byGael.Langevin.1.py]]
[[InMoov.Hello.Robyn.py]]
[[InMoov2.RobertoPreatoni.py]]
Hi Grog & Everyone,
I have been working trying to get OpenNI to start tracking my skeleton. I have the OpenNI service up and running. The display shows the grey sort of depth map image, so I know it's getting data from the Kinect to display the image. However, standing in front of the kinect, I'd expect to see a skeleton drawn on the display of the GUI. Is there some trick to initialize the service that I need to do before it will start tracking and rendering the skeleton?
A few days ago there was talk in the Shoutbox about running InMoov with wifi or bluetooth. I would like to pursue this, one problem though, I don't know how to do it. I am willing to buy the components needed, do the testing and do a tutorial on it once it is working. I need someone to give me advise on the components needed and help with the programming.
Does someone want to help me out with this?
This is a Video for Gael and those searching fingertip sensor possibilities