Finally... Yay !

This morning I finally got this work end to end with a streaming 3D point cloud.  It was slow, but it worked !  The kinect is fast but the display is painfully slow.  If a robot was using the 3D cloud, it wouldn't need a display anyway and there is probably display acceleration software I could be using like JOGL, but its a start !

Figuring out how to install all the pieces was not trivial, however Mitch's tutorial was the best.  There was 2 "gotchas"..  

The trunk of the KinectSensor project is not compatible with the trunk of OpenNI - you have to git a specific label (at the moment)  

cd SensorKinect
git checkout faf4994fceba82e6fbd3dad16f79e4399be0c184

The other gotcha had to do with getting the Java 3D world to display - I didn't get any errors or other helpful info, but it would not display.  Turns out (on Linux) the drawing process of the tabs adversly affects the Java 3D world view.  If you simply detach the tab - "Ta Daa" it will display.

References:

Todo:

  • Get Xtion sensor to work with all the other software components.
  • Package up all the parts so that "installing a service" works with any of the 6 Operating system targets - Linux 32, Linux 64, Windows 32, Windows 64, Mac 32, Mac 64 (oh boy)
  • This all was for a 3D point cloud, which will be useful for SLAM at some point - however NITE comes with some great gesture recognition software - which should be another "mode" of MRL's OpenNI service.
  • Find some graphics acceleration techniques for faster display - at the moment, the Java display becomes swamped with the data coming from the kinect.