First credit must be given to the SimpleOpenNI project, which created a clean and manageable interface to OpenNI & NITE.
The SimpleOpenNI project describes itself this way:
"This project is a simple OpenNI and NITE wrapper for Processing. Therefore not all functions of OpenNI are supported, it's meant more to deliver a simple access to the functionality of this library."
They did a great job. And although they created it specifically for Processing. Because it was such a great project, we have decided to Borg it in.
Regrettably - the OpenNI has some repo/pathing problems. They will be fixed, but before that happens the system in the repo/libraries path needs to be removed. And before that happens the repo has to be fixed.
This link describes a work-around
References
- Reference For SimpleOpenNI and Kinect
- SkelTrak - less Jitter
- FingerTIp Extraction Processing OpenCV
- Drawing Depth with Kinect
- 3D Vector & Matrix Math
- Kinect Microphone Array in Java - Andrew Davison
- Kinect Open Source Programming Secrets
- Vibration Motors for High Accuracy
- Listening with Kinect
- Common Bridge - Cinder
- Finger Tracker
- SimpleOpenNI Home
- Reference for Simple-OpenNI and the Kinect
- Another Finger Tracker
- How to Install OpenNI 2 + Nite 2 + Kinect SDK 1.6 + windows 7 32/64
- ODROID binaries
- How to Install OpenNI 2 + Nite 2 + Kinect SDK 1.6 + windows 7 32
- Simple-OpenNI
- OpenNI downloads
- DWR's foray into getting OpenNI working on MRL
- My experiments into OpenNI
- How-to: Successfully Install Kinect on Windows (OpenNI and NITE)
- Examples from SONI
- OpenNI API
- Open Source Alternatives to MS SDK
Historical references
This is the service page for OpenNI.
Currently, this service is under dire need of development & stabalization. OpenNI itself seems to be in dire need of development and stabalization ;) DWR has tested SimpleOpenNI & OpenNI "unstable" releases.
To Do
- fix packages in repo
- matrix of working, non-working, kinect, xtion, and un-testedcombinations
- images back from camera
- gesture capture & recognition
- distribute 1 binary for status check
- split service into two - GestureRecognition & PointCloud
OS | Bitness | Xtion | Kinect | Repo |
Linux (Ubuntu) | 32 |
Works Needs repo bundle & driver info |
Works - needs work |
org.OpenNI.jar Native - libnimCodecs.so libnimMockNodes.so libnimRecorder.so libOpenNI.jni.so libOpenNI.so |
Linux (Ubuntu) 12.10 ? | 64 |
org.OpenNI.jar Missing Native Files ! |
||
Windows (XP) | 32 |
org.OpenNI.jar Missing Native Files ! |
||
Windows (7) | 64 | Did Work - needs verification |
org.OpenNI.jar Native - OpenNI.jni64.dll OpenNI64.dll |
Block Diagram of Modules
Python
[[service/OpenNI.py]]
Questions
SimpleNI vs OpenNI
I'm speaking off the cuff here, and I might get a detail or two wrong. But here's the state of the technology as I understand it.
First of all, a definition: "Natural Interaction" How often did you see Dr Smith, or Will Robinson actually PROGRAM Robot B9? Or use any sort of remote control device? Or see any TV/Movie robot programmed or controlled by the use of some console or handheld device? Let's see. . .I can think of Gigantor, and. . .the only time I've ever seen robots programmed was when Huey, Louie, and Dewey were taught to play poker. (If you're familiar with the scene--people had a different concept of computers in the early 70's, and once upon a time computers WERE programmed by physically changing (soldering/unsoldering) internal electrical connections between logic gates and registers. But I digress.)
Back to Natural Interaction.
Whenever Will Robinson wanted something out of Robot B9, They spoke to each other as if Robot B9 was just another person. In other words, they "interacted naturally"
OpenNI is an organization founded in 2010, composed of a number of companies that explored naturally interactive controls for computers. One of the early devices supported was a depth sensing camera with its open source drivers from PrimSense. The Kinect came out from development between PrimeSense and Microsoft. The Asus Xtion devices came from similar project between PrimSsense and Asus.
The OpenNI software is programmed in C and C++. Problem with C and C++ is it takes a serious commitment from an unnaturally orderly brain to understand and use effectively.
There are a few projects out there in the wild web that attempt to interface OpenNI to more approachable languages like Java and Python. The one that seems to have the most success is SimpleOpenNI, which is a wrapper for OpenNI. Basically, you don't have SimpleOpenNI without OpenNI.
And it seems to be working. When I first discovered SimpleOpenNI, it was basically a little known effort to make OpenNI work with the Processing computer language. A few months later, a Google search will yield quite a few tutorials and projects that depend on SimpleOpenNI. I don't see as much activity on the Java and Python wrappers out there.
To get SimpleOpenNI, you start with a working OpenNI installation. (Files available from the OpenNI website.) Install Processing, (Which requires Java.) Download the SimpleOpenNI wrapper from the SimpleOpenNI website, and drop it into the Processing/Sketchbook/Libraries folder.
So. . .does SimpleOpenNI = OpenNI? I guess the best answer is as much as MRL OpenNI = OpenNI.
MRL does not require SimpleOpenNI to run, but Both MRL and SimpleOpenNI require the files from the OpenNI website.
To restate (cause my mutant
To restate (cause my mutant brain needs extra-help sometimes) :
Simple-OpenNI is a big wrapper - where the folks there took the effort to bundle the correct (working) versions of all the bits & pieces
Some of the bit bits & pieces are the following :
I believe it also comes with a Jar to interface with Java programs, with Processing specifically in mind.
Problems :
MRL is designed with the attempt to be fully self-contained as possible. It will not dump files into system directories, nor does it expect immediate dependencies to be in shared or system directories.
Additionally, we all like the concept of push-the-button and it works. I like that concept, however, as a developer - the concept usually means "more work". Simple-OpenNI some developer has taken the time to put all the working bits together and put that into an easy (push-button) installer. Regrettablly, this makes it more difficult for me to figure out exactly which files are needed / bundled in the (push-button) installer. A bit of a catch 22 - make it easier for users makes it more difficult for developers and vis-versa.
The challenge now is to determine exactly what and where the 4 installers put things & determine if they are worth decomposing and putting into MRLs repo - or if the complexity of the install makes this intractable.
That's pretty much how it goes
But a couple of clarifications. NITE is actually the middleware, and the largest part of the package. I THINK it's also the source of the example software for OpenNI.
The two sensor packages are drivers--one for the Xtion, and one for the Kinect.
An OpenNI installation is OpenNI, Nite, and a driver.
The wrapper--SimpleOpenNI goes into the Sketches/libraries folder of Processing. It seems to be the wrapper, and some examples in Processing.
I've taken a closer look at the OpenNI website. Their stable binaries seem to have version numbers 1.5.2.x. Unstable have version numbers of 1.5.4.x. The Simple OpenNI website apparently gathers up packages from both releases. They don't seem to have hardware binaries for Microsoft Kinect.
Typically, I install the unstable releases from the OpenNI website, by running their installers, and then install the wrapper by copy and pasting the unzipped simpleopenni file into Processings libraries folder.
NITE is the largest part
NITE is the largest part because its responsible for the Skeleton and Tracking
OpenNI is a lower level framework which is supposed to offer pluggable module implementations of all sorts of NI stuff - sound location even etc.... This forum thread seems to make a good attempt of trying to sort out the confusion..
For the drivers I'm a bit confused why they label is this way...
Back to where I left off....
It took a little bit to get back to where I left off..
This is the (new?) ... really just renamed PointCloud service.. Initially stuff was not working for me because the Java3D I had was only for 64 bit but I was running a 32 bit JVM (OpenJDK) ... switched back to an Oracle 64 bit JVM and got it working (again)
Here's my wall & phone in the 3D point cloud using a kinect. It does take a little while for the video stream to display.. Then trying to move the 3D cloud with a mouse was sluggish - I also saw the buffer overrun as the kinect data was piling up to be displayed ...
The BIG BOTTLENECK - Java3D and displaying all those data points in a 3D representation...
Possible solutions ... JOGL
But at his point I'd like to change focus to the NITE stuff to get DWR's Houston capable of being commanded by gestures..
Downloading Simple-OpenNI
Downloading Simple-OpenNI Linux64 to my VirtualBox to see if I can decompose what they put together... Linux typically being developer-friendly, user (not as friendly)
Would OpenNI by any other name would smell as sweet ?
Alright I'm going on a lot of conjecture, but you have to start somewhere...
OpenNI is the name of an organization, a web site, a relatively light weight framework, and "incorrectly" used as a bundle of the aforementioned framework, NITE & a Sensor driver.
To get "OpenNI" (incorrect bundle name reference) to "work" - you also, in addition to the 3 previous parts, need a actual hardware driver. This can be a Kinect or Xtion driver - but of course they must be compatible with the "Sensor Driver" from OpenNI...
That's my guess.. and I'm stickin to it... (untill proven otherwise) :D
Oh as an addendum :
OpenNI (the organization) provides other language bindings.. they provide OpenNI.jar (and a 64 bit jar for 64 bit JVMs) - this is a general purpose interface for Java made by the same organization which created all the other pieces (except for the hardware drivers)
Simple-OpenNI - provides its own interface Jar. This one was created to interface with Processing & OpenNI
So my preference givin the limited info I currently have, would be to interface with OpenNI (the organization)'s general purpose Java binding, because "usually" (not-always) the manufactures of the libararies provide more and better access.... but I don't know enough of the details to decide at the moment.
Yay - decisions are made !
We will be going with OpenNI's (the organization) "unstable" releases. I'll download all the binaries, and attempt to decompose them into MRL repo packages
This is preferrable since the OpenNI.jar allows more/better access than the simple-openni one (still don't know at this point)
After the packages are created - there will still be missing primasens & kinect drivers ... Any thoughts DWR?
Update.
PrimaSense driver comes with OpenNI - hmmm so does this work with Xtion AND Kinect or ???
Its all so clear ... except...
The Unstable OpenNI looks all good - I've downloaded the Mac & Arm versions as well to try to be as complete as possible.. The files in all are what I would expect, and I forgot the very convienently supplied test executables... even test .jars ! Very nice...
One (not so nice) thing is that they have a different jar for each platform :P
This is just bad Java design.. Understandably you have to recompile all the native code fore each platform, therefore you get N different native sets of binaries BUT .. you SHOULD only need to have one Jar ! I looked and each jar is different. I'm guessing its because the Jar loads the native libarary and the native library names are different for each platform. Grrr.. Good design would be if the Jar itself looked at what platform it was running on and loaded the appropriate libraries... Oh well..
The other thing which is still a mystery is what sensors are there, how are they different, and what comes with OpenNI (the package reference)
Here is a good question, for which I have not found a good answer too , additionally what is provided with OpenNI (the package) and how does it support the 3? flavors of sensors?
I've seen three sensors. . .
I've seen four 3d imaging devices related to OpenNI. The oldest was a Primesense reference design, which is hard to even find mention of any more. Then there was the Kinect. And then Primesense teamed up with Asus to come up with the Xtion Pro and Xtion Pro Live.
The driver for Xtion comes direct from the OpenNI website. There used to be a sensorkinect file for Kinect on the same site, lately the installation instructions I've seen for Kinect refer to Avin2.
As for bad java design. . .heh. . .That's for you big-headed programmer types to grumble over. Us shaved-apes are just looking for the right button.
Found A Easy Button for Point Cloud !
So this is with nothing downloaded on a windows 7 64 bit machine.
I'm starting to update the repo with the (bad design) platform specific Jars ... I think that was part of my confusion as to why it did not work on other platforms ... I asumed the Jar was the same...
Initially I was not sure of how the Repo mechanism would work with Jars with the same names but different platforms... Turns out its supports it very easily...
Now, just got update the other platforms in the repo...
(My driver was prevouisly installed & the repo does not include the functionality of installing the driver)
but its progress...
Just some Driver details ..
Just some Driver details .. this is what happens when you just plug the kinect in on Windows 7 64 bit..
My system is so dirty I couldn't tell you who's driver this is... but it works I'll do more driver details later...
Woohooo !!!
OpenNI working from INSIDE MRL GestureRecognition service !!!
Next will be packaging up the config and putting it in a MRL GUI Frame.
Split Display From Service.
One of MRL's design mantras is:
The Service is not the Display, and
The Display is just another Service
What this means is that a Service such as GestureRecognition is not responsible for displaying the results of gestures. Instead it publishes data about gesture recognition so other services can consume it. One of the consumers could be a display. There are several advantages to the strategy.
So here it is split & put into MRL - I've lost the User tracking :P ... but ya gotta break eggs to make cake.
capture / stop capture are the only supported buttons at the moment
SCORE!!!
Works like a charm with my Xtion Pro Live. Had a bit of an update issue, but yeah, here it is.
Floor shot
Nice clean floor gradient.. rug is on the left - frame rate is really great (30 fps?) - So now I'm thinking what OpenCV filters should this go through to do navigation & obstacle avoidance - its still got black blobs where depth info is not available - they appear and disappear - since the rug is textured - its always has some black in it (reflected IR doesnt get back)
Now, to work out an algorithm, the camera is mounted at a known level above the ground, pointed at the floor with a known angle. We "expect" the floor to be a gradient plane. I wonder if OpenCV's find contours would work well to segment this image. ..
Once segmented - we could begin to find important objects. Like :
Brekel &
Thought I would add these 2 to your exhaustive list.
Brekel Kinect .... (openNI & Nite based) & Free
Yes that is me and the GUI is pretty impressive.
Skanect (Standard Microsoft Kinect drivers) not free, however there is a demo version
It also does a good job at mapping colours to your mesh.
It produces from its Pointcload data a 3D mesh that can be saved as a STL, .OBJ, .PLY, .VRML file (demo version gives lowres mesh).
Wow .. that's impressive ..
Wow .. that's impressive .. Nice reconnaissance..
Impressive products...
Nice that they are free, but still have not seen an open source point cloud generator... but maybe my Google-Fu is off