Hello everyone!
It's been almost two years now that I bought a Orbbec Astra to replace the Kinect 360 on my InMoov robot.
Why? Because it is smaller, it can "see" at a closer range and most of all, it is powered via the USB 2 plug.
Finished the huge power cords and power supply just for having a Kinect.
Now that being said, how do I make this thing work in MyRobotLab?
So what I did is, add the Astra drivers directly into two directories:
C:\myrobotlab\myrobotlab.1.1.74\libraries\native\OpenNI2\Drivers
C:\myrobotlab\myrobotlab.1.1.74\libraries\native\x86.32.windows\win32\OpenNI2\Drivers
I am guessing only one directory should be enough, but since I don't precisely know what to do...
Oh Magic!! the camera is recognized by MRL and captures the depths:
Now the next step is to get the camera to body track the skeleton.
And at this point I am stuck. I tried to add some of the ASTRA SDK jar files into the jar directory, but it is like looking for a needle in a haystack, specially when you really don't know what should be done.
Does it need it's own body tracking or can it use the same body tracking than the kinect 360? I have no idea.
Here are the links to the ASTRA body track SDK and drivers:
https://orbbec3d.com/bodytracking-sdk/
https://www.dropbox.com/s/oyw9pcfrf0zgck1/Driver_Windows.zip?dl=0
Some Github codes I found: https://github.com/shinselrobots/astra_body_tracker
UPDATE: some more code I found:https://github.com/robotBing/skeleton_follow/blob/master/src/skeleton_follow.cpp
Currently to get the Orbbec Astra or KinectOne working as depth sensor, I just copy past the drivers within the directory libraries\native\OpenNI2\Drivers.
Any good help or suggestions are FULLY welcome.
:)
UPDATE 18/06/19: The Intel real Sense seems to be a good choice as well, I do not have one but many users are talking about it and questionning if it might work with the next MRL release.
Here is a link to a post on InMoov siteabout some questionning and test recently done on version Nixie 1.1.173.
UPDATE 05/10/2019 found something on github for to get the skeleton on kinectV2 but not Orbbec Astra https://github.com/pierrep/ofxNI2
Wow Gael .. nice the depth
Wow Gael .. nice the depth works !
kwatters was looking into OpenPose which looked very promising (and open source :)
I might upgrade from my 10 year old kinects ...
Curious why you chose the Orbbec Astra vs the Intel RealSense ?
hairygael says, Hello
If the depth is represented
If the depth is represented in a recognizable way (which appears to be the case) - The same Nite software for the kinect "should" work for the Astra.
Are you still not able to get a skeleton to appear ?
Nope, I do not get the
Nope, I do not get the skeleton above the body with the Orbbec Astra. Mmmh at this point I seriously do not know what to do.
But I noticed something when using the Kinect 360 in version 1.1.77, the first time you ask to get the skeleton tracking it works properly. But if you stop it and restart the tracking skeleton, it doesn't work. It just won't start the skeleton again.
Do you have a similar behavior?
OpenPose (or the Kinect skeleton) is for sure the best current solution to teach gestures to the robot.
Specially if you want to create human like gestures. Ideally would be to include velocity during the capture.
I found some more
I found some more code
https://github.com/robotBing/skeleton_follow/blob/master/src/skeleton_follow.cpp
In the meantime, because the Orbbec is not doing the skeleton,
I tried to install the drivers for the Kinect(2) Nui One into OpenNI of native for MyRobotLab with success.!
The quality of the depth image is better than with the Kinect xbox 360 and requires more resource from the PC.
But it behaves the same as the Orbbec, I can see the depth very clearly, but the skeleton will not start.
No Luck.
:(
Intelsense F200
I'm curious if this could also work for the Intelsense F200?