Hi All,

I am part of a team building and inMoov for a school project. We are in the process of creating a service that uses the JMonkey(which uses LWJGL)  games engine to provide a GUI for inMoov.  Right now we have a crude 3D model of an inMoov robot. You can move the position of inMoov’s limbs using key presses, which also you to control inMoov in real-time. You can also then take snapshots of inMoov in different poses, and then play them back to create a “movie” for inMoov to follow. We are also in the process of taking the Kinect data and using that to set the position of our 3D model’s limbs. I am not a great java programmer. There are definitely a few bugs that we know about, and probably more that we haven’t found. Hope our service can find a home in myrobotlab, and be of use to inMoovers everywhere. 

GroG

10 years 4 months ago

Hello And Welcome CamosunBot !
That looks Great !  Maybe you can post some video too sometime ;)
I'm very impressed.  It's great to see something this well developed just pop up .. I had no idea.  Let me know if you need any help.

So you know there is another GUI project completed which uses JMonkey, called SEAR.  Although it was not integrated with InMoov.  The person who wrote it, did his thesis on it, and I believe it worked very well.  The premise behind it was to connect middle-ware (MRL or ROS) to the simulator then run a variety of simulations to evolve control systems.  Then simply switch the control system over to "real" hardware.

I am interested in a fully imersive brain state for InMoov where it's respresentation if itself is similar to your 3D model.  Data would come in through sensors like the kinect, PIR, and camera - and the simulator/display would become populated with "objects and their correct spatial location.  That way InMoov could run through his own "thought" simulations to achieve goals .. like picking up a glass of water.

When I get done with this laborous move from google to github - I will merge in SEAR and maybe it will be useful to you.

Keep us posted, and Thanks !!!

hairygael

10 years 4 months ago

Very cool 3D gui, this seems to be a great solution to pre-create gestures before launching them on real InMoov.

I would love to test that!  I love new toys.

Are the joints using the actual joints positions of InMoov or ado you still need to do that? If yes did you use the joints measurements posted on MRL?

CamosunBot

10 years 4 months ago

Thank you both of you as well. InMoov and MRL are great projects, and we are hoping to give back.

Right now the movements of the 3D image are based on the rotation of bones in model, and the rotations of the bones correspond to the angles of the servos. 

The length of the limbs doesn't correspond to the real world length of inMoov's limbs, but I would like to increase the realism of the service by getting a better model of inMoov. Working in Blender is not one of my stronger skills. 

 I would definitely be interested in seeing how my serivce can work with SEAR. 

The idea for my service came partly from thinking about a different way to create gestures, and also to see if I could take a step toward an immersive represenation of inMoov. 

I have embedded a little video of the service in action. It has a long way to go, but let me know what you think. 

Thanks again for making our group project possible.

 

 

 

Alessandruino

10 years 4 months ago

In reply to by CamosunBot

I would like to play with it... It would be possibile to have dependencies and the service? So i can play with it in the debugger? :D

Alessandruino

10 years 4 months ago

In reply to by CamosunBot

I would like to play with it... It would be possibile to have dependencies and the service? So i can play with it in the debugger? :D

CamosunBot

10 years 4 months ago

JMonkey has a skeletonDebugger option that shows the bone structure of a model, which is the green lines. 

 Instead of having one bone to the shoulder (like how it is drawn in openni), I modeled it using a couple of smaller bones. 

The first bone after the torso represents the omoplate, and it has only one axis of rotation that corresponds to the omo servo. 

The shoulder comes next, and rotates about one axis corresponding to the shoulder servo. And so on down the line... 

I don't think my model represents the bicep accurately, and I am having trouble getting the thumb right. 

Hello Again CamosunBot,

It seems like it has been a long time, I was very curious about the state of your project.  I am interested in starting a simulator and was wondering if you were interested in collaboration.

Hope things are well with you,

Regards,

GroG