Position InMoov In Front Of Door

Using a marker, Processing, Kinect4WinSDK, nyar4psg and java.net.URL on my PC I wrote some code that will position the InMoov in a defined position in front of a door.

As the kinect range of view is a bit limited I had to fix the marker 20 cm above the door handle axis. 

Unfortunately in my case I had to do the movements on my own (an omniwheel base as Robyn has is yet on the wishlist of my Marvin). The distance to the door is 55..60 cm and could be more precise with a nicely positionable base.

The intention is that with the MRL IK service the bot will be able to open the door afterwards (be patient, calamity will first have to work on that).

Might be of interest for some that the positioning commands you hear in the video come from the Processing app running on my PC and are picked up by the acapela speech service on my bot-controlling odroid-xu4 running MRL by means of:"do this now"

It is also possible to call a MRL python function from a local or external browser with e.g.

you find the video here


and here the code



Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
calamity's picture

Nice work juerg!!!   If I

Nice work juerg!!!


If I understand well, Marvin want to have the marker centered on the kinect field, is that right?

From that position, I think Marvin can open the door with where I am on the IK service. 

Next week-end I will probably be able to test physically the IK service as it seem to work well virtually. But my computer running my InMoov is away for a few day.

I'm sure it won`t be hard to have a mobile base position in the right way from the data you get from the kinect

I think we are not far to see Marvin open that door :)

juerg's picture

Thanks calamity for your

Thanks calamity for your response

Right, currently InMoov would command me to position him with the kinect-camera x-centred in front of the door with a distance of 55 to 60 cm.

Thought about my approach and I think it would be worth to give a try with  the head camera as the kinect V1  depht info is not very good in close range and cam resolution is only 640*480 inan unmovable setup. Also the nyar4psg library is mostly in japanese and that makes it hard to make full use of it.

Will try to use OpenCV and the ArUco library. This will also allow to have the marker tell whether the handle is to the left or the right and whether the door opens toward the bot or away from it. And it would also allow to print markers for different doors.

Imagine it will be a challange to get the fingers in between the door and the handle and get feedback about the actual position. Maybe a mechnism as shown in Alan Timms InMoov blog could help as I understand the Arduino would know it's current position and that could be compared to the wanted position to maybe give a "second try"?

Nice to hear that you are still so optimistic about this task!

hairygael's picture

Great work Juerg! Didn't know

Great work Juerg!

Didn't know InMoov could be SO stressing when he needs to get to the door. A bit like an old man that needs help to go to the toilets!


I posted your video on my Google+ page.