Extract Blender python gestures to be used with virtual InMoov

I’m wondering how Gareth, or anyone else, would proceed to use the below Blender file along with virtual InMoov to create simili human gestures.

The c3d addon for Blender is in the preferences to be added.


If we could these kind of files along with the riggs of virtual InMoov, it would really help to create the first walking gestures, and to see what are the most important points to respect during the design of the mechanics.


Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
GroG's picture

Its a binary file, although

Its a binary file, although it looks very well documented, open source, and they have many tools -


This is someone's motion capture or a person Gael ? 

This is what the file looks like in one of the viewer tools

Which looks like coordinates mapped over time/frame

hairygael's picture

Hello Grog, I found these

Hello Grog,

I found these files in the Open Source directory of iCub robot, there is a whole bunch for to study human gestures, walking and climbing stairs.

Have you tried playing it in Blender? I'm pretty sure we could use the rigging in some way along with these gestures.

Mats's picture

Some useful links


The C3D file contains motion capture data. You can find 4Gbyte of motion data and a lot more information here:


I found a paper describing how to use this type of data to create animations in Blender.


The C3D files contains "raw" data of the motion capture. For the InMoov robot I think other file formats could be even better. Like the AMC files that contains the joint angles instead of the raw data. The joint angles corresponds better to the servo angles used in InMoov.


kwatters's picture

Walking Gaits and Inverted Pendulums

I think to get started making the InMoov walk, we should focus on 2 things.

1. InverseKinematics for the foot & the sequence of positions that the foot needs to be.  (we should be able to do this now)   https://en.wikipedia.org/wiki/Gait_(human)

2. the inverted pendulum problem to model the inmoov balancing.  The idea is that if you can model the InMoov as an inverted pendulum you can provide balance.  https://en.wikipedia.org/wiki/Inverted_pendulum



Gareth's picture

Let's see what we can do...

I am only familiar with the the motion capture bvh (Biovision Hierarchical Data) format ... I used it once for a project using the Old "Poser" software (now known as DAZ3D  . It may be worth me checking its bvh system again .

The c3d  is a standard optical marker-based (targeted for open source systems) .

I loaded up your test file above and after scaling the markers .... indeed a walking Biped comes to life.

.... data is very detailed down to the pendulum action of the arms... nice...very nice..

By selecting the graph editor in blender the data at each joint can be seen. In this case I have selected the right knee data.

We have data and VinMoov so .....(Gareth taps chin).....  lets see if we can join the two...