Integrated Movement Demo using VinMoov

 

 

 

this is the script use for the demo, run after my inMoov script.


#file : home/Calamity/imdemo.py edit raw
i01.integratedMovement.removeObject("pole")
i01.integratedMovement.removeAi("kinect",i01.integratedMovement.Ai.AVOID_COLLISION)
i01.rest()
sleep(3)
i01.integratedMovement.moveTo("rightArm",-300,500,400)
mouth.speakBlocking("Hello, I am vinmoov")
sleep(3)
i01.rest()
mouth.speakBlocking("I want to talk to you about a new myrobotlab service called Integrated Movement")
mouth.speakBlocking("With this service, I can move my hand to a point in space by using inverse kinematics")
i01.integratedMovement.moveTo("leftArm",500,500,600)
mouth.speakBlocking("you don't have to specify how I move each of my joint. You just told me where to move")
i01.rest()
i01.integratedMovement.addObject(-200,500, -1000,-200,500,1000,"pole",30,True)
mouth.speakBlocking("Look! Something appear in my field of view. It's a pole so I can do some pole dancing.")
mouth.speakBlocking("But that's for another private video called Sexy Dancing Robots.")
mouth.speakBlocking("The Integrated Movement service allow to add objects in my surrounding so I can interract with them")
i01.integratedMovement.moveTo("rightArm","pole",i01.integratedMovement.ObjectPointLocation.CENTER_SIDE)
mouth.speakBlocking("I can move my hand close to that item. I can also see objects in my surrounding using the kinect I have in my belly.")
i01.integratedMovement.moveTo("rightArm",-600,600,400)
sleep(3)
i01.torso.midStom.setVelocity(3)
i01.torso.midStom.moveTo(30)
mouth.speakBlocking("I can also react if my arm enter in collision with an object I know of")
sleep(10)
i01.rest()
mouth.speakBlocking("I can also try to control where my center of gravity is. When I am in this position, my center of gravity is right in the middle of my belly")
mouth.speakBlocking("So I'm standing in a stable position. But if I raise my arm, my center of gravity will shift away and if it get too far from my center point")
mouth.speakBlocking("I may tip over and fall if my base is not fixed strong enough")
mouth.speakBlocking("If one of my arm is set to keep balance in integrated movement service, I can adjust my position to keep my center of gravity close to my center point")
mouth.speakBlocking("that way, I will be able to stand in a more stable position")
i01.integratedMovement.setAi("rightArm",i01.integratedMovement.Ai.KEEP_BALANCE)
i01.leftArm.omoplate.moveTo(70)
i01.leftArm.rotate.moveTo(180)
i01.leftArm.bicep.moveTo(40)
i01.torso.topStom.moveTo(80)





 

 


Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
GroG's picture

Ahhaah  ! Brilliant Calamity

Ahhaah  !

Brilliant Calamity !

Its great to see it all come together !

Smooth ~120 fps ...   IK & IM you have working great  !

What about an official VirtualInMoov service ? 

Your script looks interesting - but its missing the details of InMoov creation.  I think it would help many people if we packaged all the  pyrobotlab/service/  scripts during the build to - and have them as examples.

VirtualInMoov could start the VirtualArduinos and attach JMonkey - and this could be the core example.  

Great work Calamity !  I'm excited to see the pole dancing too.

calamity's picture

Thanks Grog I don't think we

Thanks Grog

I don't think we actually need a VirtualInMoov service. The magic of going going purely virtual or not can be done by using VirtualArduino or Arduino service. 

I did not add my full inMoov script because I think it can work with any inMoov script. Launching IntegratedMovement and VinMoov have been add to the inMoov service and can be start with inMoov.startIntegratedMovement() and inMoov.startVinMoov(). Both service will use the limits and setting used in the inMoov service.

IntegratedMovement (and the jme3 app) can also be use as standalone for other robot model not using the inMoov service. But it will require setting up all the required parameters (DHparameter, collision objects, mass of each part, jme3 3d model etc). I still have to do an example script for that. 

But it begin to require a lot of settings. So I'm thinking about doing some config file for each part, something similar as what ROS is using.

GroG's picture

But it begin to require a lot

But it begin to require a lot of settings. So I'm thinking about doing some config file for each part, something similar as what ROS is using.

This is the reason I was thinking for a VirtualInMoov service. To manage all those settings. MRL already has a configuration management system. It can be loaded or saved at any time. Its clear and easy to manage in the code.

calamity's picture

interresting about the

interresting about the configuration management system. are you talking about the .json that are save?

I kinda feel like I did not explain well the reason I build the VinMoov. I did not want to build an "on screen" robot that we can control (as the first virtual inMoov that use blender) but have a way to "visualize" what happen to the robot after we use control to move it. By control, I mean anything that can modify the position of the robot (slider, servo.moveTo() command, or more advanced service like Tracker, InverseKinematic3D or IntegratedMovement). All the different control ultimatly will modify the servos position and the best way I could find to listen to all those controls is to use to servo events. 

So VinMoov is not an virtual device, but a virtualization of a device (in our case an inMoov) that can be virtual or not. You are not actually controlling the VinMoov, but controlling the InMoov and have a video like representation of the movement the InMoov is actually doing.

InMoov service can be turn essentially virtual by using VirtualArduino. The only things that cannot be virtual in the inMoov service is sensor like the kinect or opencv. So I don't really see the need to have a VirtualInMoov service.

In my previous comment, what require a lot of setting is the IntegratedMovement service, not the VinMoov.

About the jme3 app, I have add 2 apps in the MRL. 

One is a generic version and one a version that hardcode what is need for InMoov.

With the generic version (TestJmeIMModel.java) you need to specify the 3D model to use and how to plug them with each other part. But that app can be use for any "visualization" of a robot, not just InMoov

For the InMoov specific version (InMoov3DApp.java) the only thing that is required is to link servo to part and enable the servo event (and listening to it). I think using namebinding like you suggest in another post probably make thing simplier to use, but i'm not there yet

I hope this will make what really the VinMoov is

That said, if you want a pure VirtualInMoov that directly control an inMoov representation in a jme3 app, it's possible to do it. But that was not the goal I have in mind with the VinMoov app I did

 

 

GroG's picture

Yes, the json files.  They

Yes, the json files.  They are saved and at startup they automatically load.  There is a little detail in how variables need to be initialized if you always want the file value of the variable on startup, but there is no limit to the amount of information which can be saved.  Additionally, the variables names are always consistent :D

It would be concerning if another configuration routine was created, and this is the primary reason I suggested VirtualInMoov service.

For the rest of the points I agree Calamity.

 I completely and strongly agree the control script should need the very minimal amount of change (perhaps a single line) which switches from controlling a "real" inmoov to a virtual inmoov, or utilizes both.  This is the idea of 'one - clicky' a single line or button which causes a great amount of desired change.

The idea of having a pluggable 'back-end' which is either real or virtual has been around in MRL for quite a while.  SEAR was this concept and Morrows_End 's thesis.

At first I thought it  wasn't a good idea to have the InMoov service reference directly JME or any part of virtual inmoov - because I thought (and still do) that JME service needs to be generalized, and was afraid of InMoov specific stuff being put in to JME.  But I think we are on the same page with that.  You mention several times about working on generic stuffs !!!  That's great !

Your Integrated Movement service is awesome !  And I hope that can be generalized too so all robot makers can utilize it !

Your right about the things which really need virtualization - its the serial connections.  The points of which data flows in or out of the computer,  this includes serial com ports, kinect usb data, and camera data.  The VirtualArduino is working well I believe, the virtualization of the kinect & camera will come.

I find it interesting too that you've chosen to use events as commands in controlling vinmoov.  You could argue this is more representational of "How things really Are"  vs. "How I want things to Be" -  but, servo events are not really truth, and I'm a little concerned that this adds complexity with little benfit.  For example, its a requirement that servo events are on in order to drive vinmoov.  Should this be a requirement ?  

Your work and comments have started me thinking about multi vinmoovs -  What if there was a "How things really Are" vinmoov and a "How I want things to Be" .. Don't we do this ?  Our concept of reality is a collision of many realities in this multi-verse.  So why limit to just one virtual reality?

Some of this is starting to remind me of that inspirational Star-Fish robot - with its virtualized respresentation of itself.  And the way it ran through a many virtualizations when it found it had a missing leg, to figure out how to walk again..

http://www.news.cornell.edu/stories/2006/11/cornell-robot-discovers-itse...

Its a pleasure seeing your work develop Calamity, and if I ever get out of refactoring I'm looking forward to playing with it :D 

calamity's picture

I'm totally aware that servo

I'm totally aware that servo event may not be the truth about the real position. They are echos of the commands issue. They can be view more like reflecting the commands asked than a real position feedback. And the worst drawback I have using the servo event so far is that they are always late (a few ms) over what really happen.

But still, the servo event give me a much better result than if I use the control command. The reason is that when a control ask to move a part, it only send the endpoint (shoulder.moveTo(180)), while the servo events return a flow of data for each step the controller decide to use. So if I use control to move the VinMoov, the part I'm controlling will jump to the end position, unless I add more code to regulate it's speed. And I think it's always better to use already working code than to add new code to do almost the same thing. 

The things I have code that use the servo event alls activate the servo event by calling the servo.addServoEventListener (or servo.addIKServoEventListener) so the user never had to worry about that.

And more importantly, it's worky without changing any code between InMoov or VirtualInMoov service just by swaping the controller (Arduino or VirtualArduino). 

 

I alway take care to implement new things to be as generic as possible so they can be reuse for others projects. There is no other way to go :)