Almost cooked, this is very inspired by InMoov service, for a universal service, for the next InMoov service...
Kwatters pointed the interest to use a tree to identify the skeleton. I have no more cofee, but I learned cool things!
What is the goal
Attach X servo to a "body part".
Attach X "body part" to another "body part" or the main "nervous central system" the root, it can be InMoov service by example.
Final goal: Use this magic universal command without hardcode any kind of stuff:
myService.moveTo("rightHand',90,90.90,90,90,90)
What was the problem with this command?
- I need a right hand, named "rightHand" and I cannot declare a new body part, like leg.
- If we need a new servo to the hand we cannot
- the general logic of the inmoov service imposed controllers (arduino) , pin numbers and classic servo.
How this is worky now
First, a script sample for 2 fingers, 1 hand, 1 arm, so the logical root here is the right Arm :
thumb = Runtime.start("rightHand.thumb", "Servo")
index = Runtime.start("rightHand.index", "Servo")
rightHand = Runtime.start("rightArm.rightHand", "BodyPart")
rightArm = Runtime.start("rightArm", "BodyPart")
hand.attach(thumb,index)
arm.attach(hand)
arm.moveTo("rightHand",90,90) will do the job.. How:
The skeleton had learned every nodes, and actuators are identified as leaf for now, for simplicity.
We can "search" for a specific node based on his name, if we are looking for a right hand, we have the output parent: rightArm.rightHand and every leafs or attached nodes.
I added 2 new methods inside index.java :
- getLeafs
- findNode ( not satified by crawlForDataStartingWith )
Now need to take care of few compatibles previous commands like moveHand("left")
moveTo method is generic and can parse infinite parameters.
Bones! Skeletons need bones!
Great progress, I think these are all steps towards having a structural model to represent any robot. It really needs to be defined by the bones that connect the links.
In that way we are creating "kinematic chains" ...
And these kinematic chains should represent our robot, regardless of how many servos are connected or how long each limb is.
So, if we define a root node, we can describe a humanoid robot with respect to that root node. Which means, the geometries of each of the links in the tree representation of the robot.
(Ok, I know, some robots have things like 4 bar linkages, let's leave that out of scope for the moment.)
There are a few methods for representing a robot, on such is URDF. Another more accidemic sort of representation is raw DH parameters. We currently can construct a "kinematic chain" with the DHLink and DHRobotArm class in MyRobotLab. These classes are designed to represent some "bones" and their "joints" ..
Let's see if we can build skeleton to support our forward and inverse kinematics in MyRobotLab.
Hello
Hello Anthony,
Fantastic!!
This seems to be a very interesting way to go. No limit.
I like the principle of daisy chaining the various parts of a robot, and as Kevin says it will be great for kinematic purposes.
I'm wondering how does kinematic works in a situation like the InMoov head which uses three or four servo (depending on the built) to get it's movements.
Can we imagine a similar frame regarding sensors?