Some time ago, I had want to add legs to my test model (InMoov) in the IntegratedMovement (IM) service, but find out that something was lacking to work properly. By example, using the jMonkey simulator, when I make the right knee bent, it work fine and raise the foot. but if I also bent the left knee, it would raise the left foot and the robot will appear to be flying, wich is not even close of that will happen with the physical robot. So I need to find a way to make IM model act more in jMonkey to what a real robot will act. 

IM service was also in need of a major overhaul in how it work (relation with Servo, missplaced data etc)

Previously, we use a virtual robot in jMonkey with file(s) that contain j3o model linked with nodes. Servos will rotate the nodes and make the parts moves. But that structure will not work if I want to have the leg sometime moves in the normal way (upper leg to the origin of the robot, moves the foot) and sometimes in a reverse way (origin to the foot, and moves the rest of the body). But that could be more easily done and compute with the DH parameter and the Inverse Kinematic computation (IK). 

So I decide to change the data structure to be base on the DH parameters 

Each part of the robot (ie foot, ankle, bicep, neck etc) are define with DH parameters, it's 3d model and the control (Servo) that will move that part

IMPart partRightHipY = ik.createPart("harlRightHipY", 0.01);
partRightHipY.setDHParameters(ArmConfig.DEFAULT, 0.0415, 0, 0, -90);
partRightHipY.setDHParameters(ArmConfig.REVERSE, 0, 0, 0., 90); 
partRightHipY.setControl(ArmConfig.DEFAULT, "rightHipY");
partRightHipY.set3DModel("Models/harlRHipY.j3o", scale, new Point(0,0,0,0,0,0));
ik.attach(partRightHipY);
Parts are link together with an "Arm" that will define the order of the part (similar to DHRobotArm class)
 
With that each transform matrix (translation + rotation) can be compute and pass to jMonkey that will display the part in the right location and orientation. (the fun part was that jMonkey do not had the same coordinate definition than the IK we use)
 
The good thing of this structure is that the Arms can also be link to each other easily and permutated as need.
 
  • Origin
    • Torso
      • leftArm
      • RightArm
        • thumb
        • index
        • major
      • head
    • LeftLeg
    • RightLeg
 
or
 
  • Origin
    • RightLeg
      • Torso
        • leftArm
        • RightArm
      • LeftLeg
or even
 
  • Origin
    • rightArm
      • Torso
        • rightLeg
        • leftLeg
      • leftArm

 

and have it easy to know the end point of each arm

 

So far it's working good when moving 1 arm at time. (moves leftArm, using by example rightleg + torso + leftArm). But the IK engine can't currently resolve when ask to move leftarm to (x, y, z) and leftleg to (x1, y1, z1).

The reason is that the IK engine do one group of part at time and when moving two arms. So the commun part are moving one way, then another way for the other arm and never able to reach both target. With only the Torso as common part that do not moves a lot, the IK can manage to reach both target, but with a more complex and longer arm like the leg, it do not work. So I will need to have the IK engine be able to compute all the moves target at the same time

kwatters

4 years 8 months ago

The current IK3D service (which, I think you are using?)  I think only supports solving a single arm... Not that long ago, we added support for the Drupp neck...  This had an IK solver associated with it also.  It seems to works reasonably well, but there are definitely some quirks with it.

I think being able to solve for multiple limbs attached to a single torso is going to be required going forward.  Right now the IK service really only solves a robot arm position assuming a fixed based.  In reality, that base is the torso of the robot and floating about in space.  The base can translate and rotate .. (That translation and rotation could be / should be measured with some IMU  or other gyroscope.

I'm happy to see your progress!

so.. reviewing the code for the IK3D service.. it does look like it was designed to support any number of robot arms.  

I'm not using IK3D service, but now I have branch from it and it's still very similar. I can explain the difference in another time if you wish.

Yes, IK3D and what I am using now can 'chain' the arm but using the result of the first arm as the 'base' of the next one. That is working very fine.

I have two problem that I need to resolve. 

The first is when using a much longer combined arms (like right leg + torso + right arm) the effect of moving the part near the base (foot) have much more impact on the final position than part near the hand. The result that often happen is that the arm oscillated around the target and never able to reach it. If I lower the delta (in jinverse method) it improve thing, but also make the arm almost not moving (in comparaison of the leg).So I will probably need to weight the delta on the amplitude of movement so each part contribute more equally in moving toward the goal.

The 2nd problem i'm seeing, maybe relate to the 1st one, is when moving 2 arm at the same time, that share some common part (like moving both hand to different point, using right leg + torso as common part). It never get close to solving, probably because the 'common parts' are moving too much compare of the arm. 

By the way, there is a bug in ik3d service here

https://github.com/MyRobotLab/myrobotlab/blob/develop/src/main/java/org/myrobotlab/kinematics/Matrix.java#L271-L275

when ak equals (0.0) R_plus is set to the wrong dimention (should be 1, numRows). I have fix it on my branch but I don't think I will merge anytime soon.