After beeing able to come up with my own calculation the palm position if felt the challange to do the same for the move to a new position. Found also some nice tutorials that can bring a dummy closer to the subject (e.g. SVD tuto by Kirk Baker).

I am looking at MRL code in DHRobotArm to try to understand how Kevin has done it.

Unfortunately my current own experiments (VS with Anaconda and numpy) have not succeeded and I wanted to step thru Kevin's code with the eclipse debugger to find the differencies.

However I haven't been able to figure out how to do that, like what to start and where to set a breakpoint to be able to step thru the moveToGoal method. I have eclipse running with the cloned git repo from https://github.com/MyRobotLab/myrobotlab

kwatters

8 years ago

Hi Juerg!  

  In a round of a lot of refactoring, I realized that the InverseKinematics3DTest.java is suffering from a  "dimensions bad in multiply" error.  (This happens sometimes using this sort of gradient descent for IK stuff.)  That unit test has an annotation on it at the top of the class for "@Ignore".  This tells our 

  Anyway, long story short.  I'm going to fix that unit test this morning.  Once it's working,  I'll check it in and try to do a video tutorial of using junit to debug a service in MRL.

I'm excited that you're looking into this code and if there's anything I can do to help, don't hesitate to reach out!

-Kevin

@Kevin

Am all set now to get a bit more insight. Trying to understand I started with your existing InverseKinematics3DTest class.

In testIK3D I am missing a connection with the real servo settings for InMoov. I see that you create an InMoovArm and connect the ik3d arm with it.

Then your comment is:

// start from a centered joint configuration so we can iterate without loosing rank
// in our jacobian!
which I can't  understand. What could cause loss of rank?
 
Next you do a "centerAllJoints" which - as far as I can  understand - only impacts the ik3d arm but not the arms omoplate etc servo settings?
 
and then I see a ik3d.moveTo(100,0,50) which calls the DHRobotArm.moveToGoal.
 
It looks to me like it does all the calcs in the ik3d world only but not with the real InMoov?
 
 

@Kevin

Found  the problem with my own trials of mimicking your moveToGoal function. I am using degrees in my DH-table primarily for easy comparison with the MRPT settings.

As you are using the same delta variable to modify theta and to set the jacobian steps my calcs didn't work.

So I am ready to go for the next steps :)

1) Your current implementation is bound to the arm. To bring it up to a more general level it would require to have it start from a x,y,z position and also pitch, roll, yaw caused by stomag movements? Or move it out of the arm into a more general instance?

2) I have already asked Gael about advances with the "global calibration system" for all InMoov's. That looks to be paused and without a common agreement. But for getting further we should have an InMoov 0,0,0 point and then set everything into relation to that. I have no idea if the stomag rotation point is a good candidate or whether we should have it on ground level? I will continue to use the stomag point unless a better idea comes along.

3) I will first try to add pitch, roll, yaw to my own calcs and also extend the DH table with the fingers.

I see in the shoutbox that you are busy creating a stable InMoov MRL version with improved speech for Gael. Seeing Gael on stage struggle with voice commands made me feel a bit bad. I think improvements there should have priority. So keep on helping Gael that as I would REALLY like to be able to have Marvin talk and listen in German and I will busy myself with the IK.

@Kevin 

Lost orientation!!!

Wiki defines (for robots)

yaw = counterclockwise rotation about z-axis

pitch = counterclockwise rotation about y-axis (unfortunately my brain likes to think of this as roll ...)

roll = counterclockwise rotation about x-axis

If I set my 0,0,0 position on the stomag rotation point:

  • head is on positive z-axis
  • sholders are on the x-axis (left sholder on + or - side???
  • InMoov is looking at pos or neg y-axis??

It's more or less a question of location

1) if I am inside InMoov, then left shoulder has a negativ x-position and I am looking into the direction of the positive y-axis.

2) if I stand in front of InMoov it's left shoulder is on the positive x-axis and InMoov looks into the direction of the negative y-axis

I would vote for 1 (although I had based my calcs so far on 2)

Another option is to have shoulders on the y-axis and InMoov looking towards positive x-axis - actually I would prefer this view over the above. Left shoulder then on positive y-Axis.

Any rcommendation or did I rise a question already answered?

Hi Jeurg,

  I think most computer graphics systems as a convention use z as moving away from you.  I know, we generally think of z is going up.. but when it comes to video game programming, programming the oculus rift, and dealing with the OpenNI data.  Z is away from you.

  X is left and right

  Y is up and down.

  Z is forward and backwards.

This leaves the definition for Roll / Pitch and Yaw to be the following.

Roll = rotation about the Z axis.

Pitch = rotation about the X asis.

Yaw = rotation about the Y axis.

I know it's a bit unnatural to think of Z as going away from you, rather than up and down.  But when it comes to rendering video game graphics, the coordinates on your screen are x and y..  And your screen is usually not laying flat on your desk.

Of course, it's easy to translate coordinate systems with a simple translation/rotation matrix multiplication.

With my original DH param set that I added to the InMoov arm model, it was configured with the omoplate rotation point (the shoulder) as the origin.   Z pointing forward, X pointing left/right and Y pointing up and down..  

hope this helps a bit with that I'm thinking and why I started out this way.

 

From the Oculus SDK documentation: