Inverse Kinematics update...

Here's an update on my progress with the InMoov arm Inverse Kinematics.

 

 

 

This demo shows the usage of MyRobotLab with the new AngularJS based WebGUI rendering the InMoov arm using ThreeJS in the web gui.  An XBox 360 controller was used to move the position of the InMoov hand around in the x,y,z coordinate system.  The cooresponding Servo angles are computed and published to the InMoov arm.

This is work in progress, as there's still some calibration to be done but the movements definitely seem to be generally in the expected directions.  There are some places where the IK algorithm throws a divide by zero and other bad math stuff, but with some nudging it starts working again.

 

 


Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
GroG's picture

Very Cool Kwatters ! I wanna

Very Cool Kwatters !

I wanna see the 'real' Arm Too !    As a challenge you should touch something using the 2 joysticks with the real arm !

Awesome !

kwatters's picture

point and speak

I was hoping to get it to sign it's name on the whiteboard, but i don't think the motor control is accurate enough..  (and without ik 6D it'll be difficult to keep the marker on the board properly..  )

I need to address some of the erratic robot arm movement behavior with the ik3d solver so the video will be easier/better...  I also want to get the servos to provide feedback and update the current postion of the DH model.   hope to do something with it either later today or maybe tomorrow before the super bowl :)

Alessandruino's picture

Cool demo kevin ! wow... your

Cool demo kevin ! wow... your servos seems very quiet ! are you using different motors/potentiometers?

Ale

juerg's picture

Hi Kevin As I got stuck

Hi Kevin

As I got stuck because I could not get the 3d point with the InverseKinematics3D service I tried to read into the DH-Parameters and come up with my own calculations for head and left/right palm. I still would like to give a try to position neck and rothead based on the 3d coordinates.

As this looks to be working in my PC-environment I wanted to add it ito my MRL script for seeing the real action. However my script uses numpy and - despite having numpy set up on my odroid InMoov controller the MRL python window is not seeing it.

Any help? Or can I use MRL latest to get the 3d Point of palm and head?

kwatters's picture

X,Y,Z coordinates are not what you want for the head.

Hi Juerg,

  I think you need/want roll,pitch, and yaw , not X,Y,Z for the head position.

The IK service can definitely return x,y,z of the hand (it does not provide orientation data for roll , pitch, yaw  (yet) )

Lets talk about roll, pitch and yaw..  The default InMoov head does not support "roll"   (unless you have the Bob Houston updated neck printed and an extra servo.)  

So that leaves us simply with pitch and yaw.  Pitch is controlled with the "neck" servo,  Yaw is controlled by the "rothead" servo. 

What you want to know is what settings for yaw and pitch will look at the 3D x,y, and z position of the InMoov hand...

So, you can get the current x,y,z position of the inmoov arm with the IK3D service... then the calculation is to translate/rotate that coordinate system to one for the inmoov head.  Once you've done that, it's should be straight forward to compute the yaw and pitch to look at that point in space...

As for numpy.. sorry. jython doesn't support numpy.

-Kevin

 

juerg's picture

Thanks for your reply. Agree

Thanks for your reply. Agree that I will need to set pitch and yaw of the head (I have only the standard head controls in place) but think I can calculate this from the 2 x,y,z positions.

For matrix multplication in jython I found a reference to a matrix toolbox but do you think I can make that available within the MRL jython script?

Is it correct then that with your previous release I wasn't able to get the y,x,z postion of the hand? There was some talk about problems with latest - should I use latest anyway?

kwatters's picture

Palm Position

Hi Juerg,

  There is a matrix class in java that you should be able to use from jython.  I'll check if there are other options that are jython friendly.  The palm position has been available for a while, but what I recently added was the encoder offset and calibration from the InMoov arm to the DHRobotArm model.  I think where you'll run into difficulity is that the IK3D service publishes the servo angles, so you can tell the IK3D service to move the inmoov hand to a particular x,y,z position  (assuming the arm can get there)  and it publishes the angles to the InMoovArm, however, if you move the servos on the InMoov arm, the IK3D service does not currently update the DHRobotArm links to those new angle values...

  At it's core, the DHRobotArm class does all the good stuff.  It consists of a list of DHLinks.  You can compute the position of any arbitrary robot arm (or leg) with this class by calling getPalmPosition()  

Here's an example python script that shows how to build a dh robot arm in python and to move it around in a straight line

 


#file : /home/kwatters/DHParamsIK3D.py edit raw
##################################################
# Inverse Kinematics
# This is an example to build up a robot arm
# using (modified?) DH parameters
# then use the InverseKinematics3D service to
# compute the forward and inverse kinematics for 
# the arm.
##################################################
from time import sleep
from org.myrobotlab.service import InMoovArm
from org.myrobotlab.kinematics import DHRobotArm
from org.myrobotlab.kinematics import DHLink
from org.myrobotlab.kinematics import DHLinkType

# Create a robot arm
myRobotArm = DHRobotArm()

# Lets create a 2 link robot arm
# Create the first link in the arm specified by 100,100,0,90
# additionally specify a name for the link which can be used elsewhere. 
d1 = 100
r1 = 100
theta = 0
alpha = 90
link0 = DHLink("base", d1, r1, theta, alpha)

# Create the second link (same as the first link.)
d1 = 100
r1 = 100
theta = 0
alpha = 90
link1 = DHLink("link1", d1, r1, theta, alpha)

# Add the links to the robot arm
myRobotArm.addLink(link0)
myRobotArm.addLink(link1)


# create the  IK3D service.
ik3d= Runtime.createAndStart("ik3d", "InverseKinematics3D")

# assign our custom DH robot arm to the IK service.
ik3d.setCurrentArm(myRobotArm)


# print out the current postion of the arm.
print ik3d.getCurrentArm().getPalmPosition()


# Now, pick a start/end point 
# and begin moving along a stright line as specified by the start/stop point.

# starting point
# x , y , z
x1 = 100
y1 = 100
z1 = 100

# ending point
# x , y , z
x2 = 200
y2 = -400
z2 = 100

# how many steps will we take to get there
numSteps = 100
# delay between steps (in seconds) (this will control the velocity of the end effector.
delay = 0.1 

# lets compute how long the path is.
# this is the change in x,y,z between the two points
# divided up into numSteps, so we know how much to
# move in the x,y,z direction for each step.
dx = 1.0*(x2 - x1)/numSteps
dy = 1.0*(y2 - y1)/numSteps
dz = 1.0*(z2 - z1)/numSteps

# our starting point for the iteratin
# set that to the current x,y,z position 
curX = x1
curY = y1
curZ = z1

# tell the arm to configure to that point
ik3d.moveTo(curX,curY,curZ)

# iterate over the 100 steps
for i in range(0 , 100):
  # Increment our desired current position by dx,dy,dz 
  curX+=dx
  curY+=dy  
  curZ+=dz
# tell the ik engine to move to that new point
  ik3d.moveTo(curX, curY, curZ)    
  # pause for a moment to let the arm arrive at it's destination
  # smaller delay = faster movement.
  sleep(delay)

juerg's picture

Kevin, great stuff, will play

Kevin, great stuff, will play around with it. It was also very interesting to create my own set of DH parameters (starting at the base of the body) and matrix-multiply my way to the palm and head.

Found also formulas that should allow to calculate roll, pitch and yaw from the matrices on the way to the palm:

http://robotics.stackexchange.com/questions/8516/getting-pitch-yaw-and-roll-from-rotation-matrix-in-dh-parameter

I will give this a try too as I am curious about the results and whether I can verify it on my Marvin.

The thing I couldn't grasp so far are the methods to find the best path to a new position, I know that a few methods exists, each with its pro and contra, but the math is a bit out of my scope :-(. And it's not the end of the line, there are mechanical restrictions, there is also another arm that could be in the way of a movement and of course I would like to avoid Marvin banging his hand into my windows or my desk.. It would be nice to enhance InMoov to be a bit more aware of its environment and add a little bit of the - for humans normal - self control.

But there are many places in the web that might help me to understand or even implement maybe some of the magic. I will keep trying. Thanks again for your time and responses.

Juerg

kwatters's picture

Trajectory and Motion Planning

Hi Jeurg, 

  Thanks for the pointer to the calcuations for Roll / Pitch / Yaw.  I suspect that they should be pretty straight forward to get into the IK3D service to make it an IK6D service :)  I had plans on making it roll/pitch/yaw aware from the start, so it should be easy to extend it.  I believe that we can compute the roll/pitch/yaw from the "homogenous transform matrix" that we're currently using to compute the robot hand position.  When it comes to solving the IK for it,  I suspect that we'll start getting a lot of failed solutions because the InMoov arm isn't "fully compliant"  in that it only has 5 degrees of freedom  (not 6...  x,y,z,roll,pitch,yaw)    

  I think you have hit on a some of the most difficult tasks in robotics at the moment.  This is to make the robot aware of it's body and it's surroundings.  In addition,  it's also to choose a good path to get to the destination.  As you point out a straight line isn't always the best way to move depending on how the robot arm is configured.  

  I have only attempted to chose the straight line path to the destination points, but I suspect that there are some much fancier ways to plot the finaly path to the destination.  These include ways to minimize the amount of energy spent in getting to the destination and pre-computed waypoints.  I'm sure there are so many more, I'm just hoping that the DH stuff that I've done already can make it easier for others to get started...

juerg's picture

jnumeric

Hi Kevin

found jnumeric and downloaded it. Copied jnumeric-0.1a3.jar into the MRL folder. Set CLASSPATH to point to the jnumeric-jar but get an error trying to import jnumeric in my MRL-python-script. Wasn't able to find a comprehensive description of how to make jnumeric work. Maybe you can help?

Juerg

juerg's picture

Beside having no success with

Beside having no success with JNumeric so far (my best version throws an error when importing the newly compiled jar with reference to trying to overwrite __findattr__)?

I tried also the script you posted. Thought I could get the palm pos of InMoov by running this script:

ikLeft = Runtime.createAndStart("ikLeft", "InverseKinematics3D")
leftArm=Runtime.start("leftArm", "InMoovArm")
leftArm.connect("/dev/ttyACM0")
ikLeft.setCurrentArm(leftArm.getDHRobotArm())
ikLeft.addListener("publishJoinAnglesLeft", leftArm.getName(), "onJointAngles")
 
# print out the current postion of the arm.
print "omoplate: ", leftArm.omoplate.getPos()
print "shoulder: ", leftArm.shoulder.getPos()
print "rotate:   ", leftArm.rotate.getPos()
print "bicep:    ", leftArm.bicep.getPos()
# 10,30,90,5
print ikLeft.getCurrentArm().getPalmPosition()
# (x=320.000, y=80.000, z=280.000 roll=0.000 , pitch=0.000, yaw=0.000)
 
ikLeft.moveTo(300,80,250)
# this throws an error from Matrix.java.221