In those video, I have use the IK service to ask the inMoov to move his hand to a position up in the air. So the position of the servos are determined by the IK service and I did not controlled or script any of the movement
the obstacle (the tape) is not detected yet by the robot, but I told him by script about the obstacle at that position.
So the robot start to move in straight line toward the point that I ask. When it reach the obstacle, it stop, move away a bit, then try a new way to reach the target point.
In both video, the robot is trying to move to the same point, and use the same script/service setting, I have show both video to show that it can take different choice to move to the position and avoid the obstacle
It still need some improvement, like a better objects definition and improving the choice he can make to avoid the obstacle, but i'm really happy with the progress made
this is the test script that I use. In the video, the script is integrated to my inMoov script, but look the same
https://github.com/MyRobotLab/pyrobotlab/blob/develop/home/Calamity/IK_…
The settings for the IK service are the DH parameters, wich describe the angle and lenght of each joint use in the model, and setting up object that will be use in the collision detection method
Currently, the only object model that are use are cylinder (start point, end point and radius). I will need to improve that because the cylinder model don't describe well some of the body part, like the hand (in the video, the thumb is not describe, that's why it touch the tape at time)
the moveTo(x, y, z) method made the IK service determine the best position of the servo to reach the point, check on each step if object are colliding. If it find a collision, it stop, move away from the collision point and try to find a new way to reach the point. It repeat and rince until it reach target point (or give up)
Great ! This work with
Great !
This work with openCV ? I have to learn to use the IK, to try to avoid it own body for my robot !
one more thing on my todo list ....
So many things on my todo list lol !
congratulations!!
great stuff, calamity!
however you left out the information how the band gets recognized?
You can add text into the video via the youtube provided tools. It's rather easy to do that and you can also cut out parts from the video that don't look like what you want to show.
However - your shaking hand might be too much to correct - maybe have another InMoov built that will hold the camera for you more steadily ;-)?
Will make some more tests with the BNO055 as soon as time allows ...
Greetings
Juerg
Hi juergthe band is not
Hi juerg
the band is not detected yet by the robot, By script, I told him that there is an object at that position, and put the tape at those coordinate to visualize the object to the camera
Sorry about the poor video quality. I only had my phone for it and was doing too many thing at the same time, like be ready to pull the plug if the testing have go wrong
next thing will be to have to robot see the band and add it to his computing so he can't avoid it (or grab it)
Great Work
Great Work Calamity..
Excellent demonstration ...
I did not know you were so far along with your build.. Thanks for the video, I'm really impressed what you've done with the IK service.
Could you post or link a reference to your script in pyrobotlab ?
It is my intention to post
It is my intention to post the details of the script, yesterday it was late was so excited that it work that I post the video right away. Will give more details when i'm back from work work
no worries - take your time -
no worries - take your time - I also am rather restless to post new things I am excited of!
wahoo impressive !!!
wahoo impressive !!!
This is super exciting!
This is super exciting! Thanks for posting this very advanced work!
How precise do you think it could be? Knowing InMoov joints aren't so precise.
If you set an end point, how do we know it is correctly reached in real world?( I'm guessing we still need the calibration solution to be solved)
I had no idea the IK service was already in service, I thought it was only under beta form.
Great job!
I will have to do more test
I will have to do more test to know how precise it can be. But I really dont expect a 0.01 mm precision :D
the 180 position resolution of the servos make it not very precise. Maybe using writeMicrosecond will give a better resolution, but more need to be done to have writeMicrosecond work with IK service
The IK computation that I use gives a good answer to go to a position, but not always the best anwer. The IK output is usually within 10 mm of the target position. On hard to reach position it can be up to 50mm off
There is currently no feedback from the servo. So right now it's reliying on telling the servo to go to a position and hope it really goes there. So the precision had to do with how precise your calibration is.
And as you mention, some InMoov joints are not so precise. If there is some loose in the joints it add to the imprecision
Calamity maybe we can read
Calamity maybe we can read the ( extracted ) potentiometer value > sent to analog input . To read the exact position
obstacle avoidance
Hi Christian
not trying to discourage you in your work on MRL and obstacle avoidance but maybe you should spend some time looking into ros and ros.moveit?
Alan Timm has created a ros package that also visualizes InMoov so movements can be simulated or executed physically or both.
I have just started to look into ros and only worked through the basic tutorials - but it looks very promising to me.
I am building a mecanum wheel base for Marvin at the moment (did not print the parts but use chinese wheels and aluminum profiles) and would like to have it controlled by Marvin, not manually by me.
Made no progress with my leveling hand (beside wireing), could not find the script part you mentioned that would allow for repeated execution of my python function.
Have fun
Juerg