Next version 'Nixie' is coming soon !

Help us to get ready with MyRobotLab version Nixie: try it !

Hello!!

 

ive been a bit out of picture but now im back!!! 

Im at the moment trying to use leap motion to control the myrobotlab! does anyone have a clue on how to do it? i know have been done in the past!

 

Even if it doesnt work anymore with the changes on MRL what is the best way to implement information from an outside source (leap) into maybe the python interface? Could someone throw me a hint <) thanks alot

 

Pedro

tried to make things run but do not understand how I can add another voice than "cmu-slt-hsmm" available. Trying to select "dfki-pavoque-neutral" as voice throws the exception "no such voice"

Here is a quick guide for extracting satellite mapping information via a simple ESP32 Web server.

Requirements :-

(1) Espressiff's ESP-WROOM-32

Setup

Using InMoov service

The integratedMovement service have been add to the inMoov service and will load default setting.

the command to start the service is 

InMoov.startIntegratedMovement()

If the range of your servo do not correspond to the default setting, you can adjust the limits using

InMoov.IntegratedMovement.setMinMaxAngle(partName, minAngle, maxAngle)... partName is the same name as the servo that move a part. 

 

Using IntegratedMovement as standalone service

The IntegratedMovement can be use for any robot. You will need to define the DH parameters for each joints before you can use it. 

The first thing you must do is to declare a new DHRobotArm with setNewDHRobotArm(armName). This will create a new thread class that will command the arm or the robot.

Then you must setup the DHLinks for each joints of the arm. The DHLinks will define how a joints moves relatively to the previous joints. The DH parameters are explained in this video

https://www.youtube.com/watch?v=rA9tm0gTln8

If you want the same coordinated for all of your arms, they must start all your arms with a common link originated at the same point

To set the DHLinks, you must use one of those methods

  • setDHLink(armName, linkName, d, theta, r, alpha)
  • setDHLink(armName, servo, d, theta, r, alpha)
    • d, theta, r and alpha are the DH parameters (see the video)
    • the first version use a string as linkName. It will make a static link that don't rotate, but still take in account for the inverse kinetic algorythm
    • the second version initialize the link with information found in the servo class, like the min/max limits. It will also establish communication between the DHRobotArm and the Servo so when the servo moves the DHRobotArm get updated about the movement.

Finally, you start the DHRobotArm thread with startEngine(armName)

 

You can also setup physicals objects that will be taken in account by the inverse kinetic engine to avoid collision between the defined objects.

To add an virtual object in the environnement of your robot, you must use

  • addObject(OriginX, OriginY, OriginZ, EndX, EndY, EndZ, objectName, radius, jmeRender)
    • Object are all considered as a cylinder, so you must define a Origin and a End point relative to your first DHLink origin and a radius. Units must be the same as use for defining the DH parameters
    • jmeRender is a boolean (True/False) that will make the object appear (render) into the JmeApp if you are using it
    • the objects set are static and cannot be moved

To set your DHLink as objects you must use

  • addObject(objectName, radius, jmeRender)
    • Object name must be the same name as the DHLink it's link to
    • jmeRender is a boolean (True/False) that will make the object appear (render) into the JmeApp if you are using it
    • the inverse kinetic engine will update the position of the object, so you dont have to give an Origin and End point

the method objectAddIgnored(objectName1, objectName2) will make the inverse kinetic engine ignore collision between the two object defined. This may be necessary if two objects are too close that the engine detect them as colliding.

The kinect can be use to 'see'  the objects in front of the robot. You can start the kinect with startOpenNI(). the kinect is not reading the environment in real time yet, you must ask it to take a snapshot with processKinectData()

 

example

[[/service/IntegratedMovement.InMoov.py]]

Javadoc link

IntegratedMovement is a service that use inverse kinematic computation to control the position of a robot arms. So instead of moving each joints separatly, you give a position in a 3d space and the arm will take to position it need to reach that position.

Hi everybody,

The service is HS.

Adding coyright And empty sound file

 

 

 

salut tous le monde,

il semble que le service est HS .

Ajout de coyright

et fichier son vide

Hi !

I've just updated my eclipse mrl sources and built a fresh Myrobolab.jar . i've changed the start argument with SwingGui and all work well . But i see some errors in the log at the bottom of python tab . No red band at the bottom of the gui's window.

I don' know if it's important but here is what i saw ( the script is sweety.py with virtual arduino ) :

_______________________________________________________________