ProgramAB Gestures

rekabuk's picture

I've put aside MouthControl for the moment and tried to get the my InMoov bot to respond to my voices commands.

I have the "gestures.aiml" file in the aiml directory and MRL seems to be loading it. But when I speak the command, my words are correctly recognised but the bot does not respond. With either the action or the template speech. I found a forum message saying  I need to say "ENTER GESTURES" to get into the right mode, but this didn't work either.

Servo pins and configuration

astro's picture

I was trying to connect the servos, apparently I can not set other pins that are not inmoov standard in some parts, it seems hardcoded.

For example in the right arm does not take the changes:

I set right bicep to 38 and continues at 8.

Servo ping fixed

 

IPLimage stream from java

MikeTalmage.mt's picture

 I've got a working attention system trying to merge with myrobotlab but it needs iplimages from opencv and the whole setup is written in c++ need to be able to stream left and right eyes need to know how to stream to a c++ program

sync multiple movements

juerg's picture

I remote control MRL with a python task using the REST API of MRL.

Seen that  "moveToBlocking" also works with the REST API so my program waits for the movement to finish - GREAT!!

But how can I initiate multiple movements and wait for the last one to finish?

32Bit Lenovo Pad Issues

SBorne's picture

I have MRL running on my laptop and pad both are 64bit, both work. Found a Lenovo pad, 32bit, and can’t seem to get to work withMRL. Says needs 64bit jm3 file. Any help would be great. 

javax.net.ssl.SSLHandshakeException In Java

Kakadu31's picture

To get the sraix functionallities working again I was playing with the metasearchservice searx's API. I made a maven project (yeah I guess) in eclipse and managed to get the right http libraries to do the get request and I got an OK from the Testlink that comes in the documentation of org.apache.http.HttpEntity;. When I try to enter the searx API I get an javax.net.ssl.SSLHandshakeException.


Yolo With Myrobotlab

Bretzel_59's picture

Thanks to Dom and Anthony, Here is what I've done :

 

 

Ask the robot what he can see, capture a picture with opencv, launch a yolo detection on the picture and results are exported to a text file. Robot can now just read the text to say what he can see :D


MrlComm arduino compatibility

Bretzel_59's picture

I've just tried to compile mrlcomm for an arduino Leonardo and here is the result :