Get 'er Done

GroG's picture

Distributed Teleoperated MRL Begins .... (sort of)

GroG's picture

This was a first, so I wanted to share it !    I connected from a local instance of MRL to the online demo .. BOOYA !  - it connected !  I wanted to see if I could create a Joystick service .. it worked !  And I got a select option on the demo site of my local hardware .. unfortunately I forgot my Joystick :(   


Avi's picture


I'm managed to run the webGui as mentioned in the github examples for the webGui.

But I don't understand who to lunch the angular web gui has seen in this video and this tutorial (but it only shows how to run it. in the I've found "api-type must be (services | messages), please refer to for details" yet the link is down.. 

Headless head....

Mats's picture

Today I tested using the RemoteAdapter service for the first time. I have a Raspberry PI mounted in the InMoov robot head and an Arduino UNO connected to it. It also has a USB WiFi dongle so that I can communicate with it.


I installed MRL in the Raspberry and created a python script.

The script first starts all services that I want to use, like the Arduino and several servos. Then as the last lines I added:

HeadRPI = runtime.createAndStart("HeadRPI","RemoteAdapter")


hand / arm graphic control

Avi's picture


I'm VERY new to the MRL.

What I would like is to see an image of hand/arm and manipulate it in image (2D or 3D both OK for me) so the robot will move in the same way. 

Such a thing exists? 

Thank you.

ProgramAB in webgui appears different to Swing interface

bensonofjohn's picture

I'm probably doing something wrong, but when I run the script below the responses from the programAB in the webgui are different to the one in the Swing interface. The swing interface is my bot Ozy, the webgui one just gives me default responses.

MaryTTS - multi language support

MaVo's picture

1. MaryTTS and it's support for languages

1.1. General

MaryTTS is an open-source, multilingual Text-to-Speech Synthesis platform written in Java. ( BTW: Does it has a service page? )

It supports many languages each with it's own set of voices.

Important: A language is NOT the same thing as a voice. To function properly both the language as well as the voice files are required! A lanugage may have several according speeches.

A language file is just an .jar, e.g. "lib/marytts-lang-de-5.1.2.jar".

The "new" Joystick Service

Kakadu31's picture

I am writing here beacause i think it's more long lasting than the shoutbox and could be helptful for others:

The script with the new build:

eyes following hand

juerg's picture

Given that MRL knows all my movement commands and servo angles - it should theoretically be possible to set the eyes/head to follow or point to the left or right hand position? Or is this already available?