Merry Christmas, Happy Holidays, Goodbye & Good Luck Manticore !


Manticore Released !

Merry Christmas & Happy Holidays ...  

Thanks to all the many good elves who made this happen.

The community here is a wealth of learning, fun and friendship.

With a boatload of fixes and features we send Manticore on its way.  Goodbye & Goodluck Manticore !

And now that we've trained the auto-testing elves we can start working on big changes for the better on the new release !  

Some of the upcoming "big" changes are :

  • Maven & Ivy dependency Management (WOOHOO!!!)
  • Agent managed upgrades with rollback capability !
  • More AI !
  • More Virtual Thingies !
  • Autonomous Navigation !
  • And lots more stuffs !


P.S. - Heh, it's going to take me a bit to get it all ship shape - but the merges have taken place. Master branch is the same as develop .. the planets have aligned, just some book keeping, posts, and jenkins shennanigins which still have to take place .. oh and some testing on "master" branch .. :D

As soon as that is all figured out ... On to Nixie release !!!

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
hairygael's picture

Brrrr, exciting!!! To enhance

Brrrr, exciting!!!

To enhance version 1.0.107 which had the above short list (posted by Grog) working, I hope to get the joy of using:

-voice control with programAB (presently working with various Speech options)

-webkitrecognition (presently working with Chrome)

-face recognition (presently working, slows down the fps a lot, a bug with populating filters in OpenCV)

-neopixelring (presently working with USB, not with RX/TX as far as I tried, and I really tried)

-sensors for fingers?

-kinematic?(I tried with joystick from Kwatters script but result was not efficient, old joystick maybe?)

calamity's picture

the TX/TX was working last

the TX/TX was working last time I try it. But it's been a while. I have currently use the neopixel for another project and have not use it for a while. I will replug it soon to test it

about kinematics, kwatters did an awesome job with the service InverseKinematics3D. I use most his code for IntegratedMovement service. When I tried it, I got some difficulty too. I think it's mostly because it have some hardcode setting to convert the IK angle to servo position and that in some position, it can generate null pointer exception. 

I have work a lot on integratedMovement service recently and I begin to be quite happy with the result. I will do a demonstration with the InMoov soon, but I need to fix a problem with one of the arm first



kwatters's picture

nice short list.

so. couple of comments.

1. we need a set of example scripts at a minimum that create an arduino, attach a servo, and move it.  This script needs to run as a unit test that will break the build if the script doesn't run (on virtual arduino).

2. we need an example script that starts up a minimal inmoov (minus the webgui and swing gui) in a unit test using the virtual arduino.  This unit test must fail if the syntax of the python script is invalid.  We should also try to exercise some of the inmoov code itself in this unit teest.

3. I'm not sure about being able to test the neopixel and I think the IK and integrated movement services need more work, so we shouldn't officially support those in this release.  Perhaps next release? 

4. webkit, program ab, inmoov all should be no problem.  Even face recognition, I'm pretty confident that we can get that all offically supported in this release.

...  Now, the question is , where do those scripts come from,  we have the InMoov repo that contians a lot of stuff.  (probably more stuff than we need to / care to test with each build.)  

So,  I recommend we get a couple minimal python scripts together, add them as some of the unit tests, make sure they run with every build and validate their syntax.  Once that test infrastrure is in place, (and the scripts are worky)  then I think it'd be ok to release.

I've been working on trying to update the unit tests a bit.  the ArduinoTest (and the Arduino2Test)  are good examples of how to use the virutal arduino in a unit test.

Finally, as part of the release we need to / should publish both the javadocs and the jacoco code testing coverage reports.  The javadocs will help document and they will be in sync with the actual code if they're bundled with the release.  The jacoco code coverage reports will let people know what is tested with each release by the auto testing elves.  


Then...  Unleash the Manticore!


calamity's picture

I beleive the Neopixel

I beleive the Neopixel Service is working fine and can be officially supported. What seem to be broken now is the ability to daisy chain Arduino. That is not a necessity for the Neopixel service, it just saved a USB port. I will look and test at what is not working. 

For the IntegratedMovement service, It is not ready for release. While it's in a working state, it need more polishing and documentation.



kwatters's picture

Virtual arduino testing with the neopixel

If we can unit test the neopixel by using my the virtual arduino .. then ok. We can support it. That means a unit test to make sure the service is worky (at least with the virtual arduino service.)

GroG's picture

WOOT ! Bring the Beast !

I'll be working on OpenCV, which includes unit test as soon as the interfaces are defined (working on that too) - then I'll switch to Tracking's interfaces and tests ..

- oh and a variety of backend infrastructure updates in the copious spare time which remains ;)

hairygael's picture

I canl test the minimal

I canl test the minimal script for head, arm, hand, torso with latest release.


Need to add the velocity and maybe other things to make it worky. Switch detach to disable...

GroG's picture

So I have to ask :)On the

On the short list is :

  • packaged self-contained InMoov scripts inside of myrobotlab.jar (thanks Moz4r!)

It would be a really big improvement to auto-package worky InMoov scripts with every build and every release...

What scripts are we going to package ?

Are they going to be the ones in InMoov repo or ones in your home directory ?  

Also just to be clear - the scripts selected will forever be auto-tested every build, but future changes need to go on the "develop" branch and we can have a process of release which auto-moves the latest to the "master" branch.

Make sense ?  I want to make very sure we all know where the scripts which will be auto-packaged and guaranteed worky come from.


Ash's picture

Hi All, Short but very

Hi All,

Short but very exciting list ;-p

Thanks a lot for your work !

moz4r's picture

Impressive beast ! Services

Impressive beast !

Services script will be a good thing to test I think. It is great raw reference. 

harland's picture

be happy to help

Azul would be happy to help with testing on a physical Inmoov.  I have the time and a complete shop for rebuilding Azul if needed (ready to give its plastic body to advance MRL).  I am much better at building, not software.  I agree with Gael that we need code to support finger sensors to make the hands more useful.  Since some Inmoov’s  have wheels,  maybe some thought toward commands for “driving” them around.

GroG's picture

Thanks Harland, Its great to

Thanks Harland,
Its great to have yours and Azul's expertise !!!

This is an inital post to get all the elves to head in the right direction ... there is much work to do, but I and the rest of the elves would greatly appreciate any robot-test-time when we are in "pre"-release mode ...

It's still a bit early.. stay tuned !

And yes I totally agree on the platform service !

juerg's picture

looking through some of the

looking through some of the files I see e.g. has e.g.

ear.addCommand(u"disconnect head", "i01.head", "detach")

has this to be changed to "disable" instead of "detach"?

If I want to add german "ge" can I start to work on that based on the englisch version?

And can you add into your list that I have a chance to get my german MarySpeech working


The voice gets loaded but to it looks like NaturalReaderSpeech kicks in and ignores my german voice.


juerg's picture

another small issue (2132):

another small issue (2132): after "install all" the restart does not terminate mrl (win10/64).

also trying to close the window does not terminate

I have to use Task Manager to kill the java process

juerg's picture

I understand that we have now

I understand that we have now an inmoov folder within the mrl-version which has a lot of definitions, some also related to the users individual environment e.g.


which lets me start a vinmoov, can set the language and specifies the COM-Ports for the Arduinos.

So each time I download a new mrl version I will need to merge my own version of that file with the newly downloaded version?

I am on windows so case in folder/files does not matter but I see all these 3 cases used to refer to InMoov:

inmoov... (all lower case)

won't this cause trouble in linux systems?

moz4r's picture

Call relayed to french

Call relayed to french friends for report.



GroG's picture

Thanks Moz4r ! .. It looks

Thanks Moz4r ! ..

It looks like tracking/opencv is messed up .. I'm looking into that now..
Also DiyServo Gael is noWorky

Do you think it's worth synching up the InMoov service file with the InMoov repo file inmoov/InMoov/ ??

hairygael's picture

Related to the InMoov

Related to the InMoov forum:!topic/inmoov/DfG-pOJhZAg

GroG's picture

That's great, Thanks

That's great, Thanks

GroG's picture

Now that I have it running in

Now that I have it running in the debugger I can't seem to get it to fail.

I alternate from "FACE DETECTION" and "STOP SEARCH" and still get 28~34 fps did this 8 or so times

The cpu is around 26~30 % running face detect.

When the filters are unloaded the video freezes but this is an expected behavior.  I'll try the jar again to see if I can get it to mis-behave.   But in eclipse at the moment, no problems :P

This is running with latest jar.  All worky 30 fps.

GroG's picture

Running with jar develop

Running with jar develop 1.0.2644 - VideoInput frame grabber - no filters .. 32 fps

moz4r's picture

diy servo ( motordualpwm )

Time to test diy servo

edit : test board ready, blackout on arduino pins output, motorPowerMapper is guilty

edit : WORKY ! polished SwingGui map things / setInverted for servo & diyservo. mappings are now saved inside json also. ( maybe too much things for small screens ) . Disabled things are not yet implemented.

Guys, about velocity, potentiometer will do the job, any ideas ? That is great because we will have a real velocity , with or without load.



GroG's picture

Cool looking ui moz4r. I'm

Cool looking ui moz4r.

I'm not sure what your asking.  But in a general case relative & absolute encoders could both be used for speed control/velocity.

a rotating potentiometer could work by measuring the difference of sampling - which could describe velocity.

hairygael's picture

I don't know if others are

I don't know if others are encountering this issue with Virtual InMoov. MRL version 1.0.2658

Exceptionnaly,  when launching MRL, VirtualInMoov will throw this error:

the_Z0MBiE's picture

MRL 1.0.2667

I too get a similar error. I then shut down MRL and restart it. Virtual InMoov works fine the second time. 

Ash's picture

Merry Christmas

Thanks to All the team : Manticore is my favorite Christmas present ! ;-))

moz4r's picture

Nixie is my new wallpaper

Nixie is my new wallpaper

the_Z0MBiE's picture

Autonomous Navigation

Thanks for the hard work you put in to MRL. I have a question.

Will the Autonomous Navigation work with a monocular camera or will it require using systems like Kinect?

Thank You and Merry Christmas.