First try with Nvidia jetson Nano and Manticore

Hello community,
First of all, I am an absolute newcomer to Linux, which doesn't make things any easier. But at some point you have to face new challenges.

I have a little (big) problem there. Since saturday I've been trying to get Inmoov to run on a Jetson Nano. At least I have it so far that the Swing GUI and Web GUI start and the chatbott works. But when the connection to the Arduinos is to be made, only the error message below appears. If I start Myrobotlab on its own without all the Inmmov runtimes, no arduinos are recognized (found) when I start an Arduino runtime. In the Arduino IDE, the Arduinos are found and displayed correctly.


-Traceback (most recent call last):  File string, line 41, in module  File InMoov/system/, line 49, in module    execfile(RuningFolder+services/+filename.encode(utf8))  File /home/nvidia/Dokumente/myrobotlab/myrobotlab.1.0.2693.16/InMoov/services/, line 23, in module    right.serial.usedByInmoov=TrueAttributeError: NoneType object has no attribute usedByInmoov at org.python.core.Py.AttributeError( at org.python.core.PyObject.noAttributeError( at org.python.core.PyObject.object___setattr__( at org.python.core.PyObject.__setattr__( at org.python.pycode._pyx51.f$0(/home/nvidia/Dokumente/myrobotlab/myrobotlab.1.0.2693.16/InMoov/services/ at org.python.pycode._pyx51.call_function(/home/nvidia/Dokumente/myrobotlab/myrobotlab.1.0.2693.16/InMoov/services/ at at at org.python.core.Py.runCode( at org.python.core.__builtin__.execfile_flags( at org.python.core.__builtin__.execfile( at org.python.core.__builtin__.execfile( at org.python.core.BuiltinFunctions.__call__( at org.python.core.PyObject.__call__( at org.python.pycode._pyx4.f$0(InMoov/system/ at org.python.pycode._pyx4.call_function(InMoov/system/ at at at org.python.core.Py.runCode( at org.python.core.__builtin__.execfile_flags( at org.python.core.__builtin__.execfile( at org.python.core.__builtin__.execfile( at org.python.core.BuiltinFunctions.__call__( at org.python.core.PyObject.__call__( at org.python.pycode._pyx2.f$0(string:76) at org.python.pycode._pyx2.call_function(string) at at at org.python.core.Py.runCode( at org.python.core.Py.exec( at org.python.util.PythonInterpreter.exec( at org.myrobotlab.service.Python$
I'm using the latest Manticore version from, Java 1.8.0_282.
I have already tested Nixie: If I open an Arduino Runtime here, the Arduino is recognized. If I load the Inmoov runtimes, the Arduino is not displayed for selection there either.


kwatters's picture

nixie for the nano.

manitcore has little chance of working on the jetson nano.

I have tested and nixie works on the jetson nano. download and try the "latest" build of nixie.  

pepper's picture

That sounds great and gives

That sounds great and gives me hope :-)

I already have the latest version of Nixie installed. How would the rest of my setup have to look so that Nixie Works with the Jetson? When i start Nixie ,it shows me a long red list of things that obviously cause errors or cannot be found,all of wich seem to be related to Java. Which Version of Java should i use best? At the Moment i use JDK version 11.0.10

webgui : InvocationTargetException - null : java.lang.reflect.InvocationTargetException at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke( at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke( at java.base/java.lang.reflect.Method.invoke( at org.myrobotlab.service.WebGui.handle( at org.atmosphere.nettosphere.Config$Builder$1.onRequest( at org.atmosphere.cpr.AsynchronousProcessor.action( at org.atmosphere.cpr.AsynchronousProcessor.suspended( at org.atmosphere.nettosphere.BridgeRuntime$3.suspended( at org.atmosphere.container.NettyCometSupport.service( at org.atmosphere.cpr.AtmosphereFramework.doCometSupport( at org.atmosphere.websocket.DefaultWebSocketProcessor.dispatch( at org.atmosphere.websocket.DefaultWebSocketProcessor$ at org.atmosphere.util.VoidExecutorService.execute( at org.atmosphere.websocket.DefaultWebSocketProcessor.dispatch( at org.atmosphere.websocket.DefaultWebSocketProcessor.invokeWebSocketProtocol( at org.atmosphere.nettosphere.BridgeRuntime.handleWebSocketFrame( at org.atmosphere.nettosphere.BridgeRuntime.handleMessageEvent( at org.atmosphere.nettosphere.BridgeRuntime.channelRead( at at at at io.netty.handler.codec.MessageToMessageDecoder.channelRead( at at at at at io.netty.handler.codec.http.websocketx.extensions.WebSocketServerExtensionHandler.channelRead( at at at at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead( at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead( at io.netty.handler.codec.ByteToMessageDecoder.callDecode( at io.netty.handler.codec.ByteToMessageDecoder.channelRead( at at at at$HeadContext.channelRead( at at at at$ at at at at at io.netty.util.concurrent.SingleThreadEventExecutor$ at io.netty.util.internal.ThreadExecutorMap$ at at java.base/ Caused by: java.lang.NoClassDefFoundError: org/datavec/image/transform/ImageTransform at java.base/java.lang.Class.forName0(Native Method) at java.base/java.lang.Class.forName( at org.myrobotlab.service.Runtime.getServiceTypeNamesFromInterface( ... 51 more Caused by: java.lang.ClassNotFoundException: org.datavec.image.transform.ImageTransform at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass( at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass( at java.base/java.lang.ClassLoader.loadClass( ... 54 more

kwatters's picture

did you install all first?

I wonder if you're seeing an error because some libraries weren't installed as expected.  I see the following in that stacktrace (and no worky)

Caused by: java.lang.NoClassDefFoundError: org/datavec/image/transform/ImageTransform

This is a library used by deeplearning4j ...  it should have found it if it was installed.

to install services on nixie, make sure run the following command first.

java -jar myrobotlab.jar --install

How are you starting nixie up from the command line?  what command are you using to start it?  You might end up needing to set a larger than default heap size to give myrobotlab some extra memory to use by adding  "-m 3g" to the startup command:

java -jar myrobotlab.jar -m 3g





pepper's picture


Hello Kwaters,
unfortunately it took a little longer to answer.
I reinstalled Nixie with
 java -jar myrobotlab.jar --install
The installation ran through to the end.
I now start Nixie in the console
java -jar myrobotlab.jar
java -jar myrobotlab.jar -m 3g
starts the WebGui. I get the following error message in the console.
Error: Can't initialize nvrm channel
nvdc: open: Operation not permiteted
nvdc: failed to open '/ dev / tegra_dc_ctrl'.
Could this have something to do with Kinect?
Nixie is going so far. I can start a serice for an Arduino and also control a servo. OpenCv also works so far, a little very slowly but you can see a picture ;-)
However, if I start the i01 runtime, start for example the runtime for the right arm and then I would like for example controlling the bicep with the Arduino, Nixie complains about a wrong MrlComm.ino.
MrlComm.ino responded with version 63 expected version is 67
There is also no USB port to choose from in the setup for the Bicep, for example. But in the previously created runtime for the Arduino.
On the one hand I tried the MRLComm version in the resource folder and downloaded the MRLComm version again via Nixie and tried it with it. Same result.
kwatters's picture

upload mrlcomm.ino that comes with the latest build

Your "mrlcomm" that is loaded onto the arduino is old.  You need to upload the latest version of mrlcomm.ino to the Arduinos if you want it to work.  (as mentioned in your post.. version 67.)  

This comes with the installation of MyRobotLab in the resources/Arduino/MrlComm folder.  (I don't know if the arduino IDE works on Nixie.  so I recommend uploading the mrlcomm from a windows or mac )

As for the "slowness" ..  I'd recommend getting a separate USB SSD drive for running MRL on.  Using the same filesystem as the root operating system gets really slow if there are many disk I/O writes to the device. 

On my Jetson Nano,  I have a high speed USB thumbdrive plugged into one of the USB 3.0 ports on the Nano.  This is where I install MyRobotLab.  I found that it made a really big improvement in performance.

Glad to hear you're making progress!

pepper's picture

Waiting for the USB Drive

Ok then I'll see that I get the correct Mrlcomm installed on the Arduino. I basically load the MrlComm onto the Arduino via my Windows PC. The external SSD will not come until tomorrow, I hope a Samsung T7 500GB will be the right choice.
I will completely discard the SD card and boot the entire system from the USB. I already tried it yesterday with a Seagate Expansion STJD500400, but unfortunately it went wrong. As the saying goes, if you buy cheap, you buy twice.
I also bought an IMX219-160 camera for the Jetson, which unfortunately does not give more than a green picture under Opencv. Not so easy if you are a spoiled Windows user.
I also have to say a big thank you for your patience with me. In most forums, a complete newbie is more likely to be beaten down than to be helped. This is absolutely not the case here. At the beginning the Inmoov project is still quite easy to handle, but if you then go deeper, you will quickly reach your limits as a beginner. So a big fat thumbs up for your support. Keep up the good work.
rv8flyboy's picture

Am having the same error preventing the mic to work in chromium

This error:

Error: Can't initialize nvrm channel
nvdc: open: Operation not permiteted
nvdc: failed to open '/ dev / tegra_dc_ctrl'.
as I dont have kinect, i do not think its kinect related
runing Nixie on Java 11 on jetson 4.5