Next version 'Nixie' is coming soon !

Help us to get ready with MyRobotLab version Nixie: try it !

Ahoy Kwatters !
I was wondering the differences and the plan going forward for these 3.

OpenCV DL4J Filter - Worky !

OpenCV Yolo Filter - Worky !

Deeplearning4J Service has no UI in swing or web - but I can see the filter spawns the service

I have been working with Nixie and windows 10 and had on various occasions a need for a running tally of the log file. In linux/unix this is easy, just use the tail command and Bob's yer uncle as they say.

That quest wasn't as straight forward in windows, but, to the rescue windows PowerShell [PS]. PS is very powerful and out of the box comes with restrictions. To be able to run scripts you have to tell the PS that it is ok to trust local scripts. To do this run  PS in admin mode then execute the fillowing command:

This is not so much a question as a passing on of a tip.

I have been working with Nixie and windows 10 and had on various occasions a need for a running tally of the log file. In linux/unix this is easy, just use the tail command and Bob's yer uncle as they say.

Hi Folks

quick introduction and some questions. My robot is called Kryten, after the humanoid robot in the Red Dwarf series. He is pretty much build standard Gael, with the addition of the articulating stomach.

he stands on a manequin leg set liberated from a local department store, which makes him, ohh, about 6 foot 8 inches tall

For those of you who implemented Bob Houston's Articulating Stomach, how are u driving this from MRL?

Getting ready to hook up this servo pair, scratching head.....

Hi Folks

I have a standard hc-sr04 sensor, actually I have 2, but looking in Nixie, ultrasonic.py in services, I don't see a way of actually activating 2 individual sensors.

what is the correct way of testing this sensor to make sure it works? 

 

In the ultrasonic.py there is a print statement: print "ultrasonicSensor test is ", i01.getUltrasonicSensorDistance(), however, I have not found where this is printed.

 

I played a little with the Jetson Nano Hello Ai World demo. It's impressive how fast it all goes. Now I ask myself whether it is not possible to use the Jetson as a brain for object recognition, face recognition, etc. for Inmoov. Instead of using Opencv in myrobotlab, the various data from e.g. .detectnet etc. would be fed into myrobotlab by the Jetson.
Is that generally possible and how should something like this have to be set up?