Next version 'Nixie' is coming soon !

Help us to get ready with MyRobotLab version Nixie: try it !

I adjusted the initial rotation of the virtual inmoov jaw so that its no longer screaming, but in the same pull request I fixed the unbridled update.   What's this about ? 

JMonkeyEngine gives us a thread so we can update virtual inmoov, moving arms and head etc..  But if we don't slow this thread down the Frame rate will rocket to 300+ Frames per Second, which makes our computer not so happy :(

Time of Flight meets BlueTooth.....

Extensive testing with WiFi mesh networks proved to be not the way to go for this project... timing issues killed the progress.

Exit Wifi Mesh network goblins...... YAY

Enter the BlueTeethed Elves ....... YAY

The ESP32 has up its sleeve BlueTooth capabilities, it was a simple job to reconvert the data packets.

Hi, this is not directly about MyRobotlab but the goal of the project is the creation of new Material for our beloved Robots, so I hope to get some help here. And my last attempt ended with a big bunch of blue expensive smoke.

 

Hi MaVo, this was an Arduino Uno - I had some problems with subscribing to publishPin, but this worked for my setup.  I just enabled pin 14 which is an analog pin and a stream of values became accessable.

 

You should be able to add or remove from streaming array of PinData with more enables or disables

Service Life Cycle

Name Action
Type
 Description  
Runtime.install(service) Method framework will install if not installed  

 

One step closer to work-e == gert-e

This is a preview of the Emoji service ...  cuz what is better to express the nuances of emotion than emojis ?
Things to fix : (oh god .. the list goes on ...)

I have been building the robot and playing around with mrl for a while now and everytime I attempted to setup face recognition, the programme would crash, as in the live video would pause and no mouse clicks would do anything. the speech still worked, but it would otherwise just hang. 
I initially thought it was the compuer I was using was too slow, however I have just updated to a new computer and same problem.

Is there something I am doing wrong or do I just need a massivly fast computer or what?

please any help would be good

I've added "OpticalFlow" filter to OpenCV (worke branch), and here seeing throuhg work-e's kinect eye - you can see worke-e turning left ... or was it right ? :) 

So to make this useful for collision avoidance, some work will need to be put into it.  Like describing the direction vector.  Also some semi-intelligent pruning of data. 

It starts by getting good features to track. Which itself has a few parameters to tweak.