kwatters's blog

Updated joystick example

kwatters's picture

It's come to my attention that the joystick example scripts are pretty far out of date and definitely don't work very well with manticore.

I have updated an example joystick script that will work with manticore

https://github.com/MyRobotLab/pyrobotlab/blob/develop/home/kwatters/joys...

This is a very simple example of the joystick analog stick x axis controlling the speed and direction of the a servo motor.  This is a much more simplified approach than having to deal with the "sweep" function on the servo.

 


Crossing the Rift into the Yolo valley.

kwatters's picture

Oculus Rift with Yolo filter

The OculusRift service has been updated to support the CV1 version of the Oculus.  The service still is pretty rough and needs some polishing, but it's functional if nothing else.


New style Config for MRL, custom robots, and InMoov2

kwatters's picture

I wanted to start a conversation about how to structure the configuration for a robot running MyRobotLab.

Having a consistent configuration that is flexible that defines the services that are started and how they attach to each other would be a great starting place.  


Robot Memories...

kwatters's picture

So the general question is:

What should your robot remember?  What should your robot recall?  How do you want to access their memories?

More details to follow...

 

Teaser:  We will embed a "core" to store robot memories.  I think these memories probably include things like.

1. what did I hear?

2. what did I say?

3. what did I see?

4. what did I feel/taste?

5. what did I smell?

 

We can establish a common storage format for all of this and it can be shared!


The uncanny Mr. Turing and his deep learning of with LSTM neural networks

kwatters's picture
 
 
I recently attended some training classes around a deep learning framework called Deeplearning4j.  It was provided by the company SkyMind.IO, they're the ones that created, maintain, and support the dl4j open source project.
 

This script can save your life...

kwatters's picture

So, sometimes an arduino running mrlcomm will get away from you.  The best way to reset it is to toggle the serial port.  This resets the arduino that's connected and thus clearing the device list in mrlcomm.

Here's a small snippet of python code that will toggle the DTR pin and reset the arduino.

pi@raspberrypi:~ $ cat reset.py


Campaign for MyRobotLab on Wikipedia

kwatters's picture

Update: This document now open for anyone to edit...  

https://docs.google.com/document/d/1HDlE8d--x1-A56Ff5mcQehE5togKtGt4yuuGIHVStgk/edit?usp=sharing  

 

so.. I see that there are 2 pages that point to MyRobotLab on wikipedia.

1.  the InMoov page : https://en.wikipedia.org/wiki/InMoov

2. the open source robotics page : https://en.wikipedia.org/wiki/Open-source_robotics

 


Raspberry PI + OpenCV on latest

kwatters's picture

So, it seems like the upgrade of OpenCV to version 1.3 has busted the OpenCVFrameGrabber on the RasPI3.  Where, I don't have a fix for this issue yet, I do have a work around for the problem.  Here's the steps to get OpenCV working on the RasPI3 on the latest build  (until we sort the javacv 1.3 issue.)


Deeplearning4j , OpenCV and real time object recognition

kwatters's picture

So...  it works! sweet! Here's the proof!

 

 


The road to deep learning for MRL.

kwatters's picture

So  One of my new years resolutions was to add some deep learning support in MyRobotLab.  Reality is, this is a big topic and covers many aspects, but at the end of the day I boil it down to this.