Double_ing up the "Time of Flight" sensors

Gareth's picture

Two is better than one...... (or 4 or 8 etcetcetc).

Following successful experiments with 1 single Time of Flight distance sensor mounted to Stepper, watching the scan process its seems to lend its self to setting the scan to just 180° and mount two sensors back to back to extract the whole 360°...... (or 90° scan with 4 sensors, hope you see where I am going with this :-)


What to say if not "Oh my God"

Spawn32's picture

Alex , they beat you to the dancing bot :)


Finished head bob mech

Alexinator40's picture

Hi all, the head bob mech is finished and here is the result. I wanted the head to move forward a bit more, however, I had to sacrifice some movement for the strength of the parts. 


Using the lightdesk way for programming the moves...

MdG_NL's picture

Hello MRL InMoover's,

During the Maker Faire in Eindhoven (in The Netherlands) I was talking with Wilco about this option.
Both have experience with light consoles and know that the way of programming will makes it easier.

I've no idea if there is a option to use the same way of programming in MRL.
Now it's all done by scripts and most things are hard to make it smooth.
I mean, it's harder to combine some moves at the same time.


Fun with OpenCV

GroG's picture

If your gonna refactor the service - might as well make it "better" "stronger" "faster" ..

So I made it capable of input from youtube videos - cuz our robots will want unsupervised training of neural networks by watching tonnes of youtube !!

The above stream was done by a single line :

opencv.capture("https://www.youtube.com/watch?v=2STTNYNF4lk");

 


Advanced head bob mech Update

Alexinator40's picture

Hi all.  The movements have been finalized on the head bobbing mech.  Click MORE below to see the new video.   Spawn says that the running a script on the servos using the "servo" service makes the servos jitter during movement unless you use the -1 speed setting. Due to the weight of the skull (which will be added soon), he didnt want to burn out his servos using that speed setting.  Here is a look at the full movments before the rest of the skull is attached.   Any suggestions on fixing the servo jitter would be appreciated    


Yolo in conjonction with Ultrasonic sensor

hairygael's picture

So, I was working on a script to get Yolo starting only if an object is presented at the correct distance (1.0meter) using a Ultrasonic sensor.

This works nice.

Now is there a filter I could use to eliminate the background before Yolo analyses the picture?

@moz4r, I solved the neopixel issue by adding an extra sleep(4) at the beginning of my script.