Description

Getting started with myrobotlab - Having problems? Ask the nice people here.

Hello, so far I've been developing my Inmoov Arm with a Raspberry Pi and Adafruit16CServoController. I've basically found myself trying to reimplement a lot of the functionality already provided by the InmoovArm service. However, in the master branch (Manticore release) it seems to be hardcoded to use an arduino. I've noticed in the develop branch it seems to be more flexible regarding the servo controller. However, I'm struggling to find the proper initialization procedure to create the Arm with the I2C servo controller.

Hello,

Can anyone point me to some help to hook up dual Ultrasonic Sensors in my Inmoov ? Cant really seem to find a place that detials it.  Looking to hook 2 of them .  Cheers

I wrote a sketch from \ resource \ Esp8266_01 at Esp8266_12. I received a response to the request http: //esp8266-01.local/ with a map of connected devices.

 I ran the script:

esp = Runtime.start ("esp", "Esp8266_01")

ada = Runtime.start ("Ada", "Adafruit16CServoDriver")

Hi guys. I wonder how we implement the Face Detect + Tracking in MRL through INMOOV.bat? The Tracking.py only moves the x and y eyes.

Is a 11.1V Lipo Batteryexternal power too high to power up the upper part of Inmoov? (The head and neck, 6 servos in total)

11:08:06.046 [main] ERROR c.m.f.Service [Service.java:2056] runtime error could not create service python Python

 

Anyone know how to fix this issue?

https://www.youtube.com/watch?v=Dqa-3N8VZbw

Hi guys. I would like to implement the function as shown in the URL attached, which is a emotion detect function. I wish to implement this when Face Detection in python script is executed in MyRobotLab, so that the emotion status will be displayed also when the face is detect, anyone have any idea how should I do so?

The tracking script used is Tracking.py.

What is the max neopixel rings that I can run off the inmoov breakout boards ?  I belive it states 3. Just wondering how, since 1 per inmoov board on a full body.  How do you connect the third ?

So this might be not related much to MyRobotLab, but I'm interested in how the face is track after detection by using OpenCV, anyone have the fundamental knowledge in this services? I mean, like, how the servo is controlled to make sure the face detected will always located at the center etc.