InMoov build plan in Nixie

GroG's picture
Here is the current service build plan for InMoov in Nixie ... (looks better) need to compare it with what Manticore did.
 
i01=InMoov
i01.ear=WebkitSpeechRecognition
i01.eyelids=InMoovEyelids
i01.eyelids.eyelidleft=Servo
i01.eyelids.eyelidright=Servo
i01.eyesTracking=Tracking
i01.eyesTracking.opencv=OpenCV
i01.eyesTracking.pid=Pid
i01.head=InMoovHead
i01.head.arduino=Arduino

myrobotlab release Nixie on Raspberry Pi 3b

Tricia279's picture

Hi, yesterday I implemented Nixie 1.1 on my Raspberry and it looks more or less successful. The problem is that I do not know which functionalities I must see, cause I do not now what Manticore has. Please tell me...


UltrasonicSensor

alexstarter's picture
Friends!
Which function returns millisecond values ​​in the UltrasonicSensor service?

Astro is awesome

GroG's picture

A quick shout-out to Astro.  I guess even though its been around for a while, the (new to me) speech recognition web ui is "out of this world" :)

It even has a recognition color gradient.

I'll fix the region problem - and I think the wake work "unset" needs fixing too

 


Stervo motor Hall-Effect Position (circa 90° position detect)

Gareth's picture

Its been bugging me for a while now on finding a way to detect the position of a Stepper motor, because it good to find its position before powering up !!! ...Yes....

So in a Nutshell :-


Sequence or Gesture GUI

astro's picture

I leave this screen as an idea.
I don't know if there is a service for this or it has to be part of Servomixer. I'm not sure if this is a gesture service.