A service to generate appropriate mouth movements based on textual data. Used typically in conjustion with the Speech service, so that audio speech through the speakers matches movement of a servo driven jaw.
Example code (from branch develop):
#file : MouthControl.py (github)
# start the service mouthcontrol = runtime.start("mouthcontrol","MouthControl")
Example configuration (from branch develop):
#file : MouthControl.py (github)
!!org.myrobotlab.service.config.MouthControlConfig delaytime: 75 delaytimeletter: 45 delaytimestop: 150 jaw: null listeners: null mouth: null mouthClosedPos: 0 mouthOpenedPos: 180 neoPixel: null peers: null type: MouthControl
Mouth control
So if mouth movement is off, have does one control and sync?