I'm sure I'm missing something simple, but I've linked ProgramAB to Maryspeech to build a personality for my InMoov. Once this was working I've started to fill out my Python script to start up the servos etc. Thanks to KWatters excellent work with OOB tabs I've got various phrases in ProgramAB to call Python functions. Where I am stuck is that the responses generated by ProgramAB are passed directly to Speech so the InMoov mouth doesn't move. I'm not sure if I'm missing something simple to do this.

I can see various options such as getting OOB tags to call SpeakBlocking(), etc. But is there a simpler method?

kwatters

8 years 11 months ago

Ahoy BenSonOfJohn! 

  That's awesome that you're able to use program ab to start building a personality for your inmoov.  I'm hoping to add a whole lot of new "data" access to MRL.  but as a side note..

Not sure exactly what is going on, if your AIML/python scipt was checked in, I could have a look.  (maybe add it to the pyrobotlab repo?) 

https://github.com/MyRobotLab/pyrobotlab/tree/master/home

That being said, I suspect the speak blocking is being called directly on the MaryTTS service.. rather than replacing the MaryTTS service as the mouth for the InMoov.  

There might be a little more refactoring to make this work as you expect, but in the short term, I'm sure we could update your script to make sure the MouthControl service gets the text so it can animate properly...

Resistance is futile!

 

 

bensonofjohn

8 years 11 months ago

In reply to by kwatters

I think I've now checked in my python file to start up InMoov. The rate you guys work at I suspect that resistance is extremely futile :)

Not sure if my check in worked or not. Github wanted me to create a new fork of the project before I could check anything in. The relevant part of the script is below.
 
import random
leftPort = "COM6"
rightPort = "COM8"
 
ozy = Runtime.createAndStart("ozy", "InMoov")
 
webgui = Runtime.createAndStart("webgui","WebGui")
ozyBrain = Runtime.createAndStart("ozyBrain", "ProgramAB")
 
ozyBrain.startSession("D:\MyRobotLab\ProgramAB", "Ben", "Ozy2")
htmlfilter =  Runtime.createAndStart("htmlfilter", "HtmlFilter") 
# create a Speech service
ozy.startMouth()
ozyBrain.addTextListener(htmlfilter)
htmlfilter.addTextListener(ozy.mouth)
 
#Start Head
ozy.startHead(leftPort)
#Set default positions
##############
# tweaking default settings of jaw
#i01.head.jaw.map(0,180,10,35)
#ozy.mouthControl.setmouth(65,90)
ozy.head.jaw.setMinMax(105,146)
ozy.head.jaw.map(0,180,105,146)
ozy.head.jaw.setRest(105)
ozy.head.jaw.moveTo(105)

 

super guys - i am lost with what to do to improve understanding of my InMoov commands.

Maybe someone could make an example of what we have to change in the Gaël's example scripts to make use of programAB? (and how we can switch to another language)

Juerg

Hi Juerg,
I'm super happy you are interested in going down this road. I admit that I am generally pretty poor with documentation. Generally speaking we need to come up with a reference implementation that uses a small amount of python to bootstrap the InMoov and ProgramAB. Then we can build out the AIML for the InMoov brain.
I added this service to MRL with exactly this intention and when Masterblasta pointed me at the OOB tags I finally saw how to integrate it all.
I started putting together a reference implementation using AIML to drive the InMoov. I wanted to make something that was functionally complete compared to Gaels InMoov full scripts, but got side tracked and actually discovered some things about my OOB tags that I didn't like. Lucky I think those are fixed now so hopefully it will make it easier to extend ( and explain how to use ).
Hopefully over the next few days I will have a chance to get back to the AIML stuff.. I got seriously side tracked with the inverse kinematics of the arm and the oculus rift integration.

juerg

8 years 11 months ago

Hi Kevin

I would really appreciate a tuto for using programAB with InMoov but understand that you got sidetracked.

Mentioning inverse kinematic - I asked Alessandro once whether we could have a Head-Command for InMoov saying "follow right hand" and "follow left hand". As MRL knows all the servo commands it should be possible (with knowledge of the geometrics and math) to add a bit of "self control" to it?

I remember seeing a video of a virtual InMoov playing bowling so it should theoretically be possible? However again another open path??

Regards

Juerg