I launched the azure translator service. Based on the example code -  it works!
But which way I can  run  translator to work with inmoov?
When I change  language in the ear service,  the phrase is correctly recognized in a given language, but it is not translated into English.
I would like that the phrase to be translated through the azure service, to english language and then it can be further processed by AIML ...
is it possible?
Just in case I send you "no worky" log.


5 years ago

any service in MRL can be used with InMoov.. it's just a question of how you want to use it.  For Azure Translate..  

AzureTranslate is a TextPublisher and a TextListener.. that means if you pass text into it.. it will be translated and publish it. 

So.. you would want to wire it in between speech recognition and programab.  Then when programAB publishes a response, instead of wiring it directly to the speech synthesis. you will want to pass it to azure first and then to the speech synthesis.

In a way the HtmlFilter does this already on the output of ProgramAB to strip away html tags.. All that does is take the text in.. process it to strip html tags.. and the pass it along to speech synthesis.


I suspect   you'd want to have some controls over what language it expects on the input.. and what language you want to translate it to..  

All fun stuff and all very do-able with the existing code in MRL :)