warning: file_get_contents(http://build.myrobotlab.org:8080/job/myrobotlab/job/develop/lastSuccessfulBuild/buildNumber) [function.file-get-contents]: failed to open stream: HTTP request failed! HTTP/1.1 404 Not Found in /var/www/html/myrobotlab/themes/superclean/block-block-5.tpl.php on line 268.

Kryten the inmoov robot

Hi Folks

quick introduction and some questions. My robot is called Kryten, after the humanoid robot in the Red Dwarf series. He is pretty much build standard Gael, with the addition of the articulating stomach.

he stands on a manequin leg set liberated from a local department store, which makes him, ohh, about 6 foot 8 inches tall

currently the host computer is win10, with Nixie 1.1.16178093 running 2 arduino megas

am at the point where I am doing test runs and am running into a few anomalies.

if I set ScriptType=Virtual, in the _InMoov.config file, the chatbot remembers the robot name and my name, every time I set ScriptType=Full, it insists its the first time it runs and wants to know what the robot name is, to name a few. set Full back to Virtual, reboot, and it knows all these facts again. 

question: why should it matter if the robot is in Full or Virtual mode, and where is the flag that signals First Time run??

when I start in Full, and have isHeadActivated= True in skeleton_head.config. when the head services start, the head.neck and the head.rothead services show they are attached, but i can't move them with the sliders. If I detach, and re-attach, then stop and start the head.timer , these servos will spring to life. sometimes I have stop and start the head timer, sometimes just stop the head timer. But i always so far have to do a detach re-attach sequence for those two servos. Yes the speed is set high enough.

Does anybody have asuggestion as to why alwasy just these two servo's?

looking for a way to test the ultrasonic sensor(s) attached, maybe a single line of py code or so. May I suggest an enhancement to the ultrasonic service in the web gui, by having a Test button, that would fire of a single measurement and display this in the web page?

even with isNeopixelActivated=False, Nixie seems to want to see something on that particular comport, mine is via USB so has a NeopixelMasterPort=COM10 entry. Should it just not care with activated set to False?

lastly, for those who have gone before me with the articulating stomach, how have you hooked the servo up? A suggestion was made to use lowStom, but currently the torso config file only has an entry for topStom and midStom, although the torso.py in the skeleton subdir does have a single small entry to set lowStom speed but nothing more. At this point I was going to add lines for lowStom in torso.py, commensurate to the lines already there for topStom and midStom. 

any pitfalls by doing this?

 


Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
GroG's picture

Hi rv8flyboy, When you

Hi rv8flyboy,
When you set ScriptType=Virtual does a screen appear and within it a virtual inmoov ?

rv8flyboy's picture

virtual screen

Hi GroG

no screen appears with a virtual inmoov

 

rv8flyboy's picture

small error in START_INMOOV.bat

in the startup script there is a small error preventing copying of the log file automatically:

echo ------------------------------------------------------
echo Rotate log files for clean no worky
 
del myrobotlab.log.1 > NUL
mv myrobotlab.log myrobotlab.log.1
 
 
the last line, in windows anyway, should be:
move /y myrobotlab.log myrobotlab.log.1
 
 

 

GroG's picture

mv is for Linux/Mac move is

mv is for Linux/Mac
move is for Windows
Can you show me the link into github where this is ?

rv8flyboy's picture

Not sure what you are

Not sure what you are asking?

The inmoov files show up after doing the java -jar myrobotlab.jar --install. I don't know where the install script pulls them in from, they show up after clicking the install all services button

rv8flyboy's picture

ultrasonic error?

trying to test the ultrasonic sensor. someone suggested the phrase "measuring the distance"

tried this and did not work, found following in the log file:

08:58:38.655 [i01.chatBot] INFO o.a.a.MagicBooleans [MagicBooleans.java:47] input: measuring the distance, that: What s it like to be that way, topic: firstinit, chatSession: org.alicebot.ab.Chat@4e1a154a, srCnt: 0
08:58:38.656 [i01.chatBot] INFO o.a.a.Graphmaster [Graphmaster.java:307] Matched: MEASURING THE DISTANCE <THAT> * <TOPIC> * _inmoovGestures.aiml
08:58:38.660 [i01.chatBot] INFO o.a.ab.Bot [Bot.java:415] writeCertainIFCaegories learnf.aiml size= 72
08:58:38.661 [i01.chatBot] ERROR c.m.f.Service [Service.java:2038] i01.chatBot error could not find method ProgramAB.publishOOBText(String)
08:58:38.662 [python] INFO c.m.s.Python [Python.java:462] exec(String) 
ultraSonic("in centimeters I measure ")
08:58:38.663 [i01] INFO c.m.s.a.AbstractSpeechSynthesis [AbstractSpeechSynthesis.java:555] i01.mouth processing 
08:58:38.667 [python] ERROR c.m.f.Service [Service.java:2038] python error Traceback (most recent call last):
  File "<string>", line 1, in <module>
NameError: name 'ultraSonic' is not defined
 
08:58:38.667 [python] INFO c.m.s.Python [Python.java:557] finishedExecutingScript
 
 
 
edit:
running the one-liner in python window does produce a measurement result:
print "ultrasonicSensor test is ", i01.getUltrasonicSensorDistance()
rv8flyboy's picture

Error when starting torso

Hi Folks

 

just sent a NoWorky, noticed whenever I set isTorsoActivated=True i get following error in the log file:

12:26:59.503 [python] ERROR c.m.f.Service [Service.java:2038] python error Traceback (most recent call last):
  File "<string>", line 42, in <module>
  File "InMoov/system/InitCheckup.py", line 63, in <module>
    if os.path.splitext(filename)[1] == ".py":execfile(RuningFolder+'skeleton/'+filename.encode('utf8'))
  File "C:/nixie/InMoov/skeleton/torso.py", line 42, in <module>
    torso.topStom.load()
AttributeError: 'NoneType' object has no attribute 'load'
 
This is with 'virgin' files, only change made is setting isTorsoActivated=True
 
 

 

GroG's picture

yeah the torso explodes -

yeah the torso explodes - this is in the noworky you posted ...
 

runtime cannot callback to listener subconsciousMouth does not exist for onStarted 
12:57:07.549 [python] ERROR c.m.f.Service [Service.java:1930] copy failed
java.lang.NullPointerException: null
	at org.myrobotlab.service.InMoovTorso.broadcastState(InMoovTorso.java:144) ~[myrobotlab.jar:1.1.300]
	at org.myrobotlab.framework.Service.startPeer(Service.java:1824) ~[myrobotlab.jar:1.1.300]
	at org.myrobotlab.framework.Service.startPeers(Service.java:1924) ~[myrobotlab.jar:1.1.300]
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:na]
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:na]
	at java.base/java.lang.reflect.Method.invoke(Method.java:566) ~[na:na]

 

So .. I go to the InMoov folder and how exactly do I replicate what you did ? - you can use video if you want for a reply

rv8flyboy's picture

Hey GroG all i do is switch

Hey GroG

all i do is switch ON the torso in the torso config ( i believe in /inmoov/config/torso.config, but not home right now) by setting  isTorsoActivated=True, 

then starting with START_INMOOV.bat.  

I have not found a correlation with any other of the services, i.e., it doesnt seem to matter if I activate arms, or hands, or not, or any combination, I have not tried with the head switched off, but I would suspect it won't matter. All other services appear to start correctly.

GroG's picture

Cool, this is helpful

Cool, this is helpful ..
This is what you do and what you observe ..

Now what is it that you want ?   I'm sure having no errors and control over the torso.
But what goals or ideas do you have ?
How do you want to control Kryten ?  How is working with the WebGui ?  and anything else you can think of ...

BTW - we appreciate the fact you volunteered as a guinea pig ...  :)

Also to let you know .. since my fix for the simulator/JMonkeyEngine service ... I'm now currently working on the kwatter's ServoMixer - which should provide a easier way to make poses/gestures/sequences

rv8flyboy's picture

lol, I would like the basic

lol, I would like the basic functionality to work

i am happy I switched from jetson nano back to win10, i would have thought these errors are Jetson induced and would have been pulling my hair out.

My goals for Kryten? I don't have a specific end result in mind, it is something to learn and tinker with. I think an interactive robot in the end, animatronics. For now, new things to learn. Would love to learn more about the programming aspect. I started working in python on the jetson nano to learn about AI. 

My daytime job is it to find out why things break, (forensics) so I don't mind being a guinea pig(let) to test things out, bring it on. I'll analyze it and report back as good as I can. Can't tell ya how many times in the last weeks I switched Java, MRL versions, to figure out the differences.

on my wish list?, a good chatbot. I think the more interactive the robot can become the better it is. One off the reasons i switched to Nixie is that in manticore, I could not get rid of calls to askpannous, which would cause the chatbot to 'freeze' in its tracks. 

I think on a sooner timescale, IF you are working on the torso bit anyway, how about incorporating for upper, mid AND lower Stom? right now it looks only upper and mid stomach are in the configfile and skeleton_torso???

 

anyway, easy is good, creating gestures easier would be a very nice additions

 

Oh and i learned a new trick today. Been looking for a replacement for the unix/linux Tail command to follow myrobotlab.log on a windows machine. .. One can use PowerShell with the following one-liner:   "Get-Content /nixie/myrobotlab.log -Wait"

Tally HO Guys, time for bed here, lemme know when the stomach bit starts working so I can move on with my project.

 

 

rv8flyboy's picture

Being a tinkerer in heart and

Being a tinkerer in heart and body, mechanics, electronics, the works, I like to 'improve' on things, my latest quest is to incorporate a better servo control for the articulating stomach. my idea is to use a BNO055 3 axis sensor to measure tilt in 2 directions, sideways, front to back. and use this signal to control the servos. I would like to use the acme threads used in 3D printers to replace the plastic threads used in various locations as they are prone to breaking. However, that means that the servos will need external feedback as they now have to be multi turn. I want to capture the servo signal from the arduino mega for the stomach control, and want to mix this with the tilt and roll info to drive the servo (the capture part is working, the BNO055 part is working, now figuring out driving the servos.

I was raking my brains as I need to take the pot out of the servo to make them multi rotational. There are some ideas on thingiverse to mount these pots externally but because the movement in 2 axis (rather than Gael's single axis) these solutions look ungainly. So I was gonna drive the servo with a separate H bridge, remove the existing electronics.

It struck me I could remove the existing pot and use a 2 resistor voltage divider. basically a pot in a fixed middle position. Then use a regular PWM servo signal to drive the existing servo controller, less than 90 degrees, the servo drives one way, more than 90 degress, the servo drives the other way. HA, so now I can go take the new servo signal directly from an arduino nano output if I have to... simple...

 

 

GroG's picture

Thanks for all the info ..

Thanks for all the info .. I'll use it as a reference ...

FYI - your exploding torso should be fixed in this PR - https://github.com/MyRobotLab/myrobotlab/pull/808 
Ask kwatters to review.