using unix/linux Tail command in windows 10.

rv8flyboy's picture

This is not so much a question as a passing on of a tip.

I have been working with Nixie and windows 10 and had on various occasions a need for a running tally of the log file. In linux/unix this is easy, just use the tail command and Bob's yer uncle as they say.

TAIL command in windows 10

rv8flyboy's picture

I have been working with Nixie and windows 10 and had on various occasions a need for a running tally of the log file. In linux/unix this is easy, just use the tail command and Bob's yer uncle as they say.

That quest wasn't as straight forward in windows, but, to the rescue windows PowerShell [PS]. PS is very powerful and out of the box comes with restrictions. To be able to run scripts you have to tell the PS that it is ok to trust local scripts. To do this run  PS in admin mode then execute the fillowing command:


Kryten the inmoov robot

rv8flyboy's picture

Hi Folks

quick introduction and some questions. My robot is called Kryten, after the humanoid robot in the Red Dwarf series. He is pretty much build standard Gael, with the addition of the articulating stomach.

he stands on a manequin leg set liberated from a local department store, which makes him, ohh, about 6 foot 8 inches tall


Articulating stomach

rv8flyboy's picture

For those of you who implemented Bob Houston's Articulating Stomach, how are u driving this from MRL?

Getting ready to hook up this servo pair, scratching head.....

Pauses after audio file played

Shido's picture

 

 

Here is an example of what I'm trying to fix ..


ultrasonic sensor(s) in Nixie, questions, suggestion

rv8flyboy's picture

Hi Folks

I have a standard hc-sr04 sensor, actually I have 2, but looking in Nixie, ultrasonic.py in services, I don't see a way of actually activating 2 individual sensors.

what is the correct way of testing this sensor to make sure it works? 

 

In the ultrasonic.py there is a print statement: print "ultrasonicSensor test is ", i01.getUltrasonicSensorDistance(), however, I have not found where this is printed.

 

ServoMixer Service

javadoc


User interface is worky in this pr - https://github.com/MyRobotLab/myrobotlab/pull/801 
"sub panels" how to get service page parts showing in other service pages is worky, as you might be able to tell by the set of servos here.

Example for develop:

[[ServoMixer.py]]

can't get Manticore (nor Nixie) to work on Win10

rv8flyboy's picture

reloaded a clean/new version of win10 64 bit

downloaded Java: java version "1.8.0_28", Java HotSpot(TM) 64-Bit Server VM (build 25.281-b09, mixed mode)

downloaded Manticore: manticore 1.0.2693

create MRL directory and run java -jar myrobotlab.jar -install

Use Jeston Nano as a visual brain.

pepper's picture
I played a little with the Jetson Nano Hello Ai World demo. It's impressive how fast it all goes. Now I ask myself whether it is not possible to use the Jetson as a brain for object recognition, face recognition, etc. for Inmoov. Instead of using Opencv in myrobotlab, the various data from e.g. .detectnet etc. would be fed into myrobotlab by the Jetson.
Is that generally possible and how should something like this have to be set up?
 

MQTT !

GroG's picture

Have worky (N) Mqtt Instances...

It may look like duplicates, but its not. There are 4 MRL Instances connected together, and all (so far) appear to be fully functional in the webgui.

There are 3 which have mqtt connectivity, and another that is connected through the webgui service.
WOOHOO ! - will continue looking for bugs and doing more refactoring ...