I am working on a new robotic head which should operate with MRL... it has a 3DOF neck mech for realistic head movements, 2 sound sensor ears and a 2eye mech for image processing... the eye mech has 2 eyes, one is a webcam with 2dof (pan-tilt) and a second eye which is a PIR sensor and can move sideways with the other eye... this post is just to announce my new project and to get some help from the MRL community... I am new to MRL and Python and working hard to learn about them.. I am dealing with robotics for years and familiar with Arduinos... this robotic head has 2 Arduino mini Pros on it... one dedicated to MRL (with MRL comm.ino)  and the other is taking care of other jobs such as neck movements and ears... the ears are functioning to find the direction of coming sound and turn the head that way, but i am planning to integrate the webcam mic for MRL sound services, such as responding to comments... what do you think?... Any help will be appreciated.... 

[[Tracking.PIRTriggered.borsaci06.py]]

 

GroG

10 years 11 months ago

I'd love to see video of it in action, I'm still having difficulty of figuring out how the serovs move in relation to the head.  I'd be very excited to see PID work with this.  My experience with PID is that it works along a single axis - you have a system which once coordinate on one axis affects the other coordinates on different accesses - complex !   But I'm wondering if PID will work well regardless of the complexity, since its always doing a relative error correction - Or maybe there are more effecient combinations of PID .. exciting !

I'm also wondering how you do sound location ?  Is it 2 small mics and a differencing + trig algorithm?

thx Grog... i havent integrated neck movements to MRL yet.. a dedicated Arduino takes care of neck movements... the single back servo turns the head sideways on a roller bearing turntable. the 2 servos on each side puls or pushes the neck to sides and back-front motion... the trick is the ball joint just over the bearing in the middle of the turntable... no PIDs for now... just pre-programmed head movements or serial commands from Arduino IDE... look up... look front-left... kinda things.... sound location is done via 2 sound sensors i made myself... i can post  the circuit and the program for locating the direction of sound here if u want... or if anyone interested...

 

Hi I like this project would help me understand the code exactly does when there is a sound on the left side then the head turns to the left and vice versa? please I want to do this for my project can publish more information of the circuit? Thank you very much

borsaci06

10 years 11 months ago

Here is the code to locate sound direction... and the circuit i made as sound sensors... there are 2 of them at each side of the head.... mics are simple condenser mics... transistor is 2N3904 but any NPN will work... two pots are for sensitivity adjustment...
 
 
/* 
 *Dincer Hepguler
 *Mic01 to A0 as right sound sensor 
 *Mic02 to A1 as left sound sensor
 *
 */
#include <Servo.h>
 
// these constants won't change:
const int ledPin = 13;     // led connected to digital pin 13
const int mic01 = 0;  // the amplifier output is connected to analog pin 0
const int mic02 = 1;  // the amplifier output is connected to analog pin 1
Servo myservo;             // create servo object to control a servo 
 
// these variables will change:
int sensorReading01 = 0;   // variables to store the value read from the sensor pins
int sensorReading02 = 0;
//int sensorMax01 = 0;
//int sensorMin01 = 1023;
//int sensorMax02 = 0;
//int sensorMin02 = 1023;
int threshold01=160;       //change this threshold value according to your environmental sound level.
int threshold02=160;       // and this second threshold too. similar ears should give same values.
int threshold;
int right = 0;
int left = 0;
int ServoDir = 90;
int ServoPos = 90;
int pos = 0;
void setup() {
  pinMode(ledPin, OUTPUT);   // declare the ledPin as OUTPUT
  pinMode(mic01, INPUT);     //declare sensor pins as INPUT
  pinMode(mic02, INPUT);
  Serial.begin(57600);       // use the serial port
  pinMode(13, OUTPUT);
  //digitalWrite(13, HIGH);
  myservo.attach(9);        // attaches the servo on pin 9 to the servo object
  myservo.write(ServoPos);
  delay(100);
  //ServoPos = constrain(ServoPos, 30, 150);
}
 
void loop() {
  // read the sensor and store it in the variable sensorReading:
  for (int x=0; x < 10; x++){
  sensorReading01 = analogRead(mic01); Serial.print(sensorReading01);Serial.print(" "); 
  sensorReading02 = analogRead(mic02); Serial.print(sensorReading02);Serial.print(" "); 
  Serial.println();
  delay(10);
  //ServoPos = constrain(ServoPos, 30, 150);
  // if the sensor reading is greater than the threshold 
  if ( sensorReading01 > threshold01 )  
     {Serial.print("right ");Serial.print(sensorReading01);Serial.print(" "); Serial.println(ServoPos); 
      right++;
 } 
  if ( sensorReading02 > threshold02 )  
     {Serial.print("left  ");Serial.print(sensorReading02);Serial.print(" "); Serial.println(ServoPos);  
      left++;
  }
 }
 
 //ServoPos = constrain(ServoPos, 30, 150);
  // define servo direction
   if (left > right) { 
       //for (ServoPos = ServoPos; ServoPos-left; ServoPos--){
       //if (ServoPos < 30){ServoPos = 30;}
       
     ServoDir = map(sensorReading02,0,1023,90,30); 
    //ServoPos = constrain(ServoPos, 30, 150);
    for (ServoPos = ServoPos; ServoPos > ServoDir; ServoPos--){
       myservo.write(ServoPos); 
       delay(10);}
       left=0; right=0;
       //return left, ServoDir;
         
 }
   if (right > left) { 
       //for (ServoPos = ServoPos; ServoPos+right; ServoPos++){
       //if (ServoPos > 150){ServoPos = 150;}  
       
     ServoDir = map(sensorReading01,0,1023,90,150); 
    //ServoPos = constrain(ServoPos, 30, 150); 
    for (ServoPos = ServoPos; ServoPos < ServoDir; ServoPos++){
       myservo.write(ServoPos); 
       delay(10);}
       right=0; left=0;
       //return right, ServoDir;
    
   }  
  else { 
   right = 0;left = 0; //ServoPos = 90;
   delay(10);  // Better for monitoring data
  }
 }
 

borsaci06

10 years 11 months ago

@Grog... i tried both scripts but neither worked for me... :(  Ale's script gives python error and doesnt create a valid opencv.. the other one is only following the sensor triggering on arduino oscope but doesnt supply valid triggering data... i am posting my oscope output as u requested....

Just in case you didn't see my "shoutbox" messages, it looks like your signal is coming in as D3 (yellow line same color as D3 on the left legend) and you are also tracking a flat signal on some of the green pins (D5?).

The oscope tab isn't exactly intuitive to me either but I think what I'm describing above is correct.

Thx KMC... i also read your shouts... i already realized that the yellow line on top is coming from my PIR sensor because it triggers as i move in front of it... the green line on D3 is there because i tried to enable D3 from pins tab on right, under oscope tab to "in" and "on"... i did that because the triggered data on yellow line doesnt have any affect on MRL python script and tracking service doesnt start tracking.. if you take a look at the script that Aless wrote, it should check whether PIR state is 1 or 0 but "if  statement " there is never executed.. it cannot read the publishpin and gives msg error... so it doesnt activate face tracking... i am desperetaly trying to make MRL  read input and execute branching with a simple if statement... 

borsaci06

10 years 11 months ago

I am adding a video i made during my tests of simple face tracking with "facetrackingminimal" python script... as you can see it can successfully find and track faces.. but still i am unable to integrate PIR sensor into the system to activate detecting human presence...

Nice to see you ! and thanks for the video post :D

What is "facetrackingminimal"  ?  I see a Tracking.minimal.py and a facetrackinglight.py in the Python/examples ..

In order to sucessfully integrate we need to know what what exactly we are integrating !

Nice demo !

@GroG... the script i am using on this test is the following one which i slightly modified the tracking minimal script to my needs...

#file : Tracking.minimal.py
# a minimal tracking script - this will start all peer
# services and attach everything appropriately
# change parameters depending on your pan tilt, pins and
# Arduino details
# all commented code is not necessary but allows custom
# options
 
port = "COM3"
xServoPin = 9
yServoPin = 10
 
tracker = Runtime.createAndStart("tracker", "Tracking")
 
# set specifics on each Servo
servoX = tracker.getX()
servoX.setPin(xServoPin)
servoX.setMinMax(30, 150)
servoX.setInverted(True) 
 
servoY = tracker.getY()
servoY.setPin(yServoPin)
servoY.setMinMax(30, 150)
 
# optional filter settings
opencv = tracker.getOpenCV()
 
# setting camera index to 1 default is 0
opencv.setCameraIndex(2) 
 
# connect to the Arduino
tracker.connect(port)
 
# Gray & PyramidDown make face tracking
# faster - if you dont like these filters - you
# may remove them before you select a tracking type with
# the following command
#tracker.clearPreFilters()
 
 
# different types of tracking
# --------------------------
# simple face detection and tracking
tracker.faceDetect()
 
# lkpoint - click in video stream with 
# mouse and it should track
#tracker.startLKTracking()
 
# scans for faces - tracks if found
#tracker.findFace() 

BTW pls tell me how i can paste the program block seperate on the post body...

@Aless... yes i could but i didnot want to bother you... GroG was online and was working on the issue... many thx for your help till now... you were always there to help... in the end i see that the new script on my post is working flawlessly and it is a modified version of your old script... you are a Guru...

GroG

10 years 8 months ago

With some of the new features of MRL - you can now debounce signals at a configurable interval - 

So the signal will allow changes of states only after the debounce time has expired.

The code is :

arduino.digitalDebounceOn(1000)

After the signal changes - this setting will prevent the signal to change again until 1 second has gone by.

It looks just the opposite as how I would expect, but it is working as I would expect - the funny thing is the trace slows down.  If the trace was at constant speed then you would see the fluctuations stretch out.. weird, but only a display issue.