For those too young to have seen the James bond movie Goldfinger - meet Odd Job, the evil henchman.

His hat had a deadly, razor sharp blade in the brim. This was one frisbee you did not want to catch.

Recently a customer asked me to make a robot chassis that could carry a 50Kg payload. The chassis had to be able to travel across mud, sand and snow. It had to be waterproof with an IP rating of 65 or better and it had to weigh less than 17Kg.

Unfortunately, due to COV19 the project has been shelved. This left me with a nice robot base that needed a purpose.

The chassis weighs 16Kg and has two 450W brushless hub motors from one of those self balancing platforms that you stand on and control by tilting your feet.

These motors are low speed, high torque motors that drive the wheels directly without any gearbox. This is due to the large number of poles.

Maximum speed depends on the battery voltage. I'm using a 36V, 10Ah battery so top speed should be about 10Km/hr according to the website I bought them from.

Because they are low speed, the back EMF is not suitable for a sensorless motor driver. The green PCB holds 3 hall effect sensors that are used to generate the commutation pattern for the motor driver.

These are 5V sensors with a digital output. The output pattern also works as an encoder output allowing your MCU to measure speed, direction and distance.

I removed the rubber tires and 3D printed adaptor rings so that the motors could drive some snow mobile treads. The treads are 12cm wide.

Time to build another BoozeBot!

Like all robots, if it can't get you a beer from the fridge then what good is it?

Until now my robot programming experience has been limited to Picaxe and Arduino. I have had nothing to do with A.I. or Linux or Python so this project represents a big learning curve for me.

After a bit of research I bought myself a Jetson Nano with 4GB of memory and dual 4K HD cameras for binocular vision. As a bonus the camera module includes a 9-axis motion tracker which is great for collision detection (using the 3-axis accelerometer) and navigation (using the 3-axis compass).

See attached datasheet: http://http://myrobotlab.org/sites/default/files/9 axis sensor ICM-20948-v1.3.pdf

I'm now 3D printing a head to put it in.

I've got neopixel rings around the cameras although later I might replace them with IR LEDs for night vision.

 

I noticed that the Jetson Nano has I2S headers for audio so I bought some EMS I2S microphones for the ears. See the attached datasheet: http://myrobotlab.org/sites/default/files/INMP441 I2S microphone.pdf

 

I am very happy with the neck, it can rotate, bob up and down, tilt left and right as well as forwards and backward.

There are 6 stud mounted ball bearings that ride inside the lower yellow ring. These support the neck so that the servo does not take all of the load.

The grippers are intentionally large so that the robot can grasp larger objects such as coffee tins. I have vague hopes of getting the robot to make coffee in the morning.

Unfortunately the wrist can only rotate for now. It is enough for the robot to use a broom or mop.

The shoulders and elbows use high torque servos for now.

I'm not confident that the gearbox can handle the 380Kg/cm that the company boast but they should be good enough to open the fridge and pick up a can of beer.

 

My service droid evil henchman, "Odd Job" in the raised and lowered positions.
In the raised position the grippers are high enough that the robot can work at a table or kitchen bench.
In the lowered position the robot can pickup items from the floor.

Over time the design has evolved slightly. I had to mount some electronics on the chest and had tried to get a triangular upper torso for a "Buff" look.

Unfortunately once I added a cover, the shoulder movement became restricted. So now the current design has a bit of a beer belly appearance (ironic huh?) but allows the shoulders a better range of movement.

 

As you can see from the latest photo there is still work to be done. The aluminium parts need painting, some of the covers are missing and there is no wiring.

The preliminary assembly was just to check that the parts fitted and look for problems. For example the elbow timing belt tension assembly needs to be improved.

I also need to rearrange the layout of the chassis as it was originally designed for a different project. The current design has the hub motors diagonally opposite to spread the weight evenly. I want to have both motors at the back to counter the weight of anything the robot picks up.

More photos to come in the following weeks.

 

GroG

2 years 11 months ago

If you stock your fridge with all beer, the yolo filter has a ~70% chance of getting it right ;)

If only it was that easy! I'd need to buy a separate fridge (not impossible).

I do not think it will be a big challenge.

  1. I was planning to keep the beer in a set location.
  2. The beer I like comes in a gold can.

Even if I get a different beer, the set location (in the door) should prevent me from getting a vegetable by mistake.

God Forbid !

Once a target is identified with yolo - the published image, and coordinates can be consumed ....
but I think to be effective you need localization data.   We don't currently have a localization service - but I suspect there are libraries that would quickly remedy this.

How will your bot know where it is in space ?
The camera itself can be used as a localization sensor - and augmented with hints if you want like TopCodes, or single camera opencv localization - https://kapernikov.com/object-localization-with-a-single-camera-and--dimensions/

And DHParams might be of use for the kinematics

http://myrobotlab.org/content/calculating-denavit-hartenberg-parameters-inmoov

As I mentioned when describing the hub motors, the commutation signals double as an encoder that can measure direction as well as distance and speed.

Admittedly, dead reconning using the encoders and the 3-axis compass in the head would still result in accumulated errors.

My brother suggested a few QR codes about the room.

Previously I have used an IR LED, transmitting a SIRC (Sony IR Code) signal as a beacon (similar to a light house). The robot was able to identify the beacon by the code it was transmitting.

The IR receiver was shielded so it only detected signals directly ahead within 10 degrees. Rotating the IR receiver and measuring the angle not only allowed the robot to determine the location but also distance.

My plan is to mount a number of these beacons around the apartment, 4 to a room, up high near the ceiling. Each beacon transmitting a different identification code.

Identifying the beacons will tell the robot which room it is in. Measuring the angles between the beacons will allow the robot to triangulate it's position within the room.

This is essentially a local GPS system using IR beacons instead of satelites. Having the beacons up high reduces the chances of a person blocking the signal when they are standing between the beacon and the robot.

Just like GPS, the more beacons the robot can detect, the more accurately it can determine it's location.

Previously I have just used 1 beacon with a small 4 legged robot so triangulation was not used. This project will be the first time that I have had a use for a constellation of beacons.

Oculus Rift works this way ...
constellation sensors (2 to 4 cameras) set about in a room at the corners ...

setting up oculus rift sensors Cheaper Than Retail Price> Buy Clothing,  Accessories and lifestyle products for women & men -

The helmet of a rift has a pattern of IR LEDs

Tracking and localization, including pitch/yaw/roll is done by processing the camera data ...
A more advanced concept is reversing this - so the camera's are onboard the helmet (Oculus Rift)

These multiple camera on the device itself do tracking of the incoming images.
The camera's basically "pick points" to track...
Like the edge of your desk, or your door's corners, or a window frame ..

If you have multiple cameras tracking points, you can use all that information to accurately calculate where you are in your room and your current heading, yaw, and roll without any external constellation sensors.