Orbous Mundus is the first of hopefully many Plantoid Robots created by GroG and I.  I build the hardware and ask lots of questions, and GroG has been providing his creatorship of the custom MRL plantoid service. 

This particular robot is a bog-brained mind model that represents some key functions of the concepts that we are working to express, and using the many services of MRL, we hope to add some exciting higher level functions and controllers to our existing arsenal of plantoids.

Currently Orbous can report its own uptime, twitch its legs, stream video,and pan tilt. Its undergoing maintenance right now (new feet) and still many small details are being ironed out.


So... WHAT IS A PLANTOID?! - A plantoid robot is defined on Wikipedia as:

A plantoid is a hypothetical robot or synthetic organism designed to look, act and grow like a plant. The concept was first scientifically published in 2010[1] (although models of comparable systems controlled by neural networks date back to 2003[2]) and has so far remained largely theoretical. A prototype for the European Space Agency is now in development.[3]

....and to be perfectly honest, I took the words "hypothetical" and "largely theoretical" personally after noticing that was largely and disappointingly true.  We need REAL robots! That let us observe real experiments and show us a completely novel manifestation of what a Plantoid Robot is.

When Curiosity landed on Mars, I realized what the next real achievements of taking life to new planets would be, and started making tangible models that didn't just model a plant, but were actually living experimental platforms that use democratized technology to achieve real goals, like keep a plant alive in  your house, garden a Mars base, but more so BE a living thing!

In order to achieve a true concept of plantoid, a certain existential exhange of intelligence must be achieved with the plant and the robot elements. In creations such as Orbous, the sensory brain stem of the plant will dictate the movement and motivations of the rest of the robot, while the robot will use spatial intelligence and programming to avoid dangers and hostile environments for the plant brain.

We have been using carnivorous plants in our studies thus far, because they tend to be a little bit more physically active and robust for experiments, though we don't let that get in the way of the fact that we are making a robotic swarm of carnivorous cyborgs, which in itself is awesome.


Orbous carries an Arduino Mega w/ Grove Shield from SeeedStudio. Sensors include ambient light, soil moisture, humidity, air quality, and ultrasonic. Sensors may be swapped in and out as needed. Experimentation is continuous. Each type of data taken in is used to affect and motivate the robotic system in a way that promotes survival and growth for the plantoid experiment. Thresholds are set in code to trigger adjustments in movement and location, and give us readings to add into the total sensory set with the more advanced NDVI camera rigs on the Plantoid Server page.

Each plant system that is put into the central brainstem  carries its own health profile, such as luminosity and duration of light, desired pH of soil substrate, and humidity. These setpoints are compared against the sets of data derived from the internal terrarium sensor structure, and together are the information from our plantoid brain stem with which Orbous' lower functions are decided.



20.11.2013 Orbous gets feet v2 & upgraded power systems...

Finally got the redesigned feet onto Orbous, no more thin ankles cracking. In hindsight I could have thought a bit more about omni feet, but the new version is a nice fix that I made instead of using bought parts. All in all, a huge plus. :)

I added two new UBEC's into the power systems to isolate and prevent brownouts. New servos replaced some that may not have been too weak, but were definitely surging stall torque and causing issues with function.  The new servos are rated at about the top end (20kg) of anything I've seen in a standard size package. 

I also reorganized wiring in Orbous' nethers so noise won't hit sensitive spots. There were some UBEC outputs in risky proximity to data from the first functional installation of all of Orbous's tech.


-Leg servo installation.

-Plantoid viewer page, published to web. (Live Demo's!)

-Anything else that comes up...






UPDATE 20.2.2014 Orbous Mundus, now with 50% more walking, and video proof to show!!



Update 27.2.2014 -

This is a rough draft mockup of the targeted page of controls and fun stuff for a webGUI tab to play with Orbous.  

Grog found some nice ways to do dials and we had finesse'd the gui a little bit, but I think this was before the MRL WebGUI had come about... but I forget. 

At his first moments of life Grog and I monitored Orbous though pins oscope and several sensor dials. 

Now with WebGUI maybe the actual camera and pilot controls can be private or secured, maybe this mockup is a customized web page that gathers functions into for for OpenCV and display fame, and the sensor dials. This would be easy to make on the robot's various hosted locations that would just need to tune into the robots publishing point, or local IP address.

Not knowinghow to do code any of that, or having any intuition on how to start in the processes, I've been falling back on a vnc viewer to load Orbous's desktop for display of openCV and treat the MRL WebGui asmy controller.

Notes and Pseudo-Code, XMPP Experiments Ideas:


-Homes pan servo to 0

-increment  X degrees (user selectable) and take still capture.

-sync to server



-start recording video from NIR camera

-upload to Infrapix and download NDVI result



-capture a single still image from NIR ad RBG camera

-load to Infrapix site and download NDVI composite result. 



-Start Tracking and motion, set tracking via OpenCV filter user selection



-start tracking via OpenCVfilter, capture stills or video along the way



capture either NIR or RGB camera at user selected interval



-Captured on startup. Internal static NIR camera takes one picture daily. Display on desktop for cool plantoid-ness.

-Process to NDVI through the Infrapix Python script

-Analyse NDVI w/ OpenCV, render Average Color data variable by counting colors. 



-start streaming MJPEG to port :9090 and embedded on embedded webpage




9 years 4 months ago

Hi DJUltis,

i am very interested with your work on plantoid robots... being a nature fan and a good gardener and bonsaist, would like to see more of your work and contribute on this plantoid robot concept... always wanted to do some experiments on this subject... made some tinkering and made some instructables on automated gardening (http://www.instructables.com/id/Intelligent-watering-system-with-arduino/ ) ... i even worked on getting feedback from plants about their situation (like measuring resistance readouts from alive leaves) trying to integrate them into automation... i will be following your work and would like to have some detailed info about the concept of the robot...... BTW, loved the TheoHansen type legs... 


9 years 4 months ago

I'll be working hard on making these robots better in the future. Maybe we can both search for better ways to derive signalling from plants.  Have fun and by all means, we can share and share alike.

As i told before, your work inspired me to give some time on Plantoid Robot  concept and decided to repeat an experiment which i made years ago.. Tried to communicate with my Hoya Carnosa today and wanted to share here with you... Simple setting was a multimeter reading ohms resistance connected to one of the leaves of my Hoya, which has broad and stiff leaves. I used 2 electrodes one each side of the leaf and carried the experiment for few hours... the first reading i got after connection was 2,6 Mohms steady in the room at steady warm conditions... After half an hour i covered the plant with a dark cloth and left there for an hour or so, during this time readings began to rise towards 4,3-4,4 Mohms... after that i uncovered the plant again and let the sunshine from the window come on the plant, the readings dropped down to 3,7-3,8 Mohms in half an hr time... later i opened the door in front of her and let cold air come inside for half an hour again... readings rose immeditely towards 4,8-4,9 Mohms... then closed the door and saw that readings began to drop down and at that point i took below pictures in order to show you the test environment... Tell me what you think?... do you think this is a way to communicate with a plant and let her tell us about the level of stress she is under?... So that we can integrate an Arduino to read the plant's condition to take necessarry action?... 


9 years 4 months ago

I tried to restart mrl for some more debugging .. and this was the last recorded transmission Sir !

Looks dark on the Idohesian plateau..  hope no damage occured sir....


9 years 4 months ago

There are new features with the Plantoid service - one of which is auto-scan which starts the pan servo scanning back and forth - enough to make a manual paroramic stich of the Idohesian Plateau !

Grog, I like using Hugin for making panoramas if you want something a bit more seemless in the future.


The autopano stuff works well but if you're doing manual alignment stuff it's still really easy. I started using it manually with fedora because of licencing issues disabled the autopano stuff but eventually I found a repo with the auto stuff built in.

Here are my panos that I uploaded to flickr.



9 years 4 months ago

The natives of the Idohesian Plateau appear to be trying to send us some sort of communication Sir.
We've run it over to the royal linguists, and they say it some cyrptic ruins describing the importance of education.

Additionally, the team's biologist has discovered a dangerous predatory animal, who's camouflage has rendered it nearly invisible !