Hello guys!

I'm currently working on brain-wave controlled prosthetic arm system.  I'm basing the arm on inmoov design with some adjustments and reworking (planning to release the alternative models later on). The plan for Alpha stage is simple hand grip, relaxed hand, rotating wrist and moving elbow.

The eeg headset I'm using is emotic epoc, research edition. I picked epoc for the sake of price, so people can make their own projects based on mine (epoc price is currently around 750$).

Simple action set would look like this:

Raw data from epoc >>> Openvibe (learning the signal, after training learning the patterns), filtering everything with my algorythms)  >>> MRL >> Arduino.

The last part I painted red for a reason. I'm missing the link between two programs. As long as it's neurobiology or math algorythms it's my field , but I'm not so sure how to make a link like that :D Would be amazing if someone would figure that out.

For example openvibe recognises a pattern in brain that's branded as wrist rotation, it should send the signal to MRL for appropriate reaction in realtime.

I did some research on possibilities and far as I got was VRPN possibilities. Here's the link to examples and workflow:

http://openvibe.inria.fr/vrpn-tutorial-sending-data-from-openvibe-to-an…

 

I will gladly answer any questions :)