here is a java file calls some pointcloudish processing code that I can runs in eclipse.
the frame rate is appaling but it would be interesting (to me at least) if you could make it run in MRL.
regardless its and example of how the original simple was used which may help you
the java build path will need to point at c:/path to processing/core/all things windowsy
and the unnerfed simpleOpenNI.jar thats in the zip
move about a bit it should recognise you as a human?
then get another one... max 6?
I know it aint got the 3dness going on but framerate would have to be first
more
http://youtu.be/QOwxYH8ZqBk
There are some "standard" ways people are using kinect scans with existing free software
and the pupetry @50sec in this OSCeleton Demo is animata but why not inmoov
BVH is a mocap file format with lotsa free mocaps on the web http://www.brekel.com/kinect-3d-scanner/
So you did the top video? Is
So you did the top video?
Is it a Processing Script ? - If so which one 3DUser.pde ?
10 FPS :( better than 3 I guess
I was getting about that and I color coded the points for depth, was using Java3D which with the fact that you can pivot around axis makes me believe that Processing is doing the same.
My latency was pretty hight too, what about this demo?
Thanks for the update Cup ;)
Arrgh thought I replied
Arrgh thought I replied yesterday musta forgot to press send.
Just seen the vid... Nice!
The Code is processing... it was based on 3DUser.pde from version1 of simpleOpenNI. When V2 came out it there were no examples or docs of "the new way" so I hacked it. Got some bits right other bits not so much. The mouse control of the 3D is from a BVH viewer I wrote years ago.
I think maybe you have misunderstood processing, its not an interpreter, it is pre-processed and compiled whith a java compiler. Like Arduino code is pre-processed and compiled C for avr. If it was used in MRL it would have to be as a source of functionallity compiled into MRL not a scripting language to control MRL funcionality like Python is. I know I'm not understanding MRL well but you get the drift.
I'm keen to pursue using processing in MRL even if it never becomes part of MRL. I think I can create useful services (for me anyway) using code I already have so I only have to debug my Java/Python/MRL knowledge.
I have created a service using the templates, but from there it gets tricky figuring out who's talking what to who, when (to draw and analogy from the shoutbox). Is there some documentation I havent found yet or can you recommend some services to study to get to grips with how MRL flow and communication works.
Documentation !
I wish I had better documentation.. but wishing doesn't seem to make it appear.
In the interim, the Clock service is a very basic service which might be helpful to understand the structure.
1 thing to keep in mind is the statement " MRL IS NOT THE GUI AND THE GUI IS NOT MRL " .
MRL is a framework service and the GUI is only one of many services. With that said it becomes a little more easy to understand from viewing it. The clock functionality is pretty straight forward - it does what you'd expect a clock to do. The Clock service know nothing about the GUIService, but the GUIService has a tab component (ServiceGUI) which knows what a clock is. This keeps the gui and clock decoupled nicely.. cuz sometimes "robots done need no stinking gui".
Messaging: Each service has an InBox and OutBox and a thread to go with them. InBox processes inbound messages and the OutBox is for outbound. A subscription can be created on any public method in any service.
If you look in clock you'll see this line :
invoke("pulse", new Date());
it invokes the pulse function with input parameter of a Date.
Other services might want to subscribe to a clock pulse. You can subcribe with "subscribe" or "addListener"
myservice.subscribe("clock", "pulse", "pulseMe");
the above line subscribes to the Clock service named "clock" "pulse" method and will route it to its own "pulseMe" method - the myservice will need to have a public pulseMe method which accepts the Date of the pulse.
Hiya Yep Thanks thats
Hiya
Yep Thanks thats helped... am getting there (? somewhere) slowly... tooooo many things to do. I now have an inmoov head almost ready to connect so priorities heading back to software.... although right now I need a power supply...
Been looking at http://cnmat.berkeley.edu/library/oscuino/max_msp_control and am interested in your thoughts on OSC in general and OSC on arduino's. There are examples for PWM and driving servos too. It can run over serial or udp... the examples include a simple serial to udp router in processing. I wonder if an addressing scheme like /myRobotName/head/eye/left might work. Mabe aliased to myRobotName/Arduino2/1. Full blown OSC does tcp, multicast etc
I know you are focused on other things but at the least I reckon there's some gold in there for when you are looking at MRLComm.
Hi Cup, Nice to hear from
Hi Cup,
Nice to hear from you.
Ya OSC looks cool, I would be very interested in a OSC Service.
Right now .. just so you know .. the WebGUI allows REST(ish) command much like OSC's protocol
For example /01.leftHand.thumb/moveTo/90 does what you would expect it to .. if you start checking out the WebGUI this will give you some ideas .. http://myrobotlab.org/content/webgui-updates
This GUI is 2 in one .. it has a webinterface but you'll see the REST API too .. the REST API link will go to a reflectively generated screen - but the results of pressing any of these buttons will give the URI which directly activates any and all MRL methods...
The other WebGUI is a more customized and traditional one which uses websockets - https://www.youtube.com/watch?v=FCRgimpKM5c