is 3D VISION possible with   stereopi   from : https://www.buyapi.ca/

has anyone tought of installing 3d vision on inmoov

 
buyapi.ca  ,developed a 3d cam called stereopi ...could it be possible to replace the K-NECT and the PIR with 3d eyes 
 

StereoPi Starter Kit - This kit has everything you need to get started right away. 

INCLUDES

  • one StereoPi Standard Edition board
  • two V1 cameras (w/ ~20 cm ribbon cables)
  • one Raspberry Pi Compute Module 3+ Lite
  • two short ribbon cables
  • one USB power cable
  • two power cables
  • one V1/V2 dual-camera mounting plate
  • one wide-angle dual-camera mounting plate
  • a microSD card pre-imaged with Raspbian and all the stereoscopic video and image demos

FROM SPATIALLY AWARE ROBOTS TO DRONE LIVESTREAMING

StereoPi is an open source stereoscopic camera based on Raspberry Pi. It can capture, save, livestream, and process real-time stereoscopic video and images. StereoPi opens up countless possibilities in robotics, AR/VR, computer vision, drone instrumentation, panoramic video, and more.

STEREOPI IS EASY TO USE WITH OTHER FAMILIAR TOOLS.

We’ve already made many demos of StereoPi integrating with other tools, including:

 

kwatters

5 years ago

So,  currently,  myrobotlab supports OpenCV to handle cameras.  There is also an oculus rift service that uses 2 cameras to display the images in the rift.  The OpenCV service has a MJpeg streamer in it, so you can stream the video data over the internet.  

I think the real neat thing on that list is the building a real-time depth map.  We have OpenCV so it's just a matter of integrating those algorithms into MyRobotLab.  

From my perspective, all the building blocks are there to support this already... and it doesn't need to be tied to the raspi.  It can use any set of cameras.

 

thank s for the answer ...i am very enthusiast that in a near futur,, INMOOV will see in 3d ...so  he will be able to grab objetcs on his own and when he walks he will be able to detect his environment and distences....unfortunatly i am no developer so i will have to wait until it is incorporated in MRL...

 

Mats

5 years ago

In reply to by martoys1

I made some experiment about a year ago using boofcv to create a 3D pointcloud and applying the colour to each point. The result was that I could move around, realtime in my livingroom displayed in 3D. The idea was to make a SLAM map based on that, but for several reasons, that never happended. My math skills limited what I could do.