NASA uses Oculus Rift and Kinect 2 to control robotic arm


Using joysticks to control robots is so unimaginative. NASA JPL (Jet Propulsion Laboratory) has a fantastic new idea that makes use of today's latest gaming accessories. JPL has figured out how to pair an Oculus Rift virtual reality dev kit headset with a Kinect 2 (the one for Xbox One) to construct "the most immersive interface" its ever created. The first task on its to-do list: control a robotic arm with high precision gestures.

As you can see in the video below, the robotic hand arm can mimic the exact movements of the controller's real arm, performing simple actions such as picking up blocks. Kinect 2's powerful body-tracking technology tracks the controller's hand in real-time with low-latency, while the Oculus Rift allows the user the ability to see what is happening in a three dimensional space.

Speaking with Engadget, Alex Menzies, a NASA JPL Human Interfaces engineer said:

We're able for the first time, with [a] consumer-grade sensor, [to] control the entire orientation rotation of a robotic limb. Plus we're able to really immerse someone in the environment so that it feels like an extension of your own body -- you're able to look at the scene from a human-like perspective with full stereo vision. All the visual input is properly mapped to where your limbs are in the real world. It feels very natural and immersive. I felt like you have a much better awareness of where objects are in the world.

With more research and time, NASA JPL hopes to eventually use the Oculus Rift and Kinect 2 combo to control Robonaut 2's arms (and possibly legs), a robot currently residing on the International Space Station, with absolutely precision.

OPSLabJPL (YouTube) and Engadget, via Geekosystem

For the latest tech stories, follow DVICE on Twitter
at @dvice or find us on Facebook