When the first boots of a human being land on Mars, imagine being able to RSVP and attend the event from your living room and actually feel like you were there, right next to the astronauts. Equipped with versions of the augmented and virtual reality technologies currently under exploration at NASA, this is an incredibly real possibility.

NASA’s International Space Apps Challenge launched yesterday, and while teams all over the world gather in their hacker dens this weekend to use Leap Motion and the power of code to build open-source solutions for global conundrums, engineers at NASA’s own Jet Propulsion Lab (JPL) are using our technology to test the boundaries of their ATHLETE Rover – short for “All-Terrain Hex-Limbed Extra-Terrestrial Explorer.”

It’s a graceful, rugged, spindly-legged, all-terrain lunar test bed. It drives when it needs to drive. It walks when it needs to walk. It looks like a giant spider, or a hand, or (for all you Ghost in the Shell fans out there) a Tachikoma.

nasa6

“If you’re trying to control a hand, or something that’s hand-like, then it’s natural to look at technologies that allow you to use your hand as the controller. That’s where we started our work with the Leap Motion,” says Jeff Norris, Mission Operations Innovation Lead at the JPL. Mapping their hands to a model of the robot, NASA engineers are able to control six degrees of freedom on the ATHLETE with a single gesture. This includes position, rotation, and articulation of the legs.

“The reason that 3D gesture control is a promising avenue for our work is because we’re all natural experts at this from birth. We are trained to interact with things in a 3D environment,” Norris explains, “We’re not born knowing how to use a keyboard or a mouse. How can we build devices and interfaces that make it intuitive and natural for people – first to understand the state of a complicated robot – and then control it?”