NASA wants to change how you think about exploring the universe, and it begins with natural user interfaces. In a surprise demonstration at the Game Developers Conference in San Francisco on Wednesday, NASA scientists Jeff Norris and Victor Luo used the Leap Motion Controller to remotely control a one-ton, six-legged ATHLETE rover located at the Jet Propulsion Lab (JPL) in Pasadena.

The ATHLETE (All-Terrain Hex-Limbed Extra-Terrestrial Explorer) is a heavy-lift utility vehicle prototype. Developed at the JPL, it was designed to support future human space exploration – on the moon, Mars, and even asteroids. Towering at 15 feet in height, its legs are able to double as tool-using arms, and it can carry an entire habitat on its back. More than anything, it resembles a prototypical Tachikoma.

“How are you going to control something that is so decidedly not human in a natural way?” Norris asked the crowd. By using the latest natural user interfaces to translate human actions into robotic movements.

“ATHLETE has six limbs, and each limb has six degrees of freedom – six joints. So you have a lot of degrees of freedom. What part of our body has that much manipulation power? Well, it turns out our hands have similar dexterity. Each hand has five fingers, and each finger has a series of joints.

“So we mapped our hands to the robot.”

The crowd watched as Leap Motion-controlled software appeared on the screen. Norris’ fingertips swirled in 3D space. Using the Leap Motion Controller’s tracking capabilities, JPL engineers were able to map their hands to the robot. Generating a Unity 3D simulation of the hangar housing the ATHLETE, they created a virtual model of the rover that was able to respond to hand motions.

But they didn’t stop there. With live video of the JPL hangar housing the ATHLETE, Luo took control of the rover using the Leap Motion Controller – raising one of the legs of a machine nearly 400 miles away.

But Saturday’s demonstration was about more than showing off the possibilities of human-robot interaction with the Leap Motion Controller. It was part of a larger conversation, starting with NASA’s recent efforts to reach new audiences with gaming, and ending with some fascinating insights into the future of human space exploration.

As Norris told an enraptured audience: “In the 1960s, the landing of Apollo 11 was the most watched television broadcast at the time in history. It happened here. It happened in this hallowed ground of the living room, a place we’d like to be again.”

Norris envisions a future when human beings are able to explore the universe through the use of remotely controlled robots, which would cross the cosmos alongside astronauts. “I want us to build a future of shared immersive tele-exploration – everyone exploring the universe through robotic avatars, not just peering at numbers or pictures on a screen, but stepping inside a holodeck and standing on those distant worlds.”