Hand tracking and virtual reality are both emerging technologies, and combining the two into a fluid and seamless experience can be a real challenge. This month, we’re exploring the bleeding edge of VR design with a closer look at our VR Best Practices Guidelines.

Locomotion is one of the greatest challenges in VR, and there are no truly seamless solutions beyond actually walking around in a Holodeck-style space. Generally, the best VR applications that use Leap Motion for navigation aren’t centered around users “walking” around in a non-physical way, but transitioning between different states. With that in mind, here are 5 interesting experiments on moving around in VR:

1. Running on Rails

aboard2

The classic rail shooter mechanic has a lot of clear benefits for a hand tracking platform – and judging by Aboard the Lookinglass and Blue Estate, it can be very successful. By handling locomotion, it gives the user more subtle (or more spectacular) ways to play and explore.

Of course, there are also drawbacks. Moving on rails doesn’t work for open-world exploration, and overusing it in VR can cause simulation sickness. It’s crucial to keep acceleration as short and infrequent as possible, and ensure that users expect when movements will start and stop. (More on that in Part 2: Interaction Design of our VR Best Practices.)

2. Superman Flight

Two-handed flight has been a cultural touchstone since Superman jumped from comics and radio onto our television screens in a single bound. With this model, users can fly around by extending both hands with palms outward. This is a great way to give your users superpowers, but it can get tiring unless used in a short demo, or alongside other interactive elements.

vr-intro-locomotion

The flight stage in VR Intro is fairly short with no negative repercussions for navigating the wrong way, making users feel exhilarated and powerful.

weightless-locomotion

Weightless is a slow meditative experience that includes other forms of interaction, like pressing buttons and sorting space debris.

Users also run the risk of extending their hands beyond optimal tracking range, or causing false positive motions. To help users stay within tracking range, other forms of feedback, like visual or auditory cues, can be used.

3. Teleportation

comenius

World of Comenius features glowing orbs that can be tapped to teleport from place to place. At any time, you can also return to the main menu to explore different scenes. This is a compelling exploration mechanic that works well for an educational demo.

4. Grabbing 3D Space

3d-space-comenius2

World of Comenius also includes a secondary navigation method – by grabbing empty space, you can pull yourself around to see different parts of what you’re building or exploring. When the grab is triggered, a 3D matrix appears to provide visual feedback, so you’re less likely to move around accidentally.

While this approach gives the user direct control over their movements, it should be used sparingly, as prolonged use can feel more like rock climbing than smooth navigation.

5. Rotating Around a Central Point

fragmental-rotation2

From the developer of the Hovercast VR menu system, Fragmental’s navigation controls let you revolve around a central axis by moving your hand to the periphery and grabbing. Again, this gives users clear and unequivocal control over their movements. By orbiting the Fragmental puzzle, keeping it in the center of the field of view, and not zooming in/out unexpectedly, it also ensures that users don’t careen out of control.

This is a great mechanic for pivoting around a puzzle, data visualization, or other central point of focus, but wouldn’t work in an open-world setting. (Note that Fragmental is also available for desktop.)

navigation

Similarly, Planetarium’s Joyball widget makes it easy to move along the surface of a globe. Since changes of elevation are unnecessary,  and there are no obstacles or boundaries, users are free to move in any direction along the surface of the sphere. This interaction works by placing your hand in front of the globe (from your perspective) and grabbing – activating the Joyball.

This approach provides the user with direct control at all times, navigating a relatively small territory with frequent starts and stops in movement. However, this interaction model and GUI doesn’t necessarily translate to other applications.

Bonus #1: Camera Control Experiments

Rapid prototyping and experimentation is an essential part of building compelling experiences on the bleeding edge of VR design. To try your hand at a few different ways that users can navigate 3D space, check out Isaac Cohen’s Three.js camera controls for desktop browsers. They’re a lot of fun, and they provide a rough-and-ready way to start thinking about building your own navigation scheme.

Bonus #2: Multi-Modal Input

You can also experiment with combining hand tracking with other forms of input. The maker community has long been mashing the Leap Motion Controller with just about every other technology you can imagine (see Hackster and ChallengePost for a galaxy of examples). There are some really exciting opportunities for combining classic and unconventional controls in new ways.

autonomous-locomotion

For example, take Double Fine’s construction-action game Autonomous. With your right hand, you can aim, throw robot parts, and attack, while your left hand uses keyboard controls. Even though your left hand isn’t being tracked, Autonomous anticipates what it might look like, so it’s also represented in front of you. It’s not hard to imagine using a one-handed joystick with the same kind of clear visual feedback for locomotion in VR.

What’s your favorite way to navigate in VR? Let us know in the comments!

Alex is the head writer and blog editor at Leap Motion, where he stands as the final bulwark against bad grammar. Want to share your Leap Motion project? Email acolgan@leapmotion.com or PM leapmotion_alex on Reddit.

Twitter Skype