Explorations in VR Design is a journey through the bleeding edge of VR design – from architecting a space, to designing groundbreaking interactions, to making users feel powerful.

Explore our latest XR guidelines →

Last week, we saw how interactive design centers on human expectations. Of course, it also begins with the hardware and software that drives those interactions. The Leap Motion Orion software opens up two fundamental interactions – pinch and grab. Using our Unity Core Assets detectors scripts, it’s also possible to track certain hand poses, such as thumbs-up.

In this exploration, we’ll cover some quick tips on building for the strengths of Leap Motion technology, while avoiding common pitfalls. For a more in-depth look at critically evaluating your project’s interaction design, see our post 6 Principles of Leap Motion Interaction Design.

The Sensor is Always On

As an optical tracking platform, Leap Motion technology exhibits the “live-mic” or “Midas touch” problem. Unlike a touchscreen or game controller, there is no tactile barrier that separates interaction from non-interaction.

This means that your project must include neutral zones and poses, so that users can play and explore without accidentally triggering something. This is fairly easy for physical interactions like pinch and grab. More abstract interactions, such as the thumbs-up and gravity gestures used in Blocks, should be both extremely limited in their impact and rarely a part of casual movement.

At the same time, safety should never be at the expense of speed. Except for drastic changes like locomotion, do not require a pause to begin an interaction, or your users will get frustrated.

Dynamic Feedback

The absence of binary tactile feedback also means that your experience should eliminate ambiguity wherever possible. All interactions should have a distinct initiation and completion state, reflected through dynamic feedback that responds to the user’s motions. The more ambiguous the start and stop, the more likely that users will do it incorrectly.

Our earlier guide to the interaction design in Blocks provides some insights on building interactions that provide continuous dynamic feedback. These principles have also been baked into the UI Input Module, which features a circular cursor that changes color as the user’s finger approaches the interface.

uiwidgets

In general, be sure to clearly describe intended poses and where the user should hold their hand to do that pose. If the intended interaction is a motion, make a clear indicator where the user can start and stop the motion. If the interaction is in response to an object, make it clear from the size and shape of the object how to start and stop the interaction.

Keeping Hands in Sight

If the user can’t see their hand, they can’t use it. While this might seem obvious to developers, it isn’t always to users – especially when focused on the object they’re trying to manipulate, rather than looking at their hand.

One way to approach this is to use visual and audio cues to create a clear safety zone, indicating where the hands should be placed. You can notify the user when their hands enter (or exit) the zone with a simple change in color or opacity. Another approach is to develop user interfaces that are locked to the user’s hand, wrist, or arm, as these draw user gaze more reliably than interfaces fixed in the world.

Finger Occlusion

As with any optical tracking platform, it’s important to avoid the known unknowns. Before Orion, we recommended encouraging users to keep their fingers splayed and hands perpendicular to the field of view. While this is still one of the most reliable tracking poses, the new pinch/grab interactions that we’ve built with Orion revolve around a different set of standard hand poses – ones that both feel natural and can be reliably tracked.

weightless-social-3

Nonetheless, it’s still important to encourage users to keep their hands in view, and to guide them through interactions. Be sure to avoid interactions that depend on the position of fingers when they are out of the device’s line of sight, and reward correct behaviors. This can be achieved through a range of instructions and cues – from sound and visual effects to interactive and object design.

Explore our latest XR guidelines →

Now that we’ve looked at optimizing for Orion, what about the human at the center of the experience? Next up, a look at user safety and comfort.