Explorations in VR Design is a journey through the bleeding edge of VR design – from architecting a space, to designing groundbreaking interactions, to making users feel powerful.

In the novel Flatland, a two-dimensional shape’s entire life is disrupted when he encounters a creature from another dimension – a Sphere. The strange newcomer can drop in and out of reality at will. He sees flatland from an unprecedented vantage point. Adding a new dimension changes everything.

In much the same way, VR completely undermines the digital design philosophies that have been relentlessly flattened out over the past few decades. Early GUIs often relied heavily on skeuomorphic 3D elements, like buttons that appeared to compress when clicked. These faded away in favor of color state changes, reflecting a flat design aesthetic.

Many of those old skeumorphs meant to represent three-dimensionality – the stark shadows, the compressible behaviors – are gaining new life in this new medium. For developers and designers just breaking into VR, the journey out of flatland will be disorienting but exciting.

folder-options

Windows users in 1992 needed 3D effects on buttons to understand that they were meant to be pressed, just like buttons on other media like radios, televisions, and VCRs. In 2016, active and passive states in the OS are communicated entirely through color states – no more drop shadows. All major operating systems and the modern web are now built with a flat minimalist design language.


But this doesn’t mean that skeuomorphism is the answer – because the flat-skeuomorphic spectrum is just another form of flat thinking. Instead, VR design will converge on essential cues that communicate structure and relationships between different UI elements. “A minimal design in VR will be different from a minimal web or industrial design. It will incorporate the minimum set of cues that fully communicates the key aspects of the environment.”

A common visual language will emerge, much as it did in the early days of the web, and ultimately fade into the background. We won’t even have to think about it.

The interface in Quill by Oculus builds on the physical skeumorphs of traditional PC design to create a familiar set of cues. As Road to VR’s Ben Lang writes, “the interface looks charmingly like something out of the early days of the first GUI operating systems, but what’s important is the fact that the interface takes known PC affordances and applies them easily and effectively in VR.”

UI Input Module

The design process behind the UI Input Module was driven by many of these insights. In turn, they continue to inform our other bleeding-edge internal projects. The UI Input Module provides a simplified interface for physically interacting with World Space Canvases in Unity’s UI System. This makes it possible for users to reach out and “touch” UI elements to interact with them.

uiwidgets_1

Below is a quick analysis of each UI component included in the UI Input Module. In each case, sound plays a crucial role in the “feel” of the interface.

Button

Each button can easily be distinguished as interactive, with 3D effects such as drop shadows. The size and spacing of the buttons makes triggering them easy. When your hand comes close to the interface, a circle appears that changes color as you approach. When you press the button, it compresses and bounces back, with a color state change suggesting that it’s now active. At the same time, a satisfying “click” sound signals that the interaction was a success.

Slider

Much like the button, the slider features a large, approachable design. Changing colors, shadows, sound effects, and a subtle cursor all continuously provide feedback on what the user is doing.

Scroll

With the scroller, users have the ability to move the content directly instead of attempting to target a small, mouse-style scrollbar (though they can if they want to). Naturally, the scrollbar within the widget indicates your position within the accessible content. Sound plays a role here as well.

Interactive Element Targeting

Appropriate scaling. Interactive elements should be scaled appropriate to the expected interaction (e.g. full hand or single finger). One finger target should be no smaller than 20 mm in real-world size, and preferably bigger. This ensures the user can accurately hit the target without accidentally triggering targets next to it.

Limit unintended interactions. Depending on the nature of the interface, the first object of a group to be touched can momentarily lock out all others. Be sure to space out UI elements so that users don’t accidentally trigger multiple elements.

Limit hand interactivity. Make a single element of the hand able to interact with buttons and other UI elements – typically, the tip of the index finger. Conversely, other nearby elements within the scene should not be interactive.

Wearable Interfaces

Fixing the user interface in 3D space is a fast and easy way to create a quick, compelling user experience. Floating buttons and sliders are stable, reliable, and easy for users to understand. However, they can also feel obtrusive, especially when their use is limited.

At Leap Motion, we’ve been experimenting internally with a range of different interfaces that are part of the user. This “wearable device” can be locked to your hand, wrist, or arm, and revealed automatically or through a gesture.

(Interestingly, demos like The Lab, Job Simulator, and Fantastic Contraption use an internalization mechanic – grabbing and “consuming” something in the environment to trigger a change, such as teleporting to a new environment or exiting the game. This is just one of many ways to bring the user’s sense of self deeper into VR.)

orion_6

A simple form of this interface can be seen in Blocks, which features a three-button menu that allows you to toggle between different shapes. It remains hidden unless your left palm is facing towards you.

These early experiments point towards wearable interfaces where the user always has instant access to notifications and statuses, such as the time of day. More powerful options may be unlocked through a trigger gesture, such as tapping a virtual wristwatch. By combining our Attachments Module and UI Input Module, it’s possible to build a wearable interface in just a few minutes.

interfacemodule-hovercast-leapmotion

Zach Kinstner’s Hover UI Kit is another approach to wearable interface design. With it you can quickly create a beautiful, customizable, dynamic UI. Tap buttons on your palm to summon or dismiss the menu, or go back. Select menu items beyond your hand to access and configure options.

The design features dynamic feedback and a finger cursor that continually suggests how the interface can be used, and what it’s currently doing. The Hover UI Kit is available from Zach Kinstner’s GitHub page. Try the basic menu demo from our gallery, or the new Force-Directed Graph to see how you could interact with data in VR.

Experimental UI

The UI Input Module also includes some experimental features that extend beyond the physical metaphor of direct interactions. One of these features is Projective Interaction Mode. By raising your hand, you can summon a cursor over a faraway menu, then interact with it using the pinch gesture. Another mode gives users telekinetic powers so they can interact with objects at a distance.

uiwidgets_2

We describe these features as “experimental” because unlike the buttons and sliders that you can instantly reach out and press, it’s not always obvious to a new user how these more abstract modes work. Once the user understands the basic concept, the interactions tend to be smooth and fluid. But it’s the first step that’s the hardest. For this reason, we strongly encourage including tutorials, text cues, and other guides when developing with these modes.

All design is a form of storytelling. To take your users out of flatland, you need the right narrative to drive their interactions and help them make sense of their new universe. Next week: Storytelling and Narrative in VR.