We at Thomas Street have been eyeing the Oculus Rift for quite some time, paying particular attention to demos featuring novel interfaces. We all grew tired of talking about how we wanted to explore VR development, so we allocated several weeks to tinkering with the Oculus Rift and Leap Motion— staffing one full-time developer and a few designers part-time. The goal was to build a cohesive VR interface that plays nicely with both hardware’s current limitations. The Minority Report interface ain’t gonna build itself.

… build a cohesive VR interface that plays nicely with the hardware’s current limitations.

It turns out that the current limitations are considerable. Some of the challenges made some of our original ideas impossible – for example, we weren’t able to use WebVR+Three.js for easy distribution like we had hoped. Others made us radically rethink what sort of interactions were possible, and which ones were plausible. Those constraints helped us build an interaction that’s fun, unique, and comfortable. But more importantly, it helped us rethink fundamental interactions in a an environment far from most of our day-to-day work.

Outcome aside, working with the Oculus Rift is, you know, working with the Oculus Rift, so the experience was a ton of fun. We learned a lot in the short amount of time, and we hope that this post inspires anyone else looking to explore design/development with the Oculus Rift and Leap Motion.

Initial Research

We got things started by downloading the most popular Oculus Rift / Leap Motion demos and giving them a try. We played through the VR demos with a purpose: Find interfaces with novel interactions. Two apps in particular, offered what we were looking for.

Inspiration

hauhet-selection-1200

Hauhet. VRARlab.

In Hauhet, players solve puzzles using their eyes and hands. Players select blocks by looking at them, and then move them based upon changes in hand position. The end result feels fluid and intuitive. Blocks don’t track the player’s hand directly – instead, the player triggers discrete motions. The decision to manipulate blocks with large-scale gestures was brilliant, as it greatly reduced the game’s dependency on precise hand-tracking.

leapgarden-slider-1200

Leap Garden. Ksiva.

LeapGarden features a straightforward button and slider interface. The slider works surprisingly well, and the buttons do what you’d expect. The menu is positioned permanently to the left of the starting field of vision, meaning that users have to rotate their heads whenever they want access to the menu. This leaves the default field of vision uncluttered, at the cost of frequent head-turning. The trade-off seems reasonable, considering that users won’t be navigating the menu too often.

Riffing Off Existing Research

While trying out existing demos, we also read Leap Motion’s excellent articles on NUI (Natural User Interface) design. One of the Thomas Street designers, Ronald, put together a concise summary of the key points.

leap-nui-summary

Leap Motion UI summary. Ronald Viernes, Thomas Street.

With ideas brewing, we decided that it was time to start making things.

Adventures in WebVR

We initially targeted Chrome and Firefox’s experimental WebVR builds. The prospect of distributing VR apps on the web, simply by pointing people to a URL, was too good to ignore.

The first version of our Planet Editor – ran in the browser with WebVR.

We experimented with voice commands via Chrome’s built in Speech to Web feature. It sounded like a good idea, considering that voice-controlled interfaces are so common in sci-fi movies. However, despite the futuristic context, voice commands still suffered from I’m-talking-to-my-computer-and-it-feels-awkward syndrome, so they were vetoed. People in the office were also getting annoyed by the constant streams of “ROTATE. ROTATE.”

Development for WebVR ended up being slow and error-prone. We used Three.js for development along with the official Leap Motion JavaScript library and an open-source Oculus–WebVR adapter. The trio worked together, but we spent a lot of time dealing with low level concerns on all fronts.

The browsers also had a 60 fps cap that was possible to get around, but not in a way that was convenient to users. This 60 fps limit resulted in occasional judder when moving the head, and an overall inferior experience to the achievable 75 fps on desktop apps.

Because of these challenges we decided to abandon WebVR and adopt desktop as our platform.

Evolution of the Project

When we first sat down and discussed the project, we envisioned an interface very similar to what you’d see in non-VR games: A 2D HUD overlay.

planet-editor-concepts-2-1000

Planet Editor mock-up. Ronald Viernes, Thomas Street.

However, as the project progressed and we got a better feel for the tools, we gravitated towards an interface that would exist in the same 3D space as the game. This change improved cohesion between the interface and game objects.

… we gravitated towards an interface that would exist in the same 3D space as the game.

The First Implementation

Our initial implementation was heading down the path of a multi-leveled menu navigation. However, the more time we spent with buttons, the more we began to dislike them.

first-implementation

Early Unity experiment using a button menu.

For one, it was difficult to press buttons accurately with the Leap hands. Our virtual hands would contort sporadically, even when keeping our hands completely still. Actions requiring precise movements – like pressing a button with a single finger – took extreme determination and patience. Similarly, it was difficult to press buttons that were in close proximity, due to the likelihood of hitting adjacent buttons. Finally, the lack of perceived physical contact made it unsatisfying to press buttons. We —inhabitants of the real world —take the physical qualities of buttons for granted.

… the idea of controlling a virtual reality interface with… buttons, felt like a poor utilization of the tools.

Beyond the technical challenges of collision-based interfaces, the idea of controlling a virtual reality interface with… buttons, felt like a poor utilization of the tools. It felt like we were importing the most boring controls from real life, despite the fact that things can fly around and do crazy stuff in virtual reality. This was the main reason we moved away from a button navigation, and instead opted for something truer to how we imagine the interfaces of the future.

The Improved Interface

The new interface is a grid of selectable items in 3D space. Users navigate columns and rows via left-and-right and forwards-and-backwards hand movements. By rotating the hand, users can activate items which will either spawn objects, or change a planet’s surface.

Inspired by Hauhet, our interface allows users to target a planet simply by looking at it. This mechanic, coupled with visual indications of the target planet, makes selection feel natural.

We also learned a lot watching a handful of testers. Testers would discover the interface by accident, and then slowly learn the controls by trial and error. To speed up the learning process, we added a title card with visual cues.

TitleCard-Blog

Our Thoughts on the Finished Interface

Overall, we feel good about the interaction design for the interface. We feel that it makes use of hand tracking in a non-vanilla way. But this uniqueness comes at the cost of a small learning curve.

First, users need to learn that only the right hand controls the interface. It seems like a silly point, but most users raise both hands up immediately when starting the demo. Second, users need to discover that the right hand’s position — relative to the Leap’s sensor range — is mapped to the interface’s rows/columns.

… most users raise both hands up immediately when starting the demo.

Users inexperienced with the Leap Motion also needed to learn the limitations of the technology. Their movements would often be expressive, which can be hard for the Leap to translate, due to the sensor’s limited range. Extending beyond the ideal range resulting in their virtual hands disappearing or flapping wildly. These tracking issues made the experience frustrating at times.

It is also worth noting that the Leap provides pretty consistent data for hand rotation under normal circumstances. However, when the Leap is head-mounted on the Oculus, rotation tracking can get a little spotty. This fact is listed in the “Known Issues” section on the Leap Motion FAQ page. As you can imagine, we weren’t aware of this when we decided on the interaction. Regardless, our rotation gesture works reasonably well, as long as you stay away from surfaces problematic to the infrared sensor — such as white walls and glossy screens (also listed in the “Known Issues”). Help your sensor help you.

We Had a Good Time

Developing a virtual reality app has been a true privilege. Yes, it was both frustrating and nauseating at times, but it was definitely a worthwhile experience. And as VR technology becomes more popular, this initial exposure will only increase in value. If you have been considering VR development for some time, we urge you to give it a shot. It’s a lot of fun.

We have high hopes for the next generation of VR / AR technology, and look forward to continuing UI experimentation in the future.

Try our Planet Editor Demo

Developed for the Oculus Rift DK2 and Leap Motion v2.

Windows 64-bit

Tested on
Windows 8.1
GeForce GTX 970
i7-5820k
Download

OSX

Tested on
2013/2014 13″ Macbook Pros
Download

An earlier version of this post originally appeared on Thomas Street’s website.

Andre is a developer with Thomas Street, a small team of designers, developers, and product managers who experiment with VR.

Twitter