At Leap Motion, we’re making VR/AR development easier with Widgets: fundamental UI building blocks for Unity. In part 2 of our Planetarium series, we take a look at the development of the Arm HUD Widget.

Hi, Barrett Fox here. As an interaction engineer here at Leap Motion, I built the Arm HUD for the Planetarium. While we introduced an early version of Arm HUD in December, I wanted to share what we learned from its evolution and development.

A Rich Stew of Ideas

As Daniel mentioned earlier, Planetarium was designed in part to put our Widgets through the rigors of an authentic production. We needed to validate that they could be put inside a complex hierarchy, withstand being scaled, work in any orientation, and control a variety of events. Even as the Arm HUD is kind of a new meta-widget to us, we wanted it to spur the creation of new additions to our Widget library. And to be frank, we also wanted to have just a touch of unabashed fun and actually create the kind of interfaces we’re being teased with so often in the movies.

The initial idea to build an Arm HUD started with our co-founder, David Holz, and was given legs by our designer Kyle Hay. Bringing it into focus and realization was richly and fluidly collaborative. Our UX designer Jody Medich would wireframe proposals for functionality and layout, and Kyle would create PDFs of art direction and design details. And as I built to these specs, we would rapidly prototype, test, observe intently. We evolved our design significantly and repeatedly throughout the process.

Flexible Workflows & New Unity Powers

Rapid prototyping of the Arm HUD’s shape and motion was critical for us to be able to deftly explore broad visual ideas without painting ourselves into any corners. I used Maya to rough out a geometric form, animate it, and export to Unity. All of the Arm HUD’s graphics needed to be dynamic and data-driven, yet able to conform to any shape of surface. To do this, I created an orthographic camera, invisible to the user, that would render any graphic layouts I created and project them onto the Arm HUD’s 3D geometry.

During the production of Planetarium, the long-awaited Unity 4.6 GUI became available and provided us with a huge new, highly relevant toolset. We use the new World Space UI canvas extensively, throughout not only the Arm HUD but the rest of Planetarium as well.

Additionally, the Arm HUD uses Unity’s Mecanim animation state machines in conjunction with the Arm HUD’s own C# state machine. Using Unity’s existing systems for GUI and animated events inside our Widgets makes it easier for developers to not only use our hand-enabling toolsets but to easily build upon them as well.

Iterate, Observe, Rinse & Repeat

The unbroken thread of testing and feedback runs throughout the process of building Arm HUD. As we iterated, often with Jody proposing new UX approaches, we experimented with several ways to layout the various widgets and panels. We zeroed in on a layout that kept the hands somewhat separate. And we aligned the panels in a way that would prompt the user to tilt their arm up and in front for optimal tracking. We found that turning the wrist was an elegant and reliable way to use your body to switch contexts.

the-process
Witness the deep scrutiny and observational rigor of our process!

Next: A Brief History of Time Dial

An interaction engineer at Leap Motion, Barrett has been working at the intersection of game design, information visualization and animation craft for 20 years as a producer, game designer, and animator.

LinkedIn