At Leap Motion, we’re making VR/AR development easier with Widgets: fundamental UI building blocks for Unity. In part 2 of our Planetarium series, we take a look at the development of the Arm HUD Widget.
Hi, Barrett Fox here. As an interaction engineer here at Leap Motion, I built the Arm HUD for the Planetarium. While we introduced an early version of Arm HUD in December, I wanted to share what we learned from its evolution and development.
A Rich Stew of Ideas
As Daniel mentioned earlier, Planetarium was designed in part to put our Widgets through the rigors of an authentic production. We needed to validate that they could be put inside a complex hierarchy, withstand being scaled, work in any orientation, and control a variety of events. Even as the Arm HUD is kind of a new meta-widget to us, we wanted it to spur the creation of new additions to our Widget library. And to be frank, we also wanted to have just a touch of unabashed fun and actually create the kind of interfaces we’re being teased with so often in the movies.
The initial idea to build an Arm HUD started with our co-founder, David Holz, and was given legs by our designer Kyle Hay. Bringing it into focus and realization was richly and fluidly collaborative. Our UX designer Jody Medich would wireframe proposals for functionality and layout, and Kyle would create PDFs of art direction and design details. And as I built to these specs, we would rapidly prototype, test, observe intently. We evolved our design significantly and repeatedly throughout the process.
Flexible Workflows & New Unity Powers
Rapid prototyping of the Arm HUD’s shape and motion was critical for us to be able to deftly explore broad visual ideas without painting ourselves into any corners. I used Maya to rough out a geometric form, animate it, and export to Unity. All of the Arm HUD’s graphics needed to be dynamic and data-driven, yet able to conform to any shape of surface. To do this, I created an orthographic camera, invisible to the user, that would render any graphic layouts I created and project them onto the Arm HUD’s 3D geometry.
During the production of Planetarium, the long-awaited Unity 4.6 GUI became available and provided us with a huge new, highly relevant toolset. We use the new World Space UI canvas extensively, throughout not only the Arm HUD but the rest of Planetarium as well.
Additionally, the Arm HUD uses Unity’s Mecanim animation state machines in conjunction with the Arm HUD’s own C# state machine. Using Unity’s existing systems for GUI and animated events inside our Widgets makes it easier for developers to not only use our hand-enabling toolsets but to easily build upon them as well.
Iterate, Observe, Rinse & Repeat
The unbroken thread of testing and feedback runs throughout the process of building Arm HUD. As we iterated, often with Jody proposing new UX approaches, we experimented with several ways to layout the various widgets and panels. We zeroed in on a layout that kept the hands somewhat separate. And we aligned the panels in a way that would prompt the user to tilt their arm up and in front for optimal tracking. We found that turning the wrist was an elegant and reliable way to use your body to switch contexts.
Witness the deep scrutiny and observational rigor of our process!
Next: A Brief History of Time Dial
[…] Next: The Evolution of Arm HUD […]
Just a suggestion, try to avoid forcing the user to adapt to the sensor like in “… we aligned the panels in a way that would prompt the user to tilt their arm up and in front for optimal tracking” ), and think of a natural way for the user to use the intended interface.
It’s not at all natural, to raise your arm up like you show in that last picture. It could eventually be more natural to look down, and still use the same system, without having to raise your arm, since everything is relative to sensor orientation in the real world.
It could also be more natural, if the panel that extends, to the inside of the arm (when arm is vertical) to instead extend up (when you arm is horizontal, relative to the sensor). something similar to the natural movement, we make when we look at a wrist watch. I suggest you try this, or support it as an option.
[…] The Evolution of Arm HUD Tomorrow: Planetarium’s Navigation […]
Avoid forcing the user to adapt to the sensor like in “… we aligned the panels in a way that would prompt the user to tilt their arm up and in front for optimal tracking” ), and instead think of a natural way for the user to use the intended interface.
It’s not at all natural, to raise your arm up like you show in that last picture. It could eventually be more natural to look down, and still use the same system, without having to raise your arm, since everything is relative to sensor orientation in the real world.
It could also be more natural, if the panel that extends, to the inside of the arm (when arm is vertical) to instead extend up (when you arm is horizontal, relative to the sensor). something similar to the natural movement, we make when we look at a wrist watch. I suggest you try this, or support it as an option.
You’re right that we should do our best to adapt the design to fit what’s natural for the user, making the application “talk human” if you will. That being said, that goal needs to be a balance with providing as reliable an experience as possible. This is a common balance you need to strike when working with new input and sensor technologies. There’s already a ton you can do with optical hand tracking, but there are some pretty defined limits on what can be tracked reliably. Cool thing is that it’s getting better every day.
The original designs for the ArmHUD had both of your suggestions (inside arm interactions and horizontal arm interactions) and we had to adjust the design to support more reliable interactions. For example, at the time we started the project, tracking became unreliable when the hand overlapped or touched the upper arm. We actually raised this limitation with our tracking team and I believe improvements to this functionality are coming down the line.
Maybe you could just rotate the orientation of the content of the current extended area.
Try to form a natural “T” with your arms (i.e. rotate right arm until it forms a 90 degree angle with left arm, without moving the left arm. Like if your right arm is going to grab your left arm).
It might feel more natural then try to use both arms in a parallel setting, since the interaction arm (right arm in this case), provides a natural hinge at the elbow, making movement rotational at your finger tip relative to your elbow. Try both settings and check which one is easier to trace a line, like you do with those slider widgets. A simple rotation of the GUI extent content, might help.
Another idea, to activate this GUI (“T”), could be to simply touch the arm, with your finger (instead of raising the arm), making it pop on whatever arm you feel more comfortable as a reference position for GUI (also helps supports right or left handed users), but could also be used to provide a distinct menu on the other arm (and still support left handed users, by an application setting, to swap “hands GUI”, like you can define for mouse keys in most systems).
There’s definitely a lot of interesting design space here, and the ArmHUD is really just one way to scaffold these sorts of UI elements. Take a look at Zach Kinstner’s Unity3D demo here https://www.youtube.com/watch?v=Phn3Ix-YxPA for some other super cool interactions. This is a big part of why we’re going to open source Widgets and the VR Planetarium, so y’all can jump in and find even better ways to build these.
[…] Iterating on Arm HUD. Our UI Widget for Unity had some interesting challenges along the way. (Get the full story in our blog post.) […]
[…] The Evolution of Arm HUD is an interesting article on the Arm HUD gif seen in the middle of this article. The Augmented Hand Series is the glitched hand artwork by Golan Levin, Chris Sugrue, and Kyle McDonald seen in the image towards the bottom of the page. […]