From gaming to big data, virtual reality gives us the chance to build and explore whole new worlds beyond the screen. As we developed demos and prototypes with the Oculus Rift internally, several UX insights sprung forth. Now that many of you have received your VR Developer Mounts, we thought we’d share.
// Nancy Chen
In the physical world, our hands exert force onto other objects. By grasping objects, they exert force through fingers and palms (using precision, power, or scissors grips), and the objects push back. Gravity can also help us hold items in the palms of our hands.
But not in the digital world! In the frictionless space of the virtual, anything can happen. Gravity is optional. Magic is feasible.
Old habits can be hard to break. When I’m building Leap Motion prototypes, I often find myself slipping into designing for cursors and touchscreens – paradigms based on one-handed interactions. By remembering to think outside the mouse, we can open ourselves up to interacting with virtual objects using both hands. But when are two-handed interactions the right approach?
Going from zero to 60 can feel exhilarating – but if you don’t know what you’re doing, it can spell disaster. The same is true for first-time app users. Even with traditional interfaces, a clear and intuitive onboarding experience is important. For new interfaces like the Leap Motion Controller, it can be the difference between joy and frustration. The trick is to build an onramp – a starting experience where users can “speed up” to access the full functionality and interaction set.
Art imitates life, but it doesn’t have to be bound by its rules. While natural interactions always begin with real world analogues, it’s our job as designers to craft new experiences geared towards virtual platforms. Building on last week’s game design tips, today we’re thinking about how we can break real-world rules and give people superpowers.
Over the past year, we’ve been iterating on our UX best practices to embrace what we’ve learned from community feedback, internal research, and developer projects. While we’re still exploring what that means for the next generation of Leap Motion apps, today I’d like to highlight four success stories that are currently available in the App Store.
For touch-based input technologies, triggering an action is a simple binary question. Touch the device to engage with it. Release it to disengage. Motion control offers a lot more nuance and power, but unlike with mouse clicks or screen taps, your hand doesn’t have the ability to disappear at will. Instead of designing interactions in black and white, we need to start thinking in shades of gray.
Mapping legacy interactions like mouse clicks or touchscreen taps to air pokes often results in unhappy users. Let’s think beyond the idea of “touch at a distance” and take a look at what it means when your hand is the interface.
Hack Reactor is a developer bootcamp where people become software engineers through live coding, real-world projects, and meetups. On March 28, Leap Motion’s senior developer Dave Edelhart and I were invited to present the Leap Motion Controller and Three.js. Together with over 30 developers (both in training and from the wider San Francisco coding community), […]