Get updates on the future of VR/AR:

  FILTERS:      Art & Design      Education      Explorations      Gaming      Hardware      Medical      Music      Orion      Unity      Unreal      UX Design      VR/AR   

// Leap Motion Team

Art imitates life, but it doesn’t have to be bound by its rules. While natural interactions always begin with real world analogues, it’s our job as designers to craft new experiences geared towards virtual platforms. Building on last week’s game design tips, today we’re thinking about how we can break real-world rules and give people superpowers.

We live in a heavily coded world – where the ability to talk to computers, and understand how they “think,” is more important than ever. At the same time, however, programming is rarely taught in schools.

What can virtual environments teach us about real-world issues? At last month’s ImagineRIT festival, Team Galacticod’s game Ripple took visitors into an interactive ocean to learn about threats facing coral reefs.

Over the past year, we’ve been iterating on our UX best practices to embrace what we’ve learned from community feedback, internal research, and developer projects. While we’re still exploring what that means for the next generation of Leap Motion apps, today I’d like to highlight four success stories that are currently available in the App Store.

Built in Unity over the course of five weeks, Volcano Salvation lets you change your perspective in the game by turning your head from side to side. This approach to camera controls – facial recognition via webcam – makes it easy for the player to focus on the hand interactions.

When combined with auditory and other forms of visual feedback, onscreen hands can create a real sense of physical space, as well as complete the illusion created by VR interfaces like the Oculus Rift. But rigged hands also involve several intriguing challenges.

For touch-based input technologies, triggering an action is a simple binary question. Touch the device to engage with it. Release it to disengage. Motion control offers a lot more nuance and power, but unlike with mouse clicks or screen taps, your hand doesn’t have the ability to disappear at will. Instead of designing interactions in black and white, we need to start thinking in shades of gray.

Mapping legacy interactions like mouse clicks or touchscreen taps to air pokes often results in unhappy users. Let’s think beyond the idea of “touch at a distance” and take a look at what it means when your hand is the interface.

We recently received an intriguing call from Puzzle Break, a Seattle-based company that specializes in building mysterious rooms. The types of mindbenders you’ve encountered in video games, except rendered in real life.

Anastasiy and Crispy Driven Pixels envision a simple, easy-to-use artistic interface that lets you create almost anything you can imagine with just your hands in the air.