Get updates on the future of VR/AR:

  FILTERS:      Art & Design      Education      Explorations      Gaming      Hardware      Medical      Music      Orion      Unity      Unreal      UX Design      VR/AR   

The way we interact with technology is changing, and what we see as resources – wood, water, earth – may one day include digital content. At last week’s API Night at RocketSpace, Leap Motion CTO David Holz discussed our evolution over the past year and what we’re working on. Featured speakers and v2 demos ranged from Unity and creative coding to LeapJS and JavaScript plugins.

Read More ›

By learning how to play an instrument, musicians have the power to channel beauty and emotion through their hands. This makes music theory a ripe playground for 3D motion control experiments. If learning to play a physical instrument is a matter of learning how that object works and building muscle memory, why can’t learning chord progressions happen the same way – but in the air?

Read More ›

Art imitates life, but it doesn’t have to be bound by its rules. While natural interactions always begin with real world analogues, it’s our job as designers to craft new experiences geared towards virtual platforms. Building on last week’s game design tips, today we’re thinking about how we can break real-world rules and give people superpowers.

Read More ›

imaginerit00We live in a heavily coded world – where the ability to talk to computers, and understand how they “think,” is more important than ever. At the same time, however, programming is rarely taught in schools, and it can be hard for young students to get started.

Read More ›

What can virtual environments teach us about real-world issues? At last month’s ImagineRIT festival, Team Galacticod’s game Ripple took visitors into an interactive ocean to learn about threats facing coral reefs. It’s just one of many great projects we’ve seen recently from RIT students.

Read More ›

UX design is a neverending learning process. Over the past year, we’ve been iterating on our UX best practices to embrace what we’ve learned from community feedback, internal research, and developer projects. While we’re still exploring what that means for the next generation of Leap Motion apps, today I’d like to highlight four success stories that are currently available in the App Store.

Games are boundless “safe zones” where developers and users can try out novel interactions and experiences that defy convention. Unlike utility apps, which are task-oriented, the game player’s goal is to play and experience something new. This is perhaps why some of the most compelling Leap Motion apps are games with cool interactions.

Read More ›

Each spring, UX aficionados gather in a different part of the world for CHI, an annual summit focused entirely on the human factors in computer interfaces. For creatives, professionals, academics, and developers on the forefront of next-gen UX, it’s a chance to  come together and rally around two tough questions.

First, what tools that exist today can help us evolve the way people interact with technology? Second, how can we iterate upon this arsenal to help transform interaction paradigms from pie-in-the-sky sketches into concrete, real-world solutions to better our health, increase our efficiency, and make us feel inspired?

At this year’s conference in Toronto, UX researcher and designer Sheila Christian, a recent graduate of Carnegie Mellon’s Human-Computer Interaction Master’s Program, joined Madeiran students Júlio Alves, André Ferreira, Dinarte Jesus, Rúben Freitas, and Nelson Vieira to build a Leap Motion experience from the ground up for CHI’s student game competition. Their creation – Volcano Salvation, an Aztec-themed god game using webcam head tracking and Leap Motion control.

Read More ›

As we saw last week, the V2 skeletal beta Bone API makes it possible to build physical interactions around a skeleton-rigged mesh – something that we couldn’t accomplish consistently with V1. When combined with auditory and other forms of visual feedback, onscreen hands can create a real sense of physical space, as well as complete the illusion created by VR interfaces like the Oculus Rift.

But rigged hands also involve several intriguing challenges, which we encountered in developing our Unity example hand. In this post, we’ll take a look at some of the technical and design challenges that went behind building our rigged hand, and how we’re still thinking about them.

Read More ›

For touch-based input technologies, triggering an action is a simple binary question. Touch the device to engage with it. Release it to disengage. Motion control offers a lot more nuance and power, but unlike with mouse clicks or screen taps, your hand doesn’t have the ability to disappear at will. Instead of designing interactions in black and white, we need to start thinking in shades of gray.

Let’s say there’s an app with two controls: left swipe and right swipe. When the user swipes right and naturally pulls the hand back to its original, the left swipe might get triggered accidentally. This is a false positive error, and bad UX. So, with hand data being tracked constantly, how do you stop hand interactions from colliding with one another?

Read More ›

In any 3D virtual environment, selecting objects with a mouse becomes difficult if the scene becomes densely populated and structures are occluded. In areas such as game development, designers can work around this limitation by creating open environments with large amounts of space between objects.

However, biocommunicators like myself aren’t afforded the same luxury, since models of anatomy represent environments in which there is no true empty space. Organs, vessels, and nerves always sit flush with adjacent structures. Consequently, accessing occluded objects requires a user to either distort the POV or remove obstructions entirely. Although individual structures can be observed, important information about the spatial relationships between different structures is lost.

Read More ›