Get updates on the future of VR/AR:

  FILTERS:      Art & Design      Education      Explorations      Gaming      Hardware      Medical      Music      Orion      Unity      Unreal      UX Design      VR/AR   

// interaction engine

As mainstream VR/AR input continues to evolve – from the early days of gaze-only input to wand-style controllers and fully articulated hand tracking – so too are the virtual user interfaces we interact with. Slowly but surely we’re moving beyond flat UIs ported over from 2D screens and toward a future filled with spatial interface paradigms that take advantage of depth and volume.

Last time, we looked at how an interactive VR sculpture could be created with the Leap Motion Graphic Renderer as part of an experiment in interaction design. With the sculpture’s shapes rendering, we can now craft and code the layout and control of this 3D shape pool and the reactive behaviors of the individual objects.

By adding the Interaction Engine to our scene and InteractionBehavior components to each object, we have the basis for grasping, touching and other interactions. But for our VR sculpture, we can also use the Interaction Engine’s robust and performant awareness of hand proximity. With this foundation, we can experiment quickly with different reactions to hand presence, pinching, and touching specific objects. Let’s dive in!

With the next generation of mobile VR/AR experiences on the horizon, our team is constantly pushing the boundaries of our VR UX developer toolkit. Recently we created a quick VR sculpture prototype that combines the latest and greatest of these tools.

The Leap Motion Interaction Engine lets developers give their virtual objects the ability to be picked up, thrown, nudged, swatted, smooshed, or poked. With our new Graphic Renderer Module, you also have access to weapons-grade performance optimizations for power-hungry desktop VR and power-ravenous mobile VR.

In this post, we’ll walk through a small project built using these tools. This will provide a technical and workflow overview as one example of what’s possible – plus some VR UX design exploration and performance optimizations along the way.

As humans, we are spatial, physical thinkers. From birth we grow to understand the objects around us by the rules that govern how they move, and how we move them. These rules are so fundamental that we design our digital realities to reflect human expectations about how things work in the real world. At Leap […]

Explorations in VR Design is a journey through the bleeding edge of VR design – from architecting a space, to designing groundbreaking interactions, to making users feel powerful. Virtual reality is a world of specters, filled with sights and sounds that offer no physical resistance when you reach towards them. This means that any interactive […]

Following up on last week’s release of the Leap Motion Interaction Engine, I’m excited to share Weightless: Remastered, a major update to my project that won second place in the first-ever 3D Jam. A lot has changed since then! In this post, we’ll take a deeper look at the incredible power and versatility of the […]

Update (6/8/17): Interaction Engine 1.0 is here! Read more on our release announcement: blog.leapmotion.com/interaction-engine Game physics engines were never designed for human hands. In fact, when you bring your hands into VR, the results can be dramatic. Grabbing an object in your hand or squishing it against the floor, you send it flying as the physics […]

Many of the assets in this post have been updated since 2016! For the latest, see developer.leapmotion.com/guide. True hand presence in VR is incredibly powerful – and easier than ever. With the Leap Motion Unity Core Assets and Modules, you can start building right away with features like custom-designed hands, user interfaces, and event triggers. […]

In the physical world, our hands exert force onto other objects. By grasping objects, they exert force through fingers and palms (using precision, power, or scissors grips), and the objects push back. Gravity can also help us hold items in the palms of our hands.

But not in the digital world! In the frictionless space of the virtual, anything can happen. Gravity is optional. Magic is feasible.

Interacting with your computer is like reaching into another universe – one with an entirely different set of physical laws. Interface design is the art of creating digital rules that sync with our physical intuitions to bring both worlds closer together. We realize that most developers don’t want to spend days fine-tuning hand interactions, so we decided to design a framework that will accelerate development time and ensure more consistent interactions across apps.