As humans, we are spatial, physical thinkers. From birth we grow to understand the objects around us by the rules that govern how they move, and how we move them. These rules are so fundamental that we design our digital realities to reflect human expectations about how things work in the real world.

At Leap Motion, our mission is to empower people to interact seamlessly with the digital landscape. This starts with tracking hands and fingers with such speed and precision that the barrier between the digital and physical worlds begins to blur. But hand tracking alone isn’t enough to capture human intention. In the digital world there are no physical constraints. We make the rules. So we asked ourselves: How should virtual objects feel and behave?

We’ve thought deeply about this question, and in the process we’ve created new paradigms for digital-physical interaction. Last year, we released an early access beta of the Leap Motion Interaction Engine, a layer that exists between the Unity game engine and real-world hand physics. Since then, we’ve worked hard to make the Interaction Engine simpler to use – tuning how interactions feel and behave, and creating new tools to make it performant on mobile processors.

Today, we’re excited to release a major upgrade to this tool kit. It contains an update to the engine’s fundamental physics functionality and makes it easy to create the physical user experiences that work best in VR. Because we see the power in extending VR and AR interaction across both hands and tools, we’ve also made it work seamlessly with hands and PC handheld controllers. We’ve heard from many developers about the challenge of supporting multiple inputs, so this feature makes it easier to support hand tracking alongside the Oculus Touch or Vive controllers.

Let’s take a deeper look at some of the new features and functions in the Interaction Engine.

Contact, Grasp, Hover

The fundamental purpose of the Interaction Engine is to handle interactions with digital objects. Some of these are straightforward, others more complex. For example, consider:

Contact: what happens when a user passes their hand through an object?
Grasping: what does it mean to naturally grab and release a virtual object?
Hover: how can I be sure that the object that I’m contacting is what I actually want to interact with?

We want users to have consistent experiences in these cases across applications. And we want you as a developer to be able to focus on the content and experience, rather than getting lost in the weeds creating grabbing heuristics.

Physical User Interfaces

Users anticipate interacting physically with both objects and interfaces. That’s why we’ve built a powerful user interface module into the Interaction Engine so developers can customize and create reliable interfaces that are a delight to use. These are physically inspired, allowing users to understand the system on their first touch.

Widgets and Wearable Interfaces

In addition to traditional user interfaces, we’ve added support for more forward-looking user interfaces like wearables and widgets. For example, you now have the ability to create an interface that is worn on the hand, but expands into freestanding palettes as the user grabs an element off the hand and places it in the world.

Graphic Renderer

Alongside the Interaction Engine, we’re also releasing a beta version of an advanced new tool – the Graphic Renderer. As we push the boundaries of VR hardware, software, and design, we often develop internal tools that would be of great use to the broader VR community.

In building the Interaction Engine, we found it was important to render and interact with curved spaces for human-oriented user interfaces, and wanted to do it in a way that was performant even in very constrained environments. So we created the Graphic Renderer, a general-purpose tool that can curve an entire user interface with ease, and render it all in a single draw call. Designed to unlock new levels of performance in the upcoming generation of mobile and all-in-one headsets, it tightly pairs with the Interaction Engine to bring curved spaces to VR interaction.

Scene built with the Graphic Renderer and rendered in a single draw call.

We see the Interaction Engine and Graphic Renderer as fundamental tools for all of VR that enable you to create natural and compelling interactions in a reliable and performant package. Now it’s in your hands – we can’t wait to see what you build with it.