As humans, we are spatial, physical thinkers. From birth we grow to understand the objects around us by the rules that govern how they move, and how we move them. These rules are so fundamental that we design our digital realities to reflect human expectations about how things work in the real world.
At Leap Motion, our mission is to empower people to interact seamlessly with the digital landscape. This starts with tracking hands and fingers with such speed and precision that the barrier between the digital and physical worlds begins to blur. But hand tracking alone isn’t enough to capture human intention. In the digital world there are no physical constraints. We make the rules. So we asked ourselves: How should virtual objects feel and behave?
We’ve thought deeply about this question, and in the process we’ve created new paradigms for digital-physical interaction. Last year, we released an early access beta of the Leap Motion Interaction Engine, a layer that exists between the Unity game engine and real-world hand physics. Since then, we’ve worked hard to make the Interaction Engine simpler to use – tuning how interactions feel and behave, and creating new tools to make it performant on mobile processors.
Today, we’re excited to release a major upgrade to this tool kit. It contains an update to the engine’s fundamental physics functionality and makes it easy to create the physical user experiences that work best in VR. Because we see the power in extending VR and AR interaction across both hands and tools, we’ve also made it work seamlessly with hands and PC handheld controllers. We’ve heard from many developers about the challenge of supporting multiple inputs, so this feature makes it easier to support hand tracking alongside the Oculus Touch or Vive controllers.
Let’s take a deeper look at some of the new features and functions in the Interaction Engine.
Contact, Grasp, Hover
The fundamental purpose of the Interaction Engine is to handle interactions with digital objects. Some of these are straightforward, others more complex. For example, consider:
Contact: what happens when a user passes their hand through an object?
Grasping: what does it mean to naturally grab and release a virtual object?
Hover: how can I be sure that the object that I’m contacting is what I actually want to interact with?
We want users to have consistent experiences in these cases across applications. And we want you as a developer to be able to focus on the content and experience, rather than getting lost in the weeds creating grabbing heuristics.
Physical User Interfaces
Users anticipate interacting physically with both objects and interfaces. That’s why we’ve built a powerful user interface module into the Interaction Engine so developers can customize and create reliable interfaces that are a delight to use. These are physically inspired, allowing users to understand the system on their first touch.
Widgets and Wearable Interfaces
In addition to traditional user interfaces, we’ve added support for more forward-looking user interfaces like wearables and widgets. For example, you now have the ability to create an interface that is worn on the hand, but expands into freestanding palettes as the user grabs an element off the hand and places it in the world.
Graphic Renderer
Alongside the Interaction Engine, we’re also releasing a beta version of an advanced new tool – the Graphic Renderer. As we push the boundaries of VR hardware, software, and design, we often develop internal tools that would be of great use to the broader VR community.
In building the Interaction Engine, we found it was important to render and interact with curved spaces for human-oriented user interfaces, and wanted to do it in a way that was performant even in very constrained environments. So we created the Graphic Renderer, a general-purpose tool that can curve an entire user interface with ease, and render it all in a single draw call. Designed to unlock new levels of performance in the upcoming generation of mobile and all-in-one headsets, it tightly pairs with the Interaction Engine to bring curved spaces to VR interaction.
Scene built with the Graphic Renderer and rendered in a single draw call.
We see the Interaction Engine and Graphic Renderer as fundamental tools for all of VR that enable you to create natural and compelling interactions in a reliable and performant package. Now it’s in your hands – we can’t wait to see what you build with it.
@leapmotion where is the love for @UnrealEngine
Curious about this as well.
We have a massive upgrade of our Unreal integration in the works right now 😀
Thank you for the info 🙂
I’m glad to hear that because there’s so much more I can do with UE4.
I’m also interested in seeing this functionality in my Mac apps. When can we expect that?
Do you have any information on when that will be realeased ?
Sweet! I’ll be testing soon with Python (MicroPython) and the ESP32WROOM. Thank you Everyone.
It’s like TMIN video player interface here https://www.youtube.com/edit?o=U&video_id=RBScOtZdrNg
[…] Update (June 8, 2017): The UI Input Module has been deprecated, as it is now part of the Leap Motion Interaction Engine. Learn more on our blog. […]
[…] Update (June 8, 2017): Detection Utilities functions are now best handled by the Interaction Engine (e.g. Camera-Facing Open Hand exists in example scripts in the Interaction Manager). Learn more on our blog. […]
[…] (and sometimes breaking a few laws of physics) to create something truly human. With last week’s Leap Motion Interaction Engine 1.0 release, VR developers now have access to unprecedented physical interfaces and interactions – including […]
[…] these new major improvements, the opportunities for interacting with the digital world become more apparent. Rest assured there […]
This is badass
[…] We’ve already talked about its ambient-inspired soundtrack, but you might be surprised to learn the sound effects in Blocks were one of our biggest development challenges – second only to the physical object interactions, an early prototype of the Leap Motion Interaction Engine. […]
I’m trying to do something very similar to that example gif you have for the Graphic Renderer. Is there a chance you can release the unity example scene for that? thanks!
[…] built the Leap Motion Interaction Engine to give developers the power to define the physical laws of the virtual universe. It unlocks […]
[…] Update (6/8/17): Interaction Engine 1.0 is here! Read more on our release announcement: blog.leapmotion.com/interaction-engine […]
[…] Update (June 8, 2017): The UI Input Module has been deprecated, as 3D interfaces are now handled by the Leap Motion Interaction Engine. Learn more on our blog. […]
[…] Update (June 8, 2017): Detection Utilities functions are now best handled by the Interaction Engine (e.g. Camera-Facing Open Hand exists in example scripts in the Interaction Manager). Learn more on our blog. […]
[…] wants to avoid all of this and that’s why it has created its own interactions engine, that this week has reached version 1.0 […]
how can i pass the interaction feedback to micro controller like Arduino to trigger a action, i mean to control a servo when interacting with virtual object with interaction behavior.
“interaction feedback” you mean a set vibration for the motor for when a specific interaction is triggered (Or on certain objects?)
yes sir, when i grasp or hold a virtual object, the action should trigger a motors.
if i release the virtual object the motor should stop working.
[…] with your bare hands is an incredibly complex task. This is one of the reasons we developed the Leap Motion Interaction Engine, whose purpose is to make the foundational elements of grabbing and releasing virtual objects feel […]
[…] with your bare hands is an incredibly complex task. This is one of the reasons we developed the Leap Motion Interaction Engine, whose purpose is to make the foundational elements of grabbing and releasing virtual objects feel […]
[…] with your bare hands is an incredibly complex task. This is one of the reasons we developed the Leap Motion Interaction Engine, whose purpose is to make the foundational elements of grabbing and releasing virtual objects feel […]
[…] with your bare hands is an incredibly complex task. This is one of the reasons we developed the Leap Motion Interaction Engine, whose purpose is to make the foundational elements of grabbing and releasing virtual objects feel […]
[…] with your bare hands is an incredibly complex task. This is one of the reasons we developed the Leap Motion Interaction Engine, whose purpose is to make the foundational elements of grabbing and releasing virtual objects feel […]
[…] with your bare hands is an incredibly complex task. This is one of the reasons we developed the Leap Motion Interaction Engine, whose purpose is to make the foundational elements of grabbing and releasing virtual objects feel […]
[…] with your bare hands is an incredibly complex task. This is one of the reasons we developed the Leap Motion Interaction Engine, whose purpose is to make the foundational elements of grabbing and releasing virtual objects feel […]
[…] with your bare hands is an incredibly complex task. This is one of the reasons we developed the Leap Motion Interaction Engine, whose purpose is to make the foundational elements of grabbing and releasing virtual objects feel […]