Update (6/8/17): Interaction Engine 1.0 is here! Read more on our release announcement: blog.leapmotion.com/interaction-engine
Game physics engines were never designed for human hands. In fact, when you bring your hands into VR, the results can be dramatic. Grabbing an object in your hand or squishing it against the floor, you send it flying as the physics engine desperately tries to keep your fingers out of it.
But by exploring the grey areas between real-world and digital physics, we can build a more human experience. One where you can reach out and grab something – a block, a teapot, a planet – and simply pick it up. Your fingers phase through the material, but the object still feels real. Like it has weight.
By exploring grey areas between real-world and digital physics, we can build a more human experience. Click To TweetBeneath the surface, this is an enormously complex challenge. Over the last several months, we’ve been boiling that complexity down to a fundamental tool that Unity developers can rapidly build with. Today we’re excited to share an early access beta of our Interaction Engine, now available as a Module for our Unity Core Assets.
How It Works
The Interaction Engine is a layer that exists between the Unity game engine and real-world hand physics. To make object interactions work in a way that satisfies human expectations, it implements an alternate set of physics rules that take over when your hands are embedded inside a virtual object. The results would be impossible in reality, but they feel more satisfying and easy to use. Our Blocks demo is built with an early prototype of this engine, which has been designed for greater extensibility and customization.
The Interaction Engine is designed to handle object behaviors, as well as detect whether an object is being grasped. This makes it possible to pick things up and hold them in a way that feels truly solid. It also uses a secondary real-time physics representation of the hands, opening up more subtle interactions.
Our goal with the Interaction Engine is for integration to be quick and easy. However, it also allows for a high degree of customization across a wide range of features. You can modify the properties of an object interaction, including desired position when grasped, moving the object to the desired position, determining what happens when tracking is momentarily lost, throwing velocity, and layer transitions to handle how collisions work. Learn more about building with the Interaction Engine in our Unity documentation.
Interaction Engine 101
Without the Interaction Engine, hands in VR can feel like one of those late-night infomercials where people can’t tie their own shoes. Now available on GitHub, Interaction Engine 101 is a quick introduction that lets you compare interactions with the Interaction Engine turned on or off:
Grasping and picking up an object is the most fundamental element of the Interaction Engine. With normal game physics, the object springs from your hand and flies around the room. The Interaction Engine makes it feel easy and natural.
The ability to pick up an object also extends to higher-level interactions, like stacking.
Standard rigidbodies will violently try to escape if you compress them into the floor. With the Interaction Engine, they take on new elastic properties, allowing your hands to dynamically phase through virtual matter.
The Interaction Engine also allows you to customize throwing physics. Without it, you could probably throw an object, but it would be extremely difficult.
This early beta of the Interaction Engine works well with the types of objects you see in these scenes – namely cubes and spheres around 1-2 inches in size. Game objects of differing shapes, sizes, and physics settings may have different results. We want to hear about your experience with the Interaction Engine so we can continue to make improvements.
Ready to experiment? Download the Unity Core Assets and Interaction Engine Module, check out the documentation, and share your feedback in the comments below or on our community forums!
MAN
I hate you Leap Motion… I’ve spent days developing means to grab items in my game.
but I love you on releasing this now <3
We
hate love you too!This makes it possible to pick things up and hold them in a way that feels truly solid.
Great, now i can stop using MagneticPinch from the previous SDK, and continue creating an office simulator 🙂
https://developer.leapmotion.com/documentation/unity/unity/Unity.MagneticPinch.html
https://www.youtube.com/watch?v=n1_y27NCvlY
Ha, nice! You should submit that to the developer gallery when it’s ready: https://developer.leapmotion.com/gallery/
I’m on the hate bus with kesadisan on this one.
I spent the Last month developing my own system of picking up and moving objects. 1 day after i finish my project you release this.
Thanks so much for it…..but screw you too 🙂
[…] with hand-tracked controls and is now available as a Unity module. Leap announced the launch in a new blog post that goes into detail about what the Interaction Engine means for hand-tracked […]
[…] with hand-tracked controls and is now available as a Unity module. Leap announced the launch in a new blog post that goes into detail about what the Interaction Engine means for hand-tracked […]
[…] with hand-tracked controls and is now available as a Unity module. Leap announced the launch in a new blog post that goes into detail about what the Interaction Engine means for hand-tracked […]
One need to introduce it into e-book readers. Not that it really was very useful there, but just for the show if turning paper sheets.
Maybe some Visual Novel players, there was something opensource in Python…
As a user, is this for the entire platform or just VR? Will this effect previous releases of content?
Hey developers, I am dyslexic, something all dyslexic’s have is dysgraphia. This is a disorder involving the hands that hampers the fine control of the fingers. Being able to manipulate objects this way should take a lot of the stress out of the hands.There is a wide market for tools out there beyond games. Thanks for the help and best of luck to you all.
The Interaction Engine is built in the Unity game engine, and is designed for VR. Previous content releases are unaffected because it needs to be built in at the development level. We think that VR/AR has massive potential for everything from games to physiotherapy to social applications to education — all the different ways that people interact with the world around them. Definitely excited to see what applications will unlock new levels of human potential 😀
Any plans for a UE4 version of the Interaction Engine?
[…] Motion released a new beta “Interaction Engine” Unity asset that improves the experience of grabbing objects in […]
[…] for developers to build VR environments can manipulate sophisticatedly with their hands. In a blog post, the company calls the engine “a layer that exists between the Unity game engine and real-world […]
[…] for developers to build VR environments can manipulate sophisticatedly with their hands. In a blog post, the company calls the engine “a layer that exists between the Unity game engine and real-world […]
[…] for developers to build VR environments can manipulate sophisticatedly with their hands. In a blog post, the company calls the engine “a layer that exists between the Unity game engine and real-world […]
[…] for developers to build VR environments can manipulate sophisticatedly with their hands. In a blog post, the company calls the engine “a layer that exists between the Unity game engine and real-world […]
[…] for developers to build VR environments can manipulate sophisticatedly with their hands. In a blog post, the company calls the engine “a layer that exists between the Unity game engine and real-world […]
[…] for builders to construct VR environments can manipulate sophisticatedly with their arms. In a blog post, the corporate calls the engine “a layer that exists between the Harmony recreation engine and […]
[…] for developers to build VR environments can manipulate sophisticatedly with their hands. In a blog post, the company calls the engine “a layer that exists between the Unity game engine and real-world […]
[…] for developers to build VR environments can manipulate sophisticatedly with their hands. In a blog post, the company calls the engine “a layer that exists between the Unity game engine and real-world […]
[…] for developers to build VR environments can manipulate sophisticatedly with their hands. In a blog post, the company calls the engine “a layer that exists between the Unity game engine and real-world […]
[…] for developers to build VR environments can manipulate sophisticatedly with their hands. In a blog post, the company calls the engine “a layer that exists between the Unity game engine and real-world […]
[…] for developers to build VR environments can manipulate sophisticatedly with their hands. In a blog post, the company calls the engine “a layer that exists between the Unity game engine and real-world […]
[…] for developers to build VR environments can manipulate sophisticatedly with their hands. In a blog post, the company calls the engine “a layer that exists between the Unity game engine and real-world […]
[…] for builders to construct VR environments can manipulate sophisticatedly with their arms. In a blog post, the corporate calls the engine “a layer that exists between the Unity recreation engine and […]
[…] for developers to build VR environments can manipulate sophisticatedly with their hands. In a blog post, the company calls the engine “a layer that exists between the Unity game engine and real-world […]
[…] for developers to build VR environments can manipulate sophisticatedly with their hands. In a blog post, the company calls the engine “a layer that exists between the Unity game engine and real-world […]
[…] for developers to build VR environments can manipulate sophisticatedly with their hands. In a blog post, the company calls the engine “a layer that exists between the Unity game engine and real-world […]
[…] for developers to build VR environments can manipulate sophisticatedly with their hands. In a blog post, the company calls the engine “a layer that exists between the Unity game engine and real-world […]
[…] for developers to build VR environments can manipulate sophisticatedly with their hands. In a blog post, the company calls the engine “a layer that exists between the Unity game engine and real-world […]
[…] 1 月份时 ,Leap Motion 就宣布要推出一款基于手势交互控制的全新物理引擎 “Interactive Engine(交互引擎)”。 近日,这个酝酿已久的“ 交互引擎” 已经正式发布。Leap Motion […]
[…] blog.leapmotion.com + […]
[…] a motor Example test that with Oculus Rift and HTC Vive is compatible. For the rest of us, the announcement blog post has a few sample GIFs. Here is a without switched the […]
[…] with a sample project that’s compatible with Oculus Rift and HTC Vive. For the rest of us, the announcement blog post features a few example GIFs. Here’s one, without the engine turned […]
[…] an Oculus Rift or HTC Vive — it’s developed a new piece of software called the “Interaction Engine.” Available as an add-on for Unity, it promises a more realistic experience while interacting […]
[…] an Oculus Rift or HTC Vive — it’s developed a new piece of software called the “Interaction Engine.” Available as an add-on for Unity, it promises a more realistic experience while interacting […]
[…] with a sample project that’s compatible with Oculus Rift and HTC Vive. For the rest of us, the announcement blog post features a few example GIFs. Here’s one, without the engine turned […]
[…] a sample project that’s compatible with Oculus Rift and HTC Vive. For the rest of us, the announcement blog post features a few example GIFs. Here’s one, without the engine turned […]
[…] with a sample project that’s compatible with Oculus Rift and HTC Vive. For the rest of us, the announcement blog post features a few example GIFs. Here’s one, without the engine turned […]
[…] for developers to build VR environments can manipulate sophisticatedly with their hands. In a blog post, the company calls the engine “a layer that exists between the Unity game engine and […]
[…] an Oculus Rift or HTC Vive — it’s developed a new piece of software called the “Interaction Engine.” Available as an add-on for Unity, it promises a more realistic experience while interacting […]
Any plans for a UE4 version of the Interaction Engine?
We would like to have parity between our Unreal and Unity support, but no ETA currently.
What is the status on Interaction Engine for UE4?
There’s already a major update in the works that’s basically ready as far as LeapC integration, timewarp, etc. but the Interaction Engine portion will take some time after that. Still no ETA, but it’s moved from “we’d like to have it” to “let’s make this happen.”
[…] “To make object interactions work in a way that satisfies human expectations, [the engine] implements an alternate set of physics rules that take over when your hands are embedded inside a virtual object,” said the company in a blog post. […]
[…] and digital physics, we will construct a extra human expertise,” Leap Movement mentioned in a blog post. “One the place you possibly can attain out and seize one thing—a block, a teapot, a […]
[…] a blog post, Leap Motion calls its engine “a layer that exists between the Unity game engine and […]
[…] digital physics, we are able to construct a extra human expertise,” Leap Movement stated in a blog post. “One the place you possibly can attain out and seize one thing—a block, a teapot, a […]
[…] physics, we are able to construct a extra human expertise,” Leap Movement mentioned in a blog post. “One the place you’ll be able to attain out and seize one thing—a block, a teapot, a […]
[…] a blog post, Leap Motion calls its engine “a layer that exists between the Unity game engine and […]
[…] a blog post, Leap Motion calls its engine “a layer that exists between the Unity game engine and real-world […]
[…] with a sample project that’s compatible with Oculus Rift and HTC Vive. For the rest of us, the announcement blog post features a few example GIFs. Here’s one, without the engine turned […]
[…] a blog post, Leap Motion calls its engine “a layer that exists between the Unity game engine and […]
[…] a blog post, Leap Motion calls its engine “a layer that exists between the Unity game engine and […]
[…] keeping with Leap Motion’s blog, the corporate has simply unveiled an early get right of entry to beta iteration of its Interplay […]
[…] with Leap Motion’s blog, the corporate has simply unveiled an early get right of entry to beta iteration of its Interplay […]
[…] a sample project that’s compatible with Oculus Rift and HTC Vive. For the rest of us, the announcement blog post features a few example GIFs. Here’s one, without the engine turned […]
[…] real-world and digital physics, we can build a more human experience,” Leap Motion said in a blog post. “One where you can reach out and grab something—a block, a teapot, a planet—and […]
[…] real-world and digital physics, we can build a more human experience,” Leap Motion said in a blog post. “One where you can reach out and grab something—a block, a teapot, a planet—and simply […]
[…] real-world and digital physics, we can build a more human experience,” Leap Motion said in a blog post. “One where you can reach out and grab something—a block, a teapot, a planet—and simply […]
[…] for developers to build VR environments can manipulate sophisticatedly with their hands. In a blog post, a association calls a engine “a covering that exists between a Unity diversion engine and […]
[…] mostrá ndose en demos y kits de prueba, pero justo ahora se ha lanzado una beta del Interaction Engine , con el que pretenden que la integració n del control y movimiento de las manos en los […]
[…] considerable mostrándose en demos y kits de prueba, pero justo ahora se ha lanzado una beta del Interaction Engine, con el que pretenden que la integración del control y movimiento de las manos en los juegos sea […]
[…] Motion对现实世界中用户手部动作的侦测,实现虚拟现实操作。(Leap Motion, Engadget, […]
[…] mostrándose en demos y kits de prueba, pero justo ahora se ha lanzado una beta del Interaction Engine, con el que pretenden que la integración del control y movimiento de las manos en los juegos […]
[…] Engine” that promises a more realistic experience while interacting with the VR objects. (Leap Motion, Engadget, […]
[…] considerable mostrándose en demos y kits de prueba, pero justo ahora se ha lanzado una beta del Interaction Engine, con el que pretenden que la integración del control y movimiento de las manos en los juegos sea […]
[…] with a sample project that’s compatible with Oculus Rift and HTC Vive. For the rest of us, the announcement blog post features a few example GIFs. Here’s one, without the engine turned […]
[…] considerable mostrándose en demos y kits de prueba, pero justo ahora se ha lanzado una beta del Interaction Engine, con el que pretenden que la integración del control y movimiento de las manos en los juegos sea […]
[…] źródło: Leap Motion […]
[…] правдоподобно и просты в использовании», — сказано в блоге […]
[…] Engine(対話エンジン)のアーリーベータ版を公開した。そのブログ記事では […]
[…] computer hardware sensors that are capable of sensing finger motions as input has introduced an early beta version of their Interaction Engine to make it easier for developers to build VR environments with hand […]
[…] with a sample project that’s compatible with Oculus Rift and HTC Vive. For the rest of us, the announcement blog post features a few example GIFs. Here’s one, without the engine turned […]
[…] flying as the physics engine desperately tries to keep your fingers out of it,” Leap Motion elaborates on the […]
[…] Motion is well aware of this problem and have announced a beta for what they are calling their Interaction Engine to help fix it. What the Interaction Engine does […]
[…] Motion is well aware of this problem and have announced a beta for what they are calling their Interaction Engine to help fix it. What the Interaction Engine does […]
[…] with hand-tracked controls and is now available as a Unity module. Leap announced the launch in a new blog post that goes into detail about what the Interaction Engine means for hand-tracked […]
[…] with hand-tracked controls and is now available as a Unity module. Leap announced the launch in a new blog post that goes into detail about what the Interaction Engine means for hand-tracked […]
[…] with hand-tracked controls and is now available as a Unity module. Leap announced the launch in a new blog post that goes into detail about what the Interaction Engine means for hand-tracked […]
[…] with hand-tracked controls and is now out there as a Unity module. Leap introduced the launch in a new blog post that goes into element about what the Interplay Engine means for hand-tracked […]
[…] В настоящее время расширение для Unity ещё тестируется и работает хорошо с объектами размером 1–2 дюйма. Это довольно серьёзные ограничения, но речь идёт лишь о первых таких опытах — в будущем технология будет дорабатываться. В настоящее время это нишевый рынок. Очень мало людей являются владельцами VR-шлемов высокого класса, и уж совсем незначительная их часть купила контроллер Leap Motion. Возможно, в будущем сенсор станет частью стандартного комплекта шлемов виртуальной реальности. С демонстрацией работы технологии можно ознакомиться в GIF-анимациях на официальном сайте Leap Motion. […]
[…] В настоящее время расширение для Unity ещё тестируется и работает хорошо с объектами размером 1–2 дюйма. Это довольно серьёзные ограничения, но речь идёт лишь о первых таких опытах — в будущем технология будет дорабатываться. В настоящее время это нишевый рынок. Очень мало людей являются владельцами VR-шлемов высокого класса, и уж совсем незначительная их часть купила контроллер Leap Motion. Возможно, в будущем сенсор станет частью стандартного комплекта шлемов виртуальной реальности. С демонстрацией работы технологии можно ознакомиться в GIF-анимациях на официальном сайте Leap Motion. […]
[…] Interaction Engine stato appena rilasciato in versione Beta Accesso Anticipato. Si tratta dell’ultima evoluzione delle tecnologie di Leap Motion per il tracciamento delle mani, disponibile come modulo di Unity. L’annuncio del rilascio di questa prima versione di Interaction Engine avviene tramite questo post. […]
[…] Interaction Engine è stato appena rilasciato in versione Beta Accesso Anticipato. Si tratta dell'ultima evoluzione delle tecnologie di Leap Motion per il tracciamento delle mani, disponibile come modulo di Unity. L'annuncio del rilascio di questa prima versione di Interaction Engine avviene tramite questo post. […]
[…] Interaction Engine è stato appena rilasciato in versione Beta Accesso Anticipato. Si tratta dell’ultima evoluzione delle tecnologie di Leap Motion per il tracciamento delle mani, disponibile come modulo di Unity. L’annuncio del rilascio di questa prima versione di Interaction Engine avviene tramite questo post. […]
[…] one of its blogs, leap Motion has revealed a lot of information about its new Interaction Engine, wherein it says […]
[…] В настоящее время расширение для Unity ещё тестируется и работает хорошо с объектами размером 1–2 дюйма. Это довольно серьёзные ограничения, но речь идёт лишь о первых таких опытах — в будущем технология будет дорабатываться. В настоящее время это нишевый рынок. Очень мало людей являются владельцами VR-шлемов высокого класса, и уж совсем незначительная их часть купила контроллер Leap Motion. Возможно, в будущем сенсор станет частью стандартного комплекта шлемов виртуальной реальности. С демонстрацией работы технологии можно ознакомиться в GIF-анимациях на официальном сайте Leap Motion. […]
[…] to handle object behaviors as well as detect whether an object is being grasped,” reads a recent blog post introducing the Interaction Engine, “This makes it possible to pick things up and hold them […]
[…] to handle object behaviors as well as detect whether an object is being grasped,” reads a recent blog post introducing the Interaction Engine, “This makes it possible to pick things up and hold them […]
[…] to handle object behaviors as well as detect whether an object is being grasped,” reads a recent blog post introducing the Interaction Engine, “This makes it possible to pick things up and hold them […]
[…] to handle object behaviors as well as detect whether an object is being grasped,” reads a recent blog post introducing the Interaction Engine, “This makes it possible to pick things up and hold them […]
[…] to handle object behaviors as well as detect whether an object is being grasped,” reads a recent blog post introducing the Interaction Engine, “This makes it possible to pick things up and hold them […]
[…] to handle object behaviors as well as detect whether an object is being grasped,” reads a recent blog post introducing the Interaction Engine, “This makes it possible to pick things up and hold them […]
[…] to handle object behaviors as well as detect whether an object is being grasped,” reads a recent blog post introducing the Interaction Engine, “This makes it possible to pick things up and hold them […]
[…] to handle object behaviors as well as detect whether an object is being grasped,” reads a recent blog post introducing the Interaction Engine, “This makes it possible to pick things up and hold them […]
[…] В настоящее время расширение для Unity ещё тестируется и работает хорошо с объектами размером 1–2 дюйма. Это довольно серьёзные ограничения, но речь идёт лишь о первых таких опытах — в будущем технология будет дорабатываться. В настоящее время это нишевый рынок. Очень мало людей являются владельцами VR-шлемов высокого класса, и уж совсем незначительная их часть купила контроллер Leap Motion. Возможно, в будущем сенсор станет частью стандартного комплекта шлемов виртуальной реальности. С демонстрацией работы технологии можно ознакомиться в GIF-анимациях на официальном сайте Leap Motion. […]
[…] В настоящее время расширение для Unity ещё тестируется и работает хорошо с объектами размером 1–2 дюйма. Это довольно серьёзные ограничения, но речь идёт лишь о первых таких опытах — в будущем технология будет дорабатываться. В настоящее время это нишевый рынок. Очень мало людей являются владельцами VR-шлемов высокого класса, и уж совсем незначительная их часть купила контроллер Leap Motion. Возможно, в будущем сенсор станет частью стандартного комплекта шлемов виртуальной реальности. С демонстрацией работы технологии можно ознакомиться в GIF-анимациях на официальном сайте Leap Motion. […]
[…] В настоящее время расширение для Unity ещё тестируется и работает хорошо с объектами размером 1–2 дюйма. Это довольно серьёзные ограничения, но речь идёт лишь о первых таких опытах — в будущем технология будет дорабатываться. В настоящее время это нишевый рынок. Очень мало людей являются владельцами VR-шлемов высокого класса, и уж совсем незначительная их часть купила контроллер Leap Motion. Возможно, в будущем сенсор станет частью стандартного комплекта шлемов виртуальной реальности. С демонстрацией работы технологии можно ознакомиться в GIF-анимациях на официальном сайте Leap Motion. […]
[…] up on last week’s release of the Leap Motion Interaction Engine, I’m excited to share Weightless: Remastered, a major update to my project that won second place […]
[…] aportando a las interfaces virtuales, que publicaba recientemente un artículo para presentar su Interaction Engine (EN), una manera de simplificar el trabajo de desarrolladores a la hora de implantar […]
[…] article from LEAP MOTION official blog […]
[…] Demonstration der Interaction Engine hat Leap Motion eine kleine Beispielanwendung geschrieben, die anhand von einfachen, geometrischen […]
[…] Well let’s look at that third function of our physical spoon – being held. There are many ways we could handle grabbing a virtual spoon within a VR experience. From the crudest, touching the spoon snap attaches it to your hand/controller, to an extremely nuanced simulation of real-life grabbing as detailed in building the Leap Motion Interaction Engine. […]
i had this working like a week ago, but todya i tried it again in a new project and every time i try to touch an object it goes flying across the screen for some inexplicable reason. Has anybody had this before?
[…] Introducing the Interaction Engine: Early Access Beta […]
[…] for stretching or pulling at a small target, such as a small object or part of a larger object. Grab interactions are broader and allow users to interact directly with a larger […]
[…] this month, we’ll be demoing this system at major VR events with an enhanced version of our Interaction Engine and flagship Blocks […]
[…] Por último, nos dicen que han construido un sistema de referencias situado en la parte superior de las Gear VR que a partir de este mes comenzaran a enviar a los principales fabricantes así como a mostrarlas en los principales eventos de realidad virtual con una versión mejorada de su motor de Interacción. […]
[…] 現在開発しているセンサーは、Gear VRで動作するリファレンス版を既にヘッドセットを開発しているメーカーに送付しています。今月から主なVRイベントでは、バージョンアップした『Interaction Engine』、『Blocks』デモと共に体験できる予定です。 […]
[…] Update: This example has been deprecated, as the Leap Motion Interaction Engine makes these types of interactions much easier to design! Learn more in our blog post Introducing the Leap Motion Interaction Engine. […]
[…] is the philosophy behind the Leap Motion Interaction Engine, which is built to handle low-level physics interactions and make them feel familiar. The […]
[…] Introducing the Interaction Engine / Documentation / Interaction Engine […]
[…] Introducing the Interaction Engine […]
[…] Interaction Engine is designed to handle incredibly complex object interactions, making them feel simple and fluid. […]
[…] #MobileVR Click To TweetFrom the Mobile VR Platform that can be embedded into any headset, to our Interaction Engine that makes virtual worlds react in human ways, we’re continuing to push the boundaries of […]
[…] this month, we’ll be demoing this system at major VR events with an enhanced version of our Interaction Engine and flagship Blocks […]
[…] way in which new technologies are building deeper engagement within VR experiences. Shops like Leap Motion Interaction Engine are beta testing ways to make interacting in VR feel like a “more human experience.” Using […]
[…] year, digital-physical interaction pioneer Leap Motion released an early access beta of Interacton Engine. In providing developers a way to incorporate scanning technology into Unity, […]
[…] year, digital-physical interaction pioneer Leap Motion released an early access beta of Interacton Engine. In providing developers a way to incorporate scanning technology into Unity, […]
Does it work on Unreal Engine?
unreal engine?
[…] don’t move when touched, but it also happens with interactive objects. Two core features of the Leap Motion Interaction Engine, soft contact and grabbing, almost always result in the user’s hand penetrating the geometry of […]
[…] #MobileVR Click To TweetFrom the Mobile VR Platform that can be embedded into any headset, to our Interaction Engine that makes virtual worlds react in human ways, we’re continuing to push the boundaries of […]
[…] this month, we’ll be demoing this system at major VR events with an enhanced version of our Interaction Engine and flagship Blocks […]
[…] Well let’s look at that third function of our physical spoon – being held. There are many ways we could handle grabbing a virtual spoon within a VR experience. From the crudest, touching the spoon snap attaches it to your hand/controller, to an extremely nuanced simulation of real-life grabbing as detailed in building the Leap Motion Interaction Engine. […]
[…] up on last week’s release of the Leap Motion Interaction Engine, I’m excited to share Weightless: Remastered, a major update to my project that won second place […]
[…] By exploring grey areas between real-world and digital physics, we can build a more human experience…Click To TweetBeneath the surface, this is an enormously complex challenge. Over the last several months, we’ve been boiling that complexity down to a fundamental tool that Unity developers can rapidly build with. Today we’re excited to share an early access beta of our Interaction Engine, now available as a Module for our Unity Core Assets. […]
Incredible this new update, congratulations for the great work done.