Game physics engines were never designed for human hands. In fact, when you bring your hands into VR, the results can be dramatic. Grabbing an object in your hand or squishing it against the floor, you send it flying as the physics engine desperately tries to keep your fingers out of it.
But by exploring the grey areas between real-world and digital physics, we can build a more human experience. One where you can reach out and grab something – a block, a teapot, a planet – and simply pick it up. Your fingers phase through the material, but the object still feels real. Like it has weight.
Beneath the surface, this is an enormously complex challenge. Over the last several months, we’ve been boiling that complexity down to a fundamental tool that Unity developers can rapidly build with. Today we’re excited to share an early access beta of our Interaction Engine, now available as a Module for our Unity Core Assets.
How It Works
The Interaction Engine is a layer that exists between the Unity game engine and real-world hand physics. To make object interactions work in a way that satisfies human expectations, it implements an alternate set of physics rules that take over when your hands are embedded inside a virtual object. The results would be impossible in reality, but they feel more satisfying and easy to use. Our Blocks demo is built with an early prototype of this engine, which has been designed for greater extensibility and customization.
The Interaction Engine is designed to handle object behaviors, as well as detect whether an object is being grasped. This makes it possible to pick things up and hold them in a way that feels truly solid. It also uses a secondary real-time physics representation of the hands, opening up more subtle interactions.
Our goal with the Interaction Engine is for integration to be quick and easy. However, it also allows for a high degree of customization across a wide range of features. You can modify the properties of an object interaction, including desired position when grasped, moving the object to the desired position, determining what happens when tracking is momentarily lost, throwing velocity, and layer transitions to handle how collisions work. Learn more about building with the Interaction Engine in our Unity documentation.
Interaction Engine 101
Without the Interaction Engine, hands in VR can feel like one of those late-night infomercials where people can’t tie their own shoes. Now available on GitHub, Interaction Engine 101 is a quick introduction that lets you compare interactions with the Interaction Engine turned on or off:
Grasping and picking up an object is the most fundamental element of the Interaction Engine. With normal game physics, the object springs from your hand and flies around the room. The Interaction Engine makes it feel easy and natural.
The ability to pick up an object also extends to higher-level interactions, like stacking.
Standard rigidbodies will violently try to escape if you compress them into the floor. With the Interaction Engine, they take on new elastic properties, allowing your hands to dynamically phase through virtual matter.
The Interaction Engine also allows you to customize throwing physics. Without it, you could probably throw an object, but it would be extremely difficult.
This early beta of the Interaction Engine works well with the types of objects you see in these scenes – namely cubes and spheres around 1-2 inches in size. Game objects of differing shapes, sizes, and physics settings may have different results. We want to hear about your experience with the Interaction Engine so we can continue to make improvements.
Ready to experiment? Download the Unity Core Assets and Interaction Engine Module, check out the documentation, and share your feedback in the comments below or on our community forums!