Game physics engines were never designed for human hands. In fact, when you bring your hands into VR, the results can be dramatic. Grabbing an object in your hand or squishing it against the floor, you send it flying as the physics engine desperately tries to keep your fingers out of it.

But by exploring the grey areas between real-world and digital physics, we can build a more human experience. One where you can reach out and grab something – a block, a teapot, a planet – and simply pick it up. Your fingers phase through the material, but the object still feels real. Like it has weight.

Beneath the surface, this is an enormously complex challenge. Over the last several months, we’ve been boiling that complexity down to a fundamental tool that Unity developers can rapidly build with. Today we’re excited to share an early access beta of our Interaction Engine, now available as a Module for our Unity Core Assets.

Read More »


We made history on Thursday with 16 dance finalists competing for an NVIDIA GeForce GTX 1070 and eternal glory. Through the magic of AltspaceVR’s FrontRow feature and a live DJ, over 380 people were in attendance, but everyone was able to watch the top 16 bring the house down with their mad moves.

Read More »


Last week, we chose our finalists for the #VRDanceParty in AltspaceVR. (In case you missed it, check out the auditions here.) Now it’s time for a historic spectacle, where 16 finalists will compete for an NVIDIA GeForce GTX 1070 and eternal glory.

Read More »

Welcome to AltSpaceVR – a place that can exist anywhere, and where exciting things are constantly happening. On Thursday, the qualifying round for our #VRDanceParty will begin, where everyone can compete to be part of the August 18th finals and dance for an NVIDIA Geforce GTX 1070 Graphics Card.

These are still early days for social VR, and AltspaceVR is at the forefront of a whole new way for human beings to connect. Ahead of the competition, we caught up with Bruce Wooden, aka “Cymatic Bruce,” to talk about where the space is headed.

Bruce has been a VR evangelist since the earliest days of the new VR resurgence, and is currently Head of Developer and Community Relations at AltspaceVR. We talked about the uncanny valley, the power of hands in VR, and the challenges of building a global community. (For an extended version of the conversation, check out our post on Medium.)

Read More »


True hand presence in VR is incredibly powerful – and easier than ever. With the Leap Motion Unity Core Assets and Modules, you can start building right away with features like custom-designed hands, user interfaces, and event triggers. Each Module is designed to unlock new capabilities in your VR project, and work with others for more advanced combinations.

In this post, we’ll take a quick look at our Core Assets, followed by the Modules.  Each section includes links to more information, including high-level overviews, documentation, and examples. The Core Assets and Modules themselves all include demo scenes, which are often the best way to get started.

Read More »


Creating new 3D hand assets for your Leap Motion projects can be a real challenge. That’s why, based on your feedback, we’ve massively automated and streamlined the pipeline for connecting 3D models to our Core Assets with Hands Module 2.0 – so what used to take hours only takes a minute or two. You can get the new module and updated assets on our developer portal.

You now have the ability to autorig a wide array of FBX hand assets with one or two button presses. This has the powerful benefit of being able to quickly iterate between a modeling package and seeing the models driven by live hand motion in Unity. Even if you’re a veteran modeler-rigger-animator, it’s a singular experience to bring hand models that you’ve been sculpting and rigging into VR and see them come to life with your own hand motions.

In this post, we’ll provide a detailed overview on how to use our new autorig pipeline, as well as some explanation of what happens under the hood. At the end, we’ll take a step back with some best practices for both building hand assets from scratch or choosing hand assets from a 3D asset store.

Read More »

Our world is on the verge of a radical shift where our physical and digital realities merge and blend together. On Thursday, Leap Motion CTO and co-founder David Holz shared his thoughts on what’s happening behind the scenes of the VR industry, and how we can make our new reality feel more human.

Read More »


Unity Widgets are back – with a new name and massively streamlined functionality! Just released for our Unity Core Assets, the UI Input Module provides a simplified interface for physically interacting with World Space Canvases within Unity’s UI System. This makes it simple for developers to create one-to-one tactile user interfaces in VR.

The module also provides “CompressibleUI” scripts that enable UI elements to pop-up and flatten in response to touch. You can try the new inputs in our latest Developer Gallery demo, or download the Module from our Unity page.

Read More »


In rebuilding our Unity developer toolset from the ground up, we started by rearchitecting the interfaces that receive data from the Leap Motion device. Moving up the tech stack, we then refactored most of the mid-level Unity scripts that pair the Leap hand data with 3D models and manages those representations. Most recently, we’ve moved another step up our stack to the code that drives the 3D models themselves.

With the release of our new Hands Module, we’ve returned to providing a range of example hands to add onto our new Orion toolset. We’ve started with a small set of examples ranging from rigged meshes with improved rigged mesh workflow, to abstract geometric hands that are dynamically generated based on the real-world proportions of the user’s hand!

Read More »


With this week’s Unity Core Asset release, we’ve made a few changes to our Pinch Utilities – including some new features that extend its capabilities! These new utilities have been folded into the main Core Assets package, retiring the former Pinch Utility module.

So what are these new features? We call them Detectors, and they provide a convenient way to detect what a user’s hand is doing. In addition to detecting pinches, you can now detect when the fingers of a hand are curled or extended, whether a finger or palm is pointing in a particular direction, and whether the hand or fingertip are close to one of a set of target objects. (A grab detector is coming soon!)

Read More »