Martin Schubert is a VR Developer/Designer at Leap Motion and the creator of Weightless and Geometric.

In architecture school, we had many long discussions about things most non-designers probably never give much thought to. These always swung quickly between absurdly abstract and philosophically important, and I could never be sure which of the two was currently happening.

One of those discussions was about what makes a spoon a spoon. What is it that distinguishes a spoon from, say, a teapot? Is it the shape, a little bowl with a handle? Is it the size, able to be held in one hand? The material? Would it still be a spoon if it were 10 ft long or had sharp needles all over or if it were made of thin paper? What gives it its ‘spoonyness’?

Read More »


Breaking into VR development doesn’t need to break the bank. If you have a newer Android phone and a good gaming computer, it’s possible to prototype, test, and share your VR projects with the world using third-party software like RiftCat’s VRidge. In this post, we’ll take a look at what you’ll need to get started with PC VR development for less than $100.

Read More »

Following up on last week’s release of the Leap Motion Interaction Engine, I’m excited to share Weightless: Remastered, a major update to my project that won second place in the first-ever 3D Jam. A lot has changed since then! In this post, we’ll take a deeper look at the incredible power and versatility of the Interaction Engine, including some of the fundamental problems it’s built to address. Plus, some tips around visual feedback for weightless locomotion, and designing virtual objects that look “grabbable.”

When I made the original Weightless, there wasn’t a stellar system for grasping virtual objects with your bare hands yet. Binary pinching or closing your hand into a fist to grab didn’t seem as satisfying as gently fingertip-tapping objects around. It wasn’t really possible to do both, only one or the other.

The Interaction Engine bridges that gap – letting you boop objects around with any part of your hand, while also allowing you to grab and hold onto floating artifacts without sacrificing fidelity in either. You can now actually grab the floating Oculus DK2 and try to put it on!

Read More »

Itadakimasu (Japanese for ‘Bon Appetit’) is a therapeutic VR experience that allows users to interact with animals through different hand gestures. The focus of this piece stems from research findings that animal-assisted therapy can help decrease anxiety and reduce blood pressure in patients.

Although the experience is simple in content, my intent is that it could act as a short-term solution for people in places where owning a pet is logistically difficult.

Read More »


Game physics engines were never designed for human hands. In fact, when you bring your hands into VR, the results can be dramatic. Grabbing an object in your hand or squishing it against the floor, you send it flying as the physics engine desperately tries to keep your fingers out of it.

But by exploring the grey areas between real-world and digital physics, we can build a more human experience. One where you can reach out and grab something – a block, a teapot, a planet – and simply pick it up. Your fingers phase through the material, but the object still feels real. Like it has weight.

Beneath the surface, this is an enormously complex challenge. Over the last several months, we’ve been boiling that complexity down to a fundamental tool that Unity developers can rapidly build with. Today we’re excited to share an early access beta of our Interaction Engine, now available as a Module for our Unity Core Assets.

Read More »


We made history on Thursday with 16 dance finalists competing for an NVIDIA GeForce GTX 1070 and eternal glory. Through the magic of AltspaceVR’s FrontRow feature and a live DJ, over 380 people were in attendance, but everyone was able to watch the top 16 bring the house down with their mad moves.

Read More »


Last week, we chose our finalists for the #VRDanceParty in AltspaceVR. (In case you missed it, check out the auditions here.) Now it’s time for a historic spectacle, where 16 finalists will compete for an NVIDIA GeForce GTX 1070 and eternal glory.

Read More »

Welcome to AltSpaceVR – a place that can exist anywhere, and where exciting things are constantly happening. On Thursday, the qualifying round for our #VRDanceParty will begin, where everyone can compete to be part of the August 18th finals and dance for an NVIDIA Geforce GTX 1070 Graphics Card.

These are still early days for social VR, and AltspaceVR is at the forefront of a whole new way for human beings to connect. Ahead of the competition, we caught up with Bruce Wooden, aka “Cymatic Bruce,” to talk about where the space is headed.

Bruce has been a VR evangelist since the earliest days of the new VR resurgence, and is currently Head of Developer and Community Relations at AltspaceVR. We talked about the uncanny valley, the power of hands in VR, and the challenges of building a global community. (For an extended version of the conversation, check out our post on Medium.)

Read More »


True hand presence in VR is incredibly powerful – and easier than ever. With the Leap Motion Unity Core Assets and Modules, you can start building right away with features like custom-designed hands, user interfaces, and event triggers. Each Module is designed to unlock new capabilities in your VR project, and work with others for more advanced combinations.

In this post, we’ll take a quick look at our Core Assets, followed by the Modules.  Each section includes links to more information, including high-level overviews, documentation, and examples. The Core Assets and Modules themselves all include demo scenes, which are often the best way to get started.

Read More »


Creating new 3D hand assets for your Leap Motion projects can be a real challenge. That’s why, based on your feedback, we’ve massively automated and streamlined the pipeline for connecting 3D models to our Core Assets with Hands Module 2.0 – so what used to take hours only takes a minute or two. You can get the new module and updated assets on our developer portal.

You now have the ability to autorig a wide array of FBX hand assets with one or two button presses. This has the powerful benefit of being able to quickly iterate between a modeling package and seeing the models driven by live hand motion in Unity. Even if you’re a veteran modeler-rigger-animator, it’s a singular experience to bring hand models that you’ve been sculpting and rigging into VR and see them come to life with your own hand motions.

In this post, we’ll provide a detailed overview on how to use our new autorig pipeline, as well as some explanation of what happens under the hood. At the end, we’ll take a step back with some best practices for both building hand assets from scratch or choosing hand assets from a 3D asset store.

Read More »