Get updates on the future of VR/AR:

  FILTERS:      Art & Design      Education      Explorations      Gaming      Hardware      Medical      Music      Orion      Unity      Unreal      UX Design      VR/AR   

Following up on last week’s release of the Leap Motion Interaction Engine, I’m excited to share Weightless: Remastered, a major update to my project that won second place in the first-ever 3D Jam. A lot has changed since then! In this post, we’ll take a deeper look at the incredible power and versatility of the Interaction Engine, including some of the fundamental problems it’s built to address. Plus, some tips around visual feedback for weightless locomotion, and designing virtual objects that look “grabbable.”

When I made the original Weightless, there wasn’t a stellar system for grasping virtual objects with your bare hands yet. Binary pinching or closing your hand into a fist to grab didn’t seem as satisfying as gently fingertip-tapping objects around. It wasn’t really possible to do both, only one or the other.

The Interaction Engine bridges that gap – letting you boop objects around with any part of your hand, while also allowing you to grab and hold onto floating artifacts without sacrificing fidelity in either. You can now actually grab the floating Oculus DK2 and try to put it on!

Read More ›

Itadakimasu (Japanese for ‘Bon Appetit’) is a therapeutic VR experience that allows users to interact with animals through different hand gestures. The focus of this piece stems from research findings that animal-assisted therapy can help decrease anxiety and reduce blood pressure in patients.

Although the experience is simple in content, my intent is that it could act as a short-term solution for people in places where owning a pet is logistically difficult.

Read More ›

Update (6/8/17): Interaction Engine 1.0 is here! Read more on our release announcement: blog.leapmotion.com/interaction-engine

Game physics engines were never designed for human hands. In fact, when you bring your hands into VR, the results can be dramatic. Grabbing an object in your hand or squishing it against the floor, you send it flying as the physics engine desperately tries to keep your fingers out of it.

But by exploring the grey areas between real-world and digital physics, we can build a more human experience. One where you can reach out and grab something – a block, a teapot, a planet – and simply pick it up. Your fingers phase through the material, but the object still feels real. Like it has weight.

By exploring grey areas between real-world and digital physics, we can build a more human experience. Click To TweetBeneath the surface, this is an enormously complex challenge. Over the last several months, we’ve been boiling that complexity down to a fundamental tool that Unity developers can rapidly build with. Today we’re excited to share an early access beta of our Interaction Engine, now available as a Module for our Unity Core Assets.

Read More ›

We made history on Thursday with 16 dance finalists competing for an NVIDIA GeForce GTX 1070 and eternal glory. Through the magic of AltspaceVR’s FrontRow feature and a live DJ, over 380 people were in attendance, but everyone was able to watch the top 16 bring the house down with their mad moves.

Read More ›

Last week, we chose our finalists for the #VRDanceParty in AltspaceVR. (In case you missed it, check out the auditions here.) Now it’s time for a historic spectacle, where 16 finalists will compete for an NVIDIA GeForce GTX 1070 and eternal glory.

Read More ›

Welcome to AltSpaceVR – a place that can exist anywhere, and where exciting things are constantly happening. These are still early days for social VR, and AltspaceVR is at the forefront of a whole new way for human beings to connect. Earlier this week, we caught up with Bruce Wooden, aka “Cymatic Bruce,” to talk about where the space is headed.

Bruce has been a VR evangelist since the earliest days of the new VR resurgence, and is currently Head of Developer and Community Relations at AltspaceVR. We talked about the uncanny valley, the power of hands in VR, and the challenges of building a global community. (For an extended version of the conversation, check out our post on Medium.)

Read More ›

Many of the assets in this post have been updated since 2016! For the latest, see developer.leapmotion.com/guide.

True hand presence in VR is incredibly powerful – and easier than ever. With the Leap Motion Unity Core Assets and Modules, you can start building right away with features like custom-designed hands, user interfaces, and event triggers. Each Module is designed to unlock new capabilities in your VR project, and work with others for more advanced combinations.

Unlock the power of true hand presence in #VR and start building right away. Click To TweetIn this post, we’ll take a quick look at our Core Assets, followed by the Modules.  Each section includes links to more information, including high-level overviews, documentation, and examples. The Core Assets and Modules themselves all include demo scenes, which are often the best way to get started.

Read More ›

Creating new 3D hand assets for your Leap Motion projects can be a real challenge. That’s why, based on your feedback, we’ve massively automated and streamlined the pipeline for connecting 3D models to our Core Assets with Hands Module 2.0 – so what used to take hours only takes a minute or two. You can get the new module and updated assets on our developer portal.

How to bring your #VR hand designs to life in two minutes or less. Click To TweetYou now have the ability to autorig a wide array of FBX hand assets with one or two button presses. This has the powerful benefit of being able to quickly iterate between a modeling package and seeing the models driven by live hand motion in Unity. Even if you’re a veteran modeler-rigger-animator, it’s a singular experience to bring hand models that you’ve been sculpting and rigging into VR and see them come to life with your own hand motions.

In this post, we’ll provide a detailed overview on how to use our new autorig pipeline, as well as some explanation of what happens under the hood. At the end, we’ll take a step back with some best practices for both building hand assets from scratch or choosing hand assets from a 3D asset store.

Read More ›

Our world is on the verge of a radical shift where our physical and digital realities merge and blend together. On Thursday, Leap Motion CTO and co-founder David Holz shared his thoughts on what’s happening behind the scenes of the VR industry, and how we can make our new reality feel more human.

Read More ›

Update (June 8, 2017): The UI Input Module has been deprecated, as it is now part of the Leap Motion Interaction Engine. Learn more on our blog.

Unity Widgets are back – with a new name and massively streamlined functionality! Just released for our Unity Core Assets, the UI Input Module provides a simplified interface for physically interacting with World Space Canvases within Unity’s UI System. This makes it simple for developers to create one-to-one tactile user interfaces in VR.

Create and customize tactile interfaces and menus for VR. Click To TweetThe module also provides “CompressibleUI” scripts that enable UI elements to pop-up and flatten in response to touch. You can try the new inputs in our latest Developer Gallery demo, or download the Module from our Unity page.

Read More ›