Get updates on the future of VR/AR:

  FILTERS:      Art & Design      Education      Explorations      Gaming      Hardware      IoT      Medical      Music      Orion      Unity      Unreal      UX Design      VR/AR   

We made history on Thursday with 16 dance finalists competing for an NVIDIA GeForce GTX 1070 and eternal glory. Through the magic of AltspaceVR’s FrontRow feature and a live DJ, over 380 people were in attendance, but everyone was able to watch the top 16 bring the house down with their mad moves.

Read More ›

Last week, we chose our finalists for the #VRDanceParty in AltspaceVR. (In case you missed it, check out the auditions here.) Now it’s time for a historic spectacle, where 16 finalists will compete for an NVIDIA GeForce GTX 1070 and eternal glory.

Read More ›

Welcome to AltSpaceVR – a place that can exist anywhere, and where exciting things are constantly happening. These are still early days for social VR, and AltspaceVR is at the forefront of a whole new way for human beings to connect. Earlier this week, we caught up with Bruce Wooden, aka “Cymatic Bruce,” to talk about where the space is headed.

Bruce has been a VR evangelist since the earliest days of the new VR resurgence, and is currently Head of Developer and Community Relations at AltspaceVR. We talked about the uncanny valley, the power of hands in VR, and the challenges of building a global community. (For an extended version of the conversation, check out our post on Medium.)

Read More ›

True hand presence in VR is incredibly powerful – and easier than ever. With the Leap Motion Unity Core Assets and Modules, you can start building right away with features like custom-designed hands, user interfaces, and event triggers. Each Module is designed to unlock new capabilities in your VR project, and work with others for more advanced combinations.

Unlock the power of true hand presence in #VR and start building right away. Click To TweetIn this post, we’ll take a quick look at our Core Assets, followed by the Modules.  Each section includes links to more information, including high-level overviews, documentation, and examples. The Core Assets and Modules themselves all include demo scenes, which are often the best way to get started.

Read More ›

Creating new 3D hand assets for your Leap Motion projects can be a real challenge. That’s why, based on your feedback, we’ve massively automated and streamlined the pipeline for connecting 3D models to our Core Assets with Hands Module 2.0 – so what used to take hours only takes a minute or two. You can get the new module and updated assets on our developer portal.

How to bring your #VR hand designs to life in two minutes or less. Click To TweetYou now have the ability to autorig a wide array of FBX hand assets with one or two button presses. This has the powerful benefit of being able to quickly iterate between a modeling package and seeing the models driven by live hand motion in Unity. Even if you’re a veteran modeler-rigger-animator, it’s a singular experience to bring hand models that you’ve been sculpting and rigging into VR and see them come to life with your own hand motions.

In this post, we’ll provide a detailed overview on how to use our new autorig pipeline, as well as some explanation of what happens under the hood. At the end, we’ll take a step back with some best practices for both building hand assets from scratch or choosing hand assets from a 3D asset store.

Read More ›

Our world is on the verge of a radical shift where our physical and digital realities merge and blend together. On Thursday, Leap Motion CTO and co-founder David Holz shared his thoughts on what’s happening behind the scenes of the VR industry, and how we can make our new reality feel more human.

Read More ›

Unity Widgets are back – with a new name and massively streamlined functionality! Just released for our Unity Core Assets, the UI Input Module provides a simplified interface for physically interacting with World Space Canvases within Unity’s UI System. This makes it simple for developers to create one-to-one tactile user interfaces in VR.

Create and customize tactile interfaces and menus for VR. Click To TweetThe module also provides “CompressibleUI” scripts that enable UI elements to pop-up and flatten in response to touch. You can try the new inputs in our latest Developer Gallery demo, or download the Module from our Unity page.

Read More ›

In rebuilding our Unity developer toolset from the ground up, we started by rearchitecting the interfaces that receive data from the Leap Motion device. Moving up the tech stack, we then refactored most of the mid-level Unity scripts that pair the Leap hand data with 3D models and manages those representations. Most recently, we’ve moved another step up our stack to the code that drives the 3D models themselves.

Leap Motion's geometric hands are generated based on the real-world proportions of your own hands! Click To TweetWith the release of our new Hands Module, we’ve returned to providing a range of example hands to add onto our new Orion toolset. We’ve started with a small set of examples ranging from rigged meshes with improved rigged mesh workflow, to abstract geometric hands that are dynamically generated based on the real-world proportions of the user’s hand!

Read More ›

With this week’s Unity Core Asset release, we’ve made a few changes to our Pinch Utilities – including some new features that extend its capabilities! These new utilities have been folded into the main Core Assets package, retiring the former Pinch Utility module.

So what are these new features? We call them Detectors, and they provide a convenient way to detect what a user’s hand is doing. In addition to detecting pinches, you can now detect when the fingers of a hand are curled or extended, whether a finger or palm is pointing in a particular direction, and whether the hand or fingertip are close to one of a set of target objects. (A grab detector is coming soon!)

Read More ›

For hundreds of years, dead bodies (cadavers) have taught medical students about human anatomy. In cadaver labs, students dissect, touch, rotate, and explore organs in hands-on experiences that make knowledge stick for a lifetime.

Unfortunately, these experiences are out of reach for most of us. Cadaver labs are expensive to run and cadavers are in limited supply, so non-medical students have to settle for secondary learning experiences like iPad apps and websites. These experiences are good, but not nearly as effective as the hands-on learning experiences students get in the lab.

That’s why we created CadaVR, a “living” virtual reality cadaver lab that emulates a real cadaver lab, minus the crowd (4-8 students per cadaver), unforgiving smell, and expensive cost. Not only does CadaVR let students use their hands and other senses to learn about anatomy, but it also has things that are not available in physical labs, such as a simulation of how the heart beats. (If you’re a medical student and you detect a heartbeat in your cadaver, you should probably run!)

Read More ›