This week we’re excited to share a new engine integration with the professional animation community – Leap Motion and iClone 7 Motion LIVE. A full body motion capture platform designed for performance animation, Motion LIVE aggregates motion data streams from industry-leading mocap devices, and drives 3D characters’ faces, hands, and bodies simultaneously. Its easy workflow […]
At Leap Motion, we’re always looking to advance our interactions in ways that push our hardware and software. As one of the lead engineers on Project North Star, I believe that augmented reality can be a truly compelling platform for human-computer interaction. While AR’s true potential comes from dissolving the barriers between humans and computers, […]
VR, AR and hand tracking are often considered to be futuristic technologies, but they also have the potential to be the easiest to use. For the launch of our V4 software, we set ourselves the challenge of designing an application where someone who has never even touched a VR headset could figure out what to […]
There’s something magical about building in VR. Imagine being able to assemble weightless car engines, arrange dynamic virtual workspaces, or create imaginary castles with infinite bricks. Arranging or assembling virtual objects is a common scenario across a range of experiences, particularly in education, enterprise, and industrial training – not to mention tabletop and real-time strategy […]
When you reach out and grab a virtual object or surface, there’s nothing stopping your physical hand in the real world. To make physical interactions in VR feel compelling and natural, we have to play with some fundamental assumptions about how digital objects should behave. The Leap Motion Interaction Engine handles these scenarios by having […]
As mainstream VR/AR input continues to evolve – from the early days of gaze-only input to wand-style controllers and fully articulated hand tracking – so too are the virtual user interfaces we interact with. Slowly but surely we’re moving beyond flat UIs ported over from 2D screens and toward a future filled with spatial interface paradigms that take advantage of depth and volume.
Last time, we looked at how an interactive VR sculpture could be created with the Leap Motion Graphic Renderer as part of an experiment in interaction design. With the sculpture’s shapes rendering, we can now craft and code the layout and control of this 3D shape pool and the reactive behaviors of the individual objects.
By adding the Interaction Engine to our scene and InteractionBehavior components to each object, we have the basis for grasping, touching and other interactions. But for our VR sculpture, we can also use the Interaction Engine’s robust and performant awareness of hand proximity. With this foundation, we can experiment quickly with different reactions to hand presence, pinching, and touching specific objects. Let’s dive in!
With the next generation of mobile VR/AR experiences on the horizon, our team is constantly pushing the boundaries of our VR UX developer toolkit. Recently we created a quick VR sculpture prototype that combines the latest and greatest of these tools.
The Leap Motion Interaction Engine lets developers give their virtual objects the ability to be picked up, thrown, nudged, swatted, smooshed, or poked. With our new Graphic Renderer Module, you also have access to weapons-grade performance optimizations for power-hungry desktop VR and power-ravenous mobile VR.
In this post, we’ll walk through a small project built using these tools. This will provide a technical and workflow overview as one example of what’s possible – plus some VR UX design exploration and performance optimizations along the way.
Explorations in VR Design is a journey through the bleeding edge of VR design – from architecting a space and designing groundbreaking interactions to making users feel powerful. Art takes its inspiration from real life, but it takes imagination (and sometimes breaking a few laws of physics) to create something truly human. With last week’s […]
As humans, we are spatial, physical thinkers. From birth we grow to understand the objects around us by the rules that govern how they move, and how we move them. These rules are so fundamental that we design our digital realities to reflect human expectations about how things work in the real world. At Leap […]