As mainstream VR/AR input continues to evolve – from the early days of gaze-only input to wand-style controllers and fully articulated hand tracking – so too are the virtual user interfaces we interact with. Slowly but surely we’re moving beyond flat UIs ported over from 2D screens and toward a future filled with spatial interface paradigms that take advantage of depth and volume.
// Augmented & Virtual Reality
Last time, we looked at how an interactive VR sculpture could be created with the Leap Motion Graphic Renderer as part of an experiment in interaction design. With the sculpture’s shapes rendering, we can now craft and code the layout and control of this 3D shape pool and the reactive behaviors of the individual objects.
By adding the Interaction Engine to our scene and InteractionBehavior components to each object, we have the basis for grasping, touching and other interactions. But for our VR sculpture, we can also use the Interaction Engine’s robust and performant awareness of hand proximity. With this foundation, we can experiment quickly with different reactions to hand presence, pinching, and touching specific objects. Let’s dive in!
Today we’re excited to announce the opening of our new design research studio in London with visionary VR/AR filmmaker Keiichi Matsuda, who will lead the new office and assume the role of VP of Design and Global Creative Director.
With the next generation of mobile VR/AR experiences on the horizon, our team is constantly pushing the boundaries of our VR UX developer toolkit. Recently we created a quick VR sculpture prototype that combines the latest and greatest of these tools.
The Leap Motion Interaction Engine lets developers give their virtual objects the ability to be picked up, thrown, nudged, swatted, smooshed, or poked. With our new Graphic Renderer Module, you also have access to weapons-grade performance optimizations for power-hungry desktop VR and power-ravenous mobile VR.
In this post, we’ll walk through a small project built using these tools. This will provide a technical and workflow overview as one example of what’s possible – plus some VR UX design exploration and performance optimizations along the way.
Explorations in VR Design is a journey through the bleeding edge of VR design – from architecting a space and designing groundbreaking interactions to making users feel powerful. Art takes its inspiration from real life, but it takes imagination (and sometimes breaking a few laws of physics) to create something truly human. With last week’s […]
As humans, we are spatial, physical thinkers. From birth we grow to understand the objects around us by the rules that govern how they move, and how we move them. These rules are so fundamental that we design our digital realities to reflect human expectations about how things work in the real world. At Leap […]
Creating a sense of space is one of the most powerful tools in a VR developer’s arsenal. In our Exploration on World Design, we looked at how to create moods and experiences through imaginary environments. In this Exploration, we’ll cover some key spatial relationships in VR, and how you can build on human expectations to create a sense of […]
This week, we’re excited to share our latest milestone on the road to truly compelling mobile VR. We’ve joined forces with Qualcomm Technologies to combine their Snapdragon 835 mobile platform with our embedded hand tracking technology so that people can interact with mobile VR content using their bare hands.
Universities are the earliest adopters of virtual and augmented reality. Even in the third age of VR, when the technology is small, inexpensive, and powerful enough that millions of people can own it, universities are still at the bleeding edge of VR research and development. A recent survey of 553 universities in the VR First Network gives us an […]
With our thoughts, we make the world.” –The Buddha Technology has the power to open up new realities. CES 2017 was about making those realities more intelligent, interactive, and human – from imaginary worlds unfolding before our eyes to objects that can talk to you and each other. Here’s a quick review of our CES […]