Get updates on the future of VR/AR:

  FILTERS:      Art & Design      Education      Explorations      Gaming      Hardware      Medical      Music      Orion      Unity      Unreal      UX Design      VR/AR   

At Leap Motion, we’re always looking to advance our interactions in ways that push our hardware and software. As one of the lead engineers on Project North Star, I believe that augmented reality can be a truly compelling platform for human-computer interaction. While AR’s true potential comes from dissolving the barriers between humans and computers, I also believe that it can help improve our abilities in the real world. As we augment our reality, we augment ourselves.

With its best-in-class field-of-view, refresh rate, and resolution, the North Star headset has proven to be an exceptional platform for representing high-speed motions with small objects. So when David asked me to put together a quick demo to show off its ability to interact with spatial environments, I knew just what to build. That’s right – table tennis.

Read More ›

VR, AR and hand tracking are often considered to be futuristic technologies, but they also have the potential to be the easiest to use. For the launch of our V4 software, we set ourselves the challenge of designing an application where someone who has never even touched a VR headset could figure out what to do with no instructions whatsoever. That application became Cat Explorer, which you can download now for Oculus Rift and HTC Vive.

On the surface it’s a fun, slightly twisted tech demo. But it also serves as a proof of concept for intuitive interaction in training, education, visualisation and entertainment. Designer Eugene Krivoruchko digs into some of the decisions that went into its creation:

Cat Explorer is an experiment in spatial interaction design that focuses on using hands to explore the structural layout of a relatively complex object – in this case a loose anatomical model of a cat named Whiskey.

Modeling of cat and internal organs by Pablo Lopéz, based on Keiichi’s cat Donna.

Read More ›

This morning, we released the latest generation of Orion tracking alongside major updates to our Unity and Unreal integrations. We’ve also taken several steps to streamline the developer experience, reflecting deeper changes in our SDK over time. Beyond the tracking updates, here’s a quick overview of the latest changes in our SDK.

Read More ›

Today we’re excited to announce the latest milestone of our journey with a major release of our Orion VR tracking software, now available for public beta on Windows. This is the fourth generation of our core software overall, featuring a range of feature improvements across the board:

  • Better finger dexterity and fidelity
  • Significantly smoother hand and finger tracking, with motions that look and feel more natural
  • Faster and more consistent hand initialization
  • Better hand pose stability and reliability
  • Improved tracking fidelity against complex backgrounds and extreme lighting
  • More accurate shape and scale for hands

Read More ›

At Leap Motion, we envision a future where the physical and virtual worlds blend together into a single magical experience. At the heart of this experience is hand tracking, which unlocks interactions uniquely suited to virtual and augmented reality. To explore the boundaries of interactive design in AR, we created Project North Star, which drove us to push beyond the limitations of existing systems.

Read More ›

When we embarked on this journey, there were many things we didn’t know.

What does hand tracking need to be like for an augmented reality headset? How fast does it need to be; do we need a hundred frames per second tracking or a thousand frames per second?

How does the field of view impact the interaction paradigm? How do we interact with things when we only have the central field, or a wider field? At what point does physical interaction become commonplace? How does the comfort of the interactions themselves relate to the headset’s field of view?

What are the artistic aspects that need to be considered in augmented interfaces? Can we simply throw things on as-is and make our hands occlude things and call it a day? Or are there fundamentally different styles of everything that suddenly come out when we have a display that can only ‘add light’ but not subtract it?

Read More ›

Leap Motion is a company that has always been focused on human-computer interfaces.

The fundamental limit in technology is not its size or its cost or its speed, but how we interact with it. Click To TweetWe believe that the fundamental limit in technology is not its size or its cost or its speed, but how we interact with it. These interactions define what we create, how we learn, how we communicate with each other. It would be no stretch of the imagination to say that the way we interact with the world around us is perhaps the very fabric of the human experience.

We believe that this human experience is on the precipice of a great change.

The coming of virtual reality has signaled a great moment in the history of our civilization. We have found in ourselves the ability to break down the very substrate of reality and create ones anew, entirely of our own design and of our own imaginations.

As we explore this newfound ability, it becomes increasingly clear that this power will not be limited to some ‘virtual world’ separate from our own. It will spill out like a great flood, uniting what has been held apart for so long: our digital and physical realities.

In preparation for the coming flood, we at Leap Motion have built a ship, and we call it Project North Star.

Read More ›

With virtual and augmented reality on the rise, so is the number of available platforms, input standards, and design paradigms. To harness the force of this horizontal expansion, we have to fundamentally rethink how we interact with VR/AR, in ways that often violate how the physical world works, but align with human expectations on a more fundamental level.

Read More ›

There’s something magical about building in VR. Imagine being able to assemble weightless car engines, arrange dynamic virtual workspaces, or create imaginary castles with infinite bricks. Arranging or assembling virtual objects is a common scenario across a range of experiences, particularly in education, enterprise, and industrial training – not to mention tabletop and real-time strategy gaming.

Imagine being able to assemble weightless car engines, arrange dynamic virtual workspaces, or create imaginary castles with infinite bricks. Click To TweetFor our latest interaction sprint, we explored how building and stacking interactions could feel seamless, responsive, and stable. How could we place, stack, and assemble virtual objects quickly and accurately while preserving the nuance and richness of full physics simulation? Check out our results below or download the example demo from the Leap Motion Gallery.

Read More ›

One of the core design philosophies at Leap Motion is that the most intuitive and natural interactions are direct and physical. Manipulating objects with our bare hands lets us leverage a lifetime of physical experience, minimizing the learning curve for users. But there are times when virtual objects will be farther away than arm’s reach, beyond the user’s range of direct manipulation. We can force users to walk over to access those objects – or we could give them superpowers!

For our latest interaction design sprint, we prototyped three ways of summoning distant objects to bring them within arm’s reach. The first is a simple animated summoning technique, well-suited to interacting with single objects. The second gives you telekinetic powers, while the third virtually augments your body’s capabilities.

Read More ›