Get updates on the future of VR/AR:

  FILTERS:      Art & Design      Education      Explorations      Gaming      Hardware      Medical      Music      Orion      Unity      Unreal      UX Design      VR/AR   

Bringing new worlds to life doesn’t end with bleeding-edge software – it’s also a battle with the laws of physics. Project North Star is a compelling glimpse into the future of AR interaction and an exciting engineering challenge, with wide-FOV displays and optics that demanded a whole new calibration and distortion system.

Read More ›

Today we’re excited to share the latest major design update for the Leap Motion North Star headset. North Star Release 3 consolidates several months of research and insight into a new set of 3D files and drawings. Our goal with this release is to make Project North Star more inviting, less hacked together, and more reliable. The design includes more adjustments and mechanisms for a greater variety of head and facial geometries – lighter, more balanced, stiffer, and more inclusive.

Read More ›

Earlier this week, we shared an experimental build of our LeapUVC API, which gives you a new level of access to the Leap Motion Controller cameras. Today we’re excited to share a second experimental build – multiple device support.

Read More ›

In 2014 we released the Leap Motion Image API, to unlock the possibilities of using the Leap Motion Controller’s twin infrared cameras. Today we’re releasing an experimental expansion of our Image API called LeapUVC.

Read More ›

Earlier this summer, we open sourced the design for Project North Star, the world’s most advanced augmented reality R&D platform. Like the first chocolate waterfall outside of Willy Wonka’s factory, now the first North Star-style headsets outside our lab have been born – in Japan.

Read More ›

Virtual reality. Augmented reality. Mixed, hyper, modulated, mediated, diminished reality. All of these flavours are really just entry points into a vast world of possibilities where we can navigate between our physical world and limitless virtual spaces. These technologies contain immense raw potential, and have advanced to the stage where they are pretty good, and pretty accessible. But in terms of what they enable, we’ve only seen a sliver of what’s possible.

Read More ›

Animate from fingers to forearms with @LeapMotion and Reallusion iClone 7 for professional motion capture animation. Click To Tweet
This week we’re excited to share a new engine integration with the professional animation community – Leap Motion and iClone 7 Motion LIVE. A full body motion capture platform designed for performance animation, Motion LIVE aggregates motion data streams from industry-leading mocap devices, and drives 3D characters’ faces, hands, and bodies simultaneously. Its easy workflow opens up extraordinary possibilities for virtual production, performance capture, live television, and web broadcasting. This professional-grade package is available now at a special price for a limited time.

Read More ›

This morning, we released an update to the North Star headset assembly. The project CAD files now fit the Leap Motion Controller and add support for alternate headgear and torsion spring hinges.

With these incremental additions, we want to broaden the ability to put together a North Star headset of your own. These are still works in progress as we grow more confident with what works and what doesn’t in augmented reality – both in terms of industrial design and core user experience.

Read More ›

At Leap Motion, we’re always looking to advance our interactions in ways that push our hardware and software. As one of the lead engineers on Project North Star, I believe that augmented reality can be a truly compelling platform for human-computer interaction. While AR’s true potential comes from dissolving the barriers between humans and computers, I also believe that it can help improve our abilities in the real world. As we augment our reality, we augment ourselves.

With its best-in-class field-of-view, refresh rate, and resolution, the North Star headset has proven to be an exceptional platform for representing high-speed motions with small objects. So when David asked me to put together a quick demo to show off its ability to interact with spatial environments, I knew just what to build. That’s right – table tennis.

Read More ›

VR, AR and hand tracking are often considered to be futuristic technologies, but they also have the potential to be the easiest to use. For the launch of our V4 software, we set ourselves the challenge of designing an application where someone who has never even touched a VR headset could figure out what to do with no instructions whatsoever. That application became Cat Explorer, which you can download now for Oculus Rift and HTC Vive.

On the surface it’s a fun, slightly twisted tech demo. But it also serves as a proof of concept for intuitive interaction in training, education, visualisation and entertainment. Designer Eugene Krivoruchko digs into some of the decisions that went into its creation:

Cat Explorer is an experiment in spatial interaction design that focuses on using hands to explore the structural layout of a relatively complex object – in this case a loose anatomical model of a cat named Whiskey.

Modeling of cat and internal organs by Pablo Lopéz, based on Keiichi’s cat Donna.

Read More ›