Earlier this week, we shared an experimental build of our LeapUVC API, which gives you a new level of access to the Leap Motion Controller cameras. Today we’re excited to share a second experimental build – multiple device support.
Earlier this summer, we open sourced the design for Project North Star, the world’s most advanced augmented reality R&D platform. Like the first chocolate waterfall outside of Willy Wonka’s factory, now the first North Star-style headsets outside our lab have been born – in Japan.
Virtual reality. Augmented reality. Mixed, hyper, modulated, mediated, diminished reality. All of these flavours are really just entry points into a vast world of possibilities where we can navigate between our physical world and limitless virtual spaces. These technologies contain immense raw potential, and have advanced to the stage where they are pretty good, and pretty accessible. But in terms of what they enable, we’ve only seen a sliver of what’s possible.
Animate from fingers to forearms with @LeapMotion and Reallusion iClone 7 for professional motion capture animation. Click To Tweet
This week we’re excited to share a new engine integration with the professional animation community – Leap Motion and iClone 7 Motion LIVE. A full body motion capture platform designed for performance animation, Motion LIVE aggregates motion data streams from industry-leading mocap devices, and drives 3D characters’ faces, hands, and bodies simultaneously. Its easy workflow opens up extraordinary possibilities for virtual production, performance capture, live television, and web broadcasting. This professional-grade package is available now at a special price for a limited time.
This morning, we released an update to the North Star headset assembly. The project CAD files now fit the Leap Motion Controller and add support for alternate headgear and torsion spring hinges.
With these incremental additions, we want to broaden the ability to put together a North Star headset of your own. These are still works in progress as we grow more confident with what works and what doesn’t in augmented reality – both in terms of industrial design and core user experience.
At Leap Motion, we’re always looking to advance our interactions in ways that push our hardware and software. As one of the lead engineers on Project North Star, I believe that augmented reality can be a truly compelling platform for human-computer interaction. While AR’s true potential comes from dissolving the barriers between humans and computers, I also believe that it can help improve our abilities in the real world. As we augment our reality, we augment ourselves.
With its best-in-class field-of-view, refresh rate, and resolution, the North Star headset has proven to be an exceptional platform for representing high-speed motions with small objects. So when David asked me to put together a quick demo to show off its ability to interact with spatial environments, I knew just what to build. That’s right – table tennis.
VR, AR and hand tracking are often considered to be futuristic technologies, but they also have the potential to be the easiest to use. For the launch of our V4 software, we set ourselves the challenge of designing an application where someone who has never even touched a VR headset could figure out what to do with no instructions whatsoever. That application became Cat Explorer, which you can download now for Oculus Rift and HTC Vive.
On the surface it’s a fun, slightly twisted tech demo. But it also serves as a proof of concept for intuitive interaction in training, education, visualisation and entertainment. Designer Eugene Krivoruchko digs into some of the decisions that went into its creation:
Cat Explorer is an experiment in spatial interaction design that focuses on using hands to explore the structural layout of a relatively complex object – in this case a loose anatomical model of a cat named Whiskey.
This morning, we released the latest generation of Orion tracking alongside major updates to our Unity and Unreal integrations. We’ve also taken several steps to streamline the developer experience, reflecting deeper changes in our SDK over time. Beyond the tracking updates, here’s a quick overview of the latest changes in our SDK.
Today we’re excited to announce the latest milestone of our journey with a major release of our Orion VR tracking software, now available for public beta on Windows. This is the fourth generation of our core software overall, featuring a range of feature improvements across the board: