Get updates on the future of VR/AR:

  FILTERS:      Art & Design      Education      Explorations      Gaming      Hardware      Medical      Music      Orion      Unity      Unreal      UX Design      VR/AR   

// V2 Tracking

Hi everyone,

I’d like to take a moment to talk about a series of developments we’ve been working on specifically for virtual reality. The first set involves our existing peripheral device and new things developers can do with it starting today, while the second is a look at some of our next-generation hardware and software efforts that we’re currently building from the ground up for this exciting and emerging space.

Around the world, nearly 15,000 animal species are threatened with extinction. These are numbers that stagger the imagination, especially as more species routinely slip into total extinction, never to be seen again. But with digital media, it’s possible to hold huge quantities of data in the palm of your hand – and come to grips with the magnitude of the crisis.

Last week, we took an in-depth look at how the Leap Motion Controller works, from sensor data to the application interface. Today, we’re digging into our API to see how developers can access the data and use it in their own applications. We’ll also review some SDK fundamentals and great resources you can use to get started.

You asked for it, you got it! By popular demand, we’ve just released the Image API, which lets you access raw data from the Leap Motion Controller for the first time. Applications built with version 2.1.0 of our SDK will be able to access our hardware like a webcam. Using this data, you can add video passthrough into your applications.

Visual feedback is hugely important when it comes to motion control – since users can feel lost or frustrated when they’re not sure how their actions are affecting an application. Virtual hands can make it much easier for users to identify what’s happening onscreen. Thanks to the new v2 tracking, we’ve been able to create persistent rigged hands for LeapJS that reflect how your hands look and behave in the real world.

In the physical world, our hands exert force onto other objects. By grasping objects, they exert force through fingers and palms (using precision, power, or scissors grips), and the objects push back. Gravity can also help us hold items in the palms of our hands.

But not in the digital world! In the frictionless space of the virtual, anything can happen. Gravity is optional. Magic is feasible.

Want to demo your website, or perform multiple tests with the same gesture? With our web-based Playback tool, you can record, crop, play, and save snippets of your own Leap Motion usage in JavaScript.

Leap Motion is a developer-driven platform, and the beauty of open sourcing development resources is that a single library or wrapper, once released to the world, can be integrated and built out by thousands of other people. And since the future of any new platform depends on what people can do with it, over the past few months we’ve released a steady stream of open source assets and examples to help devs get started with our v2 tracking beta.

Interacting with your computer is like reaching into another universe – one with an entirely different set of physical laws. Interface design is the art of creating digital rules that sync with our physical intuitions to bring both worlds closer together. We realize that most developers don’t want to spend days fine-tuning hand interactions, so we decided to design a framework that will accelerate development time and ensure more consistent interactions across apps.

I have a recurring dream that starts as a nightmare but turns into something else altogether. Imagine the stage of a monumental concert hall. The auditorium is packed, and as the audience notices you, thousands of conversations turn into a deep, imposing silence that sends a chill down your spine. Spotlights on a majestic grand […]