Immersion is everything in a VR experience. Since your hands don’t actually float in space (unless you’re this guy), we created a new Forearm API that tracks your physical arms. This makes it possible to create a more realistic experience with onscreen forearms.

How it works

While we created Arm as a separate class, it behaves much like a Bone, with many of the same functions. Direction, width, elbow and wrist positions – everything you need to create an onscreen forearm.

While the capability to track arms has been a part of our platform since the beginning, we’ve excluded it from the tracking model, because it wasn’t very reliable. With our improved beta tracking algorithms, arm tracking is now a viable option. Currently, the arm position is checked against the hand position, but in the future this will work both ways – hand to arm, and arm to hand. Ultimately, we expect that this will improve overall tracking confidence.

Applications

Beyond VR aesthetics, we also created the Forearm API for developers who want to experiment with multi-modal input. Unlike other motion-tracking platforms, the Leap Motion Controller provides highly precise hand interactions at the submillimeter level. By combining arm tracking with full-body tracking technologies, developers could take advantage of gross and fine motor control.

What do you think?

The Forearm API is still a beta feature, and our tracking team would love to know what you think about it. Is this a useful feature for you? What new functions could we add to the API?

Alex Colgan is the senior director of marketing and developer community at Leap Motion. By transforming how we interact with technology, he believes we can make our world feel more human.

Twitter Skype