Visual feedback is hugely important when it comes to motion control – since users can feel lost or frustrated when they’re not sure how their actions are affecting an application. Virtual hands can make it much easier for users to identify what’s happening onscreen. Thanks to the new v2 tracking, we’ve been able to create persistent rigged hands for LeapJS that reflect how your hands look and behave in the real world.
// onscreen hands
When combined with auditory and other forms of visual feedback, onscreen hands can create a real sense of physical space, as well as complete the illusion created by VR interfaces like the Oculus Rift. But rigged hands also involve several intriguing challenges.
At Leap Motion, we want to make interaction with technology as seamless and natural as the real world. V2 skeletal tracking, which we released into public developer beta yesterday, was built to provide a new level of tracking robustness to hands, and to expose full degrees of freedom for every moving part of the hand.