Creating new 3D hand assets for your Leap Motion projects can be a real challenge. That’s why, based on your feedback, we’ve massively automated and streamlined the pipeline for connecting 3D models to our Core Assets with Hands Module 2.0 – so what used to take hours only takes a minute or two. You can […]
// rigged hand
The choice of hand design can fundamentally make or break user experience. As a developer, a hyper-realist render in your trippy space shooter might be an intergalactic buzzkill. Conversely, if your user is playing a general in a WWII bunker, you might want to lean more human than cyborg. Hand Viewer, a brand-new release in our Examples Gallery, gives you an arsenal of onscreen hands to experiment with as you build new desktop experiences with Leap Motion.
Visual feedback is hugely important when it comes to motion control – since users can feel lost or frustrated when they’re not sure how their actions are affecting an application. Virtual hands can make it much easier for users to identify what’s happening onscreen. Thanks to the new v2 tracking, we’ve been able to create persistent rigged hands for LeapJS that reflect how your hands look and behave in the real world.
Leap Motion is a developer-driven platform, and the beauty of open sourcing development resources is that a single library or wrapper, once released to the world, can be integrated and built out by thousands of other people. And since the future of any new platform depends on what people can do with it, over the past few months we’ve released a steady stream of open source assets and examples to help devs get started with our v2 tracking beta.
When combined with auditory and other forms of visual feedback, onscreen hands can create a real sense of physical space, as well as complete the illusion created by VR interfaces like the Oculus Rift. But rigged hands also involve several intriguing challenges.
At Leap Motion, we want to make interaction with technology as seamless and natural as the real world. V2 skeletal tracking, which we released into public developer beta yesterday, was built to provide a new level of tracking robustness to hands, and to expose full degrees of freedom for every moving part of the hand.