At Leap Motion, we want to make interaction with technology as seamless and natural as the real world. V2 skeletal tracking, which we released into public developer beta yesterday, was built to provide a new level of tracking robustness to hands, and to expose full degrees of freedom for every moving part of the hand.

The new Bone API introduces a new way to extract data from tracked hands based on physical hand anatomy. With properties such as joint positions, bone lengths, and individual bone bases, it’s now possible to create onscreen rigged hands that mirror the behavior of real hands. This makes visual feedback more direct and intuitive, especially in physics engines, where hands can appear and interact as physical objects.

Finger bone basics

In the human hand, each finger has four bones, while the thumb has three. For ease of reference, the Bone API assumes a zero-length Metacarpal (between the first metacarpal bone and the trapezium, for all you anatomy buffs out there).

Bones are ordered from the base of the wrist to the tips of the fingers (index 0 to 3) or can be indexed by their anatomical names:

boneapi1

Developing with the Bone API

The Bone API provides properties and methods that lets you easily determine a given bone’s anatomical type, length, width, vector, validity, and more. (Be sure to check out our API documentation in your preferred language; it’s a quick read!) Here’s a quick example that lets you draw a cylinder for a bone in Unity:

#define MATRIX_SIZE 16

// Draws a cylinder for a bone.
void HelloHands::drawBone(const Leap::Bone& bone) {
  float bone_basis_values[MATRIX_SIZE];
  bone.basis().toArray4x4(matrix_values);

  glPushMatrix();
  Leap::Vector center = bone.center();
  glTranslatef(midpoint.x, midpoint.y, midpoint.z);
  glMultMatrixf(bone_basis_values);
  // Default rotation is pointing forward in positive Z direction.
  glRotatef(90, -1, 0, 0);

  gl::drawCylinder(CYLINDER_RADIUS, CYLINDER_RADIUS, bone.length());
  glPopMatrix();
}

void HelloHands::draw() {
  // …
  Leap::Frame frame = controller_.frame();

  // Draw all the bones in the hands!
  for (int h = 0; h < frame.hands().count(); ++h) {
    Leap::Hand hand = frame.hands()[h];

    for (int f = 0; f < hand.fingers().count(); ++f) {
      Leap::Finger finger = hand.fingers()[f];

      for (int b = 0; b < 4; ++b)
        drawBone(finger.bone(static_cast<Leap::Bone::Type>(b)));
    }
  }
}

What is a rigged hand?

In animation and 3D design, skeletal animation is a technique that brings together a surface representation (skin/mesh) that overlays a set of bones (rig/skeleton). This makes it possible to create complex animations by controlling the relatively simple movements of the skeleton. With the new robust tracking and Bone API, developers can now rig meshes to hand and finger data to create persistent onscreen hands.

Onscreen hand representations can be very simple, like the earlier cylinder example, or more complex, like our example rigged hands for Unity and JavaScript. (You can even make super-realistic meshes, though you’ll want to be careful; it’s easy to fall into the uncanny valley!) Below is the JavaScript rigged hand in action, also showing one of our other new features – distinguishing between left and right hands. You can even try it out yourself, as long as you have the beta installed.

Visual feedback and physical interactions

When the rigged hand is used in an app, users will be able to reach out and see their hands accurately and intuitively represented in 3D space. This overcomes some significant challenges that we’ve encountered in user testing, as visual feedback is extremely important in natural user interfaces. Without proper feedback, users can get frustrated because it might not be obvious if they’re interacting incorrectly, or if their hands become occluded or out of range.

With physics engines like Unity3D, however, we move beyond visual feedback and into a whole world of possibilities. By treating bones as physical objects in a physics engine, we can stretch, manipulate, grab, stack, and throw 3D objects – and these are just a few simple examples that can be achieved in a fairly straightforward way by using the bones as physically interacting objects.

By treating bones as physical objects in a physics engine, we can stretch, manipulate, grab, stack, and throw 3D objects.

This is where other features of skeletal tracking, like pinch and grab and left/right, can work together with onscreen hands to create compelling experiences that wouldn’t otherwise be possible. Experiences that use Leap Motion technology, not to replace the mouse or joystick, but to let you reach into virtual spaces and control them with your bare hands.

Next week, we’ll take a closer look at some of the design challenges in building rigged hand interactions, including an eye on the dreaded uncanny valley. (Or as we call them around the bunker, “zombie hands.”) As the skeletal tracking beta continues, we’re hoping to add more new features and build new libraries and examples.

What do you think about the new Bone API? What kinds of games can you imagine being built with onscreen hands?

Kevin is a software engineer who joined Leap Motion when we were a tiny stealth startup over two years ago.