In this post, the second of my three-part series on LeapJS plugins, we’ll take a look at Proximity Alert, an audio feedback plugin which gives beeps based upon your hand position. It can be used in any app, and is fully open-source.

Check out this Pen!


As long as I’ve been programming for the Leap Motion Controller, I’ve wanted to build something which integrates sound as a feedback mechanism. As an input device, the Leap is far more analog than the keyboard – rather than a collection of binary states from keypresses, it provides many dimensions of positional and motion data. It provides more than the paltry X and Y data of the mouse, and more than simply Z data on top of that.

Every hand joint carries information on flexion and tension. The wrist alone carries six dimensions – X, Y, Z, roll, pitch, and yaw – each representing a specific intent of the user. This intent can be just a single signal, such as making a fist or stretching, but it can also be much, much more. Given the right training, we are capable of amazing things, from playing musical instruments to assembling toy blocks.

Rethinking our senses

How we are able to use the Leap Motion device is limited by the feedback we get about how a computer understands our hands. Depending on who you ask, we have between 14 and 20 senses. Some of them, such as proprioception, will almost always be available; whereas others, such as touch, we must largely do without. Visual feedback is certainly the most common, and is getting more and more exciting with the growth of WebGL and other technologies. Smell and taste, while not impossible, are hard to work with, and that leaves sound as something that deserves more attention.

When starting with the Leap Motion Controller, one of the more difficult things to do is notice when hands fall out view, and having software respond appropriately. I wanted to build a plugin which would let me know when my hands were becoming hard to detect – such as when leaving the field of view or blocking each other from the camera’s view outright. This is crucial when making apps which allow the manipulation of objects in unbounded 3D space, as it can be easy to leave the view when reaching for something.

Building a proximity alert plugin

I started my Sunday afternoon with a plan to build a warning system for when the hand is near the bounds of the device’s view. If you’ve ever driven one of those vans with ultrasound in the rear bumper that beep when you’re backing up into something, you know exactly what I mean.

The Web Audio API is one of the great hidden gems of JavaScript – widespread, sophisticated, and more or less unheard-of. There are entire books on the subject, although the use-cases are rare. As it turns out, the Web Audio API is great for our purpose here – practically any sound can be constructed. In this demo you can start and stop a basic waveform, change the frequency, and set the wave type (from sine to square to sawtooth. This, combined with the idea of beeps, gives us many dimensions that we can adjust in order to give a feeling of proximity:

  • Tone Pitch (Frequency)
  • Beep Length
  • Timing between beeps
  • Wave Type
  • Volume

I started by scoping out my plugin. The code below returns a function which will be called within the Leap Loop, for every hand in the frame. It can be included in any Leap app, with a simple call to controller.use('proximityAlert').

Leap.Controller.plugin 'proximityAlert', ->
  context = new webkitAudioContext()

  {
    hand: (hand)->
      # do something in the loop, here
  }

Step 1: Making beeps

A wrapper around the Web Audio API allows a particular tone to be played for an amount of time. I started by playing dots in groups of three, and later dropped that in favor of a spacing between dots of half the duration, as you can see here:

Leap.Controller.plugin 'proximityAlert', ->
  context = new webkitAudioContext()

  # this is a wrapper function around the web audio api, taking care of some of the more fiddley bits,
  # such as the fact that oscillators can only be used once.
  oscillate = (freq, duration)->
    oscillator = context.createOscillator()
    oscillator.type = 0 # sine, square, sawtooth, triangle
    oscillator.connect(context)
    oscillator.frequency.value = freq
    oscillator.start(0)
    oscillator.stop(context.currentTime + duration) if duration
    oscillator

  playingUntil = undefined  # initialize this outside of the playDot function scope.
  # Remember, functions and variables created here in the plugin factory are unique for each controller using the plugin.

  # Plays a tone for a specified amount of time
  playDot = (freq, duration)->
    spacing = duration / 2

    if context.currentTime < playingUntil       return     oscillate(freq, duration)     playingUntil = context.currentTime + duration + spacing   {     hand: (hand)->
      playDot(1000, 0.04) # Hz, Seconds
  }

Step 2: Measuring proximity

That’s great, now we’ve got sound… but it’s constantly beeping while the hand is in view! The next step is to make it only play when we’re actually on the edge. It turns out that this is easy with the help of the Interaction Box, which is scaled automatically based upon the device used. Here is the new code, omitting the previous methods for brevity.

Leap.Controller.plugin 'proximityAlert', ->
  context = new webkitAudioContext()

  # snip ...

  # Takes in a a target number and a range
  # Returns how far the target is from the closest end of the range
  # e.g.,
  # distanceFromSegment( 1.5, [0,1]) == 0.5
  # distanceFromSegment(-0.7, [0,1]) == 0.7
  distanceFromSegment = (number, range) ->
    if number > range[1]
      return number - range[1]
    if number < range[0]       return range[0] - number     return false # inside the segment   {     hand: (hand)->
      return unless iBox = hand.frame.interactionBox

      distance = undefined

      # normalizePoint returns an array of three numbers representing fractional position within the interaction box.
      # e.g., a point in the middle of the box would be [0.5,0.5,0.5], and one outside to the right could be [0.5,1.2,0.5]
      proximities = iBox.normalizePoint(hand.palmPosition)
      for proximity in proximities
        if (distance = distanceFromSegment(proximity, [0,1]))
          hand.proximity = true # we expose this flag, so that it can be used elsewhere by any application using the plugin

          playContinuous(1000)

          return # for now, only check one proximity at a time.
  }

Now that we have hand location and the ability to make noise, we can begin designing UX.  This is the final step, and in some ways the most fun – we can make some sweet changes resulting from lots of ideas, experimentation, and ruthless deletion.

The beeps sound different based on the distance from the edge of the box. Inside the box, they are logarithmically spaced (line 99), based off of some hand-tuned magic numbers. Outside of the box, when the hand is almost out of view, there’s a constant note (which you probably don’t ever want to hear if you work in a hospital).

More neat tricks with Web Audio

When I was about to wrap up, I discovered that the Web Audio API, amazingly, supports Holophonics, or 3D positional sound. Due to the API’s ability to connect modules in sequence, it was very easy to connect the context to a “panner” and then the oscillator. This means that you can actually hear which side of the Leap Motion Controller your hand is on.

Also, I swapped out my neat-looking but poorly-sounding 1000Hz for some nice E and F notes, which I found on this nifty table of note frequencies. The difference is remarkable.

In my next post, we’ll take a look at a new plugin for hands. All of this code is open source, and can be used or contributed to at will. There’s a lot that could be built on this – so go wild! Check out the complete source on Github and demo on Codepen, and let me know what you think in the comments below.