I initialize my Node.js server, plug my Leap Motion Controller into the computer’s USB port, and hold my hand in the air. With a hand gesture that looks like left-clicking on a mouse, the rotors on my drone buzz to life.

The drone lifts into the air and hovers there, waiting for me to issue a command. I move my hand forward and the drone edges away from me. With my hand suspended in midair, I flick my wrist and make a circle gesture, driving the drone down and to the right while turning it to face me.

The Emergence of a Drone Industry

We are on the cusp of a personal robotics age. The most optimistic think that it will only be another 2–3 years until we all have personal robotic assistants, drone deliveries, and robotic entertainment for accessing information and interacting with the world around us. I think that we won’t begin seeing the beginnings of that paradigm shift until sometime in 2018, about 5 years from now.

For those that don’t believe domestic drones will ever be legal – laws passed in 2012 require the FAA to allow commercial drones in domestic airspace by 2015. Just last week, the FAA released a roadmap for drone legalization by 2015 (PDF). In it, they set the stage for legalizing drone use by law enforcement, businesses, universities, and hobbyists. Although they may not hit that exact deadline, it’s likely that we are about to witness the emergence of a multi-billion dollar industry seemingly overnight.

Unfortunately, many people associate drones with military operations, and the press has primarily cast them in a negative light. The word “drone” makes most people cringe as they think about the dangers of militarized drones and possible reductions in privacy. However, this way of thinking is akin to fearing computers in the ’70s because of the possibility that they could be used by black hats to wreck havoc on society.

Drones will be used in agriculture for targeted weed killing, watering, harvesting, and transportation – resulting in less pesticide use, less water waste, and fresher food.

There are thousands of domestic applications for drones that will enhance our world. Drones will be used in agriculture for targeted weed killing, watering, harvesting, and transportation – resulting in less pesticide use, less water waste, and fresher food. Restaurants and grocery stores will deliver food more quickly. Amazon will deliver packages within hours. Logistical issues like traffic will be monitored real-time. Little League games will be videotaped as though professional, weddings will be recorded from previously impossible angles, and journalists will take pictures and video of previously inaccessible areas.

Other benefits to humanity include search and rescue operations, fire and wildfire control, ecological monitoring, deep ocean surveillance (yes, these are technically drones, despite not flying), medical first responders, medical supply transportation, transporting food and water to impoverished areas, and disaster relief.

Drones can also be used for entertainment. Imagine a stadium filled with spectators watching a game of drone quidditch (think Harry Potter), where the snitch is also a drone (my apologies to people that have never read or seen Harry Potter). The drones are controlled by humans whose right arms control movement, and whose left arms control a primary mechanism (depending on whether the drone is defense, offense, etc.).

The future is coming. Can you hack it?

Using Node.js with Leap Motion

Node.js is a framework built on top of Google’s V8 JavaScript Engine and can be used as a server infrastructure for directing communication between multiple devices and a Leap Motion Controller. Node is asynchronous (non-blocking), so data streams can be sent and received simultaneously without interference. Node also leverages callbacks (functions that run upon success or failure to receive data) to chain instructions – so that you can create a series of commands that will run upon completion of the previous command.

To create my motion-controlled drone, I used Leap.js to translate hand coordinates into drone commands, then published instructions to my Node server using Faye (a simple publish-subscribe messaging system), and issued movement commands to the drone. I’m leveraging Leap Motion’s X, Y, and Z axes hand position detection to control left/right, up/down, and forward/back actions. I also used Leap Motion gestures to takeoff, land, and rotate. You can also use gestures to make the drone flip over and do a host of other neat tricks. With Leap Motion’s upcoming software updates that will bolster precision finger control, I will likely refactor my interface to use simple finger motions.

How Does it Work?

To set up your own Leap Motion interface using Node.js, you can check out my repository on GitHub, which is using a Faye client to communicate with an AR Drone 2.0 over wifi. To get an idea of what your Node configuration will look like, I’ve included a few snippets below.

In your server.js file, add:

var client = new faye.Client("http://localhost:" + (app.get("port")) + "/faye", {});

client.subscribe("/drone/move", function (d) { 
   return drone[d.action](d.speed);
});

client.subscribe("/drone/drone", function (d) { 
  return drone[d.action]();
});

Next, create a Leap.js file in your public folder and add:

var controller = new Leap.Controller({enableGestures: true});
  controller.connect();
  controller.on('frame', function (data) {
    mainRoutine(data)
  });

Your mainRoutine function is called on every frame sent by your Leap Motion Controller. I also call two other functions that check hand/finger placement, or whether the user is performing gestures.

var mainRoutine = function (frame) { // Runs on every frame
    gestureHandler(frame);  // routine for handling takeoff, landing and rotations
    handPos(frame); // all other actions
}

Directional commands are issued for every frame using the handPos function. I’ve only included commands for left and right below, but you can expand to account for yPos (up/down) and zPos (forward/back).

var handPos = function (frame) {
    var hands = frame.hands // leap detects all hands in field of vision
    if (hands.length === 0) { // sends the ‘stop’ command if no hands in field
          return faye.publish("/drone/drone", {
            action: 'stop'
          })
    } else if (hands.length > 0){
      var handOne = hands[0]; // first hand.  Can add and track second hand

      var pos = handOne.palmPosition;  // tracks palm of first hand
       
      var xPos = pos[0]; // position of hand on x axis
      var yPos = pos[1]; // position of hand on y axis
      var zPos = pos[2]; // position of hand on z axis

You’ll need to add an adjustment here, so that adjX is -0.5 to +0.5. An example would be adjX = xPos/250.

      if (adjX < 0) { // go left if hand coordinate < 0 (likely need to adjust)
          return faye.publish("/drone/move", {
                  action: 'left',
                  speed: adjXspeed // speed determined by distance from center
          })
      } else if (adjX > 0) {
          return faye.publish("/drone/move", {
                  action: 'right',
                  speed: adjXspeed
          })
      }

By implementing a gestureHandler function, the Leap Motion Controller will check whether a gesture has been performed on each frame. You can expand gestures to handle rotation, flips, and other built-in tricks that come with the AR Drone 2.0.

var gestureHandler = function (frame) { // handles rotation, takeoff and landing
    var gestures = frame.gestures; // leap detects that a gesture is being performed
    if (gestures && gestures.length > 0) {
      for( var i = 0; i < gestures.length; i++ ) { 
        var gesture = gestures[i];  // if multiple gestures, loop through each
        if ( gesture.type === 'keyTap' ) { // similar to left-clicking a mouse
          if (flying) { // ensures drone will land while flying & takeoff if dormant
            land();  // defined elsewhere, but similar to faye.publish used above
          } else {
            takeoff();
          }
        }
      }
    }
  }

And that’s it! With just a little bit of code, you can reach out and control a flying drone with simple hand movements and gestures.

Daniel is a recent graduate of Hack Reactor and was previously an investor at Summit Partners.  He is looking for interesting people with whom to collaborate on awesome things and blogs at www.startupdestiny.com.