When the Leap Motion Controller made its rounds at our office a couple of years ago, it’s safe to say we were blown away. For me at least, it was something from the future. I was able to physically interact with my computer, moving an object on the screen with the motion of my hands. And that was amazing.

Fast-forward two years, and we’ve found that PubNub has a place in the Internet of Things… a big place. To put it simply, PubNub streams data bidirectionally to control and monitor connected IoT devices. PubNub is a glue that holds any number of connected devices together – making it easy to rapidly build and scale real-time IoT, mobile, and web apps by providing the data stream infrastructure, connections, and key building blocks that developers need for real-time interactivity.

With that in mind, two of our evangelists had the idea to combine the power of Leap Motion with the brains of a Raspberry Pi to create motion-controlled servos. In a nutshell, the application enables a user to control servos using motions from their hands and fingers. Whatever motion their hand makes, the servo mirrors it. And even cooler, because we used PubNub to connect the Leap Motion to the Raspberry Pi, we can control our servos from anywhere on Earth.

Raspberry-Pi-Leap-Motion-Servos-Gif_smaller

In this post, we’ll take a general look at how the integration and interactions work. Be sure to check out the full tutorial on our blog, where we show you how to build the entire project from scratch. If you want to check out all the code, it’s available in its entirety in our project GitHub repository and on the Leap Motion Developer Gallery.

raspberry-pi-leap-motion-controller-servos

Detecting Motion with Leap Motion

We started by setting up the Leap Motion Controller to detect the exact data we wanted, including the yaw, pitch, and roll of the user’s hands. In our tutorial, we walk through how to stream data (in this case, finger and hand movements) from the Leap Motion to the Raspberry Pi. To recreate real-time mirroring of the user’s hands, the Leap Motion software publishes messages 20x a second with information about each of your hands and all of your fingers via PubNub. On the other end, our Raspberry Pi is subscribed to the same channel and parses these messages to control the servos and the lights.

Controlling Servos with Raspberry Pi

In the second part of our tutorial, we walk through how to receive the Leap Motion data with the Raspberry Pi and drive the servos. This part looks at how to subscribe to the PubNub data channel and receive Leap Motion movements, parse the JSON, and drive the servos using the new values. The result? Techno magic.

raspberry-pi-leap-motion-controller-1024x585

Wrapping Up

We had a ton of fun building this demo, using powerful and affordable technologies to build something really unique. What’s even better about this tutorial is that it can be repurposed to any case where you want to detect motion from a Leap Motion Controller, stream that data in realtime, and carry out an action on the other end. You can open doors, close window shades, dim lights, or even play music notes (air guitar anyone?). We hope to see some Leap Motion, PubNub, and Raspberry Pi projects in the future!

Joe is the Content Marketing Manager at PubNub.

Twitter LinkedIn