One of the most powerful things about the Leap Motion platform is its ability to tie into just about any creative platform. That’s why we’ve launched a Platform Integrations & Libraries showcase where you can discover the latest wrappers, plugins, and integrations.
Among developers, interactive designers, and digital artists, Processing is an enormously popular way to build compelling experiences with minimal coding. We’ve seen hundreds of Leap Motion experiments using Processing, from Arduino hacks to outdoor art installations, and the list grows every week.
James Britt, aka Neurogami, is the developer behind the LeapMotionP5 library, which brings together our Java API with the creative power of Processing.
He’s just rolled out a major update to the library, including a new boilerplate example and a demo designed to bridge hand input with musical output. We caught up with James to ask about the library, his latest examples, and how you can get started.
Building a drawing program from scratch
One of the best features of Neurogami’s library is the ability to poll or callback depending on your preference. Polling with the draw() function is a popular approach for many beginners because it’s easy to get started, while more advanced Processing sketches frequently use callback handlers. However you learned to write in Processing, this library can do it.
To help get you started, Neurogami has created a simple drawing program to demonstrate how to reference the Leap Motion library, tell it to look for gestures and finger positions, and use that to draw to the screen.
OSC plugin for Renoise and beyond
Neurogami’s latest project is an OSC (Open Sound Control) sketch for Renoise and similar music creation programs – one that makes it easy to map Leap Motion input to all kinds of sonic output. His goal, he says, isn’t to create a virtual keyboard, but instead to make it easy for musicians to access the creative potential of hand tracking.
“The interest for me lies in how we can use the space we’re given, and what that means,” he says. “Playing the bass on piano is different from playing the bass on a bass guitar I create music differently when I’m using keyboard versus a monome. It all depends on how you and your body use the space.”
Seasoned musicians know that latency is an essential question during live performance. Since Processing tends to run around 60 frames per second, Neurogami points out that the overall latency is probably too high for hitting precise notes. “However, it’s great for triggering different cues or mixing in new beats.”
As for the future of the library, his philosophy is to see how people use it first. “Trying to anticipate how people might want to use something is a problem with a lot of open source projects. Ideally, someone will use my project and want to fork it or submit a patch to add additional functionality. I could spend a lot of time trying to add new features that no one wants That becomes a lot more code to support, and ultimately I want to pursue what interests me and actually use. If some has a feature request they need to make a case for it.”
Do you have a Processing project in the works, or looking to start one? Let us know about it in the comments, or get inspired with previously featured Processing projects.