This week, we’ve explored the creative process that went into building my Leap Motion guitar app, along with some thoughts about mapping and visual feedback. My app was built quickly as a proof of concept, making use of several existing free frameworks, and could certainly be improved. In the last of my guitar app series, we’ll take a look at how I brought these frameworks together, along with one final variation – a theremin-like synthesizer.
To control the Leap Motion sensor, I used it’s official JavaScript API, Leap.js. For sending MIDI messages, I used MIDIBridge, while the menu with blue sliders on the side was generated with DAT GUI. The buttons were created with Bootstrap. All of these frameworks are powerful and very flexible, but if keeping response time to a minimum is a priority, perhaps a more tailored solution would be ideal. In the performances I filmed, I also used LoopBe MIDI as a virtual MIDI driver to send MIDI messages between programs running in the same PC.
There are three processes running independently in this app, each looping at a different rate:
Function | Purpose | Rate |
controller.loop() | sense the guitar’s position | runs at the default speed of the Leap.js framework |
MIDILoop() | send MIDI messages to a port | every 50 milliseconds (20 updates per second) |
drawScreen() | draw visual feedback on canvas | every 16.66 milliseconds (60 frames per second) |
In musical interaction, response time must be as close to real time as possible. This is critical, so I attempted to make each process iterate at its own optimal time scale – but no more often than that – to avoid wasting processing power.
Inside the functions
controller.loop()
It turns out that certain guitar shapes (like the one I use in the video) are tracked quite well by the Leap Motion sensor. Other guitars with a more traditional split head (like acoustic guitars) are also tracked, but without the same reliability – possibly a new point to take into account in the timeless Fender vs. Gibson dispute! Thanks to this, all I had to do was place my guitar over the sensor’s field of view. I actually discovered this by just curiously fooling around, and was astonished by how well it worked.
Getting the guitar’s position is extremely easy. I use Leap.js’s handy function cursorPosition, and then store the X, Y and Z coordinates of the 3D cursor in a custom object in every iteration of the controller.loop function. This object is then accessed by the other two looping functions (MIDILoop and drawScreen) at their own pace.
controller.addStep(new Leap.UI.Cursor()) controller.loop(function(frame, done) { if (frame.cursorPosition) { var leapPosition = region.mapToXY(frame.cursorPosition, 300, 150); leapvars.leapX = leapPosition[0]; leapvars.leapY = leapPosition[1]; leapvars.leapZ = leapPosition[2]*-1; } }
MIDILoop()
Sending MIDI data is also very easy, and using the MIDIBridge library it only takes two commands:
midiMessage = midiAccess.createMIDIMessage(midiBridge.CONTROL_CHANGE, 0, noteNumber, velocity, sequencePosition); output.sendMIDIMessage(midiMessage);
I use different MIDI notes (noteNumber) as controllers for distinct parameters. The “velocity” of each noteNumber contains the value for its corresponding parameter, as this is the standard way of modulating through MIDI. For example, if I define that the MIDI note 0 determines volume and I want to boost volume to the maximum, I can send a MIDI message with noteNumber = 0 and velocity = 127.
The MIDIloop function essentially runs the above code several times, once per tracked parameter, making some adjustments to match what’s been set through the GUI.
drawScreen()
All this function does is update a canvas object by drawing squares and circles – whose size, position, and color are in reference to the custom object updated by controller.loop().
Building a drumstick theremin
In the video at the top of this post, I made yet another variation of my app that goes in an entirely different direction. The Leap Motion sensor captures the position of a drumstick, which my app converts into MIDI messages to control a theremin-like synthesizer. To be honest, I’m not much of a proficient one-handed drummer or theremin player for that matter. Playing both instruments at the same time is quite a challenge, but I was still able to discover several exciting and usable ways of combining both.
In this app, the X axis is mapped to the pitch of the synthesizer, the Z axis is mapped to its volume, and the Y axis to the depth of an added LFO oscillation. I configured one of the drum pads to act as a switch – when hit, it turns on a harsh distortion for the synthesizer. More interesting things could be done with a theremin and a Leap Motion Controller by using two hands and multiple fingers, but I wanted something that could be effortlessly and casually used with one drumstick while simultaneously playing the drums.
I also experimented with adding effects to the synth when showing two or more fingers to the sensor (while still grasping the stick with the rest of my hand). Unfortunately, this didn’t work as reliably as I thought it would, as it was hard to avoid false positives and the stick would sometimes hide one of my extended fingers. In another experiment, I mapped the synth’s volume to the stick’s movement, so that it was silent while the stick was still and could be played by swinging the stick in the air.
In theory, it would feel like a more physical instrument, as you would produce sound through the energy of your movements. This idea worked OK, but there was a tiny lag between action and sound, and it confused my rhythm on the drums too much. Perhaps some careful tweaking could make these features usable, but I couldn’t find the right balance myself.
What will you create?
As you can imagine, the simple ideas explored in my apps can also be applied to many other musical instruments and in many other ways. There are a lot more possibilities that I still haven’t looked into, and that you could investigate on your own. Here are some other ideas I invite you to explore:
- Combining both movement tracking and position tracking in one mapping.
- Track the pitch angles of the guitar’s neck in all 3 axes.
- Track the rotation movement angles of the guitar’s neck in all 3 axes.
- Measure movement in independent directions, mapping different parameters to each. (This would condition your interaction, as you can only move so much in one direction before having to move back, but sometimes the paths we’re forced to take because of imposed limitations can result in very original creations.)
- Measure movement acceleration, rather than speed.
- Perform elaborate hand gestures with a free hand. You could try this out while playing the drums, tapping the guitar with a single hand, or while playing a one-handed wind instrument like a harmonica or a trumpet – or how about just using your own voice and keeping both hands free to use over the sensor!
This is fertile new ground that has just started being explored. I hope I was able to transmit some of my enthusiasm to you, and that you may soon start poking around yourself. I’d love to hear back from you! If you decide to explore any of these or other musical possibilities, let me know in the comments below, or on my Leap Motion guitar forum thread.
[…] Part 1: Building Bridges: Exploring the Musical Interface Gap Part 2: Twice Invisible: How to Design a Natural Music Interface Part 3: Under the Hood of a Touchless Modulating App for Guitar […]