There’s a space between an electronic musical instrument and the sound it creates where almost anything is possible. With the right technologies, you can tweak, distort, and transform your performative input to create whole new soundscapes on the other side. In this three-part series, I’d like to talk about the amazing possibilities of using the Leap Motion Controller for musical expression, and show how musicians can add new dimensions to their sound.

But don’t worry, you don’t need to know how to program. All you need is some basic familiarity with MIDI and live audio processing software. There are already a few existing tools you can simply download from the Airspace Store, such as Geco MIDI and AeroMIDI, that can get the work done, along with a couple of interesting patches in Usine. However, if you want something more tailored to your needs, all you need is some fairly basic JavaScript skills to build your own app.

In my case, none of the existing software matched exactly the kind of interaction and response time that I had in mind, so I went ahead and built my own lightweight JavaScript app (and a few variations of it too). My app is available for anyone to use or even build upon. Today, let’s take a look at the design process and what inspired me to make it.

What is the musical interface gap?

K1

We can think of any instrument as having an input and an output – an interface for a musician to interact with, and a sound generator that forms the music people hear. In the case of an acoustic guitar like the one above, the strings themselves are both the interface and the sound generator. This means that the mapping between both input and output is dictated by the laws of physics – it’s deterministic, richly packed with subtleties at many levels, yet predictable and immutable. The same is true for all acoustic instruments.

Gibson_Les_Paul_54_Custom

With an electric guitar, the interface and the sound generator are now two separate things. Although the strings on the guitar still generate sound, what people actually hear is the vibrating of the amplifier’s speaker, not the strings. We can choose to make the mapping between these two things direct (replicating physics) or we can get creative and alter the sound in any way we like. Here’s where things start to get interesting.

We can choose to make direct mappings to replicate physics, or we can get creative.

There’s nothing new about distorting the sound of a guitar. All the way back in the ‘50s, guitarists started to discover by accident that if their amplifiers were damaged or if they cranked the volume beyond the levels for which their gear was designed, some new interesting warm fuzzy sounds would emerge. Later on, equipment was specifically built to intentionally alter a guitar’s sound in many different ways, and to some this became a new form of musical alchemy.

A different flavor of sound is enough to express different sensations, but it’s even more interesting to explore the changes in effect parameters over time. Modulating these parameters can be a whole new means of expression – matching accentuations or mood shifts with changes in textures and shades of sound.

Augmenting musical instruments

When we see a guitarist playing an electric guitar, we consider his instrument to be just the guitar in his hands, not the pedals near his feet or the amplifiers behind him. These merely add effects, which may have a strong influence on the sound we hear, but are not being “played” (or interacted with) in order to produce the sound. Occasionally stepping to push a button on or off hardly seems to count. Nor are the knobs on these pedals ideal for this kind of interaction; they are very rarely turned while actively effecting sound, as playing the instrument already occupies both hands.

The wah wah pedal was a simple but elegant solution to this limitation, later adopted by some digital pedals to alter a wide range of other parameters like delay time or volume. In this case, we can say there is a skill to using it; it doesn’t sound farfetched to say that it becomes a part of the instrument being played. While the wah wah pedal is limited to only one degree of freedom, we can take things a step further with the Leap Motion Controller. It offers several distinct yet interlaced degrees of freedom to work with. (I can think of as many as eight, but maybe you can even think up more!)

If, by interacting with a Leap Motion device, we can meaningfully and expressively manipulate sound as part of a performance, then we can consider it to be an extension or even a part of the musical instrument we’re playing – one that could even be mastered with as much dedication as the musical instrument itself.

Tweaking a guitar with Leap Motion and MIDI

In the video at this top of this post, I’m using a simple app I coded in JavaScript. It measures the distance moved by the guitar between each frame, regardless of current location or direction, and sends this value as a MIDI message in real time. From there, FL Studio applies distortion to my guitar in real time by picking up the MIDI messages and modulating the sound accordingly. In this case, I map the amount of guitar movement to the level of distortion, delay, and high-end equalization.

To me, it feels quite natural to stay relatively still while playing softly, and to move and shake the guitar a bit while adding more intensity to a musical phrase. Through this mapping, I can stress notes and passages almost without having to pay attention to the sensor, since I’m already naturally inclined to move in this way. The dynamic qualities of my movements are also translated into the subtleties of my sound. I’m not just switching distortion on or off, but rather surfing up and down through all of the shades in between. Every instant of my performance is imbued with the intensity that I desire. It’s an amazing experience for a musician.

Why not edit in post-production?

You might be thinking that – although this is an innovation that helps when playing live – the same effects can already be achieved when post-editing a recording. After all, values can be set with a lot more precision through the meticulous editing of an audio track.

But this is only partly true. It’s clear that the MIDI messages sent by my app merely turn knobs that were already present in the sound processing software I use, but there’s more to it than that. Imagine that you could take the tension level of every one of the muscles needed to throw a ball, and abstract each into a separate knob. While you could potentially attain much greater precision, would you really be able to throw the ball at all?

The dance of embodied and feedback-driven action is more suited for artistic expression than the carefully pondered tweaking of symbolic representations petrified into a recording.

No. The knowledge you need to throw the ball is not an abstract set of variables you think of numerically, it’s something you perform. It’s engraved in your muscular memory and you embody it through action. Certain movements of your hand and arm emerge only through the complex coordinated control of several muscles in a specific sequence and timing. In spite of this overwhelming complexity, throwing a ball just comes naturally to us.

My project is an attempt to make this kind of corporeal knowledge come to life through sound. I believe that the dance of imprecise but embodied and feedback-driven action is more suited for artistic expression than the carefully pondered tweaking of symbolic representations that we carry out when editing a sound that has already been petrified into a recording.

A question about the future of music

Tune in later this week to the upcoming second and third parts of this series! There you’ll see videos of some more elaborate mappings that use variations of this app to independently alter multiple parameters in a more interesting way. In the meantime, I’ll leave you with a question. With the evolution of virtual instruments and the possibilities of the input/output gap, what will a cutting-edge live musical performance look like in 2024? Let me know on my Leap Motion guitar forum thread.

Photo credits: Wikimedia Commons, Fernando García via Guitarpop and StefanieMM

Nicolás Earnshaw believes that music and technology make a great couple. His experiments with innovative musical interfaces include his Master’s thesis and an academic publication together with Reactable inventor Sergi Jordà. Check out more of his work on his portfolio page at nearnshaw.github.io.