Today’s software instruments sound incredibly lifelike. Unfortunately, our interactions with these “electronic sounds” are often limited by computer interfaces. Consider a real instrument such as a cello where our physical contact with the string is translated into tangible changes in sound. Gently vibrate one’s finger, and you can “feel” and “hear” the vibrato. With electronic sounds, these tactile connections are often lost in translation through mouse clicks, a computer screen, keyboard presses, knobs, sliders, and (at best) a multi-touch interface.

With my latest project, I’ve been exploring how the Leap Motion Controller can make our interactions with electronic sounds feel more “direct” and intuitive. In the video above, you can see how I use Max/MSP and a Max Object created by Masayuki Akamatsu to translate data from the Leap Motion into MIDI. Later, I’ll demonstrate a granular synthesizer I created that uses natural gestures to produce organic-sounding musical textures.

lamtharn01

A small snippet of the Max/MSP I wrote for the Leap Motion Controller, which is available (along with my Ableton project file) for download here. To work, it requires Akamatsu’s aka.leapmotion, Max/MSP, Ableton Live, and the Massive synthesizer.

With this project, vertical motion controls the LFO rate of a dubstep bass that I designed in Massive, while horizontal motion selects pre-programmed notes. Using the Leap Motion Controller, I’m able to select the notes and change the sound of the synthesizer simultaneously – using just my right hand.

This goes beyond simply modulating an effect over a pre-recorded musical segment or song. Instead, I’m actively using motion control to create and shape sound in the same way I would when playing a real instrument. For fun, I also sent the data into an Arduino to control some RGB LEDs in sync with my hand motions. Here’s the code:

const int redPin = 11;

const int greenPin = 10;
const int bluePin = 9;
const int led = 13;

void setup() {
Serial.begin(9600);
pinMode(redPin, OUTPUT);
pinMode(greenPin, OUTPUT);
pinMode(bluePin, OUTPUT);
pinMode(led, OUTPUT);
}

void loop() {

while (Serial.available() > 0) {

int red = Serial.parseInt();
int green = Serial.parseInt();
int blue = Serial.parseInt();

if (Serial.read() == '\n') {
red = constrain(red, 0, 255);
green = constrain(green, 0, 255);
blue = constrain(blue, 0, 255);

analogWrite(redPin, red);
analogWrite(greenPin, green);
analogWrite(bluePin, blue);
digitalWrite(led, HIGH);

Serial.print(red);
Serial.print(green);
Serial.println(blue);
}
}
}

Next, I developed the idea of “gestural sound creation” by leveraging Leap Motion’s additional degrees of freedom. I wanted to use finger detection to control sounds in addition to hand movement. That led to the design of a motion-augmented granular synthesizer in Max/MSP.

Natural control of granular synthesis

What is granular synthesis? Put simply, it’s a type of sound synthesis where short fragments of a sample are extracted and then sequenced together to create new textures. Each fragment  is called a “grain,” and grains are often triggered by a periodic signal.

Granular synthesis is an important tool in modern signal processing and sound design. They create a unique effect that can be weaved into broader tracks, or even used to create fully fledged works on their own. I’ve been working on my projects with Professor Konrad Kaczmarek at Yale University and pieces from his “Zone A” project demonstrates some of the interesting sonic textures that granular synthesis can create.

Instead of using periodic intervals to trigger grains, however, we thought about using finger gestures. I programmed the Leap Motion Controller to detect when a finger is “depressed” (like playing piano in the air) to trigger a corresponding grain. This creates a much more “organic” sounding texture than one generated by a regular granular synthesizer. It also provides the user with a more “tactile” interaction with the process of sound creation.

With this setup, horizontal motion scrubs through the sample, while vertical motion controls grain length. All of these parameters can be controlled simultaneously using just one hand! This frees my left hand to control other sonic aspects, such as the direction of the sound from a 6-channel hemispherical speaker I built. In both videos, using traditional knobs and faders to control this many parameters at once would be very difficult, but the Leap Motion allows me to do so effortlessly in an intuitive manner. I’ve also found it makes electronic music creation more accessible.

Teaching my friends how to make a synth sound from scratch – using Ableton Live, NI Massive, and Max/MSP – would take a lot of training. But somehow, everyone who has tried my Leap Motion dubstep project has been able to “wub wub” within a few minutes of trying the setup and was left smiling and laughing from the experience! When we apply motion-control technology to specific aspects of the music creation process, we can fundamentally change what’s possible when we interact with and create electronic sounds.

Lamtharn Hantrakul is currently a junior at Yale University, double majoring in Applied Physics and Music. To learn more about his work, check out lh-hantrakul.com.