Leap Motion soloist? It’s not as strange as it might sound at first. At a recent performance of the Berklee Symphony Orchestra, Muse co-creator Dr. Richard Boulanger played alongside classical horns and strings – in a composition specially written for his virtual musical instrument.

Available for Mac and Windows on the Leap Motion App Store, Muse is the brainchild of Boulanger’s friend and long-time collaborator BT, a Grammy-nominated composer who wanted to build tools that could match his imagination. We asked Dr. Boulanger about the Muse project and what it was like to bridge the digital and analog worlds of music with Symphonic Muse. We’ve also included some really cool videos from Dr. Boulanger’s students, who often develop with the Leap Motion Controller for their thesis projects.

When did you start experimenting with alternate musical interfaces?

We often think of the computer as an “appliance” or a “virtual recording studio” or a “versatile and mutable production tool,” but to me that sells quite short the potential of the most beautiful and soulful “instrument” of our age. I’ve been working with synthesizers and computers since the early seventies, when ARP Instruments founder Alan R Perlman commissioned my first symphony. Given that I am a classical and folk guitar player, he also gave me one of their Avatar Guitar Synthesizers that I used in many performances and concerts.

All of these early electronic instruments allowed for some sort of “alternate” form of control – with microphone inputs, and envelope followers, and pitch-to-voltage and pitch-to-MIDI converters. But it wasn’t until my PhD research work on SoundFile-Convolution at the UCSD Computer Audio Research Lab that I truly began working with 3D gestural controllers.

At UCSD, I became good friends with Dr. Max Mathews from Bell Labs – considered “the father of computer music.” My thesis advisor and boss, F. Richard Moore, was Mathews’ assistant at Bell Labs when working on the GROOVE system. I spent a lot of time with him, and composed an international-award-winning composition for the electronic violin:

After that, Max and I became great friends and lifelong collaborators. He built me many custom versions of his wonderful Radiodrum and Radio Batons. These are very much like working with the Leap Motion Controller. Alternative controllers allow us new and intuitive ways to “play” the computer; to play sounds themselves and reveal, control, and communicate their inner life and beauty.

With intuitive and expressive controllers like the Radio Baton or Leap Motion, we can do so much more than “press play” on our computers, we can actually learn to play it and use it as an extension of our inner selves.


Leap Motion + Processing by student Chatchon Srisomburanonant

How did you get involved with the A3E conference and the Berklee Symphony Orchestra?

One of my former Berklee Music Synthesis students was helping to organize a new conference that would reflect an new emerging paradigm – one in which artists, performers, and composers are collaborating closely with developers to innovate and invent the apps, technology, and audio-art of the future. They knew of my work on Muse with BT and felt that this truly represented the new innovation model that they were focusing upon. So they asked me to do one of the keynotes with BT and to focus on Muse. They were also planning a big concert in the Berklee Performance Center that would feature innovative composers/developers/performers.

Berklee asked if I would like to compose something using Muse for The Berklee Symphony Orchestra under the direction of Francisco Noya (and featuring the principal French Horn player from the Boston Symphony, Gus Sebring) to open the concert. I had been writing short chamber pieces for Muse and cello as well as Muse and voice, and had performed them in Boston, NYC, and Spain. But this was my chance to really push the envelope. Both BT and I envisioned that Muse would be a great tool for film and TV composers and so I wanted to write something that would hopefully inspire that creative community.

What is it like to integrate Muse into a live symphony?

It’s a unique challenge to integrate any electronic instrument, track, or sonic element into a live performance. Getting the levels right between the live and the electronic, keeping the live instruments and the electronics in time with each other, and keeping the live instruments and the electronic instruments in tune with each other – these are just a few of the challenges that come to mind.

The way that Muse and Leap Motion address and solve these potential issues is the fact that the controller actually lets me “play” Muse in time. We have controls in the system that allow me to change the key and volume on the fly. Also, we have several “presets” that I could call up in the performance that bring in new samples, new soundfiles, new synthetic sounds, and new sets of arpeggiated chords. Finally, if I need, I can totally replace any of the built-in sounds, notes, and chords. Everything is in place in the program so that I can follow, lead, or blend in. There’s harmonic and sonic variety in each screen and in the 3 preset soundsets to allow me to tell a pretty nice story musically.

This was an underlying goal in the design of Muse. I wanted a program that would be easy to play so that my granddaughters and my mom and dad could have fun with it, no matter what they did. But it also needed to present the musical intelligence that would appeal to my Berklee students and my colleagues around the world.

How did Muse inspire the composition?

Symphonic Muse started with Muse – it’s not a composition that I wrote in advance and then added some electronic sound effects afterwards. Muse has the built-in ability to “record” the user’s improvisations and performances. I would practice with the program. I would record all my improvisations and practice sessions. Finally, I created an underlying framework for the piece, in Muse only and dropped that AudioFile into a track on my Digital Audio Workstation (DAW) Logic Pro. Around this framework, I began to write the orchestral parts and develop some of the themes. One could almost think of the orchestral parts of Symphonic Muse as the “accompaniment” for a piece that I composed in Muse itself.

What role has Leap Motion played in your curriculum with your students at Berklee?

I have two classes that focus on the Leap Motion – Circuit Bending and Physical Computing, and Composing and Performing on Wireless Mobile Networks and Devices. It’s often the case that when students are doing their final thesis projects with me, that they continue to use and develop for Leap Motion. (Their project videos are sprinkled liberally throughout this post! —Ed.)

Another member of the Muse development team, Tom Zicarelli, uses the Leap Motion Controller in his DSP, Max/MSP, and Jitter classes – encouraging the students to build interactive processing and audio-reactive visual systems. (I should also mention Paul Batchelor and Christopher Konopka, Berklee Electronic Production and Design graduates, for their essential contributions.)


Leap Motion, iPad, and Csound-Based Audio FX Processor by student Nicholas Martins

How do you envision this integration evolving over time as the technology expands?

I sponsor concerts each semester that feature the students performing with gestural devices – commercially available ones like the Leap Motion, Nintendo WiiMote, or HotHands, or even built-in video cameras. I also collaborate with faculty in the Berklee Music Therapy department. My students and I have developed many “hands-free” and “smart-systems” for them to use in clinical settings.

That work has been incredibly inspiring to all of us. It’s life-changing to use the Leap Motion Controller to release the inner music of a severely handicapped child and let them play together with each other and other musicians. There is a huge future in this area as well. I think that our understanding of the healing role of music is changing and that these technologies will be the key that will unlock many new breakthroughs.

What drives you to believe in gesture control as a compelling vehicle for musical composition and performance?

Gesture controllers let me literally bring the computer into the chamber ensemble, choir, and orchestra as the solo or ensemble instrument of the 21st century. I am not using the keyboard to replace the live trumpet section or string section from the recording session on stage. Rather, I am trying to add new colors, new roles, and ultimately take music into new areas.

From the first moment that I had my hands on a modular synthesizer back in 1969, I have always dreamed of being able to “finger-paint” soundscapes – to sculpt and shape sounds in a very intuitive and fluid way. The Leap Motion Controller, especially combined with the underlying power of Csound, makes this possible today. It will take me the rest of my life to fully develop the repertoire that shows how powerful, beautiful, tender, passionate, and dramatic that SoundArt or AudioArt can be. I continue to work toward that end – on the app level, on the design level, and on the musical level.


Leap Motion + Csound by student Mark Jordan-Kamholz

Your work pays reverence the past, the present, and the future in terms of disciplines and technologies that weave into it. In what ways do you emphasize the importance of straddling temporality to your students?

I was educated and trained as a classical composer and performer, and did a lot of performing in the NEC chamber chorale with the Boston Symphony. This culminated, in some way, with my singing Beethoven’s 9th at Carnegie Hall under the great Seiji Ozawa. But when growing up, I also played in a lot of bands, in a lot of clubs and coffeehouses, and at a lot of weddings. And they were great too!

I’ve always loved all sorts of music and all styles of music. That’s why I think I have been a pretty good fit at Berklee, where I’ve been on the faculty for more than 28 years now. It always made me happy to play, sing and share, through music, my joy for life and my songs. Now, as controller, video, audio, and sensor technology advances, I am able to share some of my other visions about sound and performance – through apps like Muse.

Instead of listening to a song or track of mine on iTunes, with the Leap Motion Controller and the Muse app, anyone is able to compose and perform and capture their own songs that are, in some way, influenced by my aesthetic. In a way, through Muse and this expressive interface, users are collaborating with BT, Dr.B. TomZ, TomS, ChrisK, and PaulB on “our” compositions – and that is quite exciting and quite new.

Are there any new Leap Motions apps on the horizon from Boulanger Labs?

We are developing a Leap Motion app called Catch Your Breath at Boulanger Labs that will allow the user to record (or import) audio and transform it in dramatic ways by opening and closing their fingers and moving their hands left, right, up and down. It’s going to be very fun – like an audio version of the Fun House Hall of Mirrors. Stay tuned.

Dr. Richard Boulanger (a.k.a. Dr. B.) has conducted research in computer music at Bell Labs, CCRMA, The MIT Media Lab, Interval Research, Analog Devices, and IBM. He is now a Professor of Electronic Production and Design at Berklee. Learn (and listen to!) more at boulangerlabs.com or his Vimeo channel.