As our physical reality becomes increasingly augmented, creative coders are able to access a whole new trove of intriguing possibilities. Several weeks back, we stumbled upon one such experiment called TACTUM, an unusual combination of projection mapping, motion controls, depth sensing, and 3D printing to create customized wearables. With all that technology, the design process is surprisingly simple – all you need is the light on your skin.

TACTUM is the creation of research and design studio MADLAB.CC. Earlier this week, we caught up with head designer and researcher Madeline Gannon to find out more about the mixed-media work, as well as her artistic process.

tactum-1

What’s the hardware setup behind TACTUM?

TACTUM is an augmented modeling tool that lets you design 3D printed wearables directly on your body. It uses depth sensing and projection mapping to detect and display touch gestures on the skin. A person can simply touch, poke, rub, or pinch the geometry projected onto their arm to customize ready-to-print, ready-to-wear forms.

tactum-2

What inspired you to incorporate Leap Motion technology into TACTUM?

We first implemented TACTUM using a Microsoft Kinect. However, our second version switched to a Leap Motion Controller. Pragmatically, this let us test whether our system was generalizable to many kinds of depth sensor. The speed and accuracy of the Leap Motion Controller also made it much easier and more reliable to projection map our digital geometry onto a moving body.

Using its skeletal tracking capabilities, we were able to dynamically project digital content onto a moving arm, and we used it as a touch sensor to detect and track tactile interactions with the body. The goal of TACTUM was to create a gestural modeling tool that did not rely on mid-air interactions. Instead, we used the controller to detect how a person is touching, pinching, or poking their arm, and we use these gestures to modify interactive geometry in our modeling environment.

tactum-jewelry-watch

What tools or resources did you use in building TACTUM?

We used the Leap Motion Java API to create our skeletal tracking and touch gesture detection, we used Processing and the Toxiclibs library to create our modeling environment and interactive geometry, and we used OpenCV to calibrate our projection mapping.

To keep up with Madeline’s latest projects and research, follow her on Twitter, check out her online portfolio, or read her co-authored white paper on skin-centered design.

Want to see another light-bending art project? See how Felix Faire was able to transform any surface into a musical instrument.