Earlier this year, Ultrahaptics and Leap Motion joined forces to combine our expertise and create the world’s leading spatial interaction company. In today’s guest blog, Ultrahaptics CTO, co-founder and long-time Leap Motion developer Tom Carter talks about the story behind the haptic technology he invented – and why adding haptics to hand tracking is so powerful.

I’ve been researching and working in haptics for close to ten years. The more I work on haptics, the more I understand how central touch is to being human. Every day, touch helps us make sense of the world around us. It’s the foundation of our sense of presence and fundamental to intuitive interaction and emotional connections.

Haptic technology is going to bring this vital sense fully into the digital world in the 2020s. And it’s going to be transformative. But for that to happen, haptics needs hand tracking. Here’s why.

Haptics and hand tracking are symbiotic

The touch receptors in your hands are highly specialized and very densely clustered. They allow you to feel very subtle distinctions in texture and pressure.

It’s the feedback loop between motor control and the rich data stream coming from these sensors that makes naturalistic interaction possible. Think of trying to button a shirt while wearing a pair of gloves. It’s possible, but without tactile sensations it’s significantly slower and more laborious.

When you put together hand tracking and haptic feedback, you’re closing the circle on that feedback loop in the digital world. Action and tactile sensation are symbiotic in our hands. The digital technologies that enable these are too.

So, I guess it’s not surprising that the story of Ultrahaptics doesn’t begin with haptics. It begins with hand tracking.

It all started with the Kinect

Much as I’d like to say that Ultrahaptics started when I picked up my first Leap Motion Controller, that’s not quite true.

I started work on what would become Ultrahaptics in the final-year project of my undergraduate degree. That was back in 2011, before Leap Motion had launched their first product. Microsoft Kinect for Xbox had just come out, and the idea of being able to control a computer with 3D movements was fascinating.

What I realised, though, was that you were now using your hands to operate a computer without being in contact with anything. It worked just fine when you were doing big, sweeping movements, or grabbing larger virtual objects. But actions that required fine motor control (such as pressing a virtual button) were really difficult.

I was thinking about how you could restore the sense of touch to address this problem. That’s when my professor had a crazy idea about how maybe we could use ultrasound.

Virtual touch technology

This started a ten-year journey to create a “virtual touch” haptic technology that uses ultrasound to create tactile sensations in mid-air.

How Ultrahaptics’ technology works with Tom Carter

There’s no need for controllers or wearables, which is one of the reasons why our technology and Leap Motion are such a good match. (If you want to dig into how the tech works a bit more, our company backgrounder takes you through it step by step.)

Simple idea, fiendishly complex execution

Sounds simple? In many ways it is – it’s a beautifully simple and fundamentally sound idea. Actually turning it into a functioning piece of hardware was a whole other story.

It took about a year to get from that initial idea to a simple lab prototype. One thing that was clear from the outset, though, was that we weren’t going to get anywhere without accurate hand tracking. You need to know exactly where a user’s hand is in 3D space in order to position the tactile sensations on it.

A seven-year friendship

That’s where Leap Motion comes in. David Holz sent me a beta Leap Motion device in 2012 – I still have it today. That also started a friendship where, over the course of the next seven years, David and I frequently talked about our visions for the future.

Leap Motion hand tracking has been a part of almost everything we’ve built. We’ve played around with other devices, but we always came back to Leap Motion, because it really is the best.

My Leap Motion beta device – I still have it today.

From haptic idea to haptic device

In the early days of Ultrahaptics, it took 20 minutes on the most expensive graphics card you could buy to do the maths to render a single, static tactile point. We’ve iterated many, many times since then, to the point where now we have a library of ready-to-use sensations (such as rotating circles, hand scans, sparkles or ripples) that are growing more sophisticated all the time.

The hardware has also evolved hugely. Last year, we launched our new STRATOS platform. I could go on about this for ages, because we’re really proud of it, but I’ll confine myself to the key headlines.

STRATOS creates tactile sensations in a different way to our previous products (if you’re interested in the detail, check out this blog). It enables a much wider range of haptic sensations. A big breakthrough was also to arrange the ultrasonic transducers in a “sunflower” spiral design based on the Fibonacci spiral. By doing this, you get stronger and better-defined tactile effects.

Joining forces: where do we go from here?

Leap Motion’s hand tracking technology has always been a critical component of what we do. When the opportunity came up to bring the two teams together, we couldn’t pass it up.

At Ultrahaptics we fundamentally understand the value of high-performance hand tracking, because we rely on it ourselves. We’ll be supporting David and the rest of the Leap Motion team to continue to advance Leap Motion’s capabilities and performance – together with collaborating on some pretty awesome joint projects.

Ultrahaptics and Leap Motion are both about enabling people to reach into and interact with the digital world using only their hands. We’re different pieces of the same puzzle. I’m personally thrilled to be working with the amazing Leap Motion team and community as we build the interfaces that will power the next generation of human-computer interaction.

Tom Carter