When developers first fire up the Leap Motion SDK, their first instinct is often to think about how their device can be used in the place of a mouse or touchscreen. But as we’ve learned through user testing, mapping legacy interactions like mouse clicks or touchscreen taps to air pokes often results in unhappy users. Unlike a mouse or touchscreen, the Leap Motion Controller doesn’t provide tactile feedback or a neutral resting position, so this interaction can feel unsatisfying.
In this post, I’d like to share a few of the insights we’ve learned from internal prototyping, and all the cool experiences we’ve seen from creative people like you. Let’s think beyond the idea of “touch at a distance” and take a look at what it means when your hand is the interface.
Three questions you should ask
Instead of asking how we can use the the Leap Motion Controller to replace the mouse, we need to ask three questions:
1. All technology aside, how would I control my computer with my hands? Close your eyes and imagine a new UI beyond the 2D screen for the use-case you have in mind. This takes us a step away from the mouse mindset.
2. What can my hands do? Your hands can roll, grab, wave, and pinch. They have fingers that can easily trace paths in 3D space. What kind of experience would you like to develop with your hand as the interface?
3. How can I affect the UI with my hand? The mouse is designed to let you quickly and easily interact with a single pixel on a 2D plane. And it does a brilliant job, because it combines precision and ease-of-use with absolute unambiguity. But what if you could affect the UI with your entire hand and its many degrees of freedom? (Daniel Plemmons’ post on boundary-breaking menus is a great way to start grappling with this.)
Brainstorming interactions
Now you’re ready to brainstorm high-level interactions. Think broadly and sketch out your ideas, keeping these questions in mind. Try to think of multiple interactions for each task, so you have many options to evaluate against. Don’t be afraid to write down every idea, no matter how crazy!
Once you’ve got lots of potential hand interactions in mind, pick out the ones that can be readily accomplished with our API. Maybe it requires combining different types of data (like hand.yaw()
and hand.palmNormal
). Then prototype some interactions. If there’s a usability problem with a certain interaction, don’t just ditch the idea. Many problems can be fixed using additional hand data or better visual feedback! If you’re really stuck, post about it in the developer category on the forums.
I’d love to know – have you ever reached into a Leap Motion experience that took you beyond the mouse?
I agree that if you change the UI to fit the unique control scheme, this works. And it opens up some incredible opportunities.
The problem with this is, you can’t/won’t change the entirety of conventional computing any time soon. And in the mean-time, it makes your potentially awesome tech into an interesting, exciting, and arbitrary novelty, as opposed to a utility. It has to become utility before it can be commonplace enough to change things; so in the mean-time, it’s critical to find ways to get past the lack of tactical feedback or a neutral resting position.
Exploration of new interfacing ideas and new computing applications is exciting, and important; and there are absolutely aspects of Leap’s capabilities that can completely change the way we do things. Totally unexpected and novel aspects to the common, understood way of doing things are certain to come up. In other words, it clearly is better than the current control schemes in many ways.
But we have to find ways to weave those into the current control scheme, without interrupting it or making it uncomfortable for the user. And touch/resting position are two very big hurdles to this.
Once people can function in their day-to-day computing needs using the new control scheme, then interest will be high enough that you will most definitely begin to change the UI schemes and what-not to match the now predominant tech. But until then, you have to build backwards until you can build that bridge forwards.
So yes, absolutely. Explore. Theorize. Dream a boatload of crap up. But in the mean-time, also make it *useful*. Because I shed another tear every time I look at the really cool, but pretty useless gizmo on the back of my desk as it gathers dust.
Thanks for the comment, Jason. I agree that utility is much better than novelty because it means people *need* the technology to achieve their tasks. Some use cases are already quite compelling for Leap, such using a computer in a hospital room where touch can get in the way (http://www.tedcas.com/). As for everyday consumers though, it will definitely take longer before motion tracking becomes the norm.
[…] for cursors and touchscreens – paradigms based on one-handed interactions. By remembering to think outside the mouse, we can open ourselves up to interacting with virtual objects using both hands. But when are […]
[…] of interaction. This feels flat and artificial, and doesn’t live up to the potential of truly three dimensional input. The more floating screens we have in our environment, the more noise and interruption between us […]
[…] Thinking Outside the Mouse […]
[…] Thinking Outside the Mouse […]
[…] of interaction. This feels flat and artificial, and doesn’t live up to the potential of truly three dimensional input. The more floating screens we have in our environment, the more noise and interruption between us […]