Get updates on the future of VR/AR:

  FILTERS:      Art & Design      Education      Explorations      Gaming      Hardware      Medical      Music      Orion      Unity      Unreal      UX Design      VR/AR   

// Raffi Bedikian

Early last month, Leap Motion kicked off our internal hackathon with a round of pitch sessions. This basically involves everyone bouncing crazy ideas off each other to see which ones would stick. One of our tracking engineers suggested using our prototype Dragonfly module to augment a physical display with virtual widgets. Our team of five ran with this concept to create AR Screen.

You’ve probably heard the rest of the story. Our team’s video got shared on /r/oculus and led to a feature on Wired. While the Wired story focuses a lot on the experience side of things – the power of spatial thinking and offices of the future – it was light on the technical details. Since we’ve heard from a lot of VR developers interested in the project, I thought I’d do a deep dive here on the blog.

We’ve seen how hardware, software, and graphics constraints can all work to produce latency. Now it’s time to put them all together, and ask what we can take away from this analysis.

Latency is an important factor in making any human interface feel right. The Leap Motion Controller has lower latency than other similar products on the market, but exactly how low is it? The honest answer is that it depends on quite a few different variables, some of which are often overlooked.