Get updates on the future of VR/AR:

  FILTERS:      Art & Design      Education      Explorations      Gaming      Hardware      Medical      Music      Orion      Unity      Unreal      UX Design      VR/AR   

Could virtual reality retrain our brains to reverse some types of vision problems? Founded by a former lifelong “lazy eye” sufferer, medical technology firm Vivid Vision has already deployed to hundreds of eye clinics, with startling results in two independent studies.

“Our clinics treat patients using Vivid Vision games that require reaching and grasping. This allows the patient to practice eye-hand coordination skills in a life-like environment where we can assess their performance in real time.”

– James Blaha, CEO of Vivid Vision

A five-year journey to become a global medical technology provider

As a child James Blaha suffered from strabismus, commonly known as being “cross-eyed” or “wall-eyed.” As a result, his brain started ignoring input from his non-dominant eye. This left him with incredibly poor vision in that eye and robbed him of depth perception.

James’ condition, known as amblyopia or “lazy eye,” is estimated to affect 1.75% of the population (or around 135 million people worldwide). Early intervention is often the key, as with conventional therapies the condition is very hard to treat beyond 8 years of age.

As an adult, James decided to try retraining his brain using a combination of Leap Motion hand tracking and VR. Only weeks later, he was able to perceive stereo depth and read with his amblyopic (non-dominant) eye.

After a five-year odyssey that included being part of a Leap Motion startup accelerator program, his medical technology company, Vivid Vision, is now transforming how eye clinics treat strabismus, amblyopia, and other disorders of binocular vision, such as convergence insufficiency.

Standard treatments are beyond boring

Vivid Vision’s VR system shows the patient two different images – one for the strong eye and one for the weak eye. By reducing the signal strength of objects in the strong eye, and increasing them for the weak eye, it becomes easier for the eyes to work together. Over the course of treatment, the system gradually reduces the difference between the two.

Standard behavioural therapies tend (as James puts it) to be “excruciatingly boring.” This has real consequences for patients, especially children, who may not complete treatment courses.

VR and hand tracking also have unique capabilities conventional therapies cannot match. Importantly, VR headsets can show a different image to each eye. Leap Motion hand tracking also makes it possible for patients to interact in 3D – enabling them to practice eye-hand coordination skills in a life-like environment.

How VR medical technology can hack your brain – for good

James Blaha created a game that forces your eyes to work together in order to win. This effectively tricks the player’s brain into strengthening the weaker eye.

Using Leap Motion hand tracking, the game lets the player navigate various controls and play six different games. These include an asteroid shooter, a 3D variant of the classic Atari game Breakout, and targeting levels that force the user to rely on depth and colour perception. The system has a range of difficulty levels and is suitable for all ages.

Outcomes can be digitally tracked and evaluated over time, giving eye care professionals new tools and greater insight into patient outcomes.

Multiple studies show positive results

“I remember the exact time I first saw in 3D. It was the opening screen for the Bubbles game…. I asked my vision therapist, ’uh…I see something odd.’ And when I described it, she said ‘you’re seeing in 3D! That’s depth perception!’  I remember getting chills down my whole body and then crying because… I could see.”

Andrea, 35

A preliminary 2017 study and a more in-depth study published earlier this year showed that Vivid Vision improved both sight in patients’ “lazy” eye, and depth perception.

In the 2019 study, the results were particularly dramatic for the improvement of sight and for children. On the LogMar score (a standard measure, in which LogMAR 0.0 is equivalent to 20/20 vision), the sight of children under 11 in the study improved from an average of LogMAR 0.23 to an average of LogMAR 0.06.

This means that the children came out of the study with close to 20/20 vision in their “lazy” eye.

How Vivid Vision's VR + hand tracking system restores sight

To date, Vivid Vision’s VR system has been deployed to more than 300 optometry and ophthalmology clinics worldwide. New clinics are being added all the time, and the Vivid Vision team believe their medical technology will ultimately improve the lives of millions of people worldwide.

Tom Carter. CTO haptic company Ultrahaptics

Earlier this year, Ultrahaptics and Leap Motion joined forces to combine our expertise and create the world’s leading spatial interaction company. In today’s guest blog, Ultrahaptics CTO, co-founder and long-time Leap Motion developer Tom Carter talks about the story behind the haptic technology he invented – and why adding haptics to hand tracking is so powerful.

Read More ›

Building the world’s most advanced augmented reality headset isn’t exactly for beginners. But at the world’s first #BuildYourNorthStar workshop, over 20 participants built their own open-source Project North Star headsets in just 48 hours – using components now available to everyone.

Read More ›
Ultrahaptics and Leap Motion join forces

Today, we’re announcing a strategic deal with Ultrahaptics that combines the two companies and solidifies our collective role as the world’s leading spatial interaction company.

Ultrahaptics is a long-time Leap Motion developer and the two companies have been working together for nearly six years. Their haptic technology creates tactile sensations in mid-air using ultrasonic waves. This deal will create a vertically integrated technology company that brings us that much closer to fully immersive, rich and physically intuitive virtual interfaces.

This will not in any way affect our unwavering support for the incredible Leap Motion community.

In fact, joining forces will not only lead to new and exciting products, but entirely new categories of technologies that could only come from deep collaboration between these teams.

Our two companies together will be more than the sum of their parts. At Leap Motion we’ve always been about breaking down the barriers between people and technology to reach the true potential of both. This announcement represents the next step in this quest, and we are honored to have you continue with us on this journey.

The future of open source augmented reality just got easier to build. Since our last major release, we’ve streamlined Project North Star even further, including improvements to the calibration system and a simplified optics assembly that 3D prints in half the time. Thanks to feedback from the developer community, we’ve focused on lower part counts, minimizing support material, and reducing the barriers to entry as much as possible. Here’s what’s new with version 3.1.

Introducing the Calibration Rig

As we discussed in our post on the North Star calibration system, small variations in the headset’s optical components affect the alignment of the left- and right-eye images. We have to compensate for this in software to produce a convergent image that minimizes eye strain.

Before we designed the calibration stand, each headset would need to have its screen positions and orientations manually compensated for in software. With the North Star calibrator, we’ve automated this step using two visible-light stereo cameras. The optimization algorithm finds the best distortion parameters automatically by comparing images inside the headset with a known reference. This means that auto-calibration can find best possible image quality within a few minutes. Check out our GitHub project for instructions on the calibration process.

Mechanical Updates

Building on feedback from the developer community, we’ve made the assembly easier and faster to put together. Mechanical Update 3.1 introduces a simplified optics assembly, designated #130-000, that cuts print time in half (as well as being much sturdier).

The biggest cut in print time comes from the fact that we no longer need support material on the lateral overhangs. In addition, two parts were combined into one. This compounding effect saves an entire workday’s worth of print time!

Left: 1 part, 95g, 7 hours, no supports. Right: 2 parts, 87g, 15 hour print, supports needed.

The new assembly, #130-000, is backwards compatible with Release 3. Its components substitute #110-000 and #120-000, the optics assembly, and electronics module respectively. Check out the assembly drawings in the GitHub repo for the four parts you need!

Cutout for Power Pins

Last but not least, we’ve made a small cutout for the power pins on the driver board mount. When we received our NOA Labs driver board, we quickly noticed the interference and made the change to all the assemblies.

This change makes it easy if you’re using pins or soldered wires, either on the top or bottom.

Want to stay in the loop on the latest North Star updates? Join the discussion on Discord!

 Over the past few months we’ve hit several major milestones in the development of Project North Star. At the same time, hardware hackers have built their own versions of the AR headset, with new prototypes appearing in Tokyo and New York. But the most surprising developments come from North Carolina, where a 19-year-old AR enthusiast has built multiple North Star headsets and several new demos.

Read More ›

Bringing new worlds to life doesn’t end with bleeding-edge software – it’s also a battle with the laws of physics. Project North Star is a compelling glimpse into the future of AR interaction and an exciting engineering challenge, with wide-FOV displays and optics that demanded a whole new calibration and distortion system.

Read More ›

Today we’re excited to share the latest major design update for the Leap Motion North Star headset. North Star Release 3 consolidates several months of research and insight into a new set of 3D files and drawings. Our goal with this release is to make Project North Star more inviting, less hacked together, and more reliable. The design includes more adjustments and mechanisms for a greater variety of head and facial geometries – lighter, more balanced, stiffer, and more inclusive.

Read More ›

Earlier this week, we shared an experimental build of our LeapUVC API, which gives you a new level of access to the Leap Motion Controller cameras. Today we’re excited to share a second experimental build – multiple device support.

Read More ›

In 2014 we released the Leap Motion Image API, to unlock the possibilities of using the Leap Motion Controller’s twin infrared cameras. Today we’re releasing an experimental expansion of our Image API called LeapUVC.

Read More ›