Hand tracking and virtual reality are both emerging technologies, and combining the two into a fluid and seamless experience can be a real challenge. This month, we’re exploring the bleeding edge of VR design with a closer look at our VR Best Practices Guidelines.
Once the most underrated element of virtual reality, sound is now widely recognized to be a major element in creating VR with “presence.” In this post, we take a look at 4 ways that sound, VR, and motion controls can be a powerful combination.
1. Ambiance and Mood
This is the first thing that people imagine when they think of sound and VR. Engines like Unity and Unreal are constantly getting better at representing sound effects in 3D space, which is absolutely essential to creating a sense of presence. The more realistic that zombie right behind you sounds, the more your hair will stand on end.
Music also plays a crucial role in setting the mood for an experience. Weightless and Hollow, for example, both include tracks that influence how we experience them. The soft piano tracks of Weightless feel elegant and contemplative:
While Hollow puts us in the rural world of Washington Irving with a growing sense of dread:
Switch those tracks, and you have two seriously different games.
2. Giving Depth to 3D Space
When it comes to depth cues, stereoscopic vision is a massive improvement on traditional monitors. But it’s not perfect. This is because, in the real world, your eyes dynamically assess the depth of nearby objects – flexing and changing their lenses, depending on how near or far the object is in space. With headsets like the Oculus Rift, the user’s eye lenses will remain focused at infinity.
For this reason, sound is more than just an immersive tool – how (and where) objects around you sound has an enormous effect on your understanding of where they are. This applies to everything from background noises to user interfaces.
3. Evoking Touch through UI
In the absence of touch feedback, visual and auditory feedback are essential. They can fill in that cognitive gap and reinforce which elements of a scene are interactive, and what happens when the user “touches” them.
From button clicks using the Leap Motion Unity Widgets, to the humming Plasma Ball, sound has the power to make users feel more confident interacting with particular objects in the scene. It’s also an essential feedback mechanism when the user’s eyes are drawn elsewhere.
4. Tracking Boundaries
First-time Leap Motion users often master interactions faster when they are guided to stay within optimal tracking range. Auditory cues can serve as powerful but invisible reminders. (For a classic example, the Black & White god games included a color boundary to indicate your zone of influence, with a sound cue that played when your cursor crossed that boundary.)
Combined with other cues, like making the hands translucent at lower tracking confidences, users will quickly learn to keep their hands within tracking range. This is a lot better than hands simply being dropped, which makes users feel like the application is broken. (On a similar note, you can also use audio cues to discourage head clipping within your game.)
There are lots of resources for developers digging into sound and VR for the first time. Here are just a few:
- Unity documentation
- Unreal Engine documentation
- Quick guide to audio and VR
- Surrounded by sound: how 3D audio hacks your brain (The Verge)
- SoundCloud search (commercial use)
What’s the most innovative (or unusual) use of sound you’ve ever experienced in a VR demo?