Get updates on the future of VR/AR:

  FILTERS:      Art & Design      Education      Explorations      Gaming      Hardware      Medical      Music      Orion      Unity      Unreal      UX Design      VR/AR   

// Guest Posts

In yesterday’s post, I talked about the need for 3D design tools for VR that can match the power of our imaginations. After being inspired by street artists like Sergio Odeith, I made sketches and notes outlining the functionality I wanted. From there I researched the space, hoping that someone had created and released exactly what I was looking for. Unfortunately I didn’t find it; either the output was not compatible with DK2, the system was extremely limited, the input relied on a device I didn’t own, or it was extremely expensive.

What if you could create art outside the boundaries of physics, but still within the real world? For artists like Sergio Odeith, this means playing tricks with perspective. Sergio makes stunning anamorphic (3D-perspective-based) art using spray paint, a surface with a right angle, and his imagination.

Creative 3D thinkers like Odeith should have the ability to use their freehand art skills to craft beautiful volumetric pieces. Not just illusions on the corners of walls, but three-dimensional works that that people can share the same space with. This was what inspired me to create Graffiti 3D – a VR demo that I entered into the Leap Motion 3D Jam. It’s available free for Windows, Mac, and Linux on my itch.io site.

What if you could disassemble a robot at a touch? Motion control opens up exciting possibilities for manipulating 3D designs, with VR adding a whole new dimension to the mix. Recently, Battleship VR and Robot Chess developer Nathan Beattie showcased a small CAD experiment at the Avalon Airshow. Supported by the School of Engineering, Deakin University, the demo lets users take apart a small spherical robot created by engineering student Daniel Howard.

Nathan has since open sourced the project, although the laboratory environment is only available in the executable demo for licensing reasons. Check out the source code at github.com/Zaeran/CAD-Demo.

The “Augmented Hand Series” (by Golan Levin, Chris Sugrue, and Kyle McDonald) is a real-time interactive software system that presents playful, dreamlike, and uncanny transformations of its visitors’ hands. It consists of a box into which the visitor inserts their hand, and a screen which displays their ‘reimagined’ hand—for example, with an extra finger, or with fingers that move autonomously. Critically, the project’s transformations operate within the logical space of the hand itself, which is to say: the artwork performs “hand-aware” visualizations that alter the deep structure of how the hand appears.

Menu interfaces are a vital aspect of most software applications. For well-established input methods – mouse, keyboard, game controller, touch – there are a variety of options and accepted standards for menu systems. For the array of new 3D input devices, especially in virtual reality, the lack of options and standards can create significant development challenges.

Yesterday, I introduced you to Hovercast – a hand-controlled menu interface for virtual reality environments. In this post, we’ll take a closer look at the development process behind Hovercast, including some insights on usability and design for virtual reality.

Hovercast is a menu interface for virtual reality environments. Built as a tool for developers, it’s highly customizable, and can include many nested levels of selectors, toggles, triggers, and sliders. All menu actions – including navigation between levels – are controlled by simple hand movements and reliable gestures.

With input from a Leap Motion Controller, Hovercast radiates from the palm of your hand – becoming a powerful, versatile extension of your virtual self. As you rotate your palm toward your eyes, the Hovercast menu fades into view. A wide arc of menu items extends just beyond your fingertips, and follows your hand’s every movement. You can interact with menu items using the index finger of your opposite hand. To select an item, simply move your fingertip (the cursor) nearby, and hover there for a short time.

There’s no denying the buzz around hackathons transforming computer science education, or technical education overall. Over the course of 24 hours (or a weekend) coders can join together in massive marathon sessions, playing with real-world code for fun and prizes. On the other hand, classes are often portrayed as the opposite extreme – slow, unexciting, overly focused on theory.

What if you could twist a piece of sound in the air? With a little knowledge of SuperCollider, you can create your own live electroacoustic music with Greap (Grain + Leap) – an experimental interactive music environment. It started as a research project to investigate the gestural manipulation of sound events, and explore the affordances of open air interfaces.

I have a recurring dream that starts as a nightmare but turns into something else altogether. Imagine the stage of a monumental concert hall. The auditorium is packed, and as the audience notices you, thousands of conversations turn into a deep, imposing silence that sends a chill down your spine. Spotlights on a majestic grand […]

Hand tremors from diseases such as essential tremor, Parkinson’s disease, Wilson’s disease, dystonia and others affect tens of millions of people around the world, and the neurological and genetic basis for many tremors is still yet to be understood. Patients suffer physically, often unable to write and practice art, as well as socially, with tremors giving rise to more social anxiety.

Unfortunately, there are relatively few ways for individuals and doctors to quickly and reliably track tremor progression over time. With better tremor measurement and tracking using Leap Motion, I believe research could progress faster aiding in the treatment of tremors and doctors could have a more efficient tool for quantifying tremor.