Get updates on the future of VR/AR:

  FILTERS:      Art & Design      Education      Explorations      Gaming      Hardware      Medical      Music      Orion      Unity      Unreal      UX Design      VR/AR   

// Art & Design

Hand tracking and virtual reality are both emerging technologies, and combining the two into a fluid and seamless experience can be a real challenge. This month, we’re exploring the bleeding edge of VR design with a closer look at our VR Best Practices Guidelines.

Jody Medich is a UX designer and researcher who believes that the next giant leap in technology involves devices and interfaces that can “speak human.” In this essay, she asks how a 3D user interface could transform how we explore and understand content – by giving our brains a whole new dimension of working memory.

In yesterday’s post, I talked about the need for 3D design tools for VR that can match the power of our imaginations. After being inspired by street artists like Sergio Odeith, I made sketches and notes outlining the functionality I wanted. From there I researched the space, hoping that someone had created and released exactly what I was looking for. Unfortunately I didn’t find it; either the output was not compatible with DK2, the system was extremely limited, the input relied on a device I didn’t own, or it was extremely expensive.

What if you could create art outside the boundaries of physics, but still within the real world? For artists like Sergio Odeith, this means playing tricks with perspective. Sergio makes stunning anamorphic (3D-perspective-based) art using spray paint, a surface with a right angle, and his imagination.

Creative 3D thinkers like Odeith should have the ability to use their freehand art skills to craft beautiful volumetric pieces. Not just illusions on the corners of walls, but three-dimensional works that that people can share the same space with. This was what inspired me to create Graffiti 3D – a VR demo that I entered into the Leap Motion 3D Jam. It’s available free for Windows, Mac, and Linux on my itch.io site.

What if you could disassemble a robot at a touch? Motion control opens up exciting possibilities for manipulating 3D designs, with VR adding a whole new dimension to the mix. Recently, Battleship VR and Robot Chess developer Nathan Beattie showcased a small CAD experiment at the Avalon Airshow. Supported by the School of Engineering, Deakin University, the demo lets users take apart a small spherical robot created by engineering student Daniel Howard.

Nathan has since open sourced the project, although the laboratory environment is only available in the executable demo for licensing reasons. Check out the source code at github.com/Zaeran/CAD-Demo.

The “Augmented Hand Series” (by Golan Levin, Chris Sugrue, and Kyle McDonald) is a real-time interactive software system that presents playful, dreamlike, and uncanny transformations of its visitors’ hands. It consists of a box into which the visitor inserts their hand, and a screen which displays their ‘reimagined’ hand—for example, with an extra finger, or with fingers that move autonomously. Critically, the project’s transformations operate within the logical space of the hand itself, which is to say: the artwork performs “hand-aware” visualizations that alter the deep structure of how the hand appears.

Hi, I’m Wilbur Yu! You might remember me from such webcasts as Let’s Play! Soon You Will Fly and Getting Started with VR. In this post, we’ll look at how we structured Widgets to be as accessible and comprehensive as possible.

Daniel here again! This time around, I’ll talk a bit about how we handled integrating the UI Widgets into the data model for Planetarium, and what this means for you.

The first iteration of Widgets we released to developers was cut almost directly from a set of internal interaction design experiments. They’re useful for quickly setting up a virtual reality interface, but they’re missing some pieces to make them useable in a robust production application. When we sat down to build Planetarium, the need for an explicit event messaging and data-binding layer became obvious.

One of the major features of Planetarium is the ability to travel around the globe using motion controls. While this approach is still rough and experimental, we learned a lot from its development that we’d like to share. Later on in the post, we’ll even take a look under the hood at the code involved with the movement and spinning physics that tie everything together.

At Leap Motion, we’ve been working on new resources to make developing VR/AR applications easier, including Widgets – fundamental UI building blocks for Unity. In part 3, Barrett talks about the strange physics bugs we encountered with Time Dial.

One of our new VR Widgets, the Time Dial, surprised (and indeed amused!) us at several special moments during our intense production push. The Time Dial Widget is our hand-enabled VR interpretation of a typical touch interface’s Date Picker. We built it with a combination of Wilbur Yu’s Widget interaction base, Daniel’s data-binding framework (more on those two later), and a graphic front-end that I coded and built – again using Unity’s new 3D GUI.

Over the next several weeks, we’re spotlighting the top 20 3D Jam experiences chosen by the jury and community votes. These spotlights will focus on game design, interaction design, and the big ideas driving our community forward. Inspired by games like Myst and the works of H.P. Lovecraft, Wikkit Gate’s 19th-place entry Deify takes you […]