Over the next several weeks, we’re spotlighting the top 20 3D Jam experiences chosen by the jury and community votes. These spotlights will focus on game design, interaction design, and the big ideas driving our community forward.
Can you tell us a bit about your studio’s overall vision? What’s the inspiration behind VRARlab?
We opened the laboratory about a year ago, as we saw the huge potential in VR as a totally new market. Our VR experiments are inspired by our love of science fiction and by the experiments of other developers who are also exploring virtuality. When we see the results of our work and how rapidly development is moving to this direction, we have a feeling that we are already stepping one foot into the future, where VR is a part of the everyday life of ordinary people.
Hauhet is very abstract and futuristic, while Paper Plane is uses very classic arcade motifs. How did each idea come about?
Both projects are very different from each other. With Paper Plane, we studied the basic features of Leap Motion using fairly simple mechanics. We were interested in creating a project with a small entry threshold, where the use of the sensor within the gameplay would be quite natural. The idea of controlling a plane came quite by accident, when discussing the possible options of the game. We believe that in childhood many of us were running around in the house with the plane trying to avoid the corners. That’s the feeling we wanted to simulate.
The idea for Hauhet came during the discussion about different ways to look at interaction in virtual reality. Usually, we can see projects where the user is manipulating objects that affect the environment (i.e. the geometry of the level). We thought it would be interesting to turn the situation in the opposite way – controlling the geometry of the level by impacting the object. In this case, giving the ability to move part of the gaming scene by changing the direction of the laser beam.
What was it like incorporating Leap Motion into your Unity workflow?
My advice: don’t forget about the orientation of the axes and test on different computers. Unity provides a very natural way for the implementation of VR in your project. There are many differences from the desktop version associated with dimensions (distance between the eyes, the height of the neck, and so on). A more complex task is the movement of the player in VR. We have to figure out how to naturally recreate it so people will not feel uncomfortable and disoriented in space.
What types of tools or building blocks have helped you create a sense of immersion?
Methods of immersion can be quite different, and we try not to be limited to any guidelines. We are a lab – this means that we experiment a lot and constantly try to connect various gadgets and techniques. On our YouTube channel, for example, we show how to combine the Oculus and iPhone, turning it into the input device. Our designers plunge into the creation of special interfaces, as it’s very interesting and qualitatively expands new opportunities within the virtual environment.
What are some of the most exciting developments you’re seeing in augmented reality?
In addition to VR, we work with AR on various devices: mobile platforms, Google Glass, Epson Moverio. But unfortunately, at the moment the possibilities for AR are severely limited with difficulties in recognition of markers and positioning of augmented objects in space. We believe that VR will give impulse and direction to AR, including the possibility of hybrid realities.