Over the next several weeks, we’re spotlighting the top 20 3D Jam experiences chosen by the jury and community votes. These spotlights will focus on game design, interaction design, and the big ideas driving our community forward.
Featuring ambient music and beautiful visuals, Hammer Labs’ Otherworld takes you to a strange place where distant spirits stride in mile-long steps and magical puzzles wait to be solved. Available free for the Oculus Rift, Otherworld placed fourth in the 3D Jam. We caught up with coder Oliver Eberlei to ask about the project’s inception.
What’s the story behind Hammer Labs?
Officially, I am the owner and only person working for Hammer Labs. Unofficially, it’s whoever is working on a project with me at the time. I created our first game, Farm for your Life, with my sister. Since we released Farm for your Life on Steam, the name “Hammer Labs” gained a track record. I decided to keep the company and continue to work with other developers to create amazing experiences.
At the moment, we are working on a fast paced, arcade-action shooter similar to Starfox 64 called Sky Arena. The team working on Sky Arena is actually the same team that created Otherworld. We just wanted to try something completely different to recharge our batteries and the 3D Jam seemed like the perfect fit.
Otherworld is visually stunning. What was the inspiration for the dark, mysterious aesthetic?
The visuals were all created by our amazing artist, Simon Klein. We decided early on that we wanted to use the novelty of the Oculus to our advantage and just wow the player. Everyone should be able to dive in and lose themselves in a relaxing and enchanting experience. This is why the game itself is very slow and without any pressures. You can’t lose. Nothing you do will make it impossible to solve the puzzles and there is no time limit.
Looking around and taking in the world is a big part of the experience we wanted to create. We actually had many more interactive details planned which players could discover in the environment without them affecting the game at all. For example, the sky was supposed to show the real stars that we know from our sky and the constellations were supposed to light up when you look at them. But we were short on time so we couldn’t implement those ideas. Maybe we have time for them if we decide to make a complete game.
Sound is a monumental aspect of this experience. Can you tell me a bit about the sonic design process?
Robert Taubler and Michael Hasselmann, our talented sound designers and composers, were essential in this design process. Simon and I developed the core idea and when we pitched it to them, Robert immediately thought of the movie Contact. In the end there is a scene in which Jodie Foster is on a beach on a faraway planet. Whenever she reaches out to touch the environment, her touch creates a shock wave and a sound. The sounds don’t really have a melody, but they all sound beautiful together. This is what we wanted to achieve as well.
The player was supposed to create a relaxing melody by playing the game and even though all the sounds created are random, they sound mystical and beautiful thanks to something called the lydian scale. I’ve never heard of it before, but it sounded amazing when Robert just improvised something on the piano. That was exactly what we needed to round off our experience.
What was it like developing with Unity and Leap Motion in VR?
Simon and I have been using Unity for all of our projects over the last four or five years now, so the Leap Motion integration was the only area we hadn’t tried before. We knew it was essential to give ourselves time to experiment with the new hardware and see what we can do with it. Since VR and motion controls in VR are a very new concept to basically everybody on earth, we decided to provide a sort of introduction to all this new stuff and just use “full-hand grab” interaction.
It turns out that even this is a big challenge to get right. In our first iteration, the players didn’t just have to position the objects, they also had to rotate them correctly. Nobody was able to do it smoothly and it was really frustrating. By using an indirect control scheme, like the rubber band, you have much more time to position the object exactly in the spot you want it to be in. Even if the “let go” motion isn’t detected immediately, the objects are moving very slowly, so a small imprecision doesn’t screw you over as much.
My advice for other developers: Test your ideas very, very, very…. VERY early. Have an amazing idea to use Leap Motion for your input? Build a prototype and have other people try it the same day. And by other people I mean normal people. Not developers. Somebody who hasn’t even heard of Leap Motion before. Because they will use the device in a different way than you do, and your gesture-detection algorithms have to account for that.
Even though Otherworld was created in a short amount of time, we had around four or five input iterations after we had other people test the game, and I wish we had time for many more.