Patrick Hackett (@playmorevgames) is a Senior Gameplay Programmer and Drew Skillman (@dskillsaw) is a technical artist at Double Fine Productions. This post originally appeared on Gamasutra.

If you’re a regular Gamasutra reader, you’ve likely seen an article or two about the prototyping process at Double Fine and how it has become a cornerstone of the company’s nature.  Our Amnesia Fortnight game jams are anticipated, celebrated, and have had an extremely positive effect on the company culture.  But in addition to Amnesia Fortnight, prototyping is regularly done on a less grand scale, be it for in-game mechanics, at home side projects, or the exploration of new technology.

It is in that final example that the topic of prototyping becomes noteworthy again.  How does the prototyping process change when applied to hardware, and not just game mechanics?  It should be noted that when we refer to “hardware”, we’re not talking about the evolution of graphics cards or cell phone batteries, we’re talking about products like the Wiimote, Oculus Rift, DualShock 4, and a recent favorite, the Leap Motion Controller.  We’d like to describe the methods we’ve used for experimental development on these types of hardware, some pitfalls we’ve run into, and the characteristics that attract us to new technology.

Prototyping with Leap Motion

We’ve found two common approaches to developing games for new hardware.  The first is what we’ll call the “direct approach”: integrating the hardware to support a game you’ve already dreamed up.  An example of the direct approach is a mouse-based game, ported to use Kinect, where a mouse click is replaced by a hand push gesture.  It’s common to see developers tackle new hardware with the direct approach initially.  This was especially noticeable with the first wave of iPhone/touchscreen based games where developers trivially replaced mouse input with touch input.  However, as developers, as well as users, become more comfortable with the technology, more efforts are taken using a second approach: allowing the hardware to inform the type of game you should make.  This “reflective” approach requires a bit more familiarity with the technology but makes it easier to achieve an honest union between the software and hardware.  Examples of the reflective approach are easily found in the barrage of Kinect hacks that flooded game blogs in late 2011.

In the relatively short amount of time Double Fine has been working with Leap Motion, we’ve approached game prototyping from both directions: using the direct approach to adapt games to use the Leap Motion Controller as input, as well as creating games inspired completely by the unique data the controller provides us.  A great example of the latter was a prototype Jeremy Mitchell, an artist at Double Fine, created over the weekend after we received our first development units.  Essentially an endless runner, the player uses two fingers extended in a peace symbol [or bunny ears] to control a bunny running through a field.  When a hunter shows up, the player bends their fingers, hiding the bunny in the grass until the hunter passes.  The player collects carrots and bops along to charming music until nightfall.

image

Jeremy wasn’t waiting for the right technology to be created to manifest his bunny runner idea, he instead took what the Leap Motion Controller was good at providing and asked himself what kind of game could be made with it.  This is a prime example of working with the reflective approach to new technology.

While Jeremy was making his bunny runner, we were also working on a touch-based mobile prototype funded by Dracogen.  We modified the prototype to work with the Leap SDK and found it played extremely well when using input from the Leap Motion Controller.  This game would eventually become Dropchord, a title that launched with the platform.  Even though we used the direct approach initially when getting Dropchord working with the Leap, throughout development we tried to solve each new design problem with both direct and reflective approaches.  Certain mechanics designed for touch input made sense when replaced with gesture input (the direct approach), while others required a re-work, causing us to focus on what was appropriate given the available data (the reflective approach).  For instance, the original Dropchord concept had a pause menu that could be brought up with a button tap.  A direct approach to this mechanic would be to create a button tap Leap gesture for the game.  However, that solution conflicts with the basic control of the game, so we tried to find a new way to trigger the pause menu; a reflective solution inspired by the hardware rather than the initial game design. What we settled on was no gesture at all: if the Leap didn’t detect any input, the game would pause.  Simple, clever, and completely informed by the hardware.

The Trials of Being First to the Party

During Dropchord playtesting, we found that our conceptually simple game was giving players more trouble than they should be having.  What we quickly realized was that, not only did the game have to communicate the mechanics to the player, but it also had to communicate the interaction with the device as well.  Our playtesters, intentionally foreign to the Leap Motion Controller, couldn’t grasp the game concepts because they hadn’t made it past the hardware concepts.  And because Dropchord was going to be a launch title on Leap Motion’s digital store, Airspace, we recognized that most of our players would be blind when it came to the hardware.  Thus, in addition to communicating game concepts to the player, we were tasked with explaining the controller’s available playspace, valid motions, and recognized gestures.

This is something that is easily overlooked and most often, rightfully taken for granted.  It’s easy to forget how much dexterity a dual-analog first-person control scheme demands.  And when a form of technology is as mature as the modern console controller is, it’s fair and correct to make assumptions.  In fact, it allows us to push the medium making those assumptions.  However, as new technology emerges and we create the initial wave of applications, it’s important to clear the cache and re-think how the first consumers are going to approach the product.

We solved a few of these problems with some simple tutorials dressed up to match the style of the game.  To familiarize players with the ideal locations of their hands, the initial screen requires the player to line up and hold their fingers over spinning circles.  When done correctly, there is audible and visual feedback and the game beam slowly forms.  Once the player has been trained where to rest their hands, they advance to the main menu which requires them hold the beam over a menu node for a period of time to select it.  The selection progress is displayed as a decaying ring around the node.  This menu selection mechanic is the initial mechanic the player uses while in-game and therefore, they understand the core game concepts before they have started their first play session.  In hindsight, we realized we were actually using the reflective approach to teach players how to use the hardware.

image

Ease of Use

With all prototyping, the speed at which your ideas can become playable should be maximized.  At Double Fine, we regularly use publicly available software in addition to our in-house tools because the primary goal is to get an idea on screen as fast as possible.  Each piece of software can be appropriate for certain ideas, and therefore a developer may be limiting themselves by using only one.  The other benefit to this polyamorous workflow is that a familiarization with a variety of software can leave you well positioned if a new piece of hardware is only compatible with certain software.

On the other side, a new piece of technology with little compatibility to commonly used development software, or flexibility to work with custom proprietary tech, is likely to miss an audience of excited developers.  Had the initial Leap Motion SDK we received not had Unity support, Jeremy would not have been able to create his bunny runner idea in a weekend.  In addition, the original Dropchord prototype was written in Processing and because the Leap Motion team had included a compiled .jar version of the SDK, the integration took under 20 minutes.

Art Imitates Tech

There’s another topic related to the discussion of new technology and it deals with how a game’s fiction plays off the interaction with the device.  Games allow us to imagine alternate worlds, but occasionally new hardware can be as alien as what’s in our imagination, and this can certainly be used to the game’s advantage.

We leveraged this when creating Kinect Party, Double Fine’s critically acclaimed and child-favorite, short attention span non-game.  Kinect Party was a series of activities, each one placing the players in a different environment to explore.  Because the camera feed was used to transpose the players into these new environments, it made sense that the game should take place on a television, with each channel representing a new activity.  In this case, we allowed the hardware to inform us of the design of the game’s fiction.

In the case of Dropchord and the Leap Motion Controller, we felt that the original design of the game could be highlighted by adding Leap’s technology.  It was intended to be an effects driven arcade game with a heavy electronic soundtrack and adding bleeding-edge motion control would give the game a futuristic element that would complement the world.

While these examples show positive uses of the reflective and direct approaches as related to a game’s fiction, it shouldn’t be taken as discouragement to adopt new technology if it doesn’t fit your game’s world.  Not every Leap Motion game has to project a feeling of a stylized future world, for instance.  You should, however, consider what types of games will best complement the hardware that they’ll be played on.

We Want to Make Games for your Hardware

We’d be lying if we claimed that our desire to experiment with new hardware wasn’t fueled by an inherent geekiness.  We’ve grown up with these toys, experimented with them our whole lives, and are excited about where they’re going to go.  New hardware gives us an opportunity to rethink game design and opens doors to entirely new styles of games.  It also allows us to refine our methods for approaching these new technologies, define our practices, and communicate back to the hardware developers to move us in to the future faster.

Whether it be for a weekend game jam, a two-week prototype, or a full-scale project, our passion lies in manipulating new technology to reimagine old ideas and realize new ones.  By using a range of tools and allowing the hardware to inform our game design process, we’ve been able to rapidly approach a wide variety of platforms in a short amount of time.  Hopefully you’re as excited as we are and have a chance to apply these approaches on your own.

So in summary, to all the hardware creators: we want to make games for your hardware.

Sincerely,

The Department of Future Technology at Double Fine Productions