A few weeks back, a first-person immersive animation experiment hit reddit. True to form, the ever-investigative VR community immediately began unpacking the possibilities a tool like this could bring to the field of animation. Does virtual reality have the potential to unlock new technical and artistic workflows? What new freedoms (or constraints) does it offer creative professionals? Could this proof of concept be transformed into actual software in the near future?

It didn’t take long for Jere Nevalainen, aka /u/Sonicus, to reveal himself as the Master’s student behind the prototype – a thesis project he’s completing within Aalto University’s Department of Computer Science.

VR animation tool

“When I was starting my Master’s Thesis, I had no idea what to do, so I asked my thesis supervisor if he had any ideas for a project,” Jere told us. “The department had done quite a bit of animation interface design with Leap Motion and and 2D screens, so he said maybe I could do the same, but this time with the Oculus Rift.”

In its current iteration, Jere’s VR animation tool uses our Unity UI widgets. The Slider Widget maps out keyframes and interpolation for the character in the scene from one frame to the next.

blog-widgets

The Toggle Button Widget modifies keyframes, plays the animation, triggers passthrough, undoes actions, and locks the character to the bottom plane of the scene.

VR-WidgetButtonGood

As he zeroed in on a final prototype, Jere recruited some professional animators to test the project hands-on. He’s currently analyzing the results of the tests, which will be incorporated into his final thesis.

“I had five testers from game companies. They all do animation at different levels, and had experience with various pieces of animation software. They weren’t rookies. They were given the tool to use, first for 5 minutes to get used to it, then they had 15 minutes to do whatever posing and animation they want.

Then they filled out a questionnaire about software usability, scale, and gave some pointers about what could be better in it, what they would like to see, and how they would use it in their workflow.”

Several users reported that the prototype was easy to learn and that it felt very natural to manipulate characters with their hands, and they felt like they could move more efficiently around the character in the scene. Another common note was that the users craved the ability for the character’s limbs to emulate the user’s hand gestures. Additionally, as tests wore on, the necessity of hand tracking became incredibly apparent.

“When I was calibrating the Oculus Rift for each user, they always looked at their hands. That’s the first thing that people seem to do, so it seems pretty important to have some sort of hand tracking. It seems mandatory.”

Desktop animation using the Leap Motion Controller.

While Jere is still collecting data, his initial findings foster the idea that 3D motion control in VR, if integrated wisely into the workflow, could speed up the first series of rough poses. He believes that a fully realized version of his prototype could potentially lower the barrier to entry for creatives just getting their feet wet in the animation world.

Once his thesis is complete, Jere plans to open source the project this summer. Stay tuned for updates on that front.