How would you sculpt a virtual piece of clay? It’s a deceptively simple question – one that we’ve been thinking about for years. Despite the enormous power of modern CAD applications, it still takes hours on the computer to design something that can be sculpted in five minutes using clay. That’s why we created Sculpting (previously dubbed Freeform), our clay sculpting app, over the course of two months. Here’s how we did it.
Power vs. simplicity
Traditionally, the world of digital sculpting and design has been defined by two extremes: inaccessible power and primitive simplicity. The most powerful tools require many hours of training and often feel clunky, while more intuitive tools tend to be more primitive, failing to provide the power and flexibility needed to create complex shapes.
Using a combination of novel and emerging techniques, Sculpting represents a significant step forward in the delicate balance between usability and power. Using the Leap Motion Controller, users can access a variety of sculpting tools and materials, interacting with the clay in 3D space with their bare hands. We also aimed to let the user focus on the creative process by distilling the toolset to only the functional essentials, and using an intuitive minimalistic design for our interface.
Sculpting‘s mesh engine is both flexible and highly performant, allowing it to elegantly handle variable-detail resolution and arbitrary changes in surface topology. Traditionally, this requires the use of voxel-based approaches or implicit surfaces. We started with Lucian Stanculescu’s Freestyle project, which allows for changes in topological genus, making it possible to make holes and join different pieces together.
We achieved flexibility within Sculpting’s mesh engine by requiring edge lengths to fall within a certain range, which adds or removes edges as the mesh is transformed. To adapt the topology, we assume the mesh to be a manifold with vertex movement guarantees – creating a uniform grid at the brush, so that we only have to check a small region of the mesh.
Ultimately, this mean that situations in which mesh modifications would cause self-intersection are resolved automatically. When you change the camera zoom level or brush size, the mesh dynamically adjusts its tessellation detail. You don’t have to worry about the limitations of the geometry, and the interface between your imagination and your creation begins to disappear.
Sculpting uses an octree data model that supports fast sphere and ray lookups. This approach also made the application quick to build and update, since changes only needed to be made locally.
Brushes and sculpting
To design the brush interactions for Sculpting, we kept the fingers in the same position relative to the view, which helped users build a mental map of their real hands to the brush. To ensure robustness, the application only uses the frontmost finger on each hand, and ties brush strength to the amount of time the finger has been visible. With v2 tracking, we expect a much fuller use of the hand, as well as clearer onscreen hand representations.
Once a brush gets close enough to the mesh, Sculpting finds all vertices within range and applies a transformation:
- Grow: moves in the direction of the surface normal
- Press: moves along the finger direction
- Smear: moves with finger velocity
While the current brush is spherical in shape, this approach is also extensible to more complex brush types.
We designed Sculpting’s camera controls with a novel isopotential-based system, in which camera panning moves along equipotentials generated from the mesh geometry, while the camera zoom moves perpendicularly to these equipotentials. This control scheme allows for smooth, intuitive traversal over the surface at large distances, while seamlessly enabling finely controlled surface crawling at closer distances.
During the early stages of development, we quickly discovered that hovering and poking interactions wouldn’t work for Sculpting. Hovering requires a slight time delay, so it would take away the feeling of immediate control, while poking felt awkward within the context of the app’s interaction set.
Quick and two-dimensional – with these goals in mind, we ultimately developed a radial boundary crossing model. This gives a fun feeling of control, while the visual feedback is much clearer and more intuitive. Plus, unlike a poke or hover, it has much lower odds of a false positive, since you can see the interaction start to happen. You can find out more about how the Sculpting marching menus were designed in Daniel Plemmons’ post Rethinking Menu Design in the Natural Interface Wild West.
Lighting and environments
In Sculpting, you can select from several different natural environments to sculpt your creation. The scenes in Sculpting are rendered through high-dynamic-range image-based lighting, which enhances the visual richness of the background environments and the sculpted surface.
Each environment consists of six images at 1.5K resolution, using 48 bit/pixel OpenEXR images rendered from Terragen 3. To create custom sculpting materials – such as glass, clay, and plastic – we employed blurry reflections and refractions. We also made adjustments to the diffuse component, specular component, and refraction index for each material.
As we mentioned earlier, v2 tracking makes it possible to take Sculpting sculpting interactions beyond the foremost finger. We’re working on using onscreen hand representations that will make it fun and easy to use more of the hand for sculpting.
Update 9/12/2014: Freeform has been redubbed Sculpting for V2. This has been reflected throughout the blog with update notes at the bottom of each altered post.