Human anatomy is complex in its nature, and people have been trying to understand it since the late Bronze Age. Most anatomy books focus on specific parts of the body or try to give a general overview; since interaction is impossible, images in books present the information from a single viewpoint. More recently, software tools have been developed to illustrate anatomy with aims ranging from education and anatomical research, to surgical training and pre-operative planning.

My latest project, the Online Anatomical Human (OAH), is the first of its kind – offering real anatomical data in an online environment with existing linked knowledge and the ability to add and share new information. I describe this data as real anatomical data, because it’s obtained from medical imaging data and is not based on an idealized average anatomy.

The Online Anatomical Human is the first of its kind – real anatomical data combining existing knowledge with the ability to add and share new information.

As an online browser and annotation system for real human anatomy, it makes 2D medical image data and the 3D models that were created from it available to everyone with an Internet connection. The system runs completely inside a web browser and directly from the web. There is no need to install any plugins or other software.

(You can see the OAH in action at the top of this post, with a 3D model of the human pelvis alongside three orthogonal 2D views containing medical imaging data.)

Designing 3 Types of Annotations

Besides data exploration tools, an editor is available for annotations. These annotations can be added to the 3D mesh directly to enrich the model. The following types of annotations are available:

Landmark: These are single points on the surface of an anatomical structure. This type of annotation is used either to label an exact point, or to sub-label a structure without a specific region.

Region: Regions are used when a certain part of a structure needs to be annotated, without being precise. This can be useful when, for example, two parts of a structure need to be distinguishable, but the actual border is not evident. This method uses a brush and works well to quickly annotate larger areas.

Line/Contour: This type of annotation is used to show lines on the surface. As the image below shows, lines exist in anatomy. This type of annotation is also useful for more precise annotations than regions, since while regions don’t have clearly defined edges, lines do. Lines can be set to form a contour. In this case, the beginning and endpoint of the line are connected as well. The image below shows the annotation view with an annotated human pelvic bone. In this case, the main regions of the pelvic bone are annotated and a landmark point is placed.

Human anatomy app pelvic bone

An annotated human pelvic bone.

The difficulty with annotations lies in the fact that they are placed on a 3D surface. For single points, this isn’t a problem, as long as the point where the marker must be placed is visible in the current view. For regions and (closed) line segments, solutions are less trivial. For brushes, we don’t just need one point, but a range of points within a certain radius. For lines, we need a strip of points to follow the curvature of the model.

For the region annotations, I used a forward-search algorithm. First, the point on the mesh directly below the cursor is taken. Then the connected vertices within the selected radius are found using a forward search similar to Dijkstra’s algorithm. Because of how the algorithm works, only points on parts of the surface that are connected get colored. This prevents discontinuities in the coloring process, as you can see below.

Human anatomy app coloration

When drawing a region in OAH, disconnected parts don’t get colored.

To get to the initial point below the cursor, I used an off-screen rendering. With this method, the mesh is rendered a second time in the background, with each triangle rendered with a distinct unique color. The image below shows how that would look if you rendered it to screen. To retrieve the triangle below the cursor, we can simply check the color for the corresponding pixel in the background rendering. This is much faster than raycasting techniques, and speed was a high priority in the web environment.

Human anatomy app color values and model faces

Each face of the model is rendered with a distinct color. By checking the color value directly below the cursor, you know the corresponding face.

To avoid making this article into a small book, I won’t go into too much detail for now on the algorithms behind the line annotations. If you’d like me to write a separate post about it, let me know in the comments. In short, I wanted to give OAH users the ability to draw perfect straight lines directly onto a 3D mesh. You can see the results below. From the angle that the line was drawn in, the line looks perfectly straight, while as you rotate the mesh, you see it actually follows the curvature of the mesh.

Human anatomy app annotation lines

Left: Line drawn looks perfectly straight from drawing angle. Right: Rotated view shows that the line follows the curvature of the mesh.

Integrating with Leap Motion

When I got my hands on a Leap Motion device, I just had to try and see whether I could make it work within the application. This was originally not even part of the project description, but sometimes trying something new at random can lead to beautiful things.

In a short time, a basic prototype application was built showing a mesh of a pelvic bone. The Leap Motion Controller can be used to control the camera and to paint on the mesh with a brush. I made a video, and the community’s reactions were overwhelming.

At that moment, I knew this was something that I needed to explore further. The result was Leap Mesh Painter. Although it started out as a mere proof-of-concept application, it has become a small project on its own.

OAH can be controlled with a mouse, Leap Motion, or both. The latter is mainly useful for annotation. You can use one hand in the air to rotate the model and zoom in or out. The other hand controls the mouse to do the actual annotation.

This technique can be used for different applications as well. In the operating room, for instance, computers are often used to display information about the patient and medical imaging data such as preoperative scans. When images on this screen need to be rotated, or a different view is needed, surgeons needs to tell someone else to do this for them, because their hands need to remain sterile. With the Leap Motion Controller, the surgeon would be able to control the screen directly, without touching any input devices.

Leap Motion v2 tracking beta

Recently, the beta version 2 of Leap Motion’s tracking system was released. This version allows for much more accuracy, but more importantly, it allows support for more gestures. OAH was built on version 1 of the software, and although most things work quite well, it would have been nice to enable all menu items with Leap Motion interaction. After seeing this great demo by Brendan Polley, I can’t wait to start playing with the new tracking system.


WebGL, which is used to render the 3D views, is a relatively new technique to utilize the power of the graphics card inside a web browser environment. WebGL is gaining in popularity and support quickly. All major browsers except Internet Explorer support WebGL, even on mobile devices. The project relies on Three.js, a JavaScript framework that simplifies the ways of working with WebGL.

One challenge is to keep the system as lightweight as possible, so that it runs not only on high-end machines, but on tablet computers as well. This means that the models must not become too large in filesize, while maintaining enough detail to be anatomically correct. I converted the models to a JSON format, which can be imported in Three.js directly. These files are considerably smaller than the wavefront OBJ files the model was originally stored in. This is even further decreased when the model mesh is simplified using decimation techniques before conversion.

In order to make Leap Mesh Painter controllable without ever needing to use a mouse, I had to think of ways to change the properties of the brush, such as the color and the diameter. In order to this I built in gestures to open these controls, one of which is Leap Color Picker. This color picker can be controlled by moving your hand in the air. A detailed description and tutorial to build your own Leap supported color picker is available on my blog.


You’d like to try OAH for yourself, you say? Great! I have good news and I have bad news. The bad news is that although the project was set up to be as open as possible, the data (medical images and 3D models) that have been courteously made available by theLeiden University Medical Center may not be shared. That means, that the tool cannot be released with the current dataset.

The good news is that I am currently looking into some datasets that are freely available. Once I do, the tool will definitely become available at some point. If you are interested in using OAH, I’d love to hear from you! In the meantime, Mesh Painter is currently available to try out for yourself, including the ability to drop in custom Three.js geometry.

Mesh Painter demo

/ Cees-Willem is a web developer and entrepreneur. When he’s not building software for interactive lightforms or building websites, he’s usually editing videos or playing guitar.