Real-time explainers

Animation in film and cinema

Courtesy of Aaron Sims Creative

How is animation used in films?

Animation in film and cinema has a long history dating back to the 1920s. Back then, it was all done with the laborious process of drawing each frame by hand, resulting in the tens of thousands of drawings needed for a feature-length film. These individual frames, drawn on transparent celluloid sheets, were then placed against a painted background and photographed, frame by frame, with a camera. The film was then developed, and the shot was reviewed. 

Early animated films introduced audiences to a whole new world of entertainment, but the films themselves were time-consuming to produce, and there was little room for error or experimentation. If a character’s arm or leg wasn’t in quite the right position to work with the background, for example, an entire set of drawings needed to be redone. Such changes were costly, so there was a certain amount of risk involved in making such a film.

These early days of filmmaking also exposed audiences to other types of animation-based entertainment such as stop-motion puppet animation. You might be familiar with this technique through the iconic holiday film Rudolph the Red-Nosed Reindeer (1964), the more recent film The Nightmare Before Christmas (1993), and the Aardman Animation series that stars the characters, Wallace and Gromit. 


3D animation

Then, in the late 20th century, a new style of animation began to emerge: 3D animation. With this technique, a virtual world is built in three dimensions, similar to a film set for a live shoot. You can point your virtual camera at any part of this virtual set, then position your three-dimensional characters where you like. Characters' faces and bodies can be animated by dragging on them with a mouse, instead of drawing the new position with a pen or pencil. With a well-built 3D world, and 3D characters set up properly, you can experiment with camera angles and character action much more freely than with hand-drawn 2D images.

To avoid confusion, it’s worth pointing out that the “3D” in 3D animation doesn’t refer to stereoscopic films, where the viewer wears special glasses to see the film in three dimensions. Stereoscopic films can be animated, or made with live action, or both. The “3D” in 3D animation refers specifically to the way the animation is created under the hood, as three-dimensional objects, where worlds and characters can be more easily manipulated than if they were drawn in two dimensions.

Although filmmakers had experimented with 3D animation in movies during the 1950s with simple wireframe representations, technology quickly advanced to give artists the tools to create fully realized worlds and characters. Early examples include the photoreal water creature in the film The Abyss (1989) and the fully computer-generated (CG) feature film, Toy Story (1995).

While we’ve seen more animated feature films since then, animation has found its place in cinema in a variety of other ways:

Motion capture

When a film includes animated characters, the animation team will often do a motion capture session where live actors act out the action while wearing special suits specifically designed for data collection. The animation team records the actors’ motions digitally, then transfers the motion data to digital characters so they’ll look like naturally moving humans, animals, or creatures. When both body and facial expressions are captured, the process is called performance capture
Courtesy of Cory Strassburger

Visual effects (VFX)

The digital creation or enhancement of visual effects like fire, smoke, fog, clouds, and dust also comes to us through 3D animation technology. These gaseous effects are usually made by clusters of thousands of virtual particles. Ocean water, lightning, and other natural phenomena can also be represented with 3D animation tools. The term VFX also refers to the art of integrated computer-generated imagery with live-action footage.


Previsualization, or previs, is a “first draft” video of a film created with rough shapes (boxes in place of buildings, for example) and simple characters animated with recorded dialogue from the script. Previs gives the director, director of photography, and other crew members a roadmap for shots and sequences, and helps to identify any potential issues before they happen on the shoot.    
Courtesy of Engine House Films

About rendering

One thing that all these applications of animation in film in cinema have in common is that they all need to be rendered. Rendering is the process of turning your 3D representation into a series of 2D images that can be played as a film or video. 

Traditionally, the time to render a single frame can range from minutes for a simple scene, to hours for a scene with a lot of characters and visual effects. When you’re rendering a sequence 30 seconds long that plays at 24 frames per second, that’s 720 frames. If each frame takes 10 minutes to render, that’s 7,200 minutes, or five days.
One of the best advancements in animation has been real-time rendering, where each frame renders in a fraction of a second. Real-time rendering is available in some form in many 3D animation packages, and in game engines like Unreal Engine.

These days, more and more film productions are using real-time rendering to save overall time in production and give filmmakers more opportunities for creative iteration. 

Examples of animation in film and cinema

Here are a few examples of films and series that have made use of real-time animation:

While shows like Game of Thrones (2011-2019) notably used real-time rendering to previsualize famous sequences like the final episode’s pivotal Throne Room scene, its use still continues on modern hits like House of the Dragon.

Netflix’s Love, Death & Robots recently used real-time animation on In Vaulted Halls Entombed, a gripping episode that featured a CG Cthulhu, MetaHumans, and other mocap-driven digital characters, brought to life with the help of Unreal Engine.

The award-winning animated film Allahyar and the Legend of Markhor (2018) was created entirely in Unreal Engine with real-time rendering. The story of a young Pakistani boy going on a journey to save his friend is slated to be followed by a sequel, Allahyar and the 100 Flowers of God, which is being created with the same process.

In Welcome to Marwen (2018), Unreal Engine was used to capture the actors’ performances and retarget them on virtual dolls in their likenesses, all in real time.

Getting started with animation

To get started with animation, check out some of these free resources to kick-start your journey: You can also visit our Animation hub to learn more, or check out our Animation Field Guide to learn how large studios, small groups, and even individuals are producing quality animation right now with Unreal Engine.

More Real-Time Explainers