May 15, 2019
Trends in animation and VFX at FMX 2019
At FMX 2019, which was held Apr 30 – May 3 in Stuttgart, Germany, there were five trends in animation and effects that emerged across the presentations and product demos. Here, we’ll take a tour of these trends, and see where they point for the future of real-time technology.
Trend #1: RealismThe impact of new hardware such as the NVIDIA RTX cards, in addition to advances in real-time game engines like UE4, was evident across so many aspects of animation and visual effects.
With the advent of more physically plausible lighting, especially real-time ray tracing, there is a move to higher realism in animation and effects.
One example is Matt Workman’s Cine Tracer, a real-time cinematography simulator, which Matt presented on the second day of FMX. This hybrid game/app combines Matt’s programming knowledge with his 10 years of technical on-set cinematography knowledge. Unlike many pitchvis or previs projects from years past, Cine Tracer enables not only accurate blocking and lensing of projects, but also realistic interactive lighting and depth of field.
The realism of this UE4 project gives filmmakers opportunities for creative exploration of movement, focus, smoke/haze, and lighting with industry-standard virtual lights, cranes, and cameras. With it, “players” can explore real-world-based staging and direction of digital talent/actors in stunning next-gen environments created in Unreal Engine 4.
Trend #2: Virtual production and digital humansReal-time collaboration on set and a more refined filmmaking pipeline can lead to a more nonlinear story creation process, which in turn fosters creativity in filmmaking. Several talks highlighted the real-time collaborative benefits of incorporating UE4 into animation and effects pipelines.
David Morin, head of Los Angeles lab at Epic Games, chaired the virtual production track at FMX, which was dominated by stories of companies improving and innovating production pipelines with real-time technology. For example, Kevin Baillie, Creative Director and Sr. VFX Supervisor at Method Studios, outlined how his team used creative UE4 virtual production techniques to bring dolls to life in the Robert Zemeckis film Welcome to Marwen.
Trend #3: Simulation
Real-time simulation, which increases engagement and further adds to realism, was discussed at multiple levels and in several talks.
For example, the game Robo Recall from Epic Games was on display, giving attendees the opportunity to interact with the new Chaos destruction tools highlighted in this year’s UE4 GDC demonstration. The new Robo Recall demo illustrated how players can now interact and directly affect (or destroy) complex scenes during these simulation-heavy portions of the action, rather than being limited to noninteractive cutscenes. UE4 has always been a way to render animation in real time, but this demo illustrated the staggering jump in simulation performance that turns players into participants rather than spectators.
Trend #4: Deep learningOne of the techniques animation technical directors and programmers use to enhance realism and produce real-time simulation is the application of machine-learning techniques, such as deep learning. If there was one buzzword heard across the widest variety of presentations, it was deep learning!
Deep learning, in the context of real-time technology for media and entertainment, involves writing programs that “learn” from vast sets of visual data, and then apply these learnings in real time to a rendering. Some of the most notable examples in this field are in denoising, ray tracing, and—most recently—facial and character animation.
A case in point is the outstanding work from Digital Domain in facial tracking and animation. Doug Roble presented Digital Doug, a virtual copy of Roble’s face which he puppeted in real time, with incredible fidelity and realism, using UE4. This project builds on the work done at Epic Games in recent years and extends it, thanks to new techniques of producing highly detailed training data for facial speech and motion.
Darren Hendler also presented related digital human work for Marvel’s Thanos. Both projects use deep learning as part of the Digital Domain face pipeline.
Not all the apparent AI advances use deep learning. At the conference, Matthias Wittman at Method Studios also demonstrated his “emotional intelligence” research for advanced character animation in UE4. This application doesn’t use machine learning (yet), but it enables highly interactive and realistic acting in secondary characters. The system was designed to run not only interactively, but also in VR. Matthias illustrated the technology (and entertained the audience) by poking characters in the face live in VR.
Trend #5: USD development and adoptionAt a technical level, this year at FMX saw important advances in pipeline integration of UE4 and interoperability of assets with other standard industry tools. Programs such as Autodesk Maya, Foundry’s Nuke, and Unreal Engine are all moving to support USD, the open-source Universal Scene Description format standard.
Most promisingly, NVIDIA has been developing their new Omniverse tool. It’s still in its early days, but Omniverse may facilitate full, seamless interchange and universal asset updates for a variety of use cases, from an individual artist running multiple applications to an intercontinental company sharing assets across all its offices in real time.
USD was pioneered at Pixar Studios for wide sharing of animation, models, and assets. Unlike the Alembic interchange format, USD includes layering, referencing, and shading variants for individual assets, among other features. The broad adoption of USD would mean tremendous efficiencies in a host of animation and effects pipelines, including Unreal Engine.
Epic Games remains committed to the open source movement and to the Academy Software Foundation, and works closely with companies such as NVIDIA to not only maximize stunning onscreen imagery, but also to make UE4 productions more efficient. This helps all productions from animation to virtual production.
To take advantage of all that real-time technology has to offer, download Unreal Engine and get started with virtual production, real-time ray tracing, and more. You can also visit our virtual production hub for interviews, videos, and more insights into the expanding use of real-time technology in film and television production.