Real-time unlocks new animation workflows and opportunities for VFX studio MOFAC
The company's filmmakers -- whose experience includes working in a variety of leadership positions across a diverse spectrum of film and media -- combined with its VFX technology, provide the foundation for creating IP content in film, TV, animation, theme-park planning, and designing. MOFAC are also expanding into new media sectors, including location-based entertainment.
New animation techniques and Unreal EngineMOFAC's ground-breaking work last year on real-time animation helped land the company an Epic MegaGrant. The company is now pairing its expertise in the traditional film-production pipeline with work on real-time animation in two projects: The Life of Our Lord and Golden Panda. The Life of Our Lord is an animation feature film based on the Charles Dickens’ novel of the same name. The film was directed by MOFAC CEO Sung-ho Jang, who is also known as one of the top VFX supervisors in Korea, and filmed by cinematographer Woo-hyung Kim, who received the BAFTA TV Craft Award for Photography & Lighting: Fiction for director Chan-wook Park’s The Little Drummer Girl. Use of Unreal Engine's performance capture and virtual production technology creates more realistic character movements and shortens the production time, thanks to the ability to view the final pixels in real-time.
Golden Panda is a 20-episode show about the journey of a baby panda that grows massive when he feeds on golden dumplings. The first episode of the show immediately caught the attention of China's leading video streaming services and entertainment companies such as Tencent and Youku. Part of the show's success is thanks in part to Unreal Engine's real-time ray tracing, which allows the show's team to achieve the best rendering possible.
MOFAC decided on real-time animation for the two IPs due to the inefficiencies arising from the traditional animation workflow. In the old pipeline, the producer plans out the storyline on a storyboard stage, and then the relevant departments work linearly for several months to complete the final content. If anyone misinterprets the producer’s direction, if the quality is not up to par, or a better vision arises, iteration becomes a very inefficient process.
The team at MOFAC sought out a solution from the general film production workflow that allowed them to change directions with key personnel immediately. They recognized the necessity of doing things in real-time to get this level of flexibility within the animation-production process. As a result, the studio was able to try out new techniques and achieve efficient results. For example, producers were able to closely interact with motion capture actors and communicate the direction for their acting. This made it possible for them to make decisions on set based on screens showing scenes of actor-driven characters who are composited into the background in real time. The new pipeline had to be completely rebuilt, but this endeavor drastically reduced the overall time and cost of the production and enabled the best decision-making.
Pipeline for real-time animationThe Life of Our Lord was created with virtual production coupled with traditional film-production techniques in order to design, develop, and improve the animation process. This method is significantly different in the way the director, staff, and actors participate and interact with one another compared to the traditional animation production. Here, immersion takes the highest priority. In addition, the new animation pipeline includes casting actors, scouting locations, takes and adlib, re-takes, and editing on set; which are all special features unique to film production.
In MOFAC’s real-time animation process, there are five main sub-processes. Three of these sub-processes— layout compositing and shooting in a virtual environment, real-time editing and feedback, and real-time feedback for on-set production— were made possible through Unreal Engine. The real-time rendering technology of the engine allows on-set production thanks to high-quality visualization and an excellent sense of immersion.To achieve this, asset creation, development, real-time operation, and updates are essential. MOFAC is currently in the process of developing an asset-creation pipeline, I/O pipeline, real-time editing data sync system and sub-pipelines, multi-rendering from the user’s perspective for feedback, other sub-custom tools, and external drive control functions.
The major post-production pipeline of Golden Panda applied a real-time pipeline similar to what was used for the creation of the Fortnite trailer in 2018. The key difference compared to the existing production method is that all shots are created with real-time ray tracing to bridge the gap between the existing digital content creation tools and Unreal Engine’s final look.
MOFAC currently utilizes a mix of old and new methods for certain areas, with the ultimate goal of unifying the process with Unreal Engine. In the future, the studio aims to shoot a piece from start to finish entirely with Unreal Engine.
The effects of Unreal Engine on real-time animation productionEnhancing productivity through real-time rendering
Previously, reviewing the final results required using the Playblast feature and Maya for final renders, but Unreal Engine's real-time rendering enabled the directors to almost immediately communicate changes that shorten production times from the pre-production stage to post-production stage.
It is now possible to add camera blocking and animation in Maya, view the general atmosphere through the master sequence and fine-tune the details by moving around Sequencer in real-time. The “what-you-see-is-what-you-get” factor is a huge benefit that streamlined the existing pipeline by several stages.
There were also great improvements to rendering time. Using the previous tool required at least 30 minutes per 2K resolution frame, but now Unreal Engine renders a 4K frame in just five seconds with ray tracing enabled.
Unreal Engine’s comprehensive real-time rendering leads to significant innovation by providing real-time feedback to not only directors and actors but also all participants. Participants can receive immediate artistic and technical feedback in their areas of expertise, allowing MOFAC to streamline the long communication times between pre-production and post-production. Even the post-production stage has become efficient as the director can view the screen and immediately provide directions and make decisions on the spot.
Expanding the artist’s boundariesBy reducing the side effects of the waterfall method, production was made possible with half the manpower. In the previous pipeline, the content source was passed on from the earlier stages and compiled with a single program, just like a relay race. In contrast, Unreal Engine enables the team to view the final result and go beyond their boundaries without being limited to their respective positions.
Cutting asset costs: Houdini-HDA & Megascans
Certain circumstances, such as limited timelines and budgets, did not allow for artists to work on asset creation. With no other artists available who had experience in optimizing assets with level of details in the game industry, Houdini was used to procedurally model the simple yet high-maintenance props to shorten the time.
When comparing the method of importing FBX formats to importing through Houdini Digital Asset while maintaining the same file and the number of meshes needed, the latter was a much lighter procedure, which reduced the data size. That’s in part because certain objects that were added to large-scale environments were created with many variations and then instanced.
In the case of creating environments, a small number of artists had to work on a large batch of tasks, which made it challenging to collect assets to create beautiful backdrops that match the desired quality. Megascans was available for free, which enabled much faster placement of dense nature assets. Different versions of the background were created also using foliage, decal, exponential height fog, atmospheric fog, and post-process volume.
Both Golden Panda Directors Ho-seok Sung and DP Woo-hyung Kim said they used Unreal Engine to view the look and set up the lighting in real-time, which enabled more creative work. In comparison to the existing software, they said this made a huge difference and that they are currently refining their pipeline while looking forward to the possibilities the future brings.
Real-time rendering and future contentIn addition to real-time animation projects, MOFAC leveraged Unreal Engine to work on an immersive live performance that combined a virtual character with a physical venue. The studio is also using Unreal Engine on the pre-production phase of a large-scale project scheduled to begin filming in April. This process includes creating a virtual set and checking the location prior to filming using VR scouting. On the virtual set, the director works on movement and acting with the actors in advance. Virtual camera tools are also being customized, and in-house hardware is set up to empower the cinematographer when shooting in VR. MOFAC is planning to support on-set visualization using Ncam once filming begins.
The essence of content is the entertainment factor of a story. Today, Unreal Engine allows studios to efficiently test different visions thanks to new levels of real-time rendering without exorbitant costs. This will allow content creators to become the core of the future and pave the way for a much more creative content market. As is evident in this story, Unreal Engine is a powerful tool that offers many benefits to creators in the film and animation industries.