Image courtesy of Locus Corporation

Visualizing thoughts with real-time animation: exploring ‘Yumi’s Cells’

April 21, 2022
When creating Yumi's Cells, a Korean TV drama based on a popular webtoon, LOCUS had a lot to live up to. In order to maintain the charm of the original series, the project presented a particular animation challenge: the depiction of the cells that personify Yumi’s emotions and instincts. Viewers loved these cells, so the show’s success hinged on getting them right.

Yumi's Cells taps into human emotions and how they translate into behavior, exploring the wonders of the human brain. But in this project, LOCUS skilfully delivered a seamless combination of live-action performances and 3D animations, which has been key to its success. We talked with the artists behind the TV series to find out how they approached the challenge of portraying human thinking in such a unique way.
 

Please tell us about Yumi's Cells and what you did on the project?

The Yumi's Cells TV show is a TVING original series based on a webtoon. In this fascinating project, both live actors and the 3D cells in Yumi's head are presented together.

The most important thing for the animation was to show the unique cells vividly and flawlessly blending with the photorealistic scenes. Fortunately, we delivered it successfully and the show was a hit with viewers of all ages in 160 countries around the world—including Europe, North America, and Southeast Asia.

What was the decision behind using Unreal Engine for the production?

LOCUS has continuously researched and introduced new technologies to efficiently produce high-quality content. Unreal Engine was presented as a new solution to aid production efficiency and innovation, and our studio researched it with the goal of adopting it. The production staff shared our ambition to make Yumi's Cells even better, so we decided to use Unreal Engine extensively in this project.

We had already used Unreal Engine a few years ago for some small internal projects, including a promo video, a VR game prototype, and R&D on our real-time animation pipeline. We thought Yumi's Cells was a golden opportunity to use Unreal Engine in a bigger production.

Where and how did you use Unreal Engine specifically?

Unreal Engine was used in many areas—from the asset look development stage and shot lighting stage to the final rendering—as all of the results were both high quality and fast.
Image courtesy of Locus Corporation
For the characters, components like the Skeletal Mesh, Alembic Cache, and Dikel were grouped as a single-actor class. Once our master Blueprint actor was complete, child actors were derived to create each character. Within the Blueprint, we then added/used the necessary scripts—from simple, dynamic material instance manipulations to complex functions—to interact with background actors.

We composed our backgrounds using terrain and foliage based on both Unreal Engine's landscapes and static meshes created in Maya. The desert around Yumi village, in particular, was designed as a single space using these landscapes. In addition, each building in the village was generated as a module, and each background was composed as a level within the engine.

We adjusted background assets depending on the circumstances, setting them as Blueprint actors. For lighting, we grouped components including cloud/sky meshes, light, and fog to create a Blueprint actor. Child Blueprints were used again, according to different lighting conditions.
Image courtesy of Locus Corporation
To blend our photorealistic and animation scenes together, we explored various lighting conditions. We matched the lighting in Yumi’s village, for instance, by creating a light Blueprint for each time window, spawning a proper Blueprint in each scene's Sequencer.

Any effects we needed were either embedded as a component within the characters and prop Blueprints, or inserted in the scene independently in the form of a Blueprint actor. We also used Niagara, which enabled our lighting artists to implement effects by only adjusting parameters of the Blueprint actor. Because of this, they could easily use and control high-quality effects without having to learn the Niagara system. In addition, the team prepared effects including snow, rain, dust, and fog in advance, so they could use them anytime. This simplified the process for the lead lighting artists, helping them plan concepts and compose the shot they wanted.
Image courtesy of Locus Corporation
What were the changes in the production pipeline after the introduction of Unreal Engine in the Yumi's Cells project?

The most significant step was to establish a lighting solution immediately, which we did by delivering a massive amount of asset/shot animation data from Maya to Unreal Engine.

During the production of numerous feature-length and TV animations, LOCUS developed an in-house database tool that collects information on everything from modeling to composition. This allows us to take data generated from all of our work processes and utilize it in our various digital content creation (DCC) tools. As a result, we were able to easily apply them to the FBX content pipeline of Unreal Engine.

In the initial asset porting stage, we collectively exported various asset components produced in Maya in a format compatible with Unreal Engine. In any DCC pipeline, version control is important to store the development history of an asset. For this project, we utilized this function to keep assets up to date by checking the latest version of each asset component in Unreal Engine and deciding whether we needed to re-import it.

In the shot-building stage, we created spawnable actors in the level sequence asset, so we could manage every component of our shot in one place. This Sequencer-based approach was combined with the Sequencer scripting plugin, yielding an efficient and improved automation of our tasks, including how we bind assets or add animation assets to different shots.
Image courtesy of Locus Corporation
What are some of the benefits of Unreal Engine you discovered?

Although there are many benefits of using Unreal Engine, the most useful one is its real-time capabilities. Our previous lighting pipeline didn't support real-time installation and adjustment, so our designers had to anticipate lighting. However, in Unreal Engine lighting expression is reflected in real-time, so our designers can adjust lighting immediately. It was also invaluable for the entire team to view/discuss the composition in the live environment, so we could resolve problems—such as unexpected sample errors or long render times—on the fly.
Image courtesy of Locus Corporation
We also obtained a rendering quality equivalent to that of a much faster, offline renderer. If we had worked on the Yumi's Cells project with a traditional approach—rather than using real-time rendering—the render time would have been 20 times longer. But thanks to Unreal Engine, we didn’t need to worry about a drastic increase in processing times, which enabled us to focus on the creative quality of our shots.

Another huge benefit was compatibility with Python, a cross-platform language supported by various DCC tools. Unreal Engine supports execution of engine commands with Python, so you can customize quickly and easily. In particular, Python supports functions required for the animation pipeline, including Sequencer and Movie Render Queue, so we automated massive data to work it into our existing DCC production pipeline strategy.

Material instances were also new benefits, these are things we’ve not seen in previous tools. One material can sprawl in various directions, and a lot of materials can be modified at once! These features were extremely useful when managing dozens of similar-looking characters and the materials of many assets—both of which were necessary due to the nature of this project.

Blueprint was another important feature that we used. It's a concept that provides a bridge for artists who aren't engineers to access technical aspects more easily. Blueprint can be used effectively in so many areas, because it not only technically controls or manipulates actor-class settings and elements of scenes (including the level script), but also creates utility tools required for work processes.

For example, for scenes where a character walks in the snow with a trail of footprints, in the existing Maya-based process, we had to modify the footprint effect whenever the animation was modified, because footprints were matched with the finished walking animation. However, in Unreal Engine, you just implement a setup to create footprints automatically when a character takes a step on the snow.
Image courtesy of Locus Corporation
What was the most challenging aspect of using Unreal Engine and how did you resolve it?

We worried about the final result because Yumi's Cells had a tight broadcast schedule, we’d adopted a new format combining photorealism and animation that required a lot of assets and shot works, and most artists in LOCUS were new to Unreal Engine.

One of the challenges we overcame was figuring out how to deliver animation that was controlled by a complex rig—containing many deformations—using  Unreal Engine. We solved this problem by creating the faces of cell characters using Alembic, and the other parts using FBX. The cell characters' eyes had to be expressed in a cartoon style, so we added a geometry component to the Blueprint actors and prepared it with scripting. We then connected the Alembic face data in Sequencer.

What know-how is behind your successful work, and what did you learn from the production process?

Based on previous experiences, we developed an internal tool to prevent any confusion between pipelines or artist teams at the onset of this project. Without this Python-based tool, we wouldn’t have been able to meet our deadlines, considering each episode of Yumi's Cells is composed of 200 to 300 cuts. In the early stages, there were problems with Blueprint and the Sequencer asset rendering, but both LOCUS and the Unreal Engine teams actively sought out a solution. Epic Games Korea helped us a lot.

As an animation studio, our biggest concern was introducing a real-time 3D tool like Unreal Engine to staff, and helping them get familiar with it quickly. We found that the best approach was to determine the training topic/scope, as well as which parts required to be automated with pipelines and Blueprints, and then discover what we needed to improve while we were performing our established process.
Image courtesy of Locus Corporation
Before starting the main production we hosted an Unreal Engine class to teach our artists the basics of the software. We also conducted 1:1s and hands-on guided training sessions. After that, we documented the cause and solution whenever a new problem occurred while using Unreal Engine, to refer to them easily when the problem recurs. For common problems, like camera-setting errors, we developed a plugin to automate them.

We also had to develop field manuals to explain the characteristics of assets and how to use them, so we could capture the workflow in detail. We presented the numerical values of options and how to use the editor for each workflow stage, so artists could unify asset controls and maintain consistency.

What is next for LOCUS?

As a result of  the popularity of Yumi's Cells Season 1, we are producing Season 2 and a full 3D feature-length animation. You'll meet various Yumi's Cells characters through the drama and animation film. In addition, we are producing an animation based on Toemarok, the number one best-selling Korean novel and the most famous fantasy novel in South Korea. The animation will combine the occult genre, and the compelling story of the original work, with the stylish and modern charm of animation. LOCUS is also planning/producing many animations based on fiction and webtoons, as well as two feature-length animations of Running Man (a popular animation series for children), and another original family animation for global audiences called Red Shoes.

In addition, LOCUS is developing many technologies and designs to bring joy to everyone in the world with cool and funny animation. You can find more information on LOCUS via our website, YouTube, and Instagram.

    Get Unreal Engine today!

    Get the world’s most open and advanced creation tool.
    With every feature and full source code access included, Unreal Engine comes fully loaded out of the box.
    Unreal Editor for Fortnite is now available in Public Beta!
    News
    March 22

    Unreal Editor for Fortnite is now available in Public Beta!

    Unreal Editor for Fortnite (UEFN) combines the power of Unreal Engine with the scale of Fortnite. Use development tools to build games and experiences that can be unlike anything seen in Fortnite so far and publish for millions of players to enjoy. 
    Unreal Editor for Fortnite is now available in Public Beta!
    News

    Unreal Editor for Fortnite is now available in Public Beta!

    Unreal Editor for Fortnite (UEFN) combines the power of Unreal Engine with the scale of Fortnite. Use development tools to build games and experiences that can be unlike anything seen in Fortnite so far and publish for millions of players to enjoy. 
    Dive into Epic’s announcements from GDC 2023
    News
    March 22

    Dive into Epic’s announcements from GDC 2023

    At the State of Unreal, we revealed how we’re laying the foundations for an open ecosystem and economy for all creators. Find out how everything Epic has been building for the past 30 years fits together. 
    Dive into Epic’s announcements from GDC 2023
    News

    Dive into Epic’s announcements from GDC 2023

    At the State of Unreal, we revealed how we’re laying the foundations for an open ecosystem and economy for all creators. Find out how everything Epic has been building for the past 30 years fits together. 
    Meet MAVE: the virtual K-pop stars created with Unreal Engine and MetaHuman
    Spotlight
    June 1

    Meet MAVE: the virtual K-pop stars created with Unreal Engine and MetaHuman

    Find out how Metaverse Entertainment used MetaHuman and Unreal Engine to create a natural, believable, and charming virtual K-pop band, and in the process, produced IP content in various forms.
    Meet MAVE: the virtual K-pop stars created with Unreal Engine and MetaHuman
    Spotlight

    Meet MAVE: the virtual K-pop stars created with Unreal Engine and MetaHuman

    Find out how Metaverse Entertainment used MetaHuman and Unreal Engine to create a natural, believable, and charming virtual K-pop band, and in the process, produced IP content in various forms.