Composed of VFX professionals who have more than 20 years of experience, Westworld participated in the production of more than 200 of Korea’s top films including The Soul Guardians, Guardian: The Lonely and Great God, and Mr. Sunshine.
Made up of veteran VFX professionals, Korean production company Westworld worked alongside esteemed director Lee Eung-Bok, who directed various K-drama hit series and Netflix’s popular original series, Sweet Home.
Westworld was responsible for the VFX behind the show and, during pre-production, became the first Korean production company to utilize a new filming method that combines real-time tracking and motion capture coupled with Unreal Engine, Ncam, and Xsens. To learn more about how this was done, we interviewed Westworld Supervisor Lee Byeong-joo.
There was a lot of buzz going into the release of Sweet Home due to the production team’s high pedigree. Could you briefly tell us about the project and the team?
Westworld Supervisor Lee Byeong-joo: Sweet Home is a popular thriller webtoon series that kicked off in 2017 and released its final episode in 2020. As one of the many fans who stayed up all night to binge the series, I was ecstatic to hear of plans to recreate it as a Netflix original series and was excited to be part of the production. Director Lee Eung-Bok, who is well-known for various K-drama hit titles, directed Sweet Home. Having worked with us on previous titles, Director Lee possessed a high level of understanding and interest in VFX, which enabled us to work together to create large-scale creatures in this series.
What was Westworld’s role in Sweet Home, and what made Unreal Engine a good fit for the production process?
Byeong-joo: Westworld was responsible for the character design, prop design, storyboarding, previs, as well as the creation of digital creatures and overall VFX of the series. Sweet Home has an intriguing concept of humans turning into different monsters based on each of their desires. Because of this, Sweet Home was incredibly challenging in the technical sense because we had to create a plethora of monsters that all looked different and had various characteristics in a way that had never been attempted before. This is especially true during outdoor close-ups of monsters in broad daylight, clashes involving monsters, and the interaction between monsters and real actors. The biggest challenge was creating these shots using CGI.
In order to create high-quality monsters that are not present on set, it was important to leverage virtual production to get a sense of their movement in advance so that the actors could believably interact with them.
Image courtesy of Westworld
We chose Unreal Engine among different virtual production solutions after we conducted multiple tests and evaluations. The most crucial reason for going with Unreal was due to Live Link, which provides the convenience of easily controlling the lighting and character location and size by linking external data such as cameras and motion capture. We leveraged Live Link to shoot the virtual character based on the real actor’s performance and movement, and to ensure more accurate camera movement.
Also, the director's intention can be communicated to the entire filming team in real time through the screen. This not only sped up communication but also greatly shortened filming time by allowing us to use film data during post-production.
Image courtesy of Netflix
The scenes where monsters and real actors interact are very believable. Could you shed light on how this was done?
Byeong-joo: As we started using virtual production for this project, the combination of Unreal Engine, Ncam, and Xsens were key factors to shooting realistic scenes because it enabled the entire filming crew, production team, and cast to monitor the scene in real time on set.
Normally within local filming locations, we use two methods separately. We either use Unreal Engine and Ncam to track the cameras in real time or use Xsens’ suit to receive motion capture data. However, for Sweet Home, we tried something different by combining real-time camera tracking and real-time motion capture using Unreal Engine.
Image courtesy of Westworld
Here we see our steroid monster placed on set using virtual production and motion capture.
For example, we had to shoot a scene where our over four-meter tall “steroid monster” and a fire truck collide. In order to achieve the interaction between the monster and the truck, the filming team needed to monitor the shot in real time to view where the monster is looking at the scene from a certain angle, taking into account the monster’s range of motion. This is where Unreal Engine came in very handy. The motion-captured actor performed behind the camera using a dummy fire truck. We then used the data in Unreal Engine to create a virtual monster in front of an actual fire truck. The motion-captured actor was able to view the shot being filmed in real time, enabling him to monitor the monster’s movement as he was performing. The director and production team were also able to offer guidance on the monster’s movement in real time on site.
If the traditional filming method were used, the size of the monster would have been too massive to produce with special effects makeup. In this case, a double dressed in the chroma key color would need to move according to the monster’s path of motion determined by the director in advance, and the cinematographer would need to shoot a specific angle. In addition, the traditional method would have required break times and the director’s guidance in between takes. Fortunately, the Westworld team was able to forego these unnecessary steps by using real-time technologies. Especially since the real-time method facilitated a unique collaborative process. This enabled the actors and directors to communicate and work together in real time, which saved a significant amount of time and resources.
Image courtesy of Westworld
Virtual production of our steroid monster in an indoor battle scene.
Using our real-time method not only benefited on-set shooting but was also useful during post-production. In order to shoot and edit at the same time on set, most scenes involving our steroid monster were shot using a virtual studio. The director and cinematographer used motion capture to monitor the monster’s movement and size in real time. Meanwhile, the motion capture data and lighting conditions were handled in real time to complete postvis without additional CG, which was then used for editing.
We're glad to hear that Unreal Engine provided an effective solution. Is there an Unreal Engine feature that was especially helpful during the filming process?
Byeong-joo: In Unreal Engine 4.19, the Live Link plugin greatly expanded the engine’s scope from previs to actual shooting and post-production. For motion capture and camera tracking, the Westworld team leveraged Xsens and Ncam respectively, with both devices supporting Unreal Engine’s Live Link plugin. Sweet Home, in particular, featured many scenes where we had to have a virtual creature move within a virtual environment. Without the Live Link plugin, it wouldn’t have been possible to link the motion capture data to the creature within Unreal Engine and to connect the actual cameras with virtual ones to employ this real-time filming method. Since Live Link offers great scalability and endless possibilities by enabling us to use any real-world data in the virtual world, we would like to use it for a variety of future projects. In addition, Unreal Engine provides a structure that enables us to easily manage and use external plugins, which is very convenient for intuitively managing assets for each project.
Image courtesy of Netflix
Considering Westworld has streamlined the filming pipeline using Unreal Engine, what advice do you have for those who want to start using virtual production in their filming pipeline?
Byeong-joo: Using virtual production not only in the production stage but also in the pre-production stage allows you to visually confirm the details of the shoot and plan accordingly. As mentioned, in our experience, we used motion capture to create a virtual character, but it is equally important to use animation. Although bipedal characters are easy to use with motion capture, four-legged or eight-legged characters, such as the spider-like tentacle monster in Sweet Home, must be keyed by an animator. Keeping in mind that there is an option to view the real-time rendering of virtual characters using Unreal Engine, make sure to create high-quality assets and set the detailed movements, range of motion, and environments of the character in advance. This will allow the director and cinematographer to fully take advantage of a virtual studio to plan the shots in detail.
The film industry is evolving from a linear pipeline of pre-production, production, and post-production to a non-linear process. We are in the early stages of using virtual production to film, but with professionals in the industry gaining an understanding of this technology, the proliferation of this equipment and sets will soon revolutionize the film industry.
I believe that Unreal Engine's real-time rendering performance and quality will replace the existing offline renderers here. Especially for effects, muscle and fur, and crowd simulations that require large-scale calculation, the time and resources needed for rendering will be significantly reduced, providing innovative productivity across the VFX industry. We’re also looking forward to Unreal Engine 5 features such as Nanite to improve the data compatibility difficulties of existing VFX applications.
What does the future hold for Westworld? What is your next goal?
Byeong-joo: Our team was able to gain real-time virtual production experience that connects the real world with the virtual world through Sweet Home. We are building upon this experience to expand into LED walls and in-camera VFX through R&D and applying them in our upcoming projects.
Westworld is planning to flexibly combine Unreal Engine’s real-time rendering with the existing offline rendering method in a variety of upcoming projects, and we’re aiming to grow into a ground-breaking company that enhances the pipeline of producing the best quality work in the most efficient way possible. You can find the latest news about Westworld on our website and on our Facebook page. Also, Sweet Home is now streaming on Netflix.
Want to ensure a high degree of quality and efficiently create projects using Unreal Engine’s virtual production capabilities? Learn more below.