Image courtesy of Netflix

Netflix smash hit Space Sweepers harnesses Unreal Engine for real-time previs

Jinyoung Choi
With more than 20 years of production experience and know-how, WYSIWYG STUDIOS produces a wide range of video content from movies to TV series.
The newly released Korean sci-fi film Space Sweepers, a Netflix Original, has the VFX world buzzing about its stunning special effects. In addition to the vast background of space and a CG robot as a main character, the dystopian tale features several well-worn digital spacecraft, all of which are intricately designed both inside and out.

Space Sweepers’ success is especially noteworthy for achieving such a high level of visual quality on a budget significantly lower than comparable Hollywood offerings (around $20 million), and in a genre rarely seen from the Korean film industry. The film debuted on Netflix at #1 in February 2021 and held a Top 10 spot for weeks, racking up millions of views from all over the globe.

We met with Hee-sung Yang, Creative Director of WYSIWYG STUDIOS, to learn how the studio managed to hit two major targets—technical advancement and blockbuster popularity—using Unreal Engine.

Please tell us about the film and production team.

WYSIWYG STUDIOS specializes in VFX production to create all kinds of video content that can be made in CG. Since our early beginnings in 2016, we have continued to use Unreal Engine in our content R&D efforts. 

Among the different sci-fi genres, Space Sweepers, which WYSIWYG took part in, is classified as a space opera, which requires a lot of CG. More than 1,000 CG/VFX experts participated in creating various CG elements, from large-scale battle scenes where hundreds of spacecrafts are scattered across a vast universe to the detailed depiction of the spacecraft interior, the steel body of a robot named Bubs, and full-suit armor on squads of soldiers. We were proud to be part of this production.

The production team’s hard work shines through in the space environment of Space Sweepers. Could you share how Unreal Engine was used for previs and what type of scenes it was used for?

All the CG elements mentioned above are combined in two scenes. In the first scene, Bubs launches himself between spaceships to shoot them down with harpoons, with a vast space landscape in the background. In another scene, several spacecraft tailgate one another along complex passageways inside a planet-sized factory. These scenes were 100% CG, so actors had to act out the necessary lines and actions based on what they imagined the final scene to look like. This is where Unreal Engine came through. In short, we started to use Unreal Engine during the previs stage. This not only made it possible for us to test the look of the final pixels early in the production pipeline and use the shots for final lighting and composition, but also enabled us to share the final shots with the entire production team. This ability to communicate the director’s intention minimized trial and error while saving a lot of time and cost.
 
Image courtesy of Wysiwyg Studios
Previs of space battle scene

Hearing about the takeaway from your experience makes us wonder how you introduced Unreal Engine to the pipeline. What led you to use Unreal Engine?

The quality and speed, plus the ability to view high-quality visuals in real time. We believed we could maximize Unreal Engine’s potential in previs, where the director’s vision needs to be fleshed out in order to apply it in a quick, flexible, and precise manner.

In fact, the power of Unreal Engine has helped us build our own pipeline that is optimized for us, to focus on realizing our imaginations and to respond to any situation. Most people tend to regard previs as a passive process for visualizing the director's vision. However, there are many cases of previs teams brainstorming and presenting new scenes as necessary, and proactively building up scene composition. In this case, it is crucial for the previs team to convince the production team as well as the director that the suggested scenes will engage the audience. Unreal Engine offered high-quality visuals for previs that proved to be highly effective for this purpose.

We’re talking about visual quality that goes beyond simply visually pleasing shots, and is able to create stunning lighting at the previs stage. In a traditional pipeline, lighting could only be considered during the production stage, but with Unreal Engine, we were able to visualize the lighting and the detailed aspects earlier on. This enabled us to share a clearer vision with the team from the start of production and achieve higher-quality scenes.
 
Image courtesy of Wysiwyg Studios
Previs of tunnel chase scene

How has using Unreal Engine for previs helped increase the scene quality? 

The power of Unreal Engine is highlighted in the tunnel scene which requires processing of heavy data such as highly detailed designs, lighting, and VFX. The scene is created in the same traditional method. We first create dummy assets with minimal information such as size and location, and build upon them in relation to the position and path of the camera and subject. Meanwhile, we create levels in Unreal Engine at the same time, which opens up possibilities for various tasks that were not possible in the traditional process.

We began by placing assets in Unreal Engine's real-time environment while doing lighting tests at the same time. This enabled us to do simple, intuitive iterations to easily determine a scene’s atmosphere, quality, amount of light, and color tones. Being able to organically work on lighting, camera work, and animation at the same time meant we could add detail to the scene’s blocking. Also, camera features such as lighting settings for different camera angles, motion blur, depth of field, and lens flares could be layered in. 

In addition, Unreal Engine made it possible to add VFX to the previs at the desired location with a specific size and timing, which ordinarily would have been possible only at the production stage. As a result, three-dimensional VFX, which was difficult to capture in the previous previs process, was easily attainable. Traditional previs would have been limited to simply placing objects to create the composition, but the power of Unreal Engine enhances the pipeline to achieve a three-dimensional look and lighting with a myriad of geometry.
 
Image courtesy of Wysiwyg Studios

When the new previs pipeline with Unreal Engine is compared to the traditional previs pipeline, what is the benefit of using Unreal Engine?

Although it is important to handle factors such as color, mood, tone, light and shadow, and camera effects when directing, the traditional previs pipeline was limited to placing objects according to the storyboard and camera layout, simply because going beyond this scope was impossible. However, when Unreal Engine was introduced to the previs phase, we were able to test areas that we couldn’t do before. Based on this view, traditional previs and previs using Unreal Engine have three main differences.
Comparison of traditional previs (left) and Unreal Engine previs (right)
The first difference is real-time rendering and lighting processing. In the traditional process, the complex correlation between light and space had to be computed to calculate the values for light placement, look dev, rendering, etc. So, it was nearly impossible to previsualize final pixels in a short time. However, with Unreal Engine, we can now test lighting and textures in real time, which were once considered impossible to manage at the previs stage. 

The second difference is in how assets are processed. In the previous pipeline, only color, simple transparency, and normal values were processed, and there was a precise limit to how many polygons could be handled in one scene. Meanwhile, Unreal Engine can simultaneously process more polygon data while offering intuitive, instance-based, high-quality texturing and shading in real time, making it much easier to process highly dense scenes and enhance the visual quality at the same time.

The final difference is blur effects. Since previs is used to compose scenes that require advanced filming techniques or are impossible to shoot in real life, fast-paced action scenes are normally handled in this stage. When there is an abundance of quick movement in scenes, blur effects—especially motion blur—are key to achieving a believable sense of speed. In fact, motion blur makes the difference between still cuts and videos. If a subject passes along a layout in an instant without motion blur, it will simply be perceived as a flickering object. For this reason, motion blur effects are critical to scenes where speed amplifies tension and engagement, such as the harpoon battle and tunnel chase scenes. Despite its importance, motion blur effects have been inaccurately portrayed in the traditional process due to the inability to compute the exact speed of cameras and objects. This made it close to impossible to properly previsualize these scenes. 

In other words, it would not be an exaggeration to state that the harpoon battle scene and tunnel chase scenes were made possible thanks to the various benefits offered by Unreal Engine, which enabled us to compose the scene in real time and achieve final-pixel-quality results.
Image courtesy of Wysiwyg Studios
What feature was the most useful in Unreal Engine?

The first feature I’d like to mention is the Material Editor. Unreal Engine's Material Editor offers the ability to create incredible looks. Materials were used to portray intricate details of texture, and it was possible to achieve repetitive animation and various VFX effects just by adding a few nodes. In addition, it is easy to customize the structure of materials in Material Editor, which made it suitable for quickly creating effective visuals as needed for the project.

Also, post process volumes were useful. Unreal Engine's post process pipeline delivered rendered images that did not require additional work, which greatly helped us save time. If we were to use another tool to post-process the renders created in Unreal Engine, it would have consumed a significant amount of time. The post process features also offer a variety of options for the concept work and artistic control, which played a part in increasing the quality of the final pixels.

Unreal Engine was used for previs in this project, but what are your thoughts on the value and usages Unreal Engine offers for other stages of production moving forward? 

We believe Unreal Engine has endless potential. Even now, the technology that enables heavy asset data to be lightly processed, and lighting to be rendered as images in real time, has reached a level that closely matches real-life photography. Especially when Unreal Engine 5 was revealed for the first time, almost everyone in this industry was left feeling stunned in a similar way.

Based on the fact that students and job seekers are greatly interested in Unreal Engine and there is high demand for UE education, it is just a matter of time before we see a significant increase in projects created with Unreal Engine in Korea.

Currently, Unreal Engine is pioneering next-generation technologies such as previs and virtual production using LED walls. For example, LED studios have enabled us to shoot outdoor scenes, free from the limitations of weather, time zones, or locations. Also, it is now possible to shoot spaces that don't physically exist as if they were real. In addition, green screen sets and chroma key compositing for VFX post-processing are slowly phasing out from the field, and actors are able to deliver more engaging performances by interacting with the LED screens.
Image courtesy of Netflix
Real actors performing on set. The first actor on the left plays Bubs, a robot fully created in CG.
In traditional VFX production environments, the time required by rendering has always been a challenge. However, for the Space Sweepers project, Unreal Engine enabled us to check the final look and adjust the lighting in real time. Unreal Engine is the key solution to integrating the compartmentalized production processes into one streamlined pipeline, and holds the potential to change the landscape of the film industry. Previously, the concept creation and previs stages were two separate areas and the idea of merging the two was beyond our imagination. However, Unreal Engine’s features now intuitively integrate the separate processes to deliver one combined result. As a result, it is much easier to effectively respond to issues that may arise from different stages.

With the boundaries between concept art and previs becoming blurred, more potential issues can be tested out during the previs stage. This thoroughly prepares us for the risks that can arise during production, significantly reducing trial and error to secure a sophisticated production environment. Compared to the previous pipeline, we can suggest a much clearer and intuitive visual goal to the entire production team, minimizing communication costs. Also, the quality of Unreal Engine’s real-time rendering gives us reason to anticipate the technology expanding beyond pre-production into post-production. 

    Speed up your pipeline

    Get to previs and final pixels faster with real-time rendering tools.