September 6, 2019
Evil Eye Pictures uses latest ray tracing features to create branded visuals for UE4
Initially, the team was working with Google’s real-time engine and pipeline, the Story Development Kit. By the end of 2018, they started experimenting with Unreal Engine, hoping to prove that they could recreate their photoreal visual effects in real time. “Our Associate VFX Supervisor Steve DeLuca used a digital asset we were working on at the time as a test case to build a photoreal environment in UE4,” says Dan Rosen, the company’s Co-Founder and Creative Director. “We took that build up to show the director on a VR headset, and it blew him away.”
The concept behind “emergence”When Epic approached Evil Eye Pictures about the SIGGRAPH brand visuals project, the team was excited by the challenge. “We saw this as a wide-open opportunity to help brand Unreal using the software to create a visual beyond what you might typically see,” says Rosen. For the ambient introduction, they planned to create an infinitely looping short, rendered entirely in engine, and incorporating the latest UE4 features including real-time ray tracing.
Finding the right design lead was crucial. “Each and every project for us is a design challenge for both art and science,” says Rosen. “For the art on this one, I knew that I wanted to work with Conor Grebel. He’s an amazing designer and motion artist who not only works in Cinema 4D, he also uses Unreal Engine.” Grebel was quick to accept the job of Lead Designer on the project.
“Based on our initial conversations with Epic Games, there was a lot of talk about integrating ‘motion graphics’ design and appeal into this project,” Grebel says. “It was really serendipitous that Evil Eye reached out to me, as I was in the process of switching careers from motion graphics to game art and design. I was excited about the idea of using my newfound love for Unreal Engine to create art that appealed to other artists of a similar design background.”
Grebel came up with a concept that explores the journey from creation to collapse. The ambient short emergence is just one element of that broader concept that has the potential to be further developed in future.
“I wanted to use parametric design methods to create a seemingly infinitely complex form,” he says. “Something that combined man-made structures and organic forms, inspired by the complexity and repetition of nature. There is something so undeniably inorganic about brutalist architecture. It is the summation of centuries of engineering and artistic progress, finalizing in an oppressive and minimal masterpiece. Its angles, materials, and shapes are completely inorganic. Taking that aesthetic and reinterpreting it in the context of organic and fractal growth, we thought was a very interesting design contrast.
“I was constantly asking myself questions during the initial design phase of this project: ‘Is this inspiring design? Will UE4 artists be impressed this was done in engine? Will motion graphics artists be impressed with UE4?’,” he explains. “I was constantly checking my progress with these prompts. I wanted the design alone to be fresh and mind-bending, but I also wanted it to be impressive within the context of the game engine. Ultimately the team was motivated by the idea of triggering a slew of ‘How did they make this?’ reactions.”
Goal achieved. As well as those congratulating the team on the beauty of the piece, many of the comments on the YouTube posting of the video ask simply “How?” The team is happy to divulge the details.
Creating light, shadow, and fractal patternsActing as Technical Supervisor on the project, DeLuca explains that the three megastructures seen in emergence consist of eight octants, each with identical animation inversely scaled in the appropriate axial planes to create an object that has perfectly mirrored motion in three dimensions.
Each octant is made up of a series of branches, with each branch containing smaller ornaments, which are themselves composed of individual kit pieces.
“The title emergence speaks to this, as any single piece on its own is dull and fairly simple, but when combined with all the other parts becomes something extraordinary and complex,” he says. “Creating a single octant by hand allowed us to art-direct the animation of a branch, an ornament, or even a single kit piece.”
The very first step in the design process was creating a kitbash set of brutalist architecture parts. Starting in Maya, Grebel created about 25 modular pieces that could snap together in various ways on a predetermined grid size.
Grebel then quickly sketched layouts using the mograph tools in Cinema 4D.
After finding a few layouts the team liked, he imported the individual low-poly pieces into ZBrush to use as a base for sculpting the detailed normals. He then created some custom aged-cement materials in Substance Painter using the high-poly sculpt from ZBrush. The base cell blocks and cluster formations were laid out in an overall structure that was now complete in its stationary form.
Next, it was the turn of Euisung Lee, Cinematographer and Animation Lead on the project. Working in Maya, Lee started adding motion both on the cluster level and at the level of the overall structure.
“Imagine a Christmas tree with moving branches and animated ornaments, which is a simple metaphor to understand the process and also the methodology of importing the animated rig into UE,” explains Lee. “I exported one FBX file that contained only the branch motion joints and another with the cluster motion loop. Once we had the rig set up, it was painless to iterate on animation.”
De Luca takes up the story. “Once we had a single octant animated in Maya, we brought it into UE4 and it was added to a Blueprint that would assign a geometry shader, mirroring that animation in the remaining seven octants,” he says. It was only at this point that the team could truly see the final animation of each megastructure. Each of the three animations was about five minutes in length from initial formation, through transformation, to collapse.
“We did a rough pass of lighting on all the structures and then began the process of creating camera moves that either traveled around and through the structures for the full five minutes or discovered unique compositions hidden in the complexity,” says DeLuca. “At the end of this process we had over three hours of ‘footage’, and our editor Matt McDonald combed through the dailies and assembled our final five-minute edit. After we’d reviewed it several times, we revisited each shot’s lighting; not only to refine it, but also to create an evolution over the piece from a neutral palette to a hyper-realized sunset full of color, then back again to a simple noir scheme.”
Overcoming challengesGiven the complexity of the geometry and the newness of the preview-release Unreal Engine code the team was using, it’s not surprising that they faced—and overcame—some challenges along the way.
“We were constantly exceeding the boundaries of the possible during the creation process, and Epic was enthusiastically encouraging us to break things so they could fix them,” says Producer Yovel Schwartz.
“The elegance of the engine can be frustratingly complex, allowing for multiple approaches that get you tantalizingly close to your goal before revealing a better way to get there,” he explains. “We were working with the absolute latest features and pushing them to their limits the entire time, which naturally created a constant stream of technical hurdles to overcome. But those challenges are at the heart of what makes the engine so powerful and its final results so impressive. The extensive abilities of the engine just make you want to push it farther.”
Another challenge the team had to overcome was handling the texture sizes required for the extreme close-up shots.
“Originally I had divided up the pieces into groups of four or five that shared a single UV space, however this proved problematic once we viewed everything in UE4,” explains Grebel. “The pieces get instanced thousands of time in scene, and although the majority of their screen space is very small, there are constant shifts in scale in the film that puts the camera very close to some of the geometry. Eventually I was forced to give each piece of geometry its own UV space and 4K texture. While it was overkill for most shots, it dramatically helped the fidelity of the close-ups.”
Lee explains how team had to learn to design their camera work in Unreal Engine’s Sequencer, rather than doing it in Maya as they were accustomed to, since the final masked and mirrored rig could only be seen in the engine. “We got used to it quickly, and in most cases it turned out to be a better workflow for the shot exploration and composition, thanks to the fact that you see the final lighting and effects,” he says.
“The migration of our animated assets from Maya to UE4 proved to be the trickiest bit due to our desire to use ray-traced global illumination (GI) and shadows in the engine,” says DeLuca. “Since nothing uses pre-baked lighting, everything required fairly high samples to produce quality results at our 4K deliverable resolution.”
To reduce the scene complexity enough to enable them to use a higher number of GI samples, the team had to import the animated ornaments as static meshes rather than skeletal meshes, and manually attach them to the joint sockets of the base skeletal mesh for the megastructure.
In the end, though, the effort was worth it. “I think the most essential feature that tied this film together was the ray-traced global illumination,” says Grebel. “The moment we enabled ray tracing, the scene just came alive. It was that missing piece that turned our designs from previs to art piece. The light bounces and accurate shadows were such a dramatic improvement that the focus of the project shifted towards highlighting the ray tracing feature.”
Designing a new Unreal Engine logo animationAnother element of the package the team was tasked with delivering was a new Unreal Engine logo animation, to be used for video intros. On this project, Rosen provided the creative direction, while Grebel once again led the design process, DeLuca supervised the final look, and Lee handled camera and cinematography.
The team considered this another unique opportunity to help brand Unreal Engine, and to show off the ray tracing feature—the piece is rendered in real time in engine. Grebel created a number of concepts from which the team selected one that reflected long light bars, or “light lines,” as Grebel termed them. Working in Substance and Cinema 4D, he created shaders that juxtaposed fully reflective and fully rough surfaces, and animated between the two to create the visual treatment, which he called “Defrost”. He was then able to migrate the assets to Unreal Engine.
“We worked together to strike a balance of design that felt original and authentically Unreal,” says Rosen. “Since it was running in real-time in UE4, we could make tweaks on all of it and see results instantaneously.
“What’s so dynamic about Unreal is that we could break off a shot or two from the main logo scene to create separate deliverables,” he continues. “I was able to change camera, lighting, and color on a couple of shots to provide the Unreal logo as a speaker backdrop for the SIGGRAPH show. Then Steve took another angle of it to provide for a print on the Epic booth. The turnaround time was incredibly fast.”
Looking ahead for real-time technologyAsked what excites them about the future of real-time technology in their industry, the team has plenty to say.
Evil Eye Pictures’ Co-Founder and Editor Matt McDonald shares his thoughts about the technology’s potential role in a new world where traditionally separate media and entertainment verticals are starting to converge, and where interactive experiences are customizable for the individual.
“These types of hybrid spaces could be narrative—creating experiences that overlap between games and film—or musical, between musician, composer, and even audience,” he says. “Customization can happen directly through the interaction with objects or simply through using one's movement in a space. This overlap is now possible across countless combinations of visual, audio, and even tactile media.
“Relatedly, I'm excited that real-time technology is blurring the lines of collaboration both as a producer and consumer, and even what those previously distinct roles mean. While I have traditionally created more fixed and finite experiences for others, real-time technology affords the capability to empower an audience or player to be the creator of their own experiences and even save their own versions to share with others or to re-experience later.”
Rosen also looks forward to continuing to use real-time technology across the convergence of movies, games, VR, and AR. “Whether VR lives or dies, it was exciting to stand inside our creations and interact with them,” he says. “The software has caught up, but now we want the hardware that can match it in the same seamless way. Using Unreal Engine for any number of screens, projections, and visuals is an exciting way to share our images with all types of audiences.”
Schwartz interjects yet another angle on the topic. “I think that we’re on the cusp of a really exciting convergence between traditional narratives, games, and other interactive experiences,” he says. “When you can design, light, render, composite, and edit in a single technological space, the creative freedom is almost overwhelming. From virtual production to motion graphics, I can’t imagine a corner of the image-making industry that won’t be affected.”
Grebel lends his perspective from his background in motion graphics and design. “The ability to achieve ray traced lighting, shadows, and reflections at 60+ fps is something I am majorly excited about,” he enthuses. “Once Octane and Redshift took hold of the 3D design community, making ray-traced rendering an industry standard, it changed the game. High-quality design became more easily accessible, and a wave of new ideas flooded the internet. Now, a similar power is awakening inside Unreal Engine. Combining the interactive capabilities of UE4 with the visual beauty of ray tracing will be another game changer for sure.”
For DeLuca, it’s a question of real-time rendering freeing artists across all graphics industries to be more creative and less burdened by technical hurdles. “Having been witness to overnight renders for a few frames of a hard-surface object at near photoreal quality, to now be able to relight an entirely photoreal ray-traced scene while it plays back at 30 fps with full lensing effects is staggering!” he exclaims.
Cinematographer Lee agrees. “Real-time graphics overall has made incredible strides lately and UE is clearly at the forefront—I’m very excited that such technology is available to everyone. I have no doubt that it is encouraging more collaboration and sharing across the industry and propelling progress at a faster rate than it would be in an alternate reality where everything is proprietary and costly,” he says. “On a personal note, I love that the real-time workflow gives you a whole new perspective on your own creative process and possibilities. It feels intimidating at times, but it’s a good jolt of stimuli I’ve been missing after a long time of doing the same things over and over.”
Schwartz has the final word. “As a producer, for me everything is in service of artistic collaboration,” he says. “How can we tell the best story? Create an immersive experience? Whatever the art form, there will always be toolsets that help you get there. When you add real-time technology as a resource, your ability to iterate creatively throughout the entire production pipeline is staggering—it’s a revelation.
“If you’re a writer, and halfway through your novel you realize that it would be better if there was some change back in chapter 2, your decision to rework the idea is basically free of consequences. That’s never really been true for necessarily collaborative art forms like film, games, or now VR / AR experiences. There was a time when asking to see what a CG scene looks like at night, for example, instead of the daytime look you’d been developing for weeks or months, could create an existential crisis. Now we can explore those creative changes so comparatively fast that we’re all going to need a lot less therapy!”
Want to find out what real-time technology can bring to your industry? Download Unreal Engine today.