HALON evolves previs with Unreal Engine
The concept of previs isn’t exactly new. Walt Disney utilized an early version of the technique that leveraged simple storyboards and Leica reels to streamline animation production. From the mid ‘90s to early 2000s, VFX pioneers like George Lucas took a page from Disney’s playbook and expanded the process by applying it to live action film and visual effects production. By implementing SGI and off-the-shelf PC and Mac tools, Lucas and his team of CG artists pushed the technology of the time as far as they could. As a result, previs in its modern incarnation was born.
Today, the team at HALON Entertainment is following in Lucas’ footsteps by pushing technology forward once again. By integrating Unreal Engine into its previsualization pipeline, HALON hopes to usher in a smarter, more iterative and higher fidelity form of previs that changes the way films are made. Traditionally 3D applications like Maya or 3ds Max are used to get the job done, but the requirements and creative expectations of the film production landscape are changing. On War for the Planet of the Apes, previs and postvis supervisor AJ Briones needed a next-generation tool capable of handling large datasets and complex imagery in a more intuitive manner.
We connected with AJ Briones and lead artist Casey Pyke to learn more about Unreal Engine’s impact on the previs and postvis process for War for the Planet of the Apes.
Q: How did using Unreal alter the way you approached previsualization?
AJ - While Unreal Engine afforded us greater fidelity and quality in the look of our previs, the goal for us was to achieve those benefits without fundamentally altering our approach to previs. We have a lean previs team consisting of generalists that are required to work fast and agile, and we couldn’t afford to add any engine programmers, FX artists, render TDs, or any of the specialist luxuries that come with bigger teams. In fact, it is a huge testament to the power of Unreal Engine that we were able to integrate it into our previs/postvis process without disruption.
Casey - Unreal gave us the flexibility to render more complex and dense environments than in the past, communicating the director's vision more accurately.
Q: Where did you do your scene layout and camera work?
AJ - We had a mixed environment of Maya and MotionBuilder. Maya was used for generating assets, constructing and laying out sets and converting any assets delivered to us by Weta Digital and the art department. Maya was also used for our keyframe animation. We used MotionBuilder whenever we had motion capture to integrate; using it to edit and add layer animation before sending it to Maya for layout. Everything was then exported into the engine for lighting and effects.
Q: What were some of your challenges getting data into Unreal?
AJ - Like I mentioned before, we have small teams of generalists at HALON, and we are designed to work quickly and wear a lot of hats. The biggest challenges we had were at the beginning of the show, trying to set up a pipeline that would work well and allow us to reap the benefits of the engine without slowing us down. We had a big show with a sustained run, so we needed a pipeline without a single point of failure, where every artist on the team could take a shot from start to finish without much help. None of us had a great deal of experience with the engine (last time I worked in Unreal was in 2007), so it was a lot of trial and error. Once we figured out how to get something in the engine manually, we wrote tools around the process to automate it. We are really grateful to Matt Reeves (Director), Ryan Stafford (Producer) and FOX for giving us ample setup time and believing that we could pull it all together.
Casey - We started our project developing tools to export multiple skeletal mesh animations and cameras with the click of a button. As the project progressed, we had to solve getting cloth simulations baked and imported into Unreal, and we had to develop a way to convert rigid body simulations into one-off SkelMesh animations. All of that took rounds of research and testing as we went, as we had limited knowledge of Unreal.
Q: Did you attempt to create a master scene with multiple cameras capturing footage or did you use a more traditional sequence/shot type workflow?
AJ - We used master scenes when they made the most sense. A good example of this was during the opening sequence, Battle on the Hill, with the human soldiers ambushing the apes in their trench. Once we completed an animation/mocap hybrid pass of the soldiers huddled at the hill, I sat with Matt Reeves and interactively placed all of the cameras with him. He was very involved in that process, working not only on camera placement but also with lens choices. That particular set piece was a challenge, since it was positioned in a forest at a very steep incline, so we had to be very careful with how we staged the action and blocked the cameras.
This was also an approach we used on bigger action scenes, so we could take advantage of our builds by generating multiple cameras with them, which in turn helped with continuity.
Casey - We used a sequence/shot workflow, but we would often start with a master scene and break it into shots. All of that was done in Maya.
Q: Compared to using a DCC for previs, how did Unreal perform with the number of characters you had in scene?
AJ - There were a couple of scenes working on this film where Unreal really helped us when it came to crowd scenes. The Elder Ape Sacrificed sequence and Avalanche are good examples of this. We worked the crowd scenes in a similar way that one would do it photographically with a limited amount of extras; by building crowds in sections and duplicating them in the engine.
Casey - Unreal managed as many characters as we could throw at it. The only performance hits were when we would export 50 soldiers and then duplicate them in Unreal to get a couple thousand-soldier army.
Q: By having extra fidelity, lighting and effects in-engine, how did the director attempt to solve potential lighting and vfx issues through previs?
AJ - In the instances where I knew the GPS location of the sets we were shooting in and the intended shoot days, we did studies using The Photographer’s Ephemeris (TPE), setting up lighting as accurately as possible to reflect where the sun would be at that date and time, and how any given scene could be lit on that day.
Additionally, on the practical effects side, once we had our scenes with explosions approved, the practical effects teams used the previs as a guide for the placement, and at times, the timing of the explosions. And, while it was not our original intention, the final explosions did end up looking a lot like the ones we put together in previs.
Q: Did you use Unreal for interactive blocking or lighting sessions with the director? What was your revision process like?
AJ - Since time is at a premium in previs and we were all about getting volume and working iteratively, most of our first pass blocking shots were done in grayscale via Maya playblasts. These shots were quickly keyframed, sometimes using only stills. This was so we could get material in front of the director as quickly as possible for feedback.
Once we got to the point where I felt we were starting to hone in on the director’s intention, we would begin to introduce motion capture, keyframe refinement animation and Unreal Engine renders into the process. This enabled us to separate the lighting/textures/effects/performance feedback from the composition and intention of the scene.
Q: What was the greatest benefit of using Unreal for previs and postvis?
AJ - Unreal gave our previs and postvis a render quality that was much more cinematic than anything we could have achieved with our traditional process. This enabled us to better showcase the action in our shots, and in turn gave the director material that he could use to communicate his ideas with all of the various departments and the studio.
Thanks to AJ and Casey for taking the time to answer our questions. To learn even more about the previs and postvis process for War for the Planet of the Apes, check out fxguide’s coverage from San Diego Comic-Con.