January 18, 2019
Take a thrilling ride through the archaic world of FRAGMENT, a student short film
The concept for FRAGMENT began last year during our final semester at NAD, School of Digital Arts, Animation, and Design in Montreal. In order to graduate from our program, we were required to submit a final project. We had just four months, from January until May, for the entire creation process.
The Creative Process
At the start of the project we didn’t know which direction we wanted to take, but we collectively agreed to keep the same team from our previous projects. Our team had a great time working together, everyone was always open to critique one another, and the energy of our team stimulated creativity, motivation, and passion. We wanted to keep this synergy for our next milestone.
We wanted to challenge ourselves with this project and showcase our individual talents. After all, our future employers would be watching. With just a four-month window for production and a small team, we had to quickly make a choice between gameplay or virtual production for our project.
With our passion for cinema and our professional ambitions ahead of us, we decided to create a short film in Unreal Engine. With Unreal Engine’s robust interface and powerful performance tools, we knew we would be able to create something incredible to fit our vision.
It took us two weeks to research references to prepare the mood alignment for our project. We gathered imagery, movies, and concept references. The short film IFCC by Sava Zivkovic was one of our inspirations. We also took a dive into camera movements and rhythm, with inspiration from David Fincher and Wes Anderson movies. After two weeks gathering theme concepts, we created moodboards to set the path for our production.
The monolithic shapes from Maze Runner and the paintings of Jean-Pierre Ugarte were inspirations for the overall mood of our project. We were also influenced by the work of Art Director Raphael Lacoste and Concept Artist Martin Deschambault. We used their art as a guideline to achieve the theme we wanted to have for our short film: archaic, misty, and timeless.
For the tone of the project we wanted a restricted color palette and a duality with greys and greens. The first part of the film would use only greys to portray big, strong shapes and heavily-polluted mist. The second part of the film would progress into using more natural green elements.
In order to travel within the world, we wanted to create a story with different levels of complexity while portraying a simple chase sequence. The main concept for the narrative started with what we called the “pivot shot.” This “pivot shot” appears at both the starting point of the story and in the middle of the film.
The idea behind this sequence was to bring together the entire storyline while leaving the viewer with an open-ended interpretation of the film. We wanted the viewer to believe this film to be just the tip of the iceberg, perhaps a piece of a larger story. We also wanted to blur the lines of the characters’ roles, leaving it up to the viewer to determine which character is the protagonist and which character is the antagonist.
To layout the ideas for the sequence shot, we developed a storyboard. From here we did a first pass on editing in order to test the rhythm of our story and our world. This highlighted which aspects we needed to cut from the story and what additions we needed in order to make the plot understandable and interesting to watch.
To create our sequence in real-time, we chose to use Unreal Engine’s Sequencer. We wanted to keep the game industry standards, have an adequate framerate, and eliminate post-production. With Unreal Engine, we could keep these standards and create what we wanted due to its technical capabilities.
In Unreal Engine, it is also faster to test shapes and contrasts with primitives and directional light angles. We could use some of these shots to do some paintovers and further push the shots that could work.
We wanted to keep the rhythm of the story focused on a chase involving three characters: a drone, a beast pack, and a man plugged into a strange device. At a deeper level, we wanted an omnipresent firm and the city to be important characters of our short, too. We made interconnections between characters and separated the short film into five clusters of time.
We separated the short film into multiple sequences and levels so that each member of the team could work on their part. This workflow enabled everyone to work on small segments and test differents parts before combing all of the sequences together.
The man is an ex-employee of the Firm, plugged into a man-made machine in his bathtub trying to upload his conscience out of his body. We added details of his lore in many shots in the bathroom. We wanted to remain vague as to what he is doing and why, where he is going, and if he succeeds or not.
To create his skin, we used the skin shader from Unreal Engine’s learn tab. We struggled with this aspect of the project because we did not have any animators on the team. Even though this was challenging, it was a lot of fun.
We then used Unreal physics to move the wires plugged into his head. To help animate both the man and the drone, our friend Charles-Etienne Gouin joined the team the last two weeks of the project.
The idea for the drone came late in production from our mentor, Sebastien. He suggested we create a simple disc that could highlight the rhythm of the chase with its movements. From this idea, our friend Louis designed and created the drone we used in the film.
The drone is a mix of the firm’s omnipresence and the overall rhythm of the film. We wanted the drone to be a powerful, hidden mastermind. We used “Voodoo in My Blood” from Massive Attack as our main reference point for the behavior of our drone. This is our second “pivot shot” with the drone:
It is at this point in the film that the viewer finally understands the drone is controlling the pack and is seeking the man who activated the drone with the booting of his hacked device.
The flexibility of Sequencer allowed us to animate the drone solely in Unreal Engine with no rig. It’s comprised of a Blueprint with lights and post-process effects for the shockwave and materials.
The beasts are a big part of the story in our short film. We wanted them to look feral and deadly, and to feel like an omnipresent threat to everyone. They are also the Firm’s eyes in the streets of this city. We wanted to leave the questions for the viewer to draw their own conclusions: Are these beasts created by the Firm or did they enslave them? Do they come from another reality? Are they holographic?
The deer also was a surprising and unanticipated idea that came to fruition during the first projection, but came to be our greatest delight. Glitching with a Firm logo, he is the clue that humans who once lived here destroyed every living species and then tried to recreate nature with the progress of technology.
To create these creatures, we used the game-ready pipeline, then scattered planes in Houdini, and used Unreal Engine’s hair shader to achieve the look we wanted. It was a great challenge and we learned a lot from this experience.
We studied a lot of environment showcases in Unreal Engine before deciding what direction we wanted to go in. We wanted to depict a timeless and misty feel, and a cinematic production look without any post-production.
We played a lot with perspective by hiding directional light with big elements to create shadows. We also oversized meshes to exaggerate the perspective of the shots. We also used a lot of Sequencer’s features to spawn entire parts of the city, and to move the exponential height and the directional light in the timeline. We used a lot of the post-process materials and LUT settings to achieve what we wanted for the visual quality.
Sebastian Primeau and Martin Deschambault helped us a lot with the visual quality and coherence of the short film. They spotted every mistake we made and helped us adjust them. From week 10 to week 15, we gave a viewing per week to a small audience in order to receive feedback from fresh spectators. We were able to re-adjust each shot from these evaluations.
We used only modular assets and materials to stay in the game standards and to stay aligned with our desire to work in the games industry. We worked in Substance Designer to make generic materials and easily created variations with the Substance plugin. We used a lot of decals to create noise and grain in our assets.
We also used a world-position material which allowed us to create really bigs assets and shapes to build our city with a good texel ratio, no matter the size of the assets. This also allowed us to tweak materials with exposed parameters in Sequencer to ensure the quality of every asset in every shot.
The vegetation textures were a mix of Quixel Megascans and Unreal Engine Marketplace photogrammetry packs. Then, we used SpeedTree to create the assets and Substance Designer to combine and refine maps. Unreal shaders, splines, and foliage types were really useful to quickly create large surface of realistic clutter without performance drops. We used custom noises in the wind (with exposed parameters for Sequencer) to create a special atmosphere for our environment.
It was a great challenge, but we learned so much crafting this environment. We were able to take many shortcuts to achieve what we wanted (faking perspective, exposing parameters to change shot per shot, moving fog and directional) with a game-ready environment.
We packaged the short film scene into a build so the spectators could move with the first-person template into our shots and shift the directional light. Our first viewing was scheduled directly after two pre-rendered VFX short films. It was a great experience and incentive for us to give the best of ourselves, to get near the quality of the two other VFX teams, and to give the spectators the possibility to freely move in our shorts on a solid FPS build.
The largest production challenge was to collectively learn the ins and outs of the Sequencer, our main production tool. From cameras and timings to specific control of visual effects, we were ready to dive into all of Sequencer’s features.
While the real-time approach made editing shots extremely fast, it also imposed a major yet interesting technical problem: time-based visual effects were difficult to control and preview properly, as their time-based loop didn’t start at the same time each viewing. For example, an oscillating effect spanning five seconds could start at the first or fourth second of its loop depending on when the user pressed “play,” giving an annoying gap of imprecision.
For shaders and effects relying on the “Time” node but requiring precision, an alternative was used: the “Time” shader node was completely replaced with a keyable variable, directly linked in the Sequencer through Material Parameter Collections. This approach let us create multiple variables controlling each part of the shaders: from time to color to how “wiggly” the shockwave bubble moved over time. The versatility was incredible: live, controlled visuals editable in real-time by artists, including post-process materials.
Easily keyable properties!
The technical feature which required the most development time was the shockwave/glitch effect: it had to have impact, and we wished for it to be different from hologram-like effects. It had to have control, as the narrative needed them to only affect the natural assets like the elk, the clutter, and the monolithic tree. Thus we created the “Glitch” Material Function used for the Post-Process material, the shockwave bubble and the screens in the bathroom; a simple UV distortion, but a versatile one!
At the core of the shockwave/glitch effect was moving horizontal bands to deform the UVs of the target entity. Applying it to “Masked” materials made their silhouette distorted, while applying it to screen-space UVs in the Post-Process material distorted the whole image in a glitch-like pattern, without being restricted by geometry or material type, giving the result below:
Commenting and grouping nodes: Just do it.
The biggest challenge we met was in the scenes in the bathroom: the screens displayed short video files we made (multiple ones at once!). However, due to the video decoder and Unreal’s decoder being split, it became increasingly hard to make the frame rates fit exactly as we needed them, leading to a lot of manual adjustments, transforming videos into image sequences (managed by Unreal) to finally achieve reasonably fitting times.
Thank you for reading! We hope FRAGMENT inspired you to create your own unique project in UE4. You can watch FRAGMENT on Vimeo. For more information, check out Adrien Brunella’s Artstation.
Looking to learn real-time rendering fundamentals or Blueprint essential concepts? Check out our free, on-demand video tutorials on Unreal Online Learning.
Project directed by Sebastien Primeau
Sandy Chow (environment modeling/texturing)
Pierre-Alexandre Côté (lighting/environment modeling/texturing)
Adrien Paguet-Brunella (photography direction/level art/environment modeling)
Pierre-Alexandre Pascale (technical art/fx/shaders/animation)
Stéphan Provost (sound fx/music)
Alexandre Turcotte Gervais (concept art/environment modeling/texturing)
Isabelle Verdon (character modeling/texturing/animation)
Additional help :
Louis-Alex Boismenu (drone design)
Jimmy Di Nezza (creature animation)
Charles-Etienne Gouin (man/drone animation)
Pierre-Luc Jacques (fx/particles)
Josianne St-Pierre (fx/particles)
David Gagne (Houdini)