September 21, 2016

Illusion Ray Studio Offers Insight Into Creating 3D Movies, VR in UE4

By Lucas Smaga and Dominik Sojka

Hi, my name is Lucas Smaga and I am the founder of the Polish studio Illusion Ray. I have previously worked for such companies as Platige Image and Fuero Games. Together with Dominik Sojka, our Art Director, I am more than happy to share our knowledge about how to create both 3D movies and Virtual Reality based on the Unreal Engine 4. 

Dominik and I have been working in the gaming and film industries for many years now. Our VR adventure started with the Dino Safari project designed for Oculus Rift DK1 in UDK. With the release of Unreal Engine 4, we expanded this movie with the new engine in mind, thus beginning our adventure with VR movies. So far we have made five VR movies (Dino Safari, Motoride, Afterlife, The Colossus and SolarSystem) as well as a stereoscopic movie entitled Pirates 3D. We are also creating a new game inspired by the atmosphere of Silent Hill PT using photogrammetry.



Each project is a little different, but in most cases the pipeline is more or less as shown below. Note: This timeline can be used not only for a whole movie, but for each scene as well. This is how we completed the Solar System project, which has entered Polish 7Dmax cinemas this summer. In the Solar System project, individual scenes / planets were treated separately.

Script and workflow

The first thing we do is choose the genre we want to enter and the atmosphere we want to create. Then comes the specific content idea and the full script that includes a breakdown of scenes and the timing for each scene. Because none of our movies are longer than 10 minutes, we must plan all scene timing sparingly. For each project, we create a Pinterest wall to collect inspirations and initial ideas. In terms of the organization of work, however, each movie has its own wall on Trello, where we define a plan of action and associated deadlines. It is advisable to set deadlines, even if they are not arbitrarily established. Otherwise, time is running out, and the work is not getting done. Sticking to schedule milestones makes it possible to bring a project to its conclusion and constantly monitor progress in its implementation. Obvious, right? In fact, it does not matter whether we are making a movie or a game – deadlines are necessary.


Scene blockout is made in a 3D software, after which the whole is imported into Unreal Engine. For collisions of the mesh we are checking the option “use complex collision as simple”. Then, we check whether the sense of size and proportion in Virtual Reality is correct. Tips: You usually want to make corridors and passageways slightly larger than they actually are.


It is time for the most important element of the pipeline: the animatic of the movie. At this stage, we create a raw movement of cameras and basic character and object animations. We adjust the blockout as well. In most cases, the camera movement is done outside of Unreal Engine because we use camera animation following the spline (“Camera Rig Rail” was first introduced in Sequencer, but we have been using Matinee). At this stage, though, the timing of the camera is the most important for us.

After preparation, we move the animatic to UE4 along with the camera and test it on in VR. A very important element here is “the feel of the camera”, which includes e.g. the right speed of movement or whether the viewer is feeling nausea or dizziness. At this stage, there is a lot of going back and forth. We check things, then improve them, then we go back again, check again and improve them once more... and so on.

It is worth remembering that animatic is supposed to include information where there will be more detail in scenes, where the camera stops for a longer while and where it shows something at a closer distance. Ultimately, this has an effect on models. Animatic makes it possible for us to know which models can be completed in a more detailed way and which should be a little bit less detailed.


Main models and basic lights

This is where the real fun begins. At this stage, we start modeling basic 3D objects. In short, the entire blockout is converted into walls, interiors, buildings, items and other modular 3D models.


Next, the whole is assembled in Unreal Engine with basic lighting and parent shaders. Characters and animations are being created in parallel on the basis of the animatic and the discussions of the script and guidelines.



For optimization, we mainly use baked lights. For our uses, we think it is the best to utilize Stationary lights with “Dynamic Shadows” disabled (of course only when we do not need them). This gives better reflexes compared to the Static lighting. Note: “stationary” lights are limited to four within one area (because of 4 RGBA channels on the material), so you must be clever about selecting them. We also frequently use additional static light with a very subtle intensity – it is perfect for GI simulations with the roughness option set to 1 (in this way they do not bounce off objects). In addition, almost every light should use light profiles; this gives a better and more realistic fade. So far we have occasionally used a great method used by Koola (which is a spotlight illuminating a white plane and baking). All our previous movies were made before the engine version 4.11 when there were no portal lights.

Atmosphere and finalization:

After preparation of the basic environment, we add props. We create them based on the camera movement so that each video frame has the appropriate composition and weight. This is followed by a general post-process, which contains basic effects and LUT. And for individual scenes / rooms, we perform additional post-processes that gently change and improve the atmosphere. At this stage, we also create particles for individual scenes.


Matinee, sounds, camera:

This is the very important final part of the movie-making process. At this stage, we animate all the objects (doors, windows, particles, launching skeletal mesh animations, changing lights, post-process animation, etc.) Here, the main imported camera has a plugged-in second camera, which carries information on other micro-movements, such as shakiness. It is also at this stage that we finally tweak our cameras – without any major changes in timings of successive scenes and sequences.

Sounds are added at the end of this stage. Some of them are launched by triggers (via Blueprints) or through Matinee (usually for ambient sounds), while others are hooked up to a skeletal mesh in the animations (Notifies).


We've been bringing this interactivity together in Matinee for years, and you can certainly accomplish all this using the new Sequencer cinematic editor.


Our target is the GTX 970. We cannot permit even a one-second drop below 90 FPS, as it would cause the effect of ghosting and delay, which are unpleasant for the viewer (the effect is noticeable even at 89 FPS).

Reducing unnecessary materials per model is important for the reduction of draw calls. This should be done where possible in such a way as to not affect the quality of the image. In addition to all this, we add “Cull Distance Volumes”. The basis, however, is the use of the GPU and CPU profilers. They are very helpful in finding bottlenecks.

Each of our projects are divided into as many levels as possible. These are loaded at the beginning and then revealed and hidden. We also use visibility limits for transparent objects at close distances (the application of “Camera Distance Fade”). The thing that is of utmost significance is to reduce the dynamic shadows on “skeletal meshes” as they have the biggest influence on the drop in FPS.

Of course, it is advisable to use LOD option not only for models but for particles as well. 

If we have some FPS left at the end, we can attempt to set the “Screen Percentage” above 100. This gives a great effect of sharpening the whole image due to the limiting Temporary AA to the image resolution.


We are (almost) done, so here are some practice-based tips.


It is recommended to use Apex – not only for clothes but also for hair, flying waste, curtains, etc.


To transfer simulations from one 3D software to another, we need to first bake transformations and parent all the animated objects to bones/null/dummy and then export them as skeletal meshes. You should know that we can thus create very interesting effects by creating some specific particles, replacing them with planes, and then exporting the entire animation in the same way.

It is worth remembering that Unreal Engine allows you to import animated blendshapes (morphs).


It is also worth noting that Unreal Engine allows you to animate material parameters via Matinee or Sequencer. In this way, you can create interesting effects by using “World Position Offset” or creating an original Dissolve.


To export a muscle system into Unreal Engine, you can add multiple skinned bones to the muscle system created earlier, using “constrains” to muscle vertices. Next, you can bake the transformations of those bones in every frame and export them into UE4.

It is often the case that tiling can be seen across larger areas. In addition to obvious methods involving adding “detail textures” and painting portions with a different material, it is a good practice to add a method that gently changes the tiling depending on the distance (PixelDepth). It is advisable to clone a texture, reduce its tiling and rotate. Then you should connect this with main texture using Perlin Noise as a mask. In addition to all this, it is good to add another material to the terrain in steep locations e.g. by a dot product of upward vector and vertex vector. (VertexNormalWS dot 0.0.1)


In games, you can often see the repeatability of basic objects, such as grass, bushes or even some props. Therefore, it is a good idea to add slight color variations to their textures, e.g. by using the “noise” option from UV as “Absolute World Position” (excluding material offset), which changes the saturation / color / brightness or gives an additional effect, such as dust or dirt.


We would like to thank Epic Games for the opportunity to share our knowledge. We are going back to work. Currently, we are preparing a demo of our upcoming thriller game “The Beast Inside” that tells the story of Adam exploring dark flashbacks and traumatic events from the past of another person. Of course, any comments and critiques from persons working in the industry (and not only) are welcome, so we encourage you to check out The Beast Inside on Facebook. You can also visit our official website and Facebook page to connect with us.