I’m Lorenzo Drago, an Italian 3D artist with a university background in communication design. I started as a self-taught Blender artist as a teenager, and for the past three years, I’ve been focusing on game art through university classes, freelance work, mod and indie development, as well as personal projects.
In May, I released my newest portfolio piece, an environment based on Etchū-Daimon Station in Imizu, Japan, powered by Unreal Engine 5. Since I’ve received quite a few questions about it, I thought it would be useful to highlight how I created the scene’s day/night lighting scenarios, camera motion, and touch upon how I set the environment up.
The beginning stages of the Etchū-Daimon Station project
I started Etchū-Daimon Station on Unreal Engine 4 and switched to UE5 halfway through. This was also one of the reasons I didn’t end up using Nanite for my assets. At the start of the project, I was limited to working on an old PC and found that UE5 performed better in some ways, especially as Lumen cut out the excessive baking times for Lightmass and enabled me to iterate on lighting faster. Migrating the project to UE5 was seamless; I only had to enable the new rendering features in the project settings and switch all the lighting to real-time. Since lighting was quite a challenging stage for me, I also used Epic Games’ Lighting Essential Concepts and Effects course as my main learning material.
Every other project currently on my ArtStation started as either a commission or an assignment, so this was my first, serious, entirely personal project. For that reason, I wanted to aim for an aesthetic that resonated with me personally. I decided to strive for a very grounded type of photorealism, with a mundane, everyday aesthetic that anyone could relate to. It also allowed me to work on my use of references, proportions, materials, and areas that I thought needed improvement from my previous projects.
During a trip to Japan a couple of years ago, I had the chance to stop at a few of the countryside train stations in the middle of nowhere. I think these locations have a very unique, nostalgic atmosphere and feel more lived-in compared to castles, temples, or shrines.
More recently, while looking for references online, I stumbled upon a picture of the real Etchū-Daimon Station. It reminded me of that kind of atmosphere, so I chose it as my next subject.
Creating day/night lighting scenarios using Lumen
Thanks in no small part to Lumen, lighting was a comparatively simple part of the project. For the daytime scene, I decided to start with real-life brightness values, which can be found fairly easily on the Internet. The big numbers take a little time to get used to, but they work pretty well once exposure is properly adjusted. I used them as a starting point before applying further tweaks to get closer to my references and the look I wanted.
For the daytime scene, UE5’s default Directional Light + Sky Light setup worked quite well, as it automatically adjusts the color and intensity of the light and sky based on the angle of incidence. Although, in the case of sunset or sunrise scenes, this may be worth adjusting further as the colors get quite saturated.
For a static scene like mine, I ended up using HDRI backgrounds, which means that, as is, the scene doesn’t feature a dynamic sky. I set the backdrops so they wouldn’t affect sky lighting since the included sun would have affected the indirect lighting in addition to the scene’s Directional Light. With these tweaks, I was happy to see that Lumen calculated Global Illumination in a very natural way.
Lumen was a little harder to work with in the nighttime scene, on the other hand. Compared to the real-world reference, my scene might appear darker, as if the light isn’t bouncing as much as it should. When I attempted to increase exposure or brightness, it became too hard to hide Lumen’s difficulty in solving a lot of half-shadowed areas, which would manifest in increased noise and instability. Still, while the nighttime scene ended up looking a little more stylized, I was satisfied with its look, so I leaned more into a first-person horror aesthetic.
Setting up the flashlight was simple. I employed a spotlight with a simple Light Function Material using a “cookie” texture plugged into its Emission Color input. After that, I tweaked the wetness of various materials in the scene, hoping to get an interesting response to the head-on light.
Setting up camera motion and flashlight animations in Unreal Engine 5
Setting up the camera motion was something quite new and exciting for me. I enjoyed bringing more physicality to my exploration of the environment and making the location feel lived-in.
I initially considered smartphone tracking but then settled on VR, as the ability to see the scene fully rendered in real-time as it’s being filmed was extremely useful, especially as reflections became important in the nighttime version.
Though I had some scripting experience in Unity using C#, this was my first time working with Blueprints, let alone virtual production, so I ran into some challenges creating thenbsp;setup. Aiden Wilson’s tutorials were extremely useful to me, as they go over all the necessary steps and more.
My solution was hardly the most advanced, clean, performant, or user-friendly, but it served me well enough for my limited needs.
To record the tracked motion, I made use of Unreal Engine’s Take Recorder, set up to track my camera object. The scene runs well enough in real-time that I can play it as is without tweaks, so it’s only a matter of hitting play and starting a new take.
For the nighttime scenes, I hadn’t initially set up flashlight movement. Instead of re-recording each shot with an additional tracker, I recorded the movement on top of the existing takes by editing the Blueprint slightly. I also used Take Recorder's function for recording an existing take.
Since the flashlight’s horizontal orientation is inherited from the camera, the result might feel a little more artificial compared to recording flashlight and camera movement separately with, say, a controller in each hand.
Once I had the finished camera motions recorded, I matched those movements in real life while recording with my real-life camera to obtain the main audio tracks for the video. I also added some extra sound effects, trying to match the low-quality audio compression and the reverb of each location. I used Adobe Premiere to combine the audio files with the rendered sequence.
I rendered the project in a fairly straightforward way from Sequencer, as a high-resolution image sequence. Something to note, though, is that I had to disable my tracked Camera Blueprint, as VR input still affected it as the sequence rendered.
I didn’t change the default post-processing settings too much, except for tweaks to exposure, bloom, tone mapping curve, and simple color adjustments.
I also added some further color tweaks, sharpening, and vignette afterward in After Effects. I’m sure it could have been done directly in-engine, but I found this way easier for me to work and experiment with.