Bulkhead Interactive showcases how to port a PC/console game to Nintendo Switch
The upgrade brought along some significant rendering changes that made the end presentation look noticeably different due to the introduction of the new filmic tonemapper, among other things. This shift required color-grading work, some tweaks to specularity, and a fresh pass on the reflection setup in levels to bring the look of the game more in line with the artists’ original vision.

The downside, however, is that the garbage collector has to work overtime, causing spikes on the game thread during gameplay. On a relatively constrained portable system like the Nintendo Switch, this caused us some considerable woes during level streaming. New actors and components have to be registered, initialized, and collated when they’re introduced into the world, while the garbage collector is dealing with entities leaving the active playset. This was compounded by complex actors with dozens of components used throughout the game which, in general, just didn’t come up as an issue on other platforms.

If there is one great piece of advice we can share for similar cases, it is to look into the available options for level streaming (Project Settings > Streaming). These can limit the amount of time per frame spent on actor and component initialization and teardown, and other similarly valuable variables.

The solution to this was to dynamically create widgets and render them to textures. This was done only during loading screens or when the player manually changes the game language. This system saved us the overhead of having multiple textures for each instance of text in the world and required no additional work during iterations on the translations.



Alternatively, we could have leaned on modern rendering features provided by Unreal Engine, such as dynamic resolution, but we settled for a more traditional approach after some deliberation.
Among the most intensive real-time rendering features we identified on the Nintendo Switch were ambient occlusion and screen space reflections. We deemed them vital for the visual fidelity of the environment, so with care, we tuned their quality to what we found reasonable to meet our targets. Much of the graphical fine-tuning was done on the fly during development thanks to the brilliant support for that in the engine. As we slowly converged on the final values, we utilized device profiles leading up to the end of the project.
