How Square Enix leveraged
Unreal Engine to modernize
FINAL FANTASY VII REMAKE

By Jimmy Thang |
June 30, 2020
NAOKI HAMAGUCHI, CO-DIRECTOR, FINAL FANTASY VII REMAKE

Joining Square Enix in 2003 to work on the critically acclaimed FINAL FANTASY XII, Naoki Hamaguchi has been involved with a number of revered FINAL FANTASY titles, including the FINAL FANTASY XIII series and MOBIUS FINAL FANTASY.
 
Hamaguchi is the co-director of FINAL FANTASY VII REMAKE alongside fellow co-director Motomu Toriyama and director Tetsuya Nomura, and used his extensive experience to lead the game’s design and programming.
Leading up to the title’s launch, FINAL FANTASY VII REMAKE was undeniably one of the most anticipated games of all time. Square Enix had tremendous pressure on its shoulders to revamp the beloved classic, which some have heralded as the best RPG ever made. Despite monumental shoes to fill, the Japanese studio took some bold new steps to modernize the game and delivered, with review sites like GameSpot stating, “This isn't the Final Fantasy VII your mind remembers, it's the one your heart always knew it to be.” 

To get an in-depth look at how Square Enix revamped FINAL FANTASY VII's graphics, combat system, and world while keeping the spirit of the original game, we interviewed FINAL FANTASY VII REMAKE Co-Director Naoki Hamaguchi. He talks about how Square Enix was able to flesh out the game’s environments to create a new heightened sense of immersion, incorporate top-notch visual effects, optimize the game to be performant, and more. The co-director also elaborates on how transitioning to Unreal Engine helped propel the team forward.
 

FINAL FANTASY VII REMAKE has been praised for maintaining the spirit of the original while also feeling fresh and new. Can you talk about how the team was able to strike this delicate balance?

FINAL FANTASY VII REMAKE Co-Director Naoki Hamaguchi:
Thank you for your kind words. Looking at players’ reactions after release, I am really proud that the remake we strived hard towards was well received.
 
We aimed to create a remake that was not built as an entirely new title that was merely inspired by the world and lore of the original but rather aimed to pay homage to the world and lore while reimagining elements so they could be enjoyed as something new and modern. By doing so, we were able to depict the remake in the latest fashion, without breaking the image players remember from the original, and provide a “familiar-yet-new” experience for players. We received many comments from players saying they enjoyed and appreciated the remake that our team aimed to create. So, I believe the hard work that the development team put into the title was able to reach fans and newcomers alike.
With tremendous visual variety, the game's environments are more detailed than even the most ardent fans could have imagined. Can you talk about how the team was able to meticulously flesh out the city of Midgar? 

Hamaguchi:
I am very proud that the reimagined Midgar was accepted by the fans. Due to technical limitations in the original FINAL FANTASY VII, traversal through the world was across fixed screens. There were many shortcuts taken in the original, where there were deliberate jumps to the next location. Our goal was to stay with the characters and flesh out those location gaps that the original navigated around. Naturally, this created a greater level of immersion, but it also gave us an opportunity to increase the variety in the locations without dismissing elements of the original. 
 
Ultimately, our aim wasn’t to create an entirely new Midgar from scratch, as we didn’t want to act like we were just borrowing this world. We wanted to pay homage to all elements of the original, including the locations and reimagine them in an appropriate and modern way.
 
For instance, take the Wall Market entertainment area of Midgar as an example. A lot has changed in the 23 years since the original game was released, and we felt this area of the city was something that we needed to modernize, so it could be enjoyed by all audiences. To update the Honeybee Inn, we used Las Vegas, the French Moulin Rouge, and Japanese burlesque as inspirations to make this location more of a spectacular show featuring a dance battle. 

It’s an example of how we paid homage to the original, capturing a familiar yet new feel that runs throughout the game and the world. 
FINAL FANTASY VII REMAKE features fantastic character models that reimagine their old blocky equivalents in a more realistic yet still stylized way. How did the team iterate and pull off their look?

Hamaguchi:
When Producer Yoshinori Kitase and Director Tetsuya Nomura called upon me to join the FINAL FANTASY VII REMAKE team, they mentioned that the visual quality we were striving for in this remake was that of the FINAL FANTASY VII ADVENT CHILDREN animated movie but running in-game in real time.
 
As we used physical-based lighting technology, the viewer’s impression can change depending on the lighting settings. While we weren’t aiming for designs that were totally photorealistic, I remember having to redo the scene in which Cloud jumps off of the train at the Mako Reactor 1 train station multiple times before Director Nomura gave us the okay for Cloud’s final expression. However, by arriving at a clear direction on lightning from working on that scene, it made mass production from that point on very smooth, which was a big plus. 
With elegant uses of particle effects, bloom, and ambient occlusion, can you speak to how the team implemented the game's impressive post-process effects?

Hamaguchi:
For this title, we used Unreal Engine’s renderer as our framework, but we decided to create a good bit of the light probe, reflection, light baking, skinning, particles, post effects, tone mapping, and, of course, materials and lighting, so that we would stand out against other titles in terms of visual quality and performance.
 
Because of that, we didn’t utilize the existing post-process effect materials. Instead, all image processing was implemented by writing in the shader directly. In addition to this, one of the benefits of changing the rendering path was the blurring of particle effects. By adding a rendering path that goes from the effect asset to its dedicated image processing, it creates a blur, much like speed lines in a comic book, which the effects designer can control. The implementation of the bloom effect is similar to a standard one, but the kernel shape was calculated by matching glare elements and blending them with glare and bloom elements; with the glare kernel fitted to the standard measurements of the human retina, and the bloom kernel following Mie scattering.
 
Additionally, because ambient occlusion has a large impact on the picture, we applied it at full resolution, while working to ensure physical accuracy as much as possible. In order to achieve this, we implemented layers of feedback over multiple frames, while also applying an aggressive noise removal filter. The calculations for ambient occlusion were merged nonlinearly into the baked occlusion, or capsule shadow, and used in a complex way within the indirect and direct lighting calculations.
 
So, not only due to the powerful engine itself but having the ability to customize Unreal Engine, I felt we were able to smoothly integrate and tailor the engine with the knowledge we’ve amassed up to this point at Square Enix. 
With a mixture of dynamic and baked lighting, can you elaborate on how the team handled lighting in the game? 

Hamaguchi:
In terms of lighting, we made a clear distinction between how we placed static lighting and dynamic lighting. Static lighting requires less processing load, but the quality pales in comparison to dynamic lighting, so we used it primarily for lower-intensity lights or ones that are far away, and let it fill the role of ambient lighting.
 
As for dynamic lighting, they cannot be placed in large numbers due to their heavier processing load, so we limited its use to areas where we wanted to better showcase the texture or three-dimensionality of certain field elements. During a cutscene, the lighting or probes placed in the field did not give off the best of impressions. To combat those instances, we would dial down the existing dynamic lighting and probe during the cutscene, while applying dynamic lighting specifically for the cinematics of that scene. By having the lighting team adjust the lighting on both the background and cutscene, we were able to get a mix of static and dynamic lighting, while striking a nice balance between both.
 
The game features cinematic uses of bokeh depth of field and elegant uses of per-object motion blur. Can you elaborate as to how the team implemented these techniques?

Hamaguchi:
In this title, a massive amount of translucent particle effects is constantly being displayed, so if we used post-process effects that are based on depth of field, the screen could become disrupted, even with something as simple as camera movements. Because of this, we limited the use of depth of field effects to cutscenes in which the camera and assets could be kept stationary.
 
As for motion blur, because it makes a big impact on how it feels to control the character, we hesitated on removing it completely. Thus, during normal gameplay– or in other words, whenever the player can control the camera – we would subtract and take the difference of the camera movement from the normal motion blur, so you would see a blur that’s reminiscent of action lines in a comic book, only when the character is actively moving. The amount of subtraction is increased or decreased whenever the game switches in and out of a cutscene, so it seamlessly connects to the standard cinematic blur. 
FINAL FANTASY VII REMAKE employs a subtle use of physics that grounds the experience. This is evident in the way liquid flows, cloth sways, and boxes get shoved about. Can you talk about how the team achieved this?

Hamaguchi:
There are three major types of physics simulation used in FINAL FANTASY VII REMAKE.
 
The first type of simulation utilized PhysX. This is used primarily on the rigid body of the background, and background artists would create these physics assets. There are times when the programmer would support the setup of complex shapes, but for the most part, the setup can be done by the background artist alone, and we did not need to add any functions to the engine.
 
The second type of simulation utilized Vertex Animation Texture (VAT). We would bake in animation per vertex using the DCC tool, and it would be played back as material animation in runtime. This method is used primarily for fluid elements or complex rigid bodies in the background but can also be used for animating a large number of characters that have a consistent movement. The artists would use Houdini or Maya to create the simulation or animations, and then we would go back and forth in the Alembic format in the DCC tool. Finally, they would export it out of Houdini as a VAT. VAT is not simulated at runtime, so we had the benefit of playing these back at a lower cost.
 
The third type of simulation was something we implemented in FINAL FANTASY XV, which was then integrated into UE4 thanks to our Advanced Technology Division – the Bonamik. The Bonamik was primarily used for character-related simulations. It was used quite a lot in hair, cloth, and soft-body physics, among other things, for its ease of control. We used position-based dynamics as the core algorithm, and we were able to achieve cinematic movements that would otherwise be difficult with a standard physics simulation. 
 
In addition to looking fantastic, the game runs at high resolutions on the PlayStation 4 and PlayStation 4 Pro while maintaining a consistent frame rate. How did the team achieve this? 

Hamaguchi:
The biggest thing for us was to maximize the use of the core. From the design phase, we were very careful about desynchronizing multi-threads and tasks.
 
For multi-threads, we adjusted the thread priority and affinity mask, so in our design, we were cognizant of the CPU cluster. Frame rate is affected by GameThread, RenderingThread, and RHI Thread, in that order, so we adjusted the affinity mask and priority so that it dedicates one core. TaskGraph would be placed lower than the above three threads, but higher than other threads, to reduce wait time for GameThread, RenderingThread, and RHI Thread. Additionally, we would actively use TaskGraph for groups of tasks that are dependent on each other and use ThreadPool to execute tasks that did not need to be completed in one frame, to achieve a maximum utilization rate of over 90% usage on the CPU analyzer. 

The remake has been heavily praised for its new, exciting combat system, which infuses strategic planning with engaging real-time action. How did the team come to this design?
 
Hamaguchi:
With the graphical update becoming much more elaborate since the original, we wanted to remove the boundaries between combat and field traversal, and incorporate a battle system that felt like it was happening in real time. However, we didn’t want to completely ignore the battle system of the original in favor of real-time action battles. Rather, we were mindful of paying homage to the Active Time Battle (ATB) combat system of the original and wanted to evolve that into something that’s more real time, so that the battle system would appear new, yet familiar, to fans.
 
It was not a simple merging of turn-based ATB and real-time action for the sake of experimentation, but we thought about how to bring out the best in both elements by clearly defining each role these would play. The base of the combat in the remake is the ATB battle system. We retained the original’s simple concept of using ATB charges to perform abilities. And the battle action would help players charge their ATB more efficiently or provide a chance opportunity to leverage the usage of ATB. Its role is to enhance the ATB battle through short bursts of player technique. This way, elements of the ATB battle system from the original functions as the core of this combat system.
FINAL FANTASY VII REMAKE introduces several new combat mechanics like the stagger system, the ability to cancel attacks, and parrying. Can you elaborate on why these intricate additions were made?

Hamaguchi:
Simple mechanics would allow for many people to jump in and would facilitate ease of gameplay, but on the flip side, if it’s too simple, considering just how often you go into combat, it inevitably would become repetitive, and players could tire of it easily. So, we carefully selected and implemented elements that would provide depth, without being too complex based on two major pillars: action and strategy.
 
The Stagger system, for example, was implemented to avoid repetitive gameplay, where the player would spam high-damage commands when trying to defeat an enemy. You might want to rush your enemy with commands that would Stagger them quicker; or if the enemy’s health is low, it may be better to attack them with high-damage commands, rather than aiming to Stagger them; or you might want to hit them with commands that would halt them in their tracks in preparation to stagger them – many different options could potentially stem from the commands you choose, and we’ve designed it so that those would link to the element of strategy. 
 
The game offers a vast array of enemies with different strengths and weaknesses. How did the team approach designing them?
 
Hamaguchi:
If we only looked at differentiating the ways in which players would beat each enemy, we would run the risk of having no consistent rule, and the gameplay would feel unclear. I knew I always had to be very mindful of that risk.
 
To overcome this, we first focused on enemy designs and characteristics, and how those would serve as an opportunity for anticipating or considering strategies to a certain extent; then, we would come up with weaknesses—like machines would be weak against lightning, or humanoids would be weak against fire—to establish the general, overarching rules. But those alone would just increase the number of enemies with similar strategies, so we would adjust the balance and think about variations that would make players try different methods to breach enemy defenses, and differentiate the gameplay for each of the enemies.
 
Whenever we’re placing enemies in each of their respective locations, we were also mindful about structuring the battle scene to avoid having repetitive gameplay. 
With slick-looking limit breaks and attacks, can you share how the team was able to achieve the game's high-quality animations?
 
Hamaguchi:
We paid careful attention to bring out the great qualities of the keyframe animations, which do not depend solely on motion capture, and the imagination and artistic talents of our animators to the full extent. Of course, parts that required realistic and subtle movements were done by shooting lots of motion capture, but attack actions and whatnot were created by our game designers, camera people, and animators working closely together, who strived to deliver very eye-catching actions that are befitting of FINAL FANTASY VII REMAKE. As a result, I believe the animations ended up being very exhilarating and appealing.
Summons in the series have always been very epic, visually impressive feats. The remake continues this trend but mixes things up by allowing summons to stick around the battlefield. Can you elaborate on this design change?
 
Hamaguchi:
In FINAL FANTASY, summons not only need to play into strategy, but they also need to look cool visually, so I’m glad you liked them. There’s actually a clear reason behind our decision to prevent players from being able to use them freely in any and all battles, which was how it was in the original. This isn’t limited to summons, but whenever there’s a mechanic that can be used at any time, we have to ensure that the battle system has been properly designed to incorporate the pros and cons that accompany its usage. Otherwise, that mechanic can become a sure-fire method for victory every time, and battles could end up becoming repetitive and monotonous. 

When we incorporated summons into the battle design, we also prioritized creating an experience that felt particularly exciting and special. Rather than incorporating summons as a strategic element to be used at any time during battle, we thought that it would feel more memorable and exhilarating if they could be experienced when most needed, where a summon would stay and help the player during the entire time it was there. Based on this decision, we narrowed down the conditions required for summoning while also reducing the MP consumption to zero. As a result, I think we were able to incorporate summons into the battle system in a well-balanced manner.

Considering FINAL FANTASY VII REMAKE is the first game in the series to use Unreal, why was the engine a good fit? Was Square Enix able to leverage any Unreal experience from working on KINGDOM HEARTS III to build upon for FINAL FANTASY VII REMAKE?
 
Hamaguchi:
Originally, we were driving the development of FINAL FANTASY VII REMAKE with an organizational structure based around external development partners. Around 2017, in order to heighten the quality of the product even further, and to stabilize mass production schedules, we shifted to an organizational structure based around development that would take place internally, although this didn’t change the fact that we continued to work with many external partners. In this respect, we determined that developing on a public engine, with expertise built up both internally and externally, was better suited for us when considering the organizational structure for developing REMAKE.

Further, staff members with experience developing KINGDOM HEARTS III came on board for the development of FINAL FANTASY VII REMAKE, and we were able to circulate the expertise they had already built up with Unreal Engine very well among us internally. This definitely helped propel us forward as we progressed with development. 
 
Did the development team have any favorite Unreal Engine tools or features?
 
Hamaguchi:
This isn’t a tool or feature per se, but incorporating Unreal Engine enabled us to experience points that differed from what we’ve done thus far. 
 
When developing AAA FINAL FANTASY titles, it had been the norm to use development engines produced internally, so it was necessary for us to keep in mind the learning period whenever new staff members joined the team. However, using Unreal Engine, which generally can be utilized by many people all over, meant that there were many staff members who already had experience working with it. This led to shorter learning periods, which was a huge plus for development efficiency.
Has the team seen the Unreal Engine 5 demo? What excited your team most about next-gen?
 
Hamaguchi:
Yes, of course, we have seen it. The graphical quality was something we hadn’t been able to achieve on current-generation consumer consoles, and when I saw it, I could feel the new sparks of creativity being stirred within me as a creator. It was a feeling that filled me with excitement and anticipation. 

Additionally, we had the opportunity to work closely and develop relationships with Mr. Takayuki Kawasaki from Epic Japan and many of the other Epic staff members, which was also a very valuable experience for our development team. I believe that moving forward, powerful gaming engines like Unreal Engine 5 will continue to shine at the center of game development. 
 
Thanks for your time. Where can people learn more about FINAL FANTASY VII REMAKE?
 
Hamaguchi:
As you may know, development is underway on the next game in the project, although we’re not able to say anything more about that at this stage. For the latest information, you can follow us on Twitter or visit our official website
 
Thank you very much for your time and for giving us this opportunity.
FINAL FANTASY VII REMAKE © 1997, 2020 SQUARE ENIX CO., LTD. All Rights Reserved.
CHARACTER DESIGN: TETSUYA NOMURA / ROBERTO FERRARI

    Get Unreal Engine today!

    Get the world’s most open and advanced creation tool. 
    With every feature and full source code access included, Unreal Engine comes fully loaded out of the box. 
    Event
    Unreal Build: Virtual Production 2020
    Coming to you from the comfort of your own home on Tuesday, November 10, Unreal Build: Virtual Production is a free half-day virtual event that showcases incredible projects and innovations from film and television industry trailblazers.
    Event
    News
    Connect with the Unreal Engine community online
    Although many physical events around the world are currently on hold, there are plenty of places to connect with the Unreal Engine community online.
    News
    News
    Twinmotion and Quixel Megascans archviz week roundup
    Missed anything from our Twinmotion and Quixel Megascans week? You can catch up on all the action here, from inspiring archviz animations to information on how to bring Megascans into your Twinmotion projects. 
    News