Next-gen gaming: tech's take on gaming's future

By Brian Crecente |
April 8, 2021
EDITOR'S NOTE: Guest author Brian Crecente founded video gaming site Kotaku and co-founded Polygon. He was also the video games editor for Rolling Stone and for Variety. He currently consults for publishers and the video game industry at Pad and Pixel.
While software and engine innovations drive the latest video games, hardware and its advancing technology also define next-gen.
 
The most significant leaps in game development arrive on the back of new tech — like physics chips, improvements in audio hardware, and, most recently, the widespread use of solid-state drives.

The solid-state drive, in particular, is a great example of the sort of immediate impact adoption of a pivotal shift in technology can have on video game development.

This is most obviously seen in the launch of the PlayStation 5, which features an SSD customized for high-speed data streaming. That means it can access data from the SSD and place it directly into a specific place in memory at record speeds. This leap in the console’s input-output system removes the huge latency that historically existed between a hard drive and processor, which was becoming a looming limiting factor.

You need only look to Unreal Engine’s Nanite micropolygon geometry tool to see just how much that I/O speed increase can improve the development landscape.

Nanite takes an image and breaks it down into millions or billions of scalable triangles. Then, the software streams only the data the camera can see (it removes anything that isn’t front facing or blocked by something else via a very accurate algorithm) from the high-speed SSD directly to memory on the console.

“This removes the need for multiple level of detail versions of a 3D model,” said Epic Games’ Sjoerd De Jong. “But for it to work, the data needs to be accessible very quickly and also to remove it just as quickly when it is no longer needed. The super fast speeds of the SSD is allowing for content to be near instantly loaded. Over the years, the main focus for graphics has been on 3D capabilities, but that was beginning to be held back by old mechanical hard drive speeds. With SSDs now becoming the norm, that is causing a shift in graphic capabilities driven — for the first time, perhaps in the history of 3D gaming, by hard drives.”

The solid-state drive is just one of myriad ways hardware advancements are empowering game makers and players. Something that Alienware, AMD, Intel, Logitech, and NVIDIA — which have each experienced multiple generational shifts over the years — shared their thoughts on.

Alienware

The constant push for delivering more immersive gaming experiences relies on a complex dialog that computers have with a player's eyes, ears, and fingertips, said Alienware Messaging Lead Eddy Goyanes.

"This means that display technologies will aim to deliver even smoother action and richer graphics," he said.

Take, for instance, the company's decision in 2018 to start offering four distinct panel technologies for their laptops. The display options began to include everything from full high definition screens with a refresh rate of 60Hz, 144Hz, 240Hz, to 360Hz to even an ultra high definition OLED display. 

"We have also seen developments in the interfaces where our hands touch the PC – this includes everything from touchpads to keyboards to mice," Goyanes said. "They're all taking on improvements specific to performance and ergonomics. This is all intentional for a variety of reasons that may include; giving gamers an edge while in-game, showing off the aesthetics of their setup, more keystroke travel, mouse-to-hand compliance for comfort, or even metaphorical extensions while playing their favorite games. 
Alienware m15 Ryzen Edition R5
"These incremental technology improvements are not likely to pause their developments over the years to come for the purpose of providing a more engaged, rich, and fun experience." Alienware also spends a considerable amount of time trying to identify and resolve new chokepoints for gaming.

"In some circles, they say, 'You're as strong as your weakest link.' If you take that into PC gaming, there are some analogs that can be made," he said. "We've seen it with storage. In the past decade, more gamers are including solid-state technology in their systems to reduce load times for the PC, within games, their personal files, and programs. We also see it with networking. Over the past year, Alienware gaming notebooks have begun to carry 2.5 Gigabit Ethernet connections and 802.11ax wireless. On the internet side of things, we see and hear lots of buzz surrounding 5G networks and cloud gaming. 

"These are all technologies striving to deliver content faster and eliminating bottlenecks in loading and delivery of content, but it's hard to say which ones will fully mature and which ones will last the longest at this time."

AMD

As a provider of both CPUs and GPUs, AMD is in a unique position to identify hardware needs and help shape the direction they take in the future.

Video games are on the cusp of becoming more realistic in the coming generations thanks to a combination of higher-fidelity visuals, more powerful hardware, and new, innovative effects, according to Scott Herkelman, Corporate Vice President and General Manager of AMD's Graphics Business Unit.

“Ray tracing, for example, enables stunning, more life-like visuals and is now available across multiple gaming platforms powered by AMD CPUs and AMD RDNA 2 architecture-based GPUs – including PCs, laptops, and the Xbox Series X and PlayStation 5 consoles,” Herkelman said. 

“At the same time, other technologies will make it easier than ever for developers to integrate amazing capabilities into future games, and AMD is working closely with developers like Epic to this end.”

One such example is how the DirectX 12 Ultimate API streamlines the development process by allowing developers to create games using the same common graphics API and graphics architecture for both PCs and Xbox Series X consoles powered by AMD RDNA 2 graphics. This enables developers to bring advanced effects like ray tracing to more games sooner. Additional DirectX 12 Ultimate features such as Variable Rate Shading (VRS), Mesh Shaders and Sampler Feedback will also allow developers to create more immersive gaming environments, he noted. Herkelman added that AMD's RDNA 2 GPUs will deliver new levels of performance to power the next-generation of demanding games.
Godfall screenshot from a PC running AMD Ryzen processor and AMD Radeon RX graphics card
“In addition,” he said, “collaboration with other technology providers across the ecosystem will ensure the best possible end-to-end gaming experiences, such as working with display manufacturers to incorporate AMD FreeSync technology to deliver fluid, stutter-free gaming visuals on next-generation displays.

But that next-gen push toward higher fidelity and true-to-life visuals and effects comes at a cost, both in terms of CPU cores and higher graphics performance.

"Ray tracing, for example, carries a heavy performance penalty compared to traditional rasterization techniques," Herkelman said, adding that AMD provides both CPUs and GPUs, putting the company in a unique position to work with the gaming ecosystem on improving visual experiences.

The results of that knowledge are new technologies like AMD SmartShift, which optimizes performance by automatically shifting power between AMD CPUs and GPUs, and AMD Smart Access Memory, which gives the CPU full access to high-speed GPU VRAM.

“Because of the increased level of visual fidelity, games will naturally become larger in size, requiring more space for high-resolution textures and possibly causing longer load times,” Herkelman said. “To mitigate this potential bottleneck, the new generation of gaming PCs and consoles will utilize speedy PCIe 4.0 storage, which will significantly reduce load times.” The company is also working on improvements meant to target quality-of-life issues.

“There are also a host of other technologies that will make games more fluid, immersive and responsive, such as the AMD FidelityFX developer toolkit that provides sharper, crystal clear visuals, and Radeon Anti-Lag that reduces the delay between a mouse or keyboard click and the resulting on-screen response – providing a competitive advantage in esports and other games,” he said.

All of these improvements and leaps in technology will deliver a more immersive, awe-inspiring gaming experience, with more exciting effects, cinematic visual fidelity, and incredible performance where load times will become a thing of the past.

"All of these improvements and leaps in technology will result in a streamlined gaming experience with less waiting, more exciting effects, higher performance, and incredible visual fidelity."

Intel

Intel’s core belief is that the PC, as the leading open, versatile, high-performance gaming platform, continues to be the best place for those innovations to drive the gaming experience forward, said Kim Pallister, general manager of gaming solutions at Intel Corporation. The company believes the decade will see massive advances in the gaming experiences offered, led by a couple of key underlying trends.

Increased availability of cloud compute will make games better, he said.

“Especially exciting is those that will make use of both a powerful client and powerful compute and storage in the cloud,” he said.

This will allow developers to devote significantly more backend compute to grow the size and concurrent populations of the worlds they create - while also ramping up the degree of interaction possible in these worlds.

“Server-based physics is a common bottleneck that has great potential for even more parallelism, which will lead to finer and more realistic interactions amongst larger, richer, more populated worlds,” he added. “The huge multi-user events that Epic has hosted in Fortnite, or how Microsoft is streaming highly-detailed worlds in Flight Simulator – these are only the beginning. Cloud will also help game creators who will use data centers in development workflows that rely on machine learning to enhance asset quality and variability and even NPC's AI and animation (for example, DeepMotion). With huge amounts of server-side power, researchers are now unleashed to develop and easily deploy more and more algorithm-based improvements to tackle all facets of the user experience. (e.g. denoising/super sampling).” Another big trend Intel is following is the increased improvements in underlying silicon compute power, efficiency, and parallelism.

Pallister said that this will continue to drive opportunities for developers to enhance client and server computation, with a steady stream of advancements in compilers and tools being released to improve the power and efficiency to all types of games. He points to examples like the work the company is doing with Epic using ISPC to get more performance out of Chaos Physics.
Image courtesy of Microsoft's Flight Simulator
“Fundamentally, power is the limiter across so many aspects of the gaming experience - whether it's the data center power consumption costs or the thermal limits of a laptop – we’re focused on making that range of power/performance across Intel products easy for developers to access,” he said. “Also, combatting the rising cost of game development in human terms (for example, artist's productivity) will rely more and more on cloud and machine learning approaches, so we look at AI and tools like Nanite to deliver large boosts to today's workflows.”

Finally, as gaming continues to push technology forward, Intel believes we will see an increasing sophistication and broadening of how platforms meet user’s needs across all of the user experience.

One example of how this has already happened is how an early focus on frame rates broadened to include things like latency and load times. 

“We know that not all gamers have the same needs and desires, and we see the PC industry continuing to adapt and evolve a slew of solutions - from high performance thin and light notebooks to beefy laptops that might be 'esports' focused with 1440/240Hz displays, or might go another direction with 4K HDR, or another with a focus on battery life using advances in CPU<->GPU power sharing,” Pallister said. “Certainly the growing ubiquity of SSDs and decompression speed of data from those SSDs will foster an expectation of 'instant loading' and rich environments streamed (like Nanite). Over this coming decade, you should expect continued leaps in technology, especially in the competitive PC world where open standards will drive advancement faster than in the walled gardens of some gaming ecosystems.

“We believe this focus on the entirety of the user experience will broaden – from in-game elements and quality to communication and streaming outside the game, the use of computing performance to power robust solutions for anti-toxicity, better voice and video, and easier sharing and streaming will lead to much more enjoyable, high-quality online experiences.”

Logitech

While innovations in computation and graphic fidelity are often the most called out leaps in gaming technology, Logitech has a different perspective. The company views this generation's push for customization of peripherals — like mice, keyboards, and controllers - as the most impactful.

"The accessibility controller was a huge leap in being able to democratize gaming and make it available for all," said Curtis Brown, Logitech's Strategic Partnership Manager.

The Logitech G Adaptive Gaming Kit, which is compatible with Microsoft's Xbox Adaptive Controller, provides an assortment of buttons and switches, which allows someone to create a controller best suited to their personal needs.

"Now, there is an entirely new group of people who have the ability to play games the way they want to," Brown said. "A lot of us are focused on the way we play games whether we're color blind, hard of hearing or don't have motor skills. This was one of the first big shots from a major company to go out and change the narrative to say everyone should be able to play and engage."
Logitech's G Adaptive Gaming Kit
Logitech also spent the past few years reexamining its approach to peripherals. The result was a decision to redesign many of its products to ensure they are task and purpose-built. That means everything from ensuring the company has mice that fit every size of hand to developing a wider range of keyboards suited for anything from everyday use to minimal pro esports boards.

Brown believes the coming wave of next-gen peripherals will see a proliferation of different mice sensors that are more accurate and faster, keyboard switches that offer a wider variety of tactile experiences, and the rise of esports continuing to push the requirements of high-end peripherals across the board.

For the longest time, the biggest choke points for a company like Logitech all had to do with input/output interactions with a computer. Now that USB-C is starting to take hold, that is opening up a lot of possibilities for devices that require more power and faster connection speeds, Brown said. He also noted that wireless devices are going to continue to evolve and perhaps eventually supplant wired devices.

NVIDIA

Ray tracing is the biggest leap in game development since DirectX 9 pixel shading in 2002, said Brian Burke, NVIDIA's Global PR Principal for Gaming Technology.

"It is changing the way games look, how they are played, and even how they are developed," he said. "NVIDIA pioneered the use of real-time ray tracing in games with the release of GeForce RTX GPUs in 2018 when we added dedicated RT Cores to accelerate ray tracing and make it possible to do in real-time for games. The world's top game franchises have added it, the most popular game engines and API are supporting it, along with dozens of content creation and design applications. Even consoles support ray tracing now."

And, he added, the game industry has just begun to truly explore what ray tracing can mean to games.

"The image quality aspect is a given with improvements to lighting, shadows, reflections, and more," Burke said. "But, it can also change gameplay." 

"We have already seen the release of the first game that uses ray tracing in core gameplay: Stay in the Light. It uses ray tracing for the reflections in a mirror that stops the monster's advancement as you navigate randomly generated dungeons. It is hard to bake in reflections to a randomly generated map, so ray tracing is used. Now expand that concept to other game genres. Shadows that give away enemy position and reflections that show enemy movement in first-person shooters. Imagine what ray-traced shadows and global illumination will do for horror games."

Burke also points to AI as a technology category that is quickly changing everything we do. Games, he notes, is no exception.
"AI is going to revolutionize gaming from rendering, physics, animation, and even broadcasting," he said. "NVIDIA's DLSS technology is an initial example of real-time AI in games. It renders a game at lower resolution then uses dedicated AI hardware on NVIDIA RTX GPUs (called Tensor Cores) to upscale the game to native resolution. The result is image quality comparable to native resolution but with a 30 to 70% jump in performance. DLSS has proven to be an invaluable companion to ray tracing because it gives you the performance headroom to maximize ray tracing settings, increase output resolution, or even extend laptop battery life. DLSS has been featured in games such as Cyberpunk 2077, Death Stranding, Control, Minecraft with RTX, and Fortnite."

The move to ray tracing and "increasingly beautiful pixels," Burke said, puts an immense demand on the shader cores inside the GPU.

"To handle this, NVIDIA RTX GPUs have added dedicated RT Cores for ray tracing, as well as created smart AI technologies like DLSS to render fewer pixels while still achieving the desired resolution and image quality."

Finally, Burke noted that ray tracing is going to have a major impact on quality of life issues for developers.

"Traditional rasterized graphics require light baking and other tricks to achieve shadows, reflections, and lighting that approach realism," he said. "With ray tracing, light is accurately simulated in the environment, eliminating the need for pre-baking and special tricks to get a realistic look. That will result in major time and quality of life improvements for game developers.”

    Get Unreal Engine today!

    Get the world’s most open and advanced creation tool. 
    With every feature and full source code access included, Unreal Engine comes fully loaded out of the box. 
    News
    February 10

    A sneak peek at MetaHuman Creator: high-fidelity digital humans made easy

    Creating convincing digital humans has traditionally been hard, slow, and expensive. With MetaHuman Creator, the time to create a unique digital human of the highest quality, complete with hair and clothing, is slashed. Here, we provide a first look at the new tool.
    News

    A sneak peek at MetaHuman Creator: high-fidelity digital humans made easy

    Creating convincing digital humans has traditionally been hard, slow, and expensive. With MetaHuman Creator, the time to create a unique digital human of the highest quality, complete with hair and clothing, is slashed. Here, we provide a first look at the new tool.
    Spotlight
    February 9

    Moment Factory collaborates with Epic on live event previs DMX sample project—available now!

    Designing live events has become more and more complex, with lighting, video content, scenography, automation, and special effects combining to form endless possibilities. Now you can accurately previs DMX-controlled lighting and effects in Unreal Engine. Get the free sample project today!
    Spotlight

    Moment Factory collaborates with Epic on live event previs DMX sample project—available now!

    Designing live events has become more and more complex, with lighting, video content, scenography, automation, and special effects combining to form endless possibilities. Now you can accurately previs DMX-controlled lighting and effects in Unreal Engine. Get the free sample project today!
    News
    December 3

    Unreal Engine 4.26 released!

    This release brings new tools that extend your ability to create believable real-time environments and characters; continued advancements to our virtual production toolset; higher-quality media output; improved design review tools; and much, more more.
    News

    Unreal Engine 4.26 released!

    This release brings new tools that extend your ability to create believable real-time environments and characters; continued advancements to our virtual production toolset; higher-quality media output; improved design review tools; and much, more more.