Image courtesy of Radiohead

Creating KID A MNESIA: an exclusive look at Radiohead's new virtual exhibit

January 13, 2022
When your record is considered to be the ‘best album of the 2000s’ by nearly every major music publication, the narrative is pretty locked. After decades of discussion and lovingly tended mythmaking, the album takes on its story. Its parts are unpackaged and consumed from every angle; sometimes in service of the music, sometimes in service of the story, until seemingly, everything has been said. 

What is a band supposed to do with such a thing—especially in the face of a major anniversary? How do you grapple with the meaning of a project, and ultimately, decide what’s left to say?

Luckily for Radiohead, the two albums in question—Kid A and Amnesiac—are enigmatic by nature and beloved, in part, for their ability to lure you in without ever really explaining what is going on. This openness makes them a wonder to explore and continues to reward listeners who want to get lost in the swirl and jitter. It also presents a unique opportunity for the artists behind it. When your work defies interpretation, or at least a singular one, you are free to recontextualize it at will, giving your fans a new window into the work and more space to roam.
Image courtesy of Radiohead
The KID A MNESIA EXHIBITION—now available for PC, Mac, and PS5—is that moment for fans: a chance to see both albums, and the art that came with them, anew for the first time in 21 years through a new virtual museum developed by [namethemachine] & Arbitrarily Good Productions with the help of Radiohead and Epic Games. A move that saved the project from near-certain doom, after ideas for a physical exhibit were scrapped due to challenges that were both structural (e.g. shipping containers that “crashed” into The Royal Albert Hall) and pandemic-driven, in nature. Seeing only one path left, the team turned to Unreal Engine.
Flash forward to the present, and even Thom Yorke is still coming to grips with what has been created. Or as he put it, “we’ve built… something. We aren’t sure what it is.”

Which feels right. Just like the albums, the exhibit has no easy answers. It is a labyrinth that feels brutalist, alien, and other—which was always the point, even in its original conception. As players walk through its ever-changing rooms, they encounter endless voids, pencil forests, particle paintings, and all the bears, minotaurs, and stick figures Radiohead fans would expect from that era. In fact, that was also part of the point. To take Thom Yorke and Stanley Donwood’s accompanying art—which there was a lot of—and turn it into something new. At the same time, longtime producer Nigel Godrich was set loose on the albums’ original multitrack recordings to reform them, sometimes at what sounds like an almost molecular level.
Image courtesy of Radiohead
The result is captivating, layered, and sometimes even a bit eerie. We’re big fans.

To honor this work, we sat down with the team that created it to hear more about the inspiration points, intentions, and tools guiding this project. We hope you enjoy it.
 
What was the brief for this project?

Sean Evans, Creative Director:
For context, the Kid A / Amnesiac era was an especially productive period for Radiohead, and between the band, Stanley, and Nigel, huge amounts of artwork and music were created. Much of this artwork was created during a period of musical creative block, which led to Thom submerging himself in the visual side. This exhibition was to cover the output from that era, inside a forgotten alien ruin. It was to be a museum that combined a labyrinth with The Library of Babel, a place that instilled a feeling of being lost without feeling hopeless. At times, the player was supposed to feel overwhelmed. But the design would have no one correct path, and contain no dead ends. 

Chelsea Hash, Game Director at Arbitrarily Good Productions: Thom described the exhibition as “trapped in a future that never happened.”
What was the brief on how the songs were to be used? How did you go about incorporating them?
    
Sean Evans:
I remember in 2019, Nigel played me a bit of how he envisioned some of this could work. It was fascinating. It was several songs that melted into one another, at times drifting into bare elements, at times mixing material from one song to another. That was the entryway to the exhibit. Nigel had a pretty solid concept of how that should work. 

In that case, we built the structure to fit around the audio concept. Much of the audio design was meant to have the feel of a gallery and not be punching the viewer in the head. This was great, it allowed us to get into controlling the flow of the whole thing, giving it space and peaks and valleys.
Image courtesy of Radiohead
For the big space inside the pyramid, Nigel and Thom wanted that to be a room that featured three songs, “How to Disappear Completely,” “Pyramid Song,” and “You and Whose Army.” The room we designed was MASSIVE, and based on a hexagon, so Nigel had the thought of remixing the songs in six-point surround. It’s a great effect—as the viewer moves around the space, different aspects of the song are heard. That approach was so successful, it allowed for other songs and spaces to be expanded on—like “Treefingers,” which is right inside the pyramid. No multitracks existed for this song, but Nigel was able to use other elements to mix it in surround.
Image courtesy of Radiohead
Matthew Davis, Producer at [namethemachine]: The first brief I heard was this notion of “exploded songs.” That there was so much material, both in the records and on the floor, that blasting everything open into its component parts and laying them out in some way was not only true to the spirit of the material, but essential to building an experience off of. 

How was lighting design used to convey emotions or put someone in a specific mood as they went through the experience? 

Sean Evans:
We didn’t want it lit like a traditional museum, more like some long-forgotten shrine. We achieved this with a mix of traditional lights, projections, and illuminated surfaces. The projection tech was developed by our team so it wouldn’t be crazy expensive and could be used on a vast amount of surfaces in one room. Very effective. Same with lit artwork. It’s great to walk down a dark hallway and have the art be what guides the way. 

Another piece of custom lighting Carlos Garcias [Senior Technical Artist, Arbitrarily Good Productions] implemented was having video screens cast light. You can see this in The Televisions room or the Pixel Warehouse, where the big screens are what is dynamically illuminating the scene. 

We also used lighting for gear shifts. For instance, when you are in the Pyramid, everything is lit and the lighting goes on into infinity. But as you progress you enter darker and darker rooms; first to the Green Phosphor—a void-like room with green dot matrix writing as walls—then into the Empty Basement, which is pitch black and a massive void. And all of that lighting flow is echoed in the sound design.
Image courtesy of Radiohead
How did you approach the level design-like aspect of this? How did you build your map? How did you make it flow?

Brett Lajzer, Technical Director, Arbitrarily Good Productions:
Many of the rooms and spaces in the exhibition were meant to occupy the same physical space, as they were a part of a real-life building. Even though some of them could never exist at all. We solved the problems / constraints that this placed upon us by using a handful of tools, including a robust streaming system, portals, and teleporters. 

Very early on, we developed a system that would only load rooms around the player. This made it possible to have overlapping rooms, as long as both didn’t need to be loaded at the same time. The portals and teleporters enabled us to make physically disconnected spaces seem continuous. The Pyramid (inside and out) and the winding gallery are the two areas that use this most heavily: they’re what make it possible for the gallery to seem infinite.

Chelsea Hash: Ambitious prototype previsualization and grey-box testing in Unreal Engine was developed by Sean in a close loop with the band’s creative group. This allowed us to set up a process where we could confidently start the environment polish for the most known spaces very early in production. Knowing the balance and pacing between the ambitious abstract spaces and the more traditional allowed us to solve technical problems with content.  Long hallways could become canvases for graffiti, or darkened for effect, or filled with music and people. It evolved from a concrete warehouse and then increasingly became encrusted, concealed, crowded, and isolated to evoke a feeling in the player by using all tools in our toolkit.
Image courtesy of Radiohead
How was color used to evoke certain feelings/change a space?

Sean Evans:
Chelsea and I were very deliberately using color as visual waypoints. The viewer would start to recognize where they were and learn familiarity because different rooms have different color schemes. This also worked hand in hand with lighting. The whole exhibition has this thought process built in from the very start—all of the schemes are paced and mapped out.

What were the inspiration points for the different rooms?

Sean Evans:
Every room has a theme, even if it is a very abstract one. As I said before, the overall map was influenced by labyrinths, and drawings and writings about The Library of Babel. That’s where the hexagons come from. The Library of Babel is a fictional space that is comprised solely of equally sized hexagonal rooms with no connecting hallways. We needed long corridors, so we didn’t stick very strictly to that idea. 

The majority of the rooms were designed around the artwork they would feature, which was all based on curation done by Thom, Dan, and Christine Jones. The rooms on the first level—before entering the pyramid—are meant to feel more like physically possible spaces. Once inside the pyramid, the spaces are meant to feel more void-like, more impossible.
Image courtesy of Radiohead
Some rooms mutated as development went on, like the Ghost Chamber. We had a version of this in our initial proof-of-concept demo, but wow, how it changed. The addition of the chorus of robot Thom voices and the animated ghosts brought the room to life. Some rooms were more directly based on the source art, like The Paper Chamber.  The art that was to be featured there was tons of sketchbook pages and scribbles and lyric sheets. Basically, let’s cover all the walls, floors, and ceiling in paper.  A similar thought was followed for The Televisions room. We had a load of short videos to play, so let’s stack old TVs and VCRs from floor to ceiling. The Paper Chamber and The Televisions are sort of mirrors of one another.  

The inside of the Pyramid is an interesting space, it’s based on Infernal Geometry. That’s this weird idea Dan told us about where a room keeps shifting how many walls it has in a way that drives its occupant mad. It’s subtle the way that manifested, but each time the video switches inside the Pyramid, the number of walls change. Once the viewer exits the elevator, they are meant to land in a more traditional museum space. The Rotunda is an infinite gallery that contains all of the artwork from the exhibition. It’s an index of sorts. It’s a beautiful peaceful space that was based on the Rundetaarn in Copenhagen. The room that follows, the Landscape Gallery, was based on a Monet room from the Musée de l'Orangerie in Paris.
Image courtesy of Radiohead
How much was created in Unreal Engine? What was created outside of it and blended?

Chelsea Hash:
While the environment and characters were developed using traditional DCC tools like Maya, all level design iteration and level sequences were created inside Unreal Engine.

What was the character creation process like? 

Chelsea Hash:
We started with a brief built on the common themes and character representations in Radiohead’s catalog of materials. The team then developed an experimental combination of animations, rendering, and reactivity, so that a large cast of characters could reflect what was going on in the space. It was important that they feel varied without being random. 

We developed them using skeleton sharing, which allowed core behaviors to be shared amongst assets, giving designers a way to quickly configure options that would match the persona of the space. Characters moved along curated paths and a spawning system kept track of how many hand-placed NPCs would reveal themselves in a given area. This ensured we could preserve the intended tone, as players continued to experience new faces in new places, either by moving between rooms or different play-throughs.

Devon Chapman, Associate Producer, Arbitrarily Good Productions: Creating the characters for this project was a very explorative process.The goal was to try and channel the creative energy of Thom and Dan during this era into something that fit into this new abstract environment. We took inspiration from the sketches and tried to find ways they would make sense in 3D and evoke an emotional reaction from the player. Being new, while also familiar to fans who recognize the Radiohead stickmen, bears, and minotaurs.
Image courtesy of Radiohead
There’s some great particle work in this exhibition, especially in the pathway to the woods and the "How to Disappear Completely" sequence. Can you tell us about the creation process?

Sean Evans:
I had this notion of how I wanted this to work, but I’m not a Niagara guy. I got it working outside of Unreal with a different particle system, but had no idea how to get it to fly in-engine. I finally figured out how to import a still of my particle system as LIDAR data and used that inside Unreal to show pacing and scale to Chelsea and the Niagara ninjas.

However, the Commodore room terminal text was made with a custom Niagara system that could loop and process random word corruptions.

The LIDAR animated painting sequence, which is set to “How To Disappear Completely,” was all done in Niagara. We did this by generating the particle color from the painting texture and keying the noise, velocity, and displacement property from the color texture itself. These values were all controlled through Sequencer.
Image courtesy of Radiohead
You’ve touched on this a bit before, but there are so many incredible rooms in this exhibit. Can you walk us through a few? 

For instance, how did you create the Pixel Warehouse? For anyone that hasn’t seen it, it has this great image-burst effect that seems to stop time.

Sean Evans:
This room was a last-minute add. It’s an answer to the Pyramid Atrium, which is bright and red, features daylight, and has both greenery and trees. The Pixel Warehouse, on the other hand, is cold, dark, and synthetic. Because it was a last-minute add, I was trying to come up with an effect that would be relatively simple to implement, but have big reaction on the space. Something that would really change the room. The first take that the environment team did on the blockout blew me away!
Image courtesy of Radiohead
This one was really fun because of how fast it came together.  Because it was a lightweight setup, done all in Blueprint, using video playback controls we’d already assembled for other rooms, there was more room to tweak the implementation. It’s one of the reasons why it’s one of the more interactive experiences. The early pre-viz rooms were more born from a filmic standpoint, but this room was born more from the game side.

How about The Televisions room? How were the videos chosen/incorporated?

Sean Evans:
The space was built around the concept of the videos. When Kid A and Amnesiac were originally released, the band had released a ton of these short video blips. Some were done by Shynola, some by Chris Bran, some were excerpts from webcasts at the time. This room is a lift from when the exhibit was being planned as a real space. It’s based on a junk shop, and is one of those rooms that probably shouldn’t be possible inside Unreal. 

Ultimately, we wanted to make a room that worked on the macro and the micro. This is another one that the environment team and engineers absolutely killed. It’s great, you can stand back and watch the room as a grid or get right up close and see the pixels on the televisions.
Brett Lajzer: The biggest challenge with this room was figuring out how to have way more videos playing than would be reasonable on most systems—in this case, there are 72 different “streams” of video. The system we came up with involved pre-compositing the videos together into a grid. This was done as a single 4K video file that could be mapped to the TVs around the room and play without severely affecting performance. The system allows for multiple different mappings, which is what enables each screen to have its own content or spread a given blip over multiple screens.
Chelsea Hash: Our technical artist used the custom asset data system in Unreal Engine to allow different preset modes to be created directly in the Editor.  So for the modes that featured a combination of videos spanning multiple monitors, that data was baked into material UV offsets. This allowed the environment team to just deck the TVs in the most evocative, art-directed way and trust the CRT video system and creative direction to do the mapping.  The other options would have involved a more procedural TV array stacking, or a more typical data-driven setup.
    
And The Paper Chamber?

Chelsea Hash: The Paper Chamber is interestingly not Niagara, but instead all implemented in-shader, and processed through Houdini to give each page pivots and UV data. The amount of tight creative direction for the UV’s indicated a mesh-based approach.
 
Image courtesy of Radiohead

The complexity of baking paper-like turbulence, and uneven fly-away paper edges made for a huge material parameter collection and material instance parameter setup.  So the first step was making it look good, and the second was making it controllable, and the third was making it sequenceable.

Separately, the presentation of the art itself was refined to make sure that it could be randomized and show different art every time. To do this, we performantly mapped to 4K atlases that weren’t too pixelated even in the floor-to-ceiling mode, and unmirrored the backside of the pages, while compositing onto high-quality PBR paper texture sets.

Making sure that all those controls could be simplified and boiled down into something art directable and controllable was a key part of the process.
Matthew Davis: The Paper Chamber was one of the more fun audio challenges, as we had to use almost every technique employed elsewhere in the experience all in one room. So much of the audio-visual experience relies on a mixture of diegetic and nondiegetic sounds, with the music often being a combination of both in many places, which helps create the feeling of being immersed in these compositions. 

The idea for this room was that the melodic components of “In Limbo” represented the ‘constructed’ room, with the rhythm section and reverbs representing the ‘deconstructed,’ empty room. So we had to make sure that compositionally and mix-wise everything gelled as the room dynamically blew away and reformed over an approximately fifteen-minute procedural sequence, essentially crossfading between the two musical states while a windy tornado of papers swirled around the room. We programmed real-time parameter controls (RTPCs) in Wwise and had those driven by how deconstructed the room was at that time, as well as the direction of the wind. The same RTPCs also drove a bin of SFX (papers flapping, wind, etc.) that had various states such as idle, blowing away, and reforming. The result is a storm of constantly changing sound.

And lastly, how about that amazing zoetrope work that informs the “You and Whose Army” sequence?
 
Image courtesy of Radiohead

Sean Evans: When we were making the demo, I had this idea that the inside of the Pyramid would be some sort of a zoetrope. But we ran out of time to even think about something that complex, so I shelved it. 

When it came time to develop the Empty Basement Sequence, we were talking about using the torus shape for “Pyramid Song,” and about things that happen in circular formations. Stanley mentioned zoetropes, how great they were, and what weird old technology they were. I lept and dove in. I mocked up a 3D zoetrope concept outside of Unreal that worked by taking frames of custom mocap and dividing them across instances in a radial array. It was clunky, but the effect worked. I made enough to show a rough cut to Chelsea, and they developed a Blueprint inside of Unreal that took the idea to another level. It provided all sorts of abilities. You could swap out animation, swap out meshes, animate scales, and all sorts of other variables. The rest was based on 3D characters developed from the artwork, and of course, the music. That bleak feeling of being overpowered, but never giving up and fighting back. It’s such a great song.

If you’d like to explore the KID A MNESIA EXHIBITION for yourself, it is available now for free via the Epic Games Store and PlayStation 5.

    Get Unreal Engine today!

    Get the world’s most open and advanced creation tool.
    With every feature and full source code access included, Unreal Engine comes fully loaded out of the box.
    <em>The Future of Animation</em>
    Event
    June 27

    The Future of Animation

    Join us for The Future of Animation, a month-long event where we celebrate the success of industry leaders and share resources for those who are just getting started.
    <em>The Future of Animation</em>
    Event

    The Future of Animation

    Join us for The Future of Animation, a month-long event where we celebrate the success of industry leaders and share resources for those who are just getting started.
    New release brings Mesh to MetaHuman to Unreal Engine, and much more!
    News
    June 9

    New release brings Mesh to MetaHuman to Unreal Engine, and much more!

    This release of the MetaHuman framework brings not only new features for MetaHuman Creator, but also an exciting new MetaHuman Plugin for Unreal Engine, as well as support for the new character rigging, animation, and physics features in Unreal Engine 5.
    New release brings Mesh to MetaHuman to Unreal Engine, and much more!
    News

    New release brings Mesh to MetaHuman to Unreal Engine, and much more!

    This release of the MetaHuman framework brings not only new features for MetaHuman Creator, but also an exciting new MetaHuman Plugin for Unreal Engine, as well as support for the new character rigging, animation, and physics features in Unreal Engine 5.
    Unreal Engine 5 is now available!
    News
    April 5

    Unreal Engine 5 is now available!

    With this release, we aim to empower both large and small teams to really push the boundaries of what’s possible, visually and interactively. UE5 will enable game developers and creators across industries to realize next-generation real-time 3D content and experiences with greater freedom, fidelity, and flexibility than ever before.
    Unreal Engine 5 is now available!
    News

    Unreal Engine 5 is now available!

    With this release, we aim to empower both large and small teams to really push the boundaries of what’s possible, visually and interactively. UE5 will enable game developers and creators across industries to realize next-generation real-time 3D content and experiences with greater freedom, fidelity, and flexibility than ever before.