Generative game helps students guide downtown redesign

March 19, 2021
Changing any part of a city is sure to spur debate, especially when it involves an area many see as common ground. Plans and illustrations can start a conversation, but where do you go when the public doesn’t know how to describe the changes they want? Or what they are willing to accept? This breakdown has been a classic challenge of municipal design, and the bane of city councils who want to do right by their residents. 

In Watertown, South Dakota, the community was greeted with something different when it came time to discuss their own downtown redesign—a set of real-time games. Instead of searching for the right word, residents could suddenly voice their opinions with a slider as they manipulated different parts of a 3D building.
 

For anyone in the know, it was like the skies had parted. The dreaded language barrier had been breached. The insights were flowing, and professor Fang Xu wasn’t surprised in the least. In fact, it confirmed what he had suspected for years: there’s a bright future in generative design.
Image courtesy of Fang Xu, Chris Simmons, Tolulope Oyeniyi

A new way

Like most visions, something had been driving Xu long before he found his inspiration. As an architecture professor at South Dakota State University, Xu had always noticed a gap between what students are taught and how everyday people engage with their designs. The solution materialized online in a video depicting how game environments could be used to present new projects to clients. He immediately saw the potential.

“It was exactly what I had been looking for, but I also knew you could do so much more with it,” says Xu. “Objects and various building features could be more interactive. Auto-generation tools could produce evaluative outcomes that wouldn’t normally occur in the traditional design process. And parametric buttons and sliders could be used to make the whole process incredibly easy. I had to make one.”
Image courtesy of Fang Xu
Online research led him to Unreal Engine, a rising standard in real-time architectural visualizations. His first project was a game that helped determine optimal shading outcomes for different times of the day. Using a mix of courses, forum feedback, and the Blueprint visual scripting system, he finished the game himself, building in parametric tools without any previous coding experience.

And while parametricism is nothing new to the architectural world, the push for randomization is still gaining ground. To Xu, the argument is simple. “It snaps you out of linear thinking. Usually, people can only conceptualize the idea of randomizing a few variables. But with these tools, you can randomize dozens of variables at once, which immediately opens up new possibilities and ideas—all without losing the refinement process that lets you put your mark on a design.”
Image courtesy of Jared Mulder and Chad Umlauf
More and more, architects are grappling with the question of automation and computer-assisted decision-making suggestions in the design process. Computational design certainly has its fans and can open up innovative things, like blending generative designs with Unreal Engine-based previsualization. But the idea of adding rule logic to the design experience is not without controversy and can ruffle architects who believe that automation robs the process of inspiration. 

Xu, however, believes that there is room for both and prepares his students accordingly. Reining in randomness and sparking ideas is helpful for architects, but he also saw value in using generative games as community-outreach tools. Then the city came calling.

Game time

With a downtown redesign in the wings, city hall needed to establish guidelines for its architects. Some things were known—any update needed to feel like it belonged—but the rest was an open question. One that the community would no doubt have an opinion on. The university was tapped to gather that feedback, helping the city figure out what would be desired and tolerated before the architects got involved. It was also a perfect opportunity to put Xu’s theories to the test.
Image courtesy of Alexandra Kummer, Kaitlyn Walker
Again, he envisioned a game; only in this one, the focus would be on a customizable 3D facade of a future boutique hotel situated in the middle of a downtown street. Working in teams, students would build their own game, complete with all the sliders, menus, tools, and looks needed to help the public effectively influence the process.
Image courtesy of Fang Xu
The students were horrified. 

But Xu brought them back, explaining how he moved from square one to a working game in a matter of weeks. This visual language would feel new, but it would also help them shorten feedback loops and break through the language barrier that stifles designer-to-community conversations. They were still worried—but they were also in.

He began by giving them a rough template. He had been working on his own version of the game, so he already had a general roadmap. He then shared some of the technical prototypes he had been working on in Blueprint, including his auto-generation tools and evaluative scoring method. Together, these two features would insert some thoughtfulness into the experience, elevating it from random combinations into the real meaning of generative design.
Image courtesy of Fang Xu
“You always need a mechanism to help you make a decision,” says Xu. “Parametric tools, by themselves, allow you to make adjustments. But by building in evaluative criteria, you can take it to another level. You can see how decisions are rated. The game can then use that information to refine suggestions as it’s looking for the optimal outcome. It leads to better results.”

Parameters remain important in generative designs, however, which is why the students continued to employ them in their games. Windows, exterior materials, feature sizing, layout, and more could be adjusted quite easily with the sliders, all of which could be turned on or off depending on the user’s preference. Most times, students would prioritize the parameters based on their research about the city and the development project, adding a touch of uniqueness to each game.

With each new bit of progress, the students began to feel more and more empowered. Many had never used Grasshopper, let alone a game engine, but here they were coding games that would have a direct impact on the future of their city. The next step was to show them off.
 

The big reveal

If there’s any word to describe the reaction, it was probably delight—or relief, depending on who you asked. In all, the community got to experience six games, each a gateway to hundreds of potential possibilities. As citizens cycled through the stations, students would talk through options and show them how to generate facade designs in the game. Some were immediately drawn to the auto-generation tools, treating each click like a potential lottery win. Others wanted to dial in options to make everything to their liking. But around the room, there was a common refrain: “This is so much easier than I thought.”
“We couldn’t have asked for anything more,” says Xu. “The powerful thing about communities, from an architectural perspective, is they have firsthand knowledge that can help designers build better structures and neighborhoods. Traditionally, that’s been a challenge to crack. But with game engines, we have a new way in.”

    Start teaching real-time skills

    Sign up for the Education Newsletter to receive monthly updates and free, valuable teaching resources for integrating Unreal Engine into your curriculum and preparing your students for an immersive world.
    News
    August 19

    Unreal Engine 4.27 released!

    Creators across all industries have something to celebrate with this release: In‑camera VFX goes next-level with a slew of efficiency, quality, and ease-of-use improvements, while other highlights include path tracing for stunning final images, out-of-the-box access to Oodle and Bink, production-ready Pixel Streaming, and much more.
    News

    Unreal Engine 4.27 released!

    Creators across all industries have something to celebrate with this release: In‑camera VFX goes next-level with a slew of efficiency, quality, and ease-of-use improvements, while other highlights include path tracing for stunning final images, out-of-the-box access to Oodle and Bink, production-ready Pixel Streaming, and much more.
    Spotlight
    September 7

    Mold3D Studio to share Slay animated content sample project with Unreal Engine community

    In a bid to inspire and educate artists, Mold3D Studio is sharing its decades of experience in the industry by creating a sample project for animated content. With a distinctive style that’s a hybrid of anime and realism, Slay is rendered entirely in Unreal Engine.
    Spotlight

    Mold3D Studio to share Slay animated content sample project with Unreal Engine community

    In a bid to inspire and educate artists, Mold3D Studio is sharing its decades of experience in the industry by creating a sample project for animated content. With a distinctive style that’s a hybrid of anime and realism, Slay is rendered entirely in Unreal Engine.
    Spotlight
    July 21

    Taking Unreal Engine's latest in-camera VFX toolset for a spin

    Recently, Epic Games and filmmakers’ collective Bullitt assembled a team to test out the latest in-camera VFX toolset, part of the extensive suite of virtual production tools in the upcoming Unreal Engine 4.27 release. To put each of the tools new through their paces, they created a short test piece to mimic a production workflow.
    Spotlight

    Taking Unreal Engine's latest in-camera VFX toolset for a spin

    Recently, Epic Games and filmmakers’ collective Bullitt assembled a team to test out the latest in-camera VFX toolset, part of the extensive suite of virtual production tools in the upcoming Unreal Engine 4.27 release. To put each of the tools new through their paces, they created a short test piece to mimic a production workflow.