Engine - News, Developer Interviews, Spotlights, Tech BlogsFeed containing the latest news, developer interviews, events, spotlights, and tech blogs related to Unreal. Unreal Engine 4 is a professional suite of tools and technologies used for building high-quality games and applications across a range of platforms. Unreal Engine 4’s rendering architecture enables developers to achieve stunning visuals and also scale elegantly to lower-end systems.en-USFind out What’s Next at GDC 2020! a glimpse at the future of game development? With next-gen consoles just around the corner, we’re ready for What’s Next now! Don’t miss amazing partner demos, the latest technology previews, learning sessions, and more.In less than two months, this year’s <a href="" target="_blank">Game Developers Conference</a> (GDC) will kick off in San Francisco, and with the next generation of consoles on their way, we couldn’t be more excited! <br /> <br /> GDC is one of our favorite times of the year, where we get the chance to catch you up on the State of Unreal in our keynote, and unveil all-new technology developments that will be coming your way down the road. It’s also a great opportunity to find out what talented teams around the world are doing to realize their creative ambitions and achieve success. And of course, our booth is the perfect venue for networking, socializing, and generally having a whole lot of fun.<br /> <br /> Please save the date for the sessions below.  <h1>Epic Games’ State of Unreal</h1> <em>Blue Shield of California Theater at YBCA, 700 Howard Street | </em><em>Wednesday, March 18 | 11:00 AM - 12:00 PM </em><br /> <br /> Be a part of What&#39;s Next! Join Epic Games and special guests as we reveal technological advancements that will open up a new generation of creative possibilities, freeing teams of all sizes to defy limits and redefine what they are able to achieve. Plus, get the latest news on Epic Online Services, the Epic Games Store, Epic MegaGrants, and more.<img alt="Blog_Body_Image_8.jpg" height="auto" src="" width="auto" />GDC attendees with an Expo Pass Plus or higher are invited to attend. Can&#39;t make it in person? Join our livestream at <a href="" target="_blank"></a> and get all the news as it happens.  <br /> <br /> <a class="addeventatc" data-id="sy4608124" href="" rel="nofollow" target="_blank" title="Add to calendar">Add to calendar</a><br /> <script type="text/javascript" src="" async defer></script> <h1>Tech Talks</h1> <em>Blue Shield of California Theater at YBCA, 700 Howard Street | </em><em>Wednesday, March 18</em><br /> <br /> Getting ready for next-gen? There’s never been a better time to get an in-depth look into how Unreal Engine can help you create stunning games that will take advantage of everything the new hardware has to offer. Don’t miss our series of Tech Talks that will provide insights into today’s powerful toolset and a glimpse at what’s around the corner. <br /> <br /> <strong>Unreal Engine for Next-Gen Games</strong> <br /> <em>12:30 PM  - 1:30 PM</em><br /> Take a look at new and existing Unreal Engine features designed for next-gen game development.<br /> <br /> <strong>The Future of Unreal Rendering </strong><br /> <em>2:15 PM  - 3:00 PM</em><br /> Get a sneak peek at yet-to-be-released Unreal Engine rendering technology.<br /> <br /> <strong>The Evolution of Real-Time VFX with Unreal Engine&#39;s Niagara </strong><br /> <em>3:30 PM  - 4:30 PM</em><br /> Join us for an in-depth look into the next phase of development and innovation.<br /> <br /> <strong>Building Worlds in Fortnite with Unreal Engine</strong> <br /> <em>5:00 PM  - 6:00 PM</em><br /> Learn how the Fortnite team used Unreal Engine worldbuilding tools to create Chapter 2.<img alt="Blog_Body_Image_5.jpg" height="auto" src="" width="auto" /> <h1>Expo</h1> <em>Moscone Convention Center, 747 Howard Street </em><br /> <em>Wednesday, March 18 & Thursday, March 19 | 10:00 AM - 6:00 PM<br /> Friday, March 20 | 10:00 AM - 3:00 PM</em> <h2>Unreal Engine booth, South 349</h2> Visit our booth and be the first to see the latest cutting-edge Unreal Engine tech at our <strong>demo stations</strong>, and <strong>meet the devs</strong> behind the code to get your questions answered. Plus, our <strong>community team </strong>and <strong>evangelists</strong> will be there and would love to chat about how you can better tap into our resources and get involved in our user groups.<br /> <br /> Want to get up to speed on topics like <em>Sky and Atmosphere, Niagara VFX, Chaos Physics and Gameplay, Environment Building</em>, and <em>Quixel</em> with some free training? Epic staff will be delivering rotating presentations for all three days of the expo in our <strong>Learning Theater</strong>.<br /> <img alt="Blog_Body_Image_2.jpg" height="auto" src="" width="auto" /> <h2>Games. Jobs. Beer. South 327</h2> Drop in at South 327, where you can check out dozens of <strong>awesome games</strong> and meet the developers behind them face-to-face, find plenty of <strong>tasty snacks</strong> and <strong>cold beer</strong>, and network for career opportunities. <img alt="Blog_Body_Image_14.jpg" height="auto" src="" width="auto" /> <h1>Additional sessions</h1> <em>Moscone Convention Center, 747 Howard Street</em><br /> <br /> But wait, there’s more! If you’re in the process of—or thinking about—moving to Unreal Engine, you won’t want to miss our crash course. Plus, find out more about cross-play from our Epic Online Services team, and get answers to your burning questions and a glimpse of the road ahead for the Epic Games Store.<br /> <br /> <strong>Epic Games Store: An Update and Q&A</strong><br /> <em>Wednesday, March 18 | 3:30 PM - 4:30 PM | West Hall, Room 2020</em><br /> Find out where the store is today, a year after launch, and learn what to expect in the future.<br /> <br /> <strong>Crash Course for Studios New to Unreal Engine</strong><br /> <em>Thursday, March 19 | 10:00 AM - 11:00 AM | West Hall, Room 3001</em><br /> Moving to Unreal Engine? Learn the best practices and essential features for success.<br /> <br /> <strong>Why Cross-Play Matters</strong><br /> <em>Thursday, March 19 | 11:30 AM - 12:30 PM | West Hall, Room 2000</em><br /> This panel session discusses the benefits and challenges of supporting cross-platform play, progression, and purchasing. <br /> <br /> <strong>Deconstructing Fortnite&#39;s Cross-Platform Experience</strong><br /> <em>Thursday, March 19  | 12:45 PM - 1:45 PM | West Hall, Room 2000</em><br /> Hear what it takes to build and operate cross-play and cross-progression games.<br /> <br /> <br /> Remember to bookmark our <a href="/gdc2020" target="_blank">event page</a>, which will be updated with more information in the coming weeks. We look forward to seeing you at GDC 2020!GDC 2020GamesFilm & TelevisionCommunityLearningNewsTue, 04 Feb 2020 14:00:00 GMTTue, 04 Feb 2020 14:00:00 GMT Darksiders Genesis successfully reinvented itself as a co-op isometric action game Syndicate details how they made the <em>Darksiders</em> formula work from a new top-down perspective. Airship Syndicate took bold development steps with <a href="" target="_blank"><em>Darksiders Genesis</em></a>. Not only is it the first game in the venerated series to include co-op, but the title eschewed the third-person perspective in favor of a top-down isometric view. This new, fresh direction garnered the game great reviews with <a href="" target="_blank">GamingTrend</a> stating, “<em>Darksiders Genesis</em> is by far the best game in the series.”<br /> <br /> Released on PC late last year, <em>Darksiders Genesis</em> recently launched on consoles. While the game is technically Airship Syndicate’s first foray into the series, the company hosts Darksiders DNA with roughly 10 developers who worked on the original game, including CEO and Co-founder Joe Mad. We caught up with several members of the team to see how they accomplished their goal of making the <em>Darksiders</em> formula flourish from a new perspective.  <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <strong>With the original <em>Darksiders</em> games employing a third-person perspective, why did it make sense to use an isometric view for <em>Darksiders Genesis</em>? <br /> <br /> THQ Nordic Development Director Reinhard Pollice: </strong>When we at THQ Nordic acquired the franchise back in 2013, one of our initial thoughts was that it would be a cool way to advance the franchise and to also give it a new spin. We felt the core elements of a <em>Darksiders</em> game would translate very well to a top-down perspective and we were eyeing a soft multiplayer introduction this way. Fortunately, Airship Syndicate had very similar thoughts and the project gained traction.<br /> <br /> <strong>Was it challenging reinventing the <em>Darksiders</em> franchise to use this new perspective? What were some of the stand-out issues to overcome?<br /> <br /> Pollice: </strong>It was a big challenge and we were constantly worried if it was going to be perceived as a core <em>Darksiders</em> experience. Finding the right pacing for combat and designing the traversal scenes in a way that the camera wouldn’t be in the way were two of the biggest challenges during development.<br /> <img alt="DeveloperInterview_Darksiders_Genesis_005.jpg" height="auto" src="" width="auto" /><br /> <strong>Besides the original <em>Darksiders</em> games, what other titles might have influenced <em>Darksiders Genesis</em>?<br /> <br /> Pollice: </strong>We really enjoyed the top-down Lara Croft games. The weight of elements was different, but they also had combat, traversal, and puzzles.<br /> <br /> <strong>How early on in the project did Airship Syndicate know that local co-op would be a major component of the game?<br /> <br /> Pollice: </strong>From the very get-go, we wanted to do multiplayer and that meant online as well as local. In fact, one of our first prototypes involved split-screen multiplayer as it was a big unknown for us. It was something we wanted to figure out early on. <br /> <img alt="DeveloperInterview_Darksiders_Genesis_007.jpg" height="auto" src="" width="auto" /><br /> <strong>With Strife being a new playable character in the <em>Darksiders</em> universe, how did you approach designing him?<br /> <br /> CEO Joe Mad: </strong>Funnily enough, the initial design for Strife was done around the same time as War, Death, and Fury, which was way before we even signed the original <em>Darksiders</em>. Each of the Horsemen got a minor update before they went into the game but Strife remained very close to his original concept. The important thing about Strife is that he [needed to] look quick and dangerous, and that he stood apart from his hulking brother War. <br /> <br /> <strong>The game has a wide variety of enemies. Did the team have to think about designing them differently given the new camera angle?<br /> <br /> Mad: </strong>There are always constraints when designing characters for<em> any</em> game, and having a fixed camera doesn’t really add any further complexity. You just want to keep the detail and interest areas where people will see them, so in this case, it meant keeping the detail across creatures’ heads and upper torso. <br /> <br /> If there was any challenge, it was due to the camera being zoomed out so far. Everything had to read great at a very small size. We tend to make big chunky characters with defined silhouettes, so that helped! <br /> <img alt="DeveloperInterview_Darksiders_Genesis_001.jpg" height="auto" src="" width="auto" /><br /> <strong>The game features beautiful environments. How did you design them?<br /> <br /> Lead Environment Artist Jesse Carpenter: </strong>Designing the environments involves a collaboration between the designers, asset makers, world builders, and many others. Our designers are the ones who do the very first take on an area that we’ve decided to make. They focus on the flow and gameplay that needs to be there. They will block in the core paths and ideas. Throughout this stage, the world builder, who will be working on the area, will work with the designer to help think of ideas for how it can look in the end and what we can do with it to make the environment feel unique and special. <br /> <br /> Once the design is approved, the world builder takes over and begins to set dress the designed map. It is very likely during this process that the map can change a decent amount, requiring a designer to update their ideas afterwards. The world builder&#39;s job is to make the level feel like a real place while facilitating the gameplay needs and design goals. They will do multiple passes to polish and finish a level including: building and placing assets, lighting, VFX placement, performance optimizations, and post-process settings. Asset makers will build custom assets needed to finish the level and VFX Artists provide the VFX needed for the level. Tech artists and engineers help support artists with custom tools as needed or help make levels performant on lower-end consoles.<br /> <br /> <em>A lot</em> goes into making environments become really beautiful and play well. In the end, every level is the result of a lot of people working in collaboration.<br /> <img alt="DeveloperInterview_Darksiders_Genesis_003.jpg" height="auto" src="" width="auto" /><br /> <strong>Considering <em>Darksiders Genesis</em> features polished graphics and awesome particle effects, can you elaborate on how you executed on the look of the game?<br /> <br /> Carpenter: </strong>Arriving at a final look for games is a very iterative process and it&#39;s usually being polished up until the last day we are allowed. We had a lot of ideas for how we wanted to translate our art style into the <em>Darksiders</em> universe after working on <em>Battle Chasers</em>. There were a lot of things we learned while making art for a top-down game on <em>Battle Chasers</em> that we were able to integrate into <em>Genesis</em>.<br /> <br /> That being said, the <em>Darksiders</em> franchise has a much more gritty overall theme and we had to lean into that a bit more than we did on <em>Battle Chasers</em>. The <em>Darksiders</em> games have always walked a line between being really stylized and realistic. To account for this, we knew that we needed to use more fully realized <a href="" target="_blank">PBR materials</a> and quite a bit more detail than we did on <em>Battle Chasers</em>.<br /> <br /> It did take many iterations and changes to the process to arrive at a place that felt like the right mix of Airship and <em>Darksiders</em>. Once we had some art and a level that was feeling right, it became much faster and easier to populate those ideas and changes across the other levels and art. We had a lot of art at this point that had to be updated to fit the approved look, so we were working backwards and forwards at the same time.<br /> <br /> Sometimes you find the right look early on and sometimes it takes a while to really nail the feel and look of a world you&#39;re trying to make. Working with the <em>Darksiders</em> franchise provided a lot of amazing source material to work with, which was great, but it did take quite a bit of work to interpret that into our own take. Luckily, we have an amazing team of artists, FX artists, and engineers that made it happen.<br /> <img alt="DeveloperInterview_Darksiders_Genesis_009.jpg" height="auto" src="" width="auto" /><br /> <strong><em>Darksiders Genesis</em> has awesome over-the-top animations. Can you delve into how those were created?<br /> <br /> Lead Animator Jeremy Pantoja: </strong>We drew inspiration for the animation style from games like <em>League of Legends</em>, the former <em>Darksiders games</em>, and our own <em>Battle Chasers</em>. I wanted the animations to be snappy and exciting, but still feel heavy and brutal. We always had to keep the characters’ personalities in mind when making them, and from there, we would record reference videos of ourselves performing the actions, or at least as close as we could get. We used the reference for rough posing and timing, but from there, we’d exaggerate the poses, adjust the timing, and add embellishments.<br /> <br /> I think the thing we did to really sell the faster, snappy action, was utilize a lot of smear frames. If you were to freeze some of the animations during a very fast action, you’d see the characters were usually very stretched out and distorted. On a single frame this looks strange, but in motion, it fools the eye into thinking there are more frames than there actually are. So, we were able to do very fast motions without the animation looking like it was “popping.”<br /> <br /> <strong>Why was Unreal Engine a good fit for <em>Darksiders Genesis</em>? <br /> <br /> Game Director Ryan Stefanelli: </strong>Based on the success other games of similar scope have had with Unreal, it felt like a very natural fit, especially for a multiplayer project. There’s also a massive resource base for group-thinking through problems that proved very valuable in the end. And it&#39;s a very proven technology.<br /> <br /> <strong>Technical Director Chris Brooks: </strong>Having access to the full <a href="" target="_blank">source code</a> is a huge benefit Unreal offers.<br /> <img alt="DeveloperInterview_Darksiders_Genesis_008.jpg" height="auto" src="" width="auto" /><br /> <strong>Considering this is the studio&#39;s first Unreal Engine title, how was the switch to UE?<br /> <br /> Stefanelli:</strong> Unreal is such a common development tool now that many people on the team had used it on previous projects. And those that hadn’t touched it before were quickly trained up by those who had. <br /> <br /> <strong>For more information on <em>Darksiders Genesis</em>, visit:</strong> <ul style="margin-left: 40px;"> <li><a href="" target="_blank"></a></li> <li><a href="" target="_blank">Darksider’s Discord server</a></li> <li><a href="" target="_blank">Darksider’s Twitter account</a></li> <li><a href="" target="_blank">Airship Syndicate’s Twitter account</a></li> <li><a href="" target="_blank">THQ Nordic’s Twitter account</a></li> </ul> Airship SyndicateDarksiders GenesisGamesArtCommunityJimmy ThangFri, 21 Feb 2020 14:30:00 GMTFri, 21 Feb 2020 14:30:00 GMT new paths for filmmakers on 'The Mandalorian' Games, in collaboration with Jon Favreau and ILM, forged a new production path for <em>The Mandalorian</em> Season 1 by evolving real-time technology to meet complex demands for filming scenes and actors. Having spent the last 15 years as a game engineer and technical lead at Epic Games, I’ve learned firsthand that rapid feedback loops are critical for successful creative collaborations. The quick iteration, spontaneity, and sense of shared purpose that comes from working closely together is irreplaceable. At Epic, we go to great lengths to give our creative teams as much time together as possible.<br />   <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <br /> So when I began to learn more about filmmaking, it was surprising to realize that it’s common for critical departments on a traditional visual effects-heavy production to be decentralized. Weeks or months can pass between the on-set work of key creatives and the post-production work to fully realize the vision. This seemed like an opportunity where real-time game engine technology could make a real difference.<br />  <br /> Fortunately, Jon Favreau is way ahead of the curve. His pioneering vision for filming <em>The Mandalorian</em> presented an opportunity to turn the conventional filmmaking paradigm on its head.<br /> <img alt="Mandalorian_HUC-003903_R.pip.jpg" height="auto" src="" width="auto" /><br /> When we first met with Jon, he was excited to bring more real-time interactivity and collaboration back into the production process. It was clear he was willing to experiment with new workflows and take risks to achieve that goal. Ultimately, these early talks evolved into a groundbreaking virtual production methodology: shooting the series on a stage surrounded by massive LED walls displaying dynamic digital sets, with the ability to react to and manipulate this digital content in real time during live production. Working together with ILM, we drew up plans for how the pieces would fit together. The result was an ambitious new system and a suite of technologies to be deployed at a scale that had never been attempted for the fast-paced nature of episodic television production.<br /> <img alt="Mandalorian_HUC-027199.pip.jpg" height="auto" src="" width="auto" /><br /> By the time shooting began, Unreal Engine was running on four synchronized PCs to drive the pixels on the LED walls in real time. At the same time, three Unreal operators could simultaneously manipulate the virtual scene, lighting, and effects on the walls. The crew inside the LED volume was also able to control the scene remotely from an iPad, working side-by-side with the director and DP. This virtual production workflow was used to film more than half of <em>The Mandalorian</em> Season 1, enabling the filmmakers to eliminate location shoots, capture a significant amount of complex VFX shots with accurate lighting and reflections in-camera, and iterate on scenes together in real time while on set. The combination of Unreal Engine’s real-time capabilities and the immersive LED screens enabled a creative flexibility previously unimaginable.<br /> <img alt="Mandalorian_HUC-058679.pip.jpg" height="auto" src="" width="auto" /> <blockquote> <p style="margin-left: 40px;"><em>The Mandalorian</em> was not only an inspiring challenge, but a powerful test bed for developing production-proven tools that benefit all Unreal Engine users. Our <strong>multi-user collaboration tools</strong> were a big part of this, along with the <strong>nDisplay system</strong> to allow a cluster of machines to synchronously co-render massive images in real time, and our <strong>live compositing system</strong> that enabled the filmmakers to see real-time previews. We also focused on abilities to interface with the engine from external sources, such as recording take data into Sequencer or manipulating the LED wall environment from the iPad. All of these features are available now in 4.24 or coming soon in 4.25.</p> </blockquote> <div style="text-align: center;"><img alt="FEED_THUMB_Mandalorian_V1.jpg" height="auto" src="" width="auto" /></div> Ultimately, being part of <em>The Mandalorian</em> Season 1 was one of the highlights of my career – the scope of what we were able to achieve with real-time technology was unlike anything else I’ve worked on. Giving filmmakers like Jon Favreau, executive producer and director Dave Filoni, visual effects supervisor Richard Bluff, cinematographers Greig Frazier and Barry Baz Idoine, and the episodic directors the freedom and opportunities to make creative decisions on the fly, fostering live collaboration across all departments, and letting everyone see their full creative vision realized in mere seconds, was a truly gratifying experience. ILM, Golem Creations, Lux Machina, Fuse, Profile, ARRI, and all of the amazing collaborators on this project were deeply inspiring to work with and I&#39;m proud to have been a part of it. But what&#39;s even more exciting is that the techniques and technology we developed on <em>The Mandalorian</em> are only the tip of the iceberg – I can’t wait to see what the future has in store.<br /> <img alt="Mandalorian_HUC-066962.PIP.jpg" height="auto" src="" width="auto" /><br /> <br />  Film & TelevisionThe MandalorianNewsVirtual ProductionVirtual SetsStar WarsJeff FarrisThu, 20 Feb 2020 16:30:00 GMTThu, 20 Feb 2020 16:30:00 GMT 2020 games booth announced dozens of amazing titles from around the world, the Unreal Engine Games.Jobs.Beer. booth is set to be the life of the party at GDC 2020Today, we’re excited to announce that we’ll once again be hosting a three-day celebration of our amazing developer community at <a href="" target="_blank">GDC 2020</a> inside Moscone South (Booth #327, to be exact) with a wide range of immersive experiences across PC, console, mobile, VR and AR from studios of all shapes, sizes, and locations.<br /> <br /> If you’ve attended the Game Developers Conference the past few years, we hope you’ve taken the opportunity to swing by the Unreal Engine booth in Moscone South to hang out with talented developers from around the world, do some networking, and go hands-on with a variety of Unreal Engine-powered titles while enjoying tasty snacks (and cold beer) on us.<br /> <br /> <img alt="GDC2019_Epic_ShowFloor_PartnerBooth_General_HiRes_00018.jpg" height="auto" src="" width="auto" /> <div style="text-align: center;"><em>The Unreal Engine Games.Jobs.Beer. booth at GDC 2019</em></div> <br /> Some newsworthy items include the opportunity to experience SQUARE ENIX’s <em>FINAL FANTASY VII REMAKE</em> before it ships worldwide on April 10, 2020, a playable demo of Arc System Works’ <em>Guilty Gear Strive</em> , and Mojang’s highly-anticipated <em>Minecraft Dungeons</em>. Of course, we’ll also have plenty of awe-inspiring indie titles to test drive like Beethoven & Dinosaur’s <em>The Artful Escape</em>, Eggnut’s <em>Backbone</em>, and Fanclub’s <em>Dead Static Drive</em>! Best of all, many of the developers themselves will be on-hand to mingle with the masses and provide insight into how they built their games.<br /> <br /> These titles are just the tip of the iceberg, so let’s take a look at some of the amazing games that will be on display in the Unreal Engine booth at GDC 2020—we hope to see you there!<br />   <h3><a href="" target="_blank"><strong>A Juggler&#39;s Tale</strong></a></h3> <strong>Developer: </strong>kaleidoscube |<strong> Publisher:</strong> Mixtvision<br /> <br /> <img alt="AJugglersTale_KeyArt.jpg" height="auto" src="" width="auto" /><br /> <br /> <em>A Juggler&#39;s Tale</em> is an atmospheric 3D side scroller set in a puppet theatre play. The string puppet Abby flees from her captors into freedom and adventure: a world of beauty and wonder - but also danger!<br /> <br /> She finds herself in a war-torn, medieval fairytale, surrounded by ravaged, starving citizens and hunted by the relentless cut-throat Tonda. Who can she trust? Can she avoid the traps and betrayal?<br /> <br /> Despite dangling from her threads, Abby learns that she can still influence her destiny - if only by winning over the audience. Help Abby navigate a traumatized, yet hauntingly beautiful world. Lead her forward through riddles and around traps, evade her pursuers and find - perhaps - freedom!<br />   <h3><a href="" target="_blank"><strong>Backbone</strong></a></h3> <strong>Developer: </strong>Eggnut | <strong>Publisher:</strong> Raw Fury <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <em>Backbone </em>is a pixel art noir detective adventure. Step into the shoes of Howard Lotor, a raccoon private eye. Interrogate witnesses through branching dialogues inspired by classic CRPGs, sneak through diverse districts of a now walled-off dystopian Vancouver, sniff out clues, and choose which leads to follow.<br />   <h3><a href="" target="_blank"><strong>Darksiders Genesis</strong></a></h3> <strong>Developer:</strong> Airship Syndicate | <strong>Publisher:</strong> THQ Nordic <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <em>Darksiders Genesis </em>is the first top-down action adventure in the Darksiders franchise. The game also introduces the horseman STRIFE to the franchise for the first time along with two-player co-op play! In Darksiders Genesis, STRIFE can battle alongside his brother and fellow horseman WAR in two-player co-op. Solo players will be able to alternate between STRIFE and WAR on the fly, taking advantage of STRIFE’s ranged abilities and WAR’s melee-style attacks. In addition, gameplay includes combat atop both horsemen’s fabled steeds: RAMPAGE and MAYHEM. Darksiders Genesis also features the introduction of Creature Cores, a combat system that empowers players to customize their own upgrade system fueled by the enemies they defeat.<br />   <h3><a href="" target="_blank"><strong>Dauntless</strong></a></h3> <strong>Developer:</strong> Phoenix Labs | <strong>Publisher: </strong>Phoenix Labs <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> As a Slayer, you and your hunting party are all that stands between your world and the Behemoths that seek to devour it. Take on boss-sized monsters, forge powerful weapons, and craft armour from the very creatures you slay — all in a massive, free-to-play online world.<br />   <h3><a href="" target="_blank"><strong>Dead Static Drive</strong></a></h3> <strong>Developer: </strong>Fanclub <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> Grand Theft Cthulhu. Grab your bat, steal a car, and take your chances against the unearthly horrors found along Route 666. Your family needs you alive.<br />   <h3><a href="" target="_blank"><strong>Destroy All Humans!</strong></a></h3> <strong>Developer: </strong>Black Forest Games |<strong> Publisher: </strong>THQ Nordic <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> The cult-classic returns! Terrorize the people of 1950’s Earth in the role of the evil alien Crypto-137. Harvest DNA and bring down the US government in the remake of the legendary alien invasion action adventure. Annihilate puny humans using an assortment of alien weaponry and psychic abilities. Reduce their cities to rubble with your flying saucer!<br /> <br /> Only <em>Destroy All Humans!</em> allows you to explore idyllic US cities of the 1950’s, read the thoughts of their citizens to uncover their secret desires… and then burn the very same cities to the ground with the mighty Death Ray of your flying saucer!<br />   <h3><a href="" target="_blank"><strong>Dragon Ball Z: Kakarot</strong></a></h3> <strong>Developer: </strong>CyberConnect2 |<strong> Publisher: </strong>BANDAI NAMCO Entertainment Inc. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> You’ve watched the iconic anime series, now play through it as Earth’s greatest defender, Kakarot! Relive the story of Goku and other Z-Fighters in <em>Dragon Ball Z: Kakarot</em>! Beyond the epic battles, experience life in the Dragon Ball Z world as you fight, fish, eat, and train with Goku, Gohan, Vegeta, and others.<br />   <h3><a href="" target="_blank"><strong>Drake Hollow</strong></a></h3> <strong>Developer: </strong>The Molasses Flood | <strong>Publisher: </strong>The Molasses Flood <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <em>Drake Hollow </em>is a cooperative action village building game set in the Hollow - a blighted mirror of our world - in which you build and defend villages of Drakes, the local vegetable folk. Either solo or with friends, explore a procedurally generated world of islands trapped in poisonous aether. Gather supplies, build networks to bring them back to your camp, find and rescue Drakes in the wilderness, raise them, and defend them from attacks by a menagerie of feral beasts.<br />   <h3><a href="" target="_blank"><strong>Endling</strong></a></h3> <strong>Developer: </strong>Herobeat Studios | <strong>Publisher:</strong> HandyGames <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> As the last mother fox, keep your cubs alive and reach the only place on Earth where humans cannot harm them.<br />   <h3><a href="" target="_blank"><strong>FINAL FANTASY VII REMAKE</strong></a></h3> <strong>Developer: </strong>SQUARE ENIX | <strong>Publisher:</strong> SQUARE ENIX <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> Coming to the PlayStation 4 computer entertainment system on April 10, 2020, <em>FINAL FANTASY VII REMAKE</em> is a reimagining of the iconic original game that redefined the RPG genre, diving deeper into the world and its characters than ever before. The first game in the project will be set in the eclectic city of Midgar and presents a fully standalone gaming experience that provides a great starting point to the series. Along with unforgettable characters and a powerful story, <em>FINAL FANTASY VII REMAKE</em> features a hybrid battle system that merges real-time action with strategic, command-based combat.<br />   <h3><a href="" target="_blank"><strong>Fuga: Melodies of Steel</strong></a></h3> <strong>Developer: </strong>CyberConnect2 Co., Ltd. | <strong>Publisher: </strong>CyberConnect2 Co., Ltd.<br /> <br /> <img alt="FUGA.jpg" height="auto" src="" width="auto" /><br /> <br /> <em>Fuga: Melodies of Steel </em>is a dramatic action-strategy RPG depicting hope and despair. Friendships and social drama are both born from within the tank. This game is all about "die and retry," and features tons of different endings! Events, choices, and even the story differ with each playthrough!<br />   <h3><a href="" target="_blank"><strong>Guilty Gear -Strive-</strong></a></h3> <strong>Developer: </strong>Arc System Works Co., Ltd. | <strong>Publisher: </strong>Arc System Works America, Inc. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <em>Guilty Gear -Strive- </em>is the latest entry in the critically acclaimed Guilty Gear fighting game franchise. Created by Daisuke Ishiwatari and developed by Arc System Works, <em>Guilty Gear -Strive- </em>upholds the series’ reputation for groundbreaking hybrid 2D/3D cel-shaded graphics coupled with intense, rewarding gameplay.<br />   <h3><a href="" target="_blank"><strong>Journey to the Savage Planet</strong></a></h3> <strong>Developer: </strong>Typhoon Studios | <strong>Publisher:</strong> 505 Games <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> Welcome to the Pioneer Program! In this upbeat & colorful co-op adventure game, you play as the newest recruit to Kindred Aerospace. Dropped onto an uncharted planet with little equipment and no real plan, you must explore, catalog alien flora and fauna and determine if this planet is fit for human habitation. But perhaps you are not the first to set foot here… Onward to adventure! Good luck – and mind the goo!<br />   <h3><a href="" target="_blank"><strong>Minecraft Dungeons</strong></a></h3> <strong>Developer:</strong> Mojang | <strong>Publisher: </strong>Xbox Game Studios <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> Fight your way through an all-new action-adventure game, inspired by classic dungeon crawlers and set in the <em>Minecraft</em> universe! Brave the dungeons alone, or team up with up to four players through action-packed, treasure-stuffed, wildly-varied levels – all in an epic quest to save the villagers and take down the evil Arch-Illager!<br />   <h3><a href="" target="_blank"><strong>Monster Energy Supercross - The Official Videogame 3</strong></a></h3> <strong>Developer:</strong> Milestone | <strong>Publisher:</strong> Milestone <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <em>Monster Energy Supercross - The Official Videogame 3 </em>is the latest release of the most beloved and realistic Supercross video game, featuring the 2019 Monster Energy Supercross season with 100 riders across both 450SX and 250SX categories, coupled with 15 official stadiums and tracks. For the very first time in the series, it will also allow players to choose between a sponsor team or an Official Supercross Team of the 2019 Championship in career mode: you will finally have the chance to become teammate of your favorite Supercross rider and show the whole world who’s the king of the hill!<br />   <h3><a href="" target="_blank"><strong>MotoGP™ 20</strong></a></h3> <strong>Developer:</strong> Milestone |<strong> Publisher: </strong>Milestone <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <br /> <em>MotoGP™20</em> is the latest chapter in the beloved MotoGP™ franchise. The traditional adrenaline-filled gameplay, beloved by the community, is now packed with a more strategic and realistic approach to races. The game offers a pure <em>MotoGP™ </em>experience, from the box to the track! In <em>MotoGP™20</em>, players are now able to take full control of their careers, making decisions that can impact the difference on track. The Managerial Career is finally back with a lot of new features that will put players’ riding and strategic skills to the test! Joining an official 2020 team or a new team sponsored by real brands from the MotoGP world, players will have a full entourage to manage that will help them select a new team, analyze race data, and develop a bike. Just like in the real MotoGP™, players will need to make the best decisions to find a winning strategy to master the championship.<br />   <h3><strong>Ni No Kuni: Cross Worlds</strong></h3> <strong>Developer:</strong> NetmarbleNeo | <strong>Publisher:</strong> Netmarble<br /> <br /> <img alt="NiNoKuni_CrossWorlds.jpg" height="auto" src="" width="auto" /><br /> <br /> Fantasy game <em>Ni no Kuni</em>, originally made in collaboration between Level-5 and Studio Ghibli, will be reborn on mobile with Unreal Engine. Developed by Netmarble, makers of<em> Lineage 2: Revolution</em>, the game includes the entire Ni no Kuni universe from the original series. The story of the game is based on a universe where the real world and fantasy world coexist. <br />  <br /> <em>Netmarble&#39;s Nino Kuni: Cross Worlds</em> presents a world filled with Ghibli&#39;s style and fairytale-like animations with colorful 3D graphics and high-quality cutscenes. In addition, it provides high-quality music from the original IP to deliver a fantastic experience.<br />   <h3><a href="" target="_blank"><strong>No Straight Roads</strong></a></h3> <strong>Developer:</strong> Metronomik | <strong>Publisher:</strong> Sold Out <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> Take back Vinyl City - with rock! Embark on a music-based action-adventure as indie rock band members Mayday & Zuke and lead a musical revolution against EDM empire <em>No Straight Roads</em>. After being unfairly rejected in their audition to join<em> No Straight Roads</em>, Mayday & Zuke uncover the evil intentions behind the NSR empire. It’s now down to them to save their city from corruption. Enjoy fast & frenetic combat with a musical twist as these two aspiring rock artists fight back with the power of music!<br /> <br /> Directed by Wan Hazmer, lead game designer of <em>Final Fantasy XV</em>, and Daim Dziauddin, concept artist of <em>Street Fighter V</em>.<br />   <h3><a href="" target="_blank"><strong>Omno</strong></a></h3> <strong>Developer: </strong>Studio Inkyfox <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> The indie game <em>OMNO</em> is a third-person adventure by solo developer Jonas Manke. It will take you through lush forests, across a sun blasted desert, over a frigid tundra, and, with the power of a lost civilisation, to the clouds.<br /> <br /> On the way you will meet strange creatures, encounter many surprises, and maybe make a friend.<br /> <br /> Besides offering an interactive world, <em>Omno </em>challenges you with puzzles, hidden secrets, and obstacles to overcome in 3D puzzle platformer style.<br />   <h3><a href="" target="_blank"><strong>Phantom: Covert Ops</strong></a></h3> <strong>Developer:</strong> nDreams | <strong>Publisher: </strong>nDreams <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> You are a Phantom: an elite and deadly covert operative with a single night to prevent all-out war.<br /> <br /> Dispatched into remote, hostile wetlands in your tactical kayak, you will utilize military-grade weapons and equipment to evade and neutralize the enemy threat.  Immerse yourself in a gritty and authentic arena of war across an intense campaign in VR. Engage your targets lethally or infiltrate unnoticed from within the shadows: it’s your mission to execute your own way.<br /> <br /> <em>Phantom: Covert Ops </em>is stealth action redefined for Oculus Quest and Rift platforms.<br />   <h3><a href="" target="_blank"><strong>SpongeBob SquarePants: Battle for Bikini Bottom - Rehydrated</strong></a></h3> <strong>Developer: </strong>Purple Lamp Studios | <strong>Publisher:</strong> THQ Nordic <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> Are you ready, kids? The cult classic is back, faithfully remade in Spongetastic splendor! Play as SpongeBob, Patrick, and Sandy and show the evil Plankton that crime pays even less than Mr. Krabs. Want to save Bikini Bottom from lots of rampant robots with your mighty bubbles? Of course you do! Want to underpants bungee jump? Why wouldn&#39;t you! Want to join forces in a brand new multiplayer mode? The battle is on!<br />   <h3><a href="" target="_blank"><strong>Star Wars Jedi: Fallen Order</strong></a></h3> <strong>Developer:</strong> Respawn Entertainment | <strong>Publisher: </strong>Electronic Arts <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> A galaxy-spanning adventure awaits in <em>Star Wars Jedi: Fallen Order</em>, a new third-person action-adventure title from Respawn Entertainment. This narratively-driven single-player game puts you in the role of a Jedi Padawan who narrowly escaped the purge of Order 66 following the events of <em>Star Wars™: Episode III - Revenge of the Sith™</em>. On a quest to rebuild the Jedi Order, you must pick up the pieces of your shattered past to complete your training, develop new powerful Force abilities, and master the art of the iconic lightsaber - all while staying one step ahead of the Empire and its deadly Inquisitors.<br />   <h3><a href="" target="_blank"><strong>The Artful Escape</strong></a></h3> <strong>Developer: </strong>Beethoven & Dinosaur | <strong>Publisher: </strong>Annapurna Interactive <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> Discover who you aren’t. <br /> <br /> On the eve of his first performance, Francis Vendetti battles with the legacy of a dead folk legend and the cosmic wanderings of his own imagination. Francis, a teenage guitar prodigy, sets out on a psychedelic, multidimensional journey to inspire his stage persona.<br />   <h3><a href="" target="_blank"><strong>The Outer Worlds</strong></a></h3> <strong>Developer:</strong> Obsidian Entertainment | <strong>Publisher: </strong>Private Division <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <em>The Outer Worlds </em>is an award-winning single-player RPG from Obsidian Entertainment and Private Division. In <em>The Outer Worlds</em>, you awake from hibernation on a colonist ship that was lost in transit to Halcyon, the furthest colony from Earth located at the edge of the galaxy, only to find yourself in the midst of a deep conspiracy threatening to destroy it. As you explore the furthest reaches of space and encounter various factions, all vying for power, the character you decide to become will determine how this player-driven story unfolds. In the corporate equation for the colony, you are the unplanned variable. <em>The Outer Worlds</em> is out now for Xbox One, PS4, and PC, and coming to Nintendo Switch this year.<br />   <h3><a href="" target="_blank"><strong>Trials of Mana</strong></a></h3> <strong>Developer:</strong> SQUARE ENIX | <strong>Publisher: </strong>SQUARE ENIX <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <em>Trials of Mana </em>is a full high-definition remake of the third entry in the Mana series, previously exclusively released in Japan in 1995 as Seiken Densetsu 3. <em>Trials of Mana</em> tells the story of six heroes as they battle against the monsters of destruction that threaten a world where Mana has been weakened. Players can customize their own party of three, select from six unique characters, and experience different stories.<br />  GamesCommunityEventsGDC 2020Daniel KayserThu, 20 Feb 2020 14:30:00 GMTThu, 20 Feb 2020 14:30:00 GMT it actually is rocket science: SkyReal Suite for aerospace design design can quickly get messy, with expensive prototypes that don’t always point up flaws. With Unreal Engine-based SkyReal Suite, engineers can test before they build, reducing errors and time to market by up to 30%.In recent years, we’ve seen a surge in the use of virtual reality for industrial design including aircraft and other vehicles. But while aerospace engineers might understand the benefits of designing in VR, they tend to focus their talents where they’re needed: toward the efficient development of quality aircraft and spacecraft. Taking the time to learn new tools for VR could be considered an unnecessary distraction.<br /> <br /> This is where <a href="" target="_blank">SkyReal</a> comes in. The Paris-based company builds custom Unreal Engine-based VR solutions for aerospace clients like <a href="" target="_blank">Airbus</a>, <a href="" target="_blank">ArianeGroup</a>, and <a href="" target="_blank">STELIA Aerospace</a>, where engineers can edit design, experiment, and collaborate in a way that feels natural to them without special training. “Physical prototyping can be complex and expensive for these large-scale projects, but the engineers and other stakeholders want to be able to feel and touch the design,” explains Hugo Falgarone, CEO and Founder of SkyReal. “In VR, they can have the same experience but without the prototype.”<br /> <img alt="Spotlight_SkyReal_blog_body_img_7.jpg" height="auto" src="" width="auto" />In addition to providing an accurate platform for testing and experimentation, SkyReal reduces design cycle time by up to 30 percent by saving on iterations between designers and stakeholders. “They don’t need to build a miniature or prototype or physical version of any kind,” says Falgarone. “Instead, they can build a daily digital prototype.” <h3><strong>Making VR iterations autonomous</strong></h3> It’s no mean feat to create a tool that’s both effective for complex, large-scale virtual design, and also easy to use. Falgarone, along with Benjamin Ray, CTO at SkyReal, started developing VR solutions at Airbus, where they worked on internal projects. After around 10 years, the team spun off a separate company to produce SkyReal Suite, a standalone product that could be offered to companies outside the Airbus family.<br /> <img alt="Spotlight_SkyReal_blog_body_img_2.jpg" height="auto" src="" width="auto" /><br /> “If your business is designing launchers, or cars, or boats, you don’t want to spend your time building or operating a complex VR system,” says Falgarone. “You just want to see the latest modification of your design, to click and play and iterate with your colleagues. That’s what SkyReal Suite is built for.”<br /> <br /> SkyReal Suite consists of three modules: one to prepare and import CAD data, another to store the data in “rooms” that users can explore, and a third to provide the VR interface and environment. The system is designed to present models at 1:1 scale.<br /> <img alt="Spotlight_SkyReal_blog_body_img_4.jpg" height="auto" src="" width="auto" /><br /> The Unreal Engine import toolset <a href="" target="_blank">Datasmith</a> has been available for some time, so the conversion of CAD data to Unreal Engine assets isn’t new. What SkyReal Suite adds is the seamless conversion of complex systems to a high-level VR experience within the engine, which in turn can be converted back to CAD data. This abstraction of complex structures is particularly important for large-scale design in real time, where the number of objects in the project can easily reach 100,000 or more, and poly counts routinely top two million.<br /> <br /> Perhaps most importantly, when an engineer changes a part while in virtual reality, the change is automatically propagated back to the CAD software, eliminating the need for a manual export/import cycle. The idea was to create a software suite where the user could work autonomously without needing to train someone on data management or virtual reality development, and without requiring help from an outside or custom service every time they update a design. <br /> <br /> “We deliver not only the technology, but also the capability to produce each design iteration,” Falgarone says. “This means that every week or every day, they can reproduce the experience by themselves.”<br /> <img alt="Spotlight_SkyReal_blog_body_img_3.jpg" height="auto" src="" width="auto" /> <h3><strong>Beyond aerospace design</strong></h3> Another benefit of the SkyReal workflow is that the manufacturing team can make use of assets created for the design process. After that, assets can be further repurposed all the way through to sales, support, and training. Because the assets are all in Unreal Engine, training tools can make use of game logic for a rich VR training environment from the actual design, and materials and lighting can be applied to create advertising-quality mockups for marketing and sales.<br /> <img alt="Spotlight_SkyReal_blog_body_img_6.jpg" height="auto" src="" width="auto" /><br /> Because SkyReal Suite presents models at 1:1 scale, it can be used for any large-scale design project—customers have used it to visualize the engineering behind ground vehicles, boats, and even factory machinery.<br /> <br /> More importantly, it helps engineers detect design issues before they become costly, or worse, cause the design to fail altogether. “By using SkyReal, our partners can detect and qualify their mistakes at a very early stage, and eventually find solutions to the problems they encounter,” says Jan B&oslash;rre Rydningen, Senior Advisor at <a href="" target="_blank">&Aring;KP</a>, which uses SkyReal for shipbuilding. “They can avoid and anticipate a multiplied series of errors that could be fatal for a project.”<br /> <img alt="Spotlight_SkyReal_blog_body_img_1.jpg" height="auto" src="" width="auto" /> <h3><strong>Unreal Engine: tools for a complete solution</strong></h3> When developing SkyReal Suite, Falgarone and Ray had the choice of a number of real-time engines, but chose Unreal Engine for its rich feature set. “Collaboration is very important inside SkyReal, and Unreal supports this really well with things like Replication Graph,” says Ray. “We also needed something that would support remote access and control.”<br /> <br /> He adds that the ability to accurately simulate physics is an important part of many projects. Also, Unreal Engine’s <a href="" target="_blank">nDisplay technology</a> comes in handy for the multi-screen and multi-projector displays many of their customers use for VR, such as powerwalls and CAVEs.<br /> <br /> While SkyReal could create their own custom branch of the source code—an option available to any Unreal Engine developer at no cost—SkyReal chooses to always use the currently shipping version while still providing a dedicated experience for engineering. The benefit is that SkyReal always has access to the latest features, and can update their software with these features shortly after each point version of Unreal Engine is released.<br /> <br /> When asked to name the most important goal for SkyReal Suite, Falgarone is quick to answer. “Reliability! We want to be 100% reliable, no matter how large the project,” he says. “Our customers are always pushing the limit—they want to get more and more models inside the same simulation.<br /> <br /> “We need to be inventive to keep up,” he continues, “and Unreal Engine gives us what we need to be inventive.”<br /> <br /> <br /> Want to explore the use of real-time technology for design? <a href="" target="_blank">Get in touch</a> and we’ll be happy to start that conversation.<br />  Training & SimulationAutomotive & TransportationDesignManufacturingVRSkyRealAirbusSebastien LozéTue, 18 Feb 2020 17:00:00 GMTTue, 18 Feb 2020 17:00:00 GMT artist Inio Asano creates real-time backdrops in Unreal Engine manga artist Inio Asano uses an innovative technique that harnesses the power of real-time technology to develop backgrounds for his artwork. Find out how he’s pushing the boundaries of manga illustration and opening up new creative avenues. When you think about the creative process behind manga artwork, real-time game engine technology may not be the first thing that springs to mind. However, a flash of inspiration one day led acclaimed <a href="" target="_blank">manga artist Inio Asano</a> to ask, “What if…?”<br /> <br /> After some experimentation, his efforts gave rise to an innovative technique for creating compelling, atmospheric backdrops for the manga scenes he illustrates. He’s currently using Unreal Engine as part of the artistic process for his comic series <a href="" target="_blank">Dead Dead Demon&#39;s Dededede Destruction</a>, which is being serialized in the manga magazine <a href="" target="_blank">Big Comic Spirits</a>. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> "Whenever I wanted to create a realistic background, I would modify a photo and then draw a manga illustration of it,” explains Asano. “But after doing that for over 10 years, I began to feel that the technique was getting a bit stale. I also realized that I was only using photos taken from my own vantage point.”<br /> <img alt="Spotlight_InioAsano_blog_body_img1.jpg" height="auto" src="" width="auto" /><br /> He started to wish he had more creative freedom, and that’s when he began thinking about the possibility of using 3D models. “Though I had known about Unreal Engine for many years, I had always thought that it was strictly a game engine,” he says. “As a manga artist, at first I didn’t think there was much overlap between it and my work. Then one day it occurred to me to look into Unreal Engine, and I discovered that it was free!” <h3><strong>Experimental manga using 3D visualization </strong></h3> With no barrier to entry, Asano decided to download the engine and try it out. That’s when he got his first taste of the power of real-time rendering. “It was easy to move around in 3D space and to adjust the lighting and colors,” he recalls. “The shadows were realistic too, more than if I tried to draw them by hand. I would then refer back to these assets and make some hand-drawn adjustments.”<br /> <img alt="Spotlight_InioAsano_blog_body_img2.jpg" height="auto" src="" width="auto" /><br /> Using the assets available in the <a href="" target="_blank">Unreal Engine Marketplace</a>, Asano found himself creating the backdrop for entire stories in the engine. “Though I find buildings relatively easy to draw, I struggle to draw plants and wildlife more than anything else,” he says. “There are a lot of foliage assets available for sale as building blocks, making it very easy to create a forest with just a few mouse clicks. I realized that I could draw a story set in these environments I created. When I use Unreal Engine, I actually reduce the quality of the renderings to make the images resemble hand-drawn manga artwork.<br /> <img alt="Spotlight_InioAsano_blog_body_img3.jpg" height="auto" src="" width="auto" /> <h3><strong>Real-time technology opens new creative avenues </strong></h3> The further he explored real-time technology, the more ways Asano found to enhance his artwork—some of which he would not previously have entertained. “With Unreal Engine, I have begun to try things that I would have dismissed in the past as not being worth the trouble,” he says. “If a scene is missing something, I&#39;ll buy some assets and place them on screen. In that way, I can set up the environment just the way I want in a relatively short period of time.” <br /> <img alt="Spotlight_InioAsano_blog_body_img4.jpg" height="auto" src="" width="auto" /><br /> Another advantage of 3D models that Asano has found is that unlike manga illustrations, which can only be used once, 3D models can be reused or drawn from different angles. That means he gets even greater value for money, with the opportunity to reuse assets for future drawing. <br /> <img alt="Spotlight_InioAsano_blog_body_img5.jpg" height="auto" src="" width="auto" /><br /> Asano’s journey with real-time technology has opened up a world of creative possibilities, giving rise to some exciting new ideas. “I’d like to try turning my creative process on its head by building an entire city in Unreal Engine and then proceeding to draw a manga that depicts events that take place in the city,” he explains. <br /> <br /> The veteran artist is a strong advocate for experimentation of this type, and believes more artists could benefit from exploring real-time visualization. “Over the past 10 years, it has become possible for anyone to create artwork using Unreal Engine,” he says. “Since these tools are freely available, I think everyone should embrace them to make it easier than ever to bring ideas to life.”<br /> <br /> <br /> Want to explore how real-time technology could enhance your own illustration techniques? <a href="" target="_blank">Download Unreal Engine</a> for free today!<br />  More UsesMarketplaceArtInio AsanoMon, 17 Feb 2020 16:00:00 GMTMon, 17 Feb 2020 16:00:00 GMT Unreal Engine-powered games honored at 23rd Annual D.I.C.E. Awards Games CEO Tim Sweeney provides the opening keynote to the 23rd Annual D.I.C.E. Awards and 15 Unreal Engine-powered games got nominated across 15 categories. A fantastic selection of Unreal Engine-powered games were honored at the 23rd annual D.I.C.E. Awards in Las Vegas earlier this week. Run by the Academy of Interactive Arts & Sciences, nominees and winners were voted on by members of the non-profit organization. 15 Unreal Engine-powered games shined at the show garnering nominations across 15 categories that include Outstanding Technical Achievement, Action Game of the Year, Immersive Reality Game of the Year, and more. Winners included Star Wars: Jedi Fallen Order for Adventure Game of the Year, The Outer Worlds for Role-Playing Game of the Year, and Mortal Kombat 11 for Fighting Game of the Year.<br /> <br /> Epic Games Founder and CEO Tim Sweeney provided the <a href="" target="_blank">opening keynote</a> for the event. You can watch the full talk, titled "The Times They Are A-Changin&#39;," in the embedded video below: <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> For the complete list of Unreal Engine-powered winners and nominees from the event, check out the following list! <h2><strong>AWARD NOMINEES AND WINNERS</strong></h2> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <h3><strong><a href="" target="_blank">Arise: A Simple Story</a> | Piccolo Studio/Techland</strong></h3> <p><strong>NOMINEE: </strong>Outstanding Achievement in Original Music Composition</p> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <h3><strong><a href="" target="_blank">Asgard&#39;s Wrath</a> | Sanzaru Games/Oculus Studios</strong></h3> <p><strong>NOMINEE:</strong> Immersive Reality Technical Achievement, Immersive Reality Game of the Year</p> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <h3><strong><a href="" target="_blank">Concrete Genie</a> | Pixelopus/Sony Interactive Entertainment</strong></h3> <p><strong>NOMINEE: </strong>Outstanding Achievement in Art Direction, Outstanding Technical Achievement</p> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <h3><strong><a href="" target="_blank">Days Gone</a> | SIE Bend Studio/Sony Interactive Entertainment</strong></h3> <p><strong>NOMINEE: </strong>Outstanding Achievement in Animation</p> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <h3><strong><a href="" target="_blank">Gears 5</a> | The Coalition/Xbox Game Studios</strong></h3> <p><strong>NOMINEE: </strong>Action Game of the Year</p> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <h3><strong><a href="" target="_blank">Jump Force</a> | Spike Chunsoft/Bandai Namco</strong></h3> <p><strong>NOMINEE:</strong> Fighting Game of the Year</p> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <h3><strong><a href="" target="_blank">Kingdom Hearts III</a> | Square Enix</strong></h3> <p><strong>NOMINEE: </strong>Role-Playing Game of the Year</p> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <h3><strong><a href="" target="_blank">Mortal Kombat 11</a> | NetherRealm Studios/Warner Bros Interactive Entertainment</strong></h3> <p><strong>NOMINEE: </strong>Outstanding Achievement in Original Music Composition, Outstanding Achievement in Audio Design, Fighting Game of the Year<br /> <strong>WINNER: </strong>Fighting Game of the Year</p> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <h3><strong><a href="" target="_blank">Samurai Shodown</a> | SNK Corporation/Athlon Games</strong></h3> <p><strong>NOMINEE: </strong>Fighting Game of the Year</p> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <h3><strong><a href="" target="_blank">Star Wars: Jedi Fallen Order</a> | Respawn Entertainment/Electronic Arts</strong></h3> <p><strong>NOMINEE: </strong>Outstanding Achievement in Character, Adventure Game of the Year<br /> <strong>WINNER:</strong> Adventure Game of the Year</p> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <h3><strong><a href="" target="_blank">The Outer Worlds</a> | Obsidian/Private Division</strong></h3> <p><strong>NOMINEE:</strong> Outstanding Achievement in Story, Role-Playing Game of the Year<br /> <strong>WINNER: </strong>Role-Playing Game of the Year</p> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <h3><strong><a href="" target="_blank">Trover Saves the Universe</a> | Squanch Games</strong></h3> <p><strong>NOMINEE: </strong>Immersive Reality Game of the Year</p> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <h3><strong><a href="" target="_blank">Vader Immortal</a> | ILMxLAB/Disney Interactive Studios</strong></h3> <p><strong>NOMINEE: </strong>Audience Choice</p> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <h3><strong><a href="" target="_blank">Westworld Awakening</a> | Survios/Warner Bros. Interactive Entertainment and HBO</strong></h3> <p><strong>NOMINEE: </strong>Immersive Reality Technical Achievement</p> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <h3><strong><a href="" target="_blank">Yoshi&#39;s Crafted World</a> | Good-Feel/Nintendo</strong></h3> <p><strong>NOMINEE:</strong> Family Game of the Year</p> GamesCommunityD.I.C.E AwardsJimmy ThangFri, 14 Feb 2020 19:30:00 GMTFri, 14 Feb 2020 19:30:00 GMT SquarePants: Battle for Bikini Bottom - Rehydrated revamps the graphics and design of a cult classic you ready, kids? SpongeBob SquarePants: Battle for Bikini Bottom remake updates visuals and adds new content to beloved platformer.Since its original release in 2003, <em>SpongeBob SquarePants: Battle for Bikini Bottom </em>has gone on to become a cult classic. The 3D platformer steadily accrued a passionate following in internet circles and even became an unlikely breeding ground for lightning-quick speedruns. Nearly two decades later <a href="" target="_blank">Purple Lamp Studios</a> and <a href="" target="_blank">THQ Nordic</a> have set out to remaster, or <a href="" target="_blank">rehydrate</a>, the game with updated visuals, tweaked game design, and new content.<br /> <br /> We interviewed THQ Nordic Producer Martin Kreuch to see how the company is improving upon the original with Unreal Engine.  <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <strong>How are you trying to update <em>SpongeBob SquarePants: Battle for Bikini Bottom - Rehydrated</em>, while keeping what fans loved about the original?</strong><br /> <br /> It was very important for us to keep the abundance of distinct gameplay moments in Battle for Bikini Bottom, which is still one of the most varied platformers out there. Each level has a distinct look and feel, and often adds some unique gameplay mechanics to support this. You have three playable characters–SpongeBob, Patrick, and Sandy–each with a unique set of skills. Other than that, the biggest challenge was simply to do justice to both one of the greatest animated series ever and arguably the best game based on the franchise.<br /> <br /> Since the original game is still very fun to play, we focused on the details that would make it feel like a modern console game. This involved tweaking the camera, the movement, and dialogue animations, and implementing a lot of ease-of-use features, like [including] a modern saving system, for example.  <br /> <img alt="BFBB_05_HD.png" height="auto" src="" width="auto" /><br /> <strong>With the original having such a strong cult following, how important is community feedback to the development of the game? </strong><br />  <br /> Being fans of the game and series ourselves, staying faithful to the original was our prime objective. At the same time, we wanted to make sure we don’t replicate some of the old frustrations from the PS2 era. We aimed to build the game according to how fans remembered it–[recreating] that feeling of playing a wacky, fun platformer in a beloved underwater world.<br />  <br /> In addition, Battle for Bikini Bottom has a vibrant speedrunner community. After showing the game for the first time at Gamescom 2019, <a href="" target="_blank">SHiFT</a>, one of the most prolific speedrunners of the original game, was kind enough to visit our studio in Vienna. This way, the team had a chance to tap into his insights, which consists of playing numerous hours of the original game, to get an understanding of all the little tricks and shortcuts that are easy to miss.<br /> <img alt="BFBB_01_HD.png" height="auto" src="" width="auto" /><br /> <strong>Rehydrated includes content that was cut from the original as well as new content such as a multiplayer mode. Can you elaborate on your decision to implement these new elements? </strong><br />  <br /> It really came from our research into what fans of the original were hoping for. For example, the concept art for Robo-Squidward was unlockable in the original game, and fans have been wondering how this fight would have looked like if it had made it into the game ever since.<br />  <br /> The multiplayer idea stems both from the long history of SpongeBob party games and the fact that the series has fans from across all age groups. It’s a series for everybody, and we wanted to offer a simple, easy-to-pick-up multiplayer mode for fans to enjoy together.<br />  <br /> <strong>Are there any other remasters or remakes that you used as reference points?</strong><br /> <br /> Our primary influence was the original game, but we also looked at great remakes like <em>Spyro Reignited Trilogy</em> and the <em>Crash Bandicoot N. Sane Trilogy</em>, as well as genre-defining games like <em>Super Mario Odyssey</em>.<br /> <br /> On top of that, at THQ Nordic, we have a strong dedication to bringing back classic games, both as HD remasters or, as with <em>Battle for Bikini Bottom – Rehydrated</em>, as full remakes. This experience is, of course, helpful in supporting Purple Lamp to create their new vision of the SpongeBob classic.<br /> <img alt="BFBB_02_HD.png" height="auto" src="" width="auto" /><br /> <strong>Spongebob is an incredibly iconic franchise with fans young and old, and it comes with a very specific and surreal brand of comedy. How did you approach injecting these comedic elements into the game?</strong><br /> <br /> We are lucky to be working with an original that was created during the golden age of the SpongeBob franchise. It’s full of legendary dialogue by a wide range of characters from the series. Every fan has their favorite one-liner; it’s a lot of fun seeing them pop up in the comments section of videos on the game.  <br /> <br /> Aside from that, we are now in the position to really support these lines of dialogues and the overall craziness of the game with high-quality animations. So, I wouldn’t say we added a lot of comedic elements, but we mainly brought out the existing ones and allowed them to shine. We did add new idle animations, though, which are the movements the playable characters do when they stand still for a while. This was done to reflect some meme-able moments that happened later in the series and was a lot of fun to make. <br />  <br /> <strong>Considering SpongeBob SquarePants: Battle for Bikini Bottom - Rehydrated is one of Purple Lamps’ first games, what was the team’s familiarity with Unreal Engine like going into development?</strong><br /> <br /> The team in its current form is relatively “fresh,” but many members have worked together before or have had lots of experience in the gaming industry.<br /> <br /> As for UE experience, the team had prior experience developing PVP action-multiplayer game <a href="" target="_blank"><em>MisBits</em></a> and got additional knowledge and support from a colleague who worked with Unreal in the architectural visualization industry.<br /> <img alt="BFBB_03_HD.png" height="auto" src="" width="auto" /><br /> <strong>How did the team leverage Unreal Engine to modernize the game’s graphics, while also using the cartoon as a reference point?</strong><br />  <br /> We collaborated closely with Nickelodeon on the art style, who guided us with their unparalleled insight into the SpongeBob franchise. In addition, Unreal Engine gives us a much higher visual fidelity than the original.<br />  <br /> On the more technical side, the <a href="" target="_blank">profiling</a> and visualization tools offered by Unreal Engine 4 provide many ways to identify otherwise hard-to-find performance bottlenecks. In that regard, the engine also includes an easy-to-use <a href="" target="_blank">LOD</a> creation tool to optimize geometry that’s far away from the player. Our team also uses the <a href="" target="_blank">Material Editor</a> to create the game’s unique look and its stylized effects.<br />  <br /> <strong>Precise jumping mechanics coupled with well-designed environments are vital for any action platformer. Are there any ways the team is using Unreal Engine to ensure the gameplay feels good? </strong><br />  <br /> The [default] Unreal camera setup for a third-person character was a solid starting point for our camera development; additionally, the article, “<a href="" target="_blank">Six ingredients for a dynamic third-person camera</a>,” from Daedalic Entertainment was of great help to us.<br /> <br /> Using the existing Unreal <a href="" target="_blank">Gameplay Ability System</a> gave us a huge boost when starting to develop the different character skills and abilities. In addition, expanding multiplayer abilities became a lot easier using an existing framework.<br /> <img alt="BFBB_04_HD.png" height="auto" src="" width="auto" /><br /> <strong>What is one thing you would like to let fans know about the game as it heads towards release?</strong><br />  <br /> Thanks for the wave of support and enthusiasm that drives us to work every day! We can’t wait to put this very special game in your eagerly waiting hands!GamesArtCommunityDesignSpongeBob SquarePantsPurple Lamp StudiosTHQ NordicMichael LuisFri, 14 Feb 2020 15:00:00 GMTFri, 14 Feb 2020 15:00:00 GMT Twinmotion Materials collection now available for Unreal Engine on the Marketplace only available for Twinmotion users, this material collection includes a mixture of high-quality assets such as bricks, concrete, glass, plastics, and more. Since Epic Games acquired <a href="" target="_blank">Twinmotion</a> last year and made the high-quality, easy-to-use real-time visualization solution freely available to the general public, we immediately started thinking about how we could best make it interoperable with Unreal Engine. While we’re excited to reveal more on how we’ll be integrating the two workflows together in the future, we wanted to begin bridging that gap today by offering Unreal users a free material collection that’s based on Twinmotion materials. There’s a wide variety of categories here including: <br />   <ul style="margin-left: 40px;"> <li>Bricks</li> <li>Concrete</li> <li>Fabrics </li> <li>Glass</li> <li>Grass and dirt</li> <li>Wood </li> <li>Plastics</li> </ul> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <img alt="HighresScreenshot00002.png" height="auto" src="" width="auto" /><br /> <a href="" target="_blank">Available now on the Marketplace</a>, we’ve ensured that these rich and powerful master materials support the latest ray-tracing advancements and have used best practices to define how the nearly 500 PBR materials were used. This work includes:<br />   <ul style="margin-left: 40px;"> <li>Specific optimizations for ray tracing</li> <li>Advanced shading techniques, such as parallax occlusion mapping for materials needing relief, which is useful for surfaces like bricks </li> <li>Ability to use an object’s UVs or to use tri-planar mapping, which can assist texture alignment by automatically aligning textures on objects that might not have been given proper UV coordinates </li> <li>Ability to define real-world scale</li> </ul> <img alt="HighresScreenshot00003_wood.png" height="auto" src="" width="auto" /><br /> Unreal users will be able to use these master materials on their custom content to achieve a high level of photorealism and can combine it with their own textures and assets across any project. <br /> <br /> <br /> <a href="" target="_blank">Download the collection</a>, try it for yourself, and stay tuned for more news about how Twinmotion and Unreal will intertwine their workflows moving forward! <br />  ArtDesignMarketplaceRay TracingTwinmotionPierre-Felix Breton and Francis MaheuxThu, 13 Feb 2020 18:00:00 GMTThu, 13 Feb 2020 18:00:00 GMT helps KA DesignWorks fine tune archviz designs with VR walkthroughs to provide compelling VR experiences that speed up the review process and help identify improvements to designs that could have gone unnoticed in 2D renders. KA DesignWorks has been designing buildings and interiors for the past 14 years. Based in the Roaring Fork Valley just outside of Aspen, Colorado, the architecture studio covers the full spectrum of design including commercial and renovation, specializing in the custom residential sector.<br /> <br /> The team at <a href="" target="_blank">KA DesignWorks</a> has been using interactive archviz tool <a href="" target="_blank">Twinmotion</a> for three years, pairing it with the HTC VIVE to leverage the software’s powerful VR capability. “Still renderings are great,” says Andrew Chaloupka, Architectural Visualization/VR Specialist at KA. “Video is great. But neither can convey the sense of space and scale that VR can.”<br /> <br /> Recently, the studio worked on a project to reinvigorate an intriguing building with spectacular views of the Rocky Mountains. Harnessing the power of Twinmotion, the team sped up the client review process exponentially—and used the software’s compelling VR functionality to make improvements to the designs that could have gone unnoticed in 2D renders. <br /> <br />   <div style="text-align: center;"><iframe allow="autoplay; fullscreen" allowfullscreen="" frameborder="0" height="480" src=";title=0&amp;byline=0&amp;portrait=0" width="854"></iframe></div> <h3><strong>Seeing designs in a new light with VR</strong></h3> The White Horse Springs Lane project centered around a house that was well-known locally for its unique architecture. Located close to a rural road, the original house was designed in the 1990s in the style of <a href="" target="_blank">Ricardo Legorreta</a>, and the structure felt incongruous with its surroundings.<br /> <br /> KA DesignWorks’ new design for the property began as a conceptual project, and evolved into a development proposal to reinvent the building. A number of interesting features that included multiple exterior rooms formed by the various wings of the plan, along with stunning mountain views, meant there was huge potential for development. <br /> <img alt="Spotlight_KADesignworks_blog_body_img5.jpg" height="auto" src="" width="auto" /><br /> The studio’s plans for the property aimed to open up the house to the impressive mountain vista, create a stronger connection to its context through a new palette of materials and textures, and make the most of the outdoor spaces to increase the usable area of the plot. Deliverables included still images as well as video, which were to be used primarily to convey interior design direction, as well as satisfy marketing requirements to drum up interest in the project. <br /> <br /> For all its projects, KA starts by building a highly detailed and coordinated 3D model using Graphisoft’s <a href="" target="_blank">ARCHICAD</a> BIM software. The team can then import this model into Twinmotion to offer VR design review using the <a href="" target="_blank">ARCHICAD Direct Link</a>, which enables one-click synchronization between Twinmotion and ARCHICAD. <br /> <br /> By enabling architectural designers to go from 3D model to VR without any technical expertise required, Twinmotion puts compelling immersive walkthroughs into the hands of everyone, regardless of their previous VR experience. “The controls are very intuitive and the results are impressive right out of the box, without an enormous amount of training,” confirms Chaloupka. <br /> <img alt="Spotlight_KADesignworks_blog_body_img6.jpg" height="auto" src="" width="auto" /><br /> The team has always used 3D software for renderings and animations, but a lack of VR functionality meant it needed another piece of software to take projects to the next level. “When viewing a 2D rendering or floor plan, clients often think they have a good grasp on what they’re seeing,” explains Chaloupka. “But as soon as you put on the headset, they instantly have feedback when a window is too big, a hallway is too narrow, or a ceiling is too high. This type of feedback cannot be garnered in a 2D world.” <h3><strong>Revealing areas for improvement </strong></h3> The ability to assess designs at human scale in an immersive environment ended up having a huge impact on the White Horse Springs Lane project. In one instance, the designers were evaluating the use of glass in a detached section of the house that was originally used as a caretaker’s unit. <br /> <br /> The initial design simply called for the replacement of the existing windows. “Using Twinmotion’s VR capability, it became clear that a much grander statement could be made with larger expanses of glass,” says Chaloupka. “By increasing the glass and connecting the unit to the main structure, we were able to create a multi-level spa that truly stood out.”<br /> <img alt="Spotlight_KADesignworks_blog_body_img3.jpg" height="auto" src="" width="auto" /><br /> In a second review session, the stakeholders were evaluating the pool. While the initial design started with just a simple lap pool, it became clear upon experiencing the design in VR that it needed something else. “The final design incorporated an integrated hot tub and multiple tiers in the pool,” says Chaloupka. “We were able to actually ‘get in the pool’ to place and scale all of the individual elements, as if we were on site.” <h3><strong>Improved ROI and reduced feedback loops</strong></h3> As well as improving the final designs, using VR for client and stakeholder reviews has had positive repercussions for the bottom line on KA’s projects—both for the studio and its clients. “Rather than going back and forth with clients on certain details, one trip through the project in VR brings up any issues instantly which can be resolved there and then,” says Kenneth Adler, Principal Architect at the studio. “This obviously saves vast amounts of time, which in turn saves money for both the client and ourselves.”<br /> <img alt="Spotlight_KADesignworks_blog_body_img7.jpg" height="auto" src="" width="auto" /><br /> What’s more, clients are increasingly enthusiastic to experience these VR walkthroughs. “We noticed that once they had given it a try, a large percentage of our clients wouldn’t leave a subsequent meeting without putting on the goggles,” says Chaloupka. “Once they saw the potential, they were hooked.” The studio has even been contacted to set up demonstrations for other architectural firms curious about the process, hardware, and software. <br /> <br /> For Chaloupka, one of the most exciting things about using VR in architectural design is how quickly the technology is advancing—and how the barriers to entry have lowered. “Now that the hardware can actually keep up with the vision of the designer, at a reasonable cost, it can become the tool it has always wanted to be,” he says. “With the near future already looking so exciting, I can only imagine what we’ll be doing in 10 or 20 years, and I can’t wait.”<br /> <br /> <br /> Want to create VR walkthroughs of your own architectural designs in just a few clicks? <a href="" target="_blank">Download Twinmotion</a> for free through early 2020!<br />  ArchitectureDesignTwinmotionVisualizationVRKA DesignWorksWed, 12 Feb 2020 16:30:00 GMTWed, 12 Feb 2020 16:30:00 GMT Invisible Walls creates asymmetrical multiplayer game with no previous Unreal experience Copenhagen, Denmark-based developers discuss how specific tools effectively served their team and explain how the switch to Unreal Engine allowed them to develop for all platforms.<a href="" target="_blank"><em>First Class Trouble</em></a> is an asymmetric multiplayer game that blends cooperation with deception, as players try to figure out who they can trust while they fight to survive aboard a luxury spaceship that’s had an unceremonious A.I. uprising.<br /> <br /> Players take on the role of one of the last remaining survivors on this ship as it experiences severe technical difficulties. Chief among these problems is that both the A.I. and onboard robotic servants have decided to kill every human onboard. They’ve already wiped out most of the passengers by simply turning the air off, but a lucky few with Oxygen Rebreathers survived.<br /> <br /> The band of surviving players must now work together to try and reset the A.I. to revert it to a non-homicidal state, but all is not as it seems. Some of the survivors are robots in disguise, and cooperation is key to success, but trust will be in short supply as players attempt to figure out who the robots in their ranks are.<br /> <br /> <em>First Class Trouble</em> has gone through a handful of changes throughout development. We spoke with Creative Director Sebastian Hurup Bevensee, who elaborated on the tools and processes that are helping studio Invisible Walls develop the game. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <strong>What was key to <em>First Class Trouble’s</em> conceptualization and development?</strong><br /> <br /> <strong>Creative Director Sebastian Hurup Bevensee: </strong>Our concept has gone through a pretty hard birth so far. We actually built a prototype in the first couple of months, which on paper seemed fun, but when we played it in a more complex setting, it proved to offer very little replayability and wasn’t that engaging. What we wanted to facilitate was a high degree of information exchange, because that is, at its core, trust inducing. However, the mechanics and design we had chosen did the opposite and was, at points, almost boring. It hit us hard because we had to decide whether it was sunk cost or if it was merely a question of small changes to our design. In the midst of that process, we recognized a lot of the flaws that didn’t make it appealing, and we started to recognize those elements and design choices that would make it a fun game to play. <br /> <br /> Now, the design is a lot better, and with the help of the Unreal Engine team, we have had the opportunity to test it with audiences in <a href="" target="_blank">Cologne</a> and at <a href="" target="_blank">Nordic Game</a>. Testing with non-biased players and communicating with your end users is at the core of creating a great experience. <br /> <br /> However, whenever you are doing anything artistic, it is natural to doubt yourself along the way. It seems to be a flowing motion; where at times, we can get very excited about how well it works. We then work on it, test it, play it, and then get “tired” of it, which causes us to start doubting it. So, up until release day, we’ll be unsure of whether we have made something that is appealing. It ultimately comes down to if the audience likes it. And we will try to involve them as much as we can along the way.<br /> <br /> <strong>Did prior experience with Unreal Engine inspire some of the decisions that went into the game? </strong><br /> <br /> <strong>Bevensee: </strong>Funnily enough, it did not. We started using UE with this project and we were really surprised by how fast we learned to use it. The flexibility and openness of the engine made it easy for us to tailor the engine to achieve our visual and gameplay goals. Even in a project like <em>First Class Trouble</em>, where the game changed a lot from its early stages, we really appreciated how adaptable the engine scaled to the different needs of the project.<br /> <br /> Beforehand in our other projects, we had sometimes felt a bit hindered by the specific game engine, which at times negatively influenced our decision making. Now we can develop for all platforms and know that the end goal will function as intended.<br /> <img alt="FCT-shot-5.png" height="auto" src="" width="auto" /><br /> <strong>What was the team’s history with other engines prior to this project? Why was Unreal Engine a good fit?<br /> <br /> Bevensee:</strong> We have previously primarily worked with CryEngine, where we developed our previous game <em>Aporia: Beyond the Valley</em>. We really enjoyed our partnerships with Crytek, but it became quite evident that UE4’s development was at a different level. So, when we were done with the project, we looked at our options. A lot of people would expect us to use Unity because we are a Danish studio coupled with Unity being [founded in Denmark]. However, with the type of game we’re making, we instead decided to go with UE4 due to how well it performs. We really like how fast-paced implementation of the roadmap appeared to be and the <a href="" target="_blank">Blueprints</a> system seemed to be as powerful, if not more powerful than the flowgraph system. Moreover, because we put a large emphasis on visuals, it seemed like UE4 offered the fastest and most affordable way to reach our visual goals.<br /> <br /> <strong>What Unreal tools helped the team realize that visual fidelity?<br /> <br /> Bevensee: </strong>We have extensively used <a href="" target="_blank">UE’s post-processing effects</a> to achieve our unique visual style. Moreover, the <a href="" target="_blank">Material</a> system and shaders really help create a unified and coherent visual presentation across the board. The mere fact that what the engine shows you in the editor is what the player sees in the game makes it tremendously easy to achieve the visual fidelity. Even a novice designer can create something beautiful effortlessly.<br /> <br /> Furthermore, <a href="" target="_blank">lighting</a> and VFX in UE4 has an incredibly low-performance cost on hardware, which means we can reach an even higher level of aesthetics while creating a deep and rich presentation to players across different graphical specs. It is important to us that players and audiences have a sense of awe and wonder when they traverse the space cruise ship, because on a space cruise ship, it should be almost magical. Filling the ship with procedural light and particle effects really helps in achieving that feeling. <br /> <img alt="FCT-shot-2.png" height="auto" src="" width="auto" /><br /> <strong>Beyond social interactions, what are the other ways players will be able to interact with their surroundings as players work to find the robotic Personoids and human residents? <br /> <br /> Bevensee: </strong>This is a game about social interaction, but we didn’t want to just build a 3D tabletop board game where people stand around and discuss who to get rid of. There are other types of games in this genre, which focus a lot on stealth, disguise, and [dispatching your enemies]. We really wanted to keep the core experience centered around trust, because that makes the betrayal or deceit that much more powerful. So, a lot of the interactions are created to build trust. You will need to work together opening doors in some areas, and you need to put your life in the hands of others, such as with the “emergency airlock” in order to get items that let you progress in the game. You need to replenish your oxygen tank. You need to pick up passenger logs that give you information about other players and you will need to explore the areas for key cards that allow the whole team to progress. These mechanics will require you to work with other people to get them and facilitate a social gaming experience.<br /> <br /> Besides the core mechanics, we want the game to emanate a 50’s leisure cruise [motif]. This means lots of cigars, cigarettes, and alcohol all over the place; because if you are on a cruise to the stars, you should have a good time on the way there, and those things are synonymous with the 1950s. For instance, you can pick up champagne bottles and knock other players unconscious just for the fun of it, or you could use them to your advantage. When people die, they aren’t immediately taken out of the game; instead, they become one of the many robotic vacuum cleaners that roam the ship, knocking over furniture in front of the other players and generally messing about, while still being able to follow the rest of the game.<br /> <br /> Of course, the environments will have unique mechanics that fit with the given environment; for example, playing dress-up at the mall or getting drinks at the ship’s bar.<br /> <img alt="FCT-shot-6.png" height="auto" src="" width="auto" /><br /> <strong>What inspired the decision to build this social experience within a sci-fi world?<br /> <br /> Bevensee:</strong> Our first game, <em>Aporia: Beyond the Valley</em>, was an ambitious semi-open world adventure puzzle game with a big emphasis on story. We think coming from the Nordics, we have a strong tradition of storytelling. We are a team who really likes world-building. It’s sort of in our DNA.<br /> <br /> Science fiction is a great genre, because it allows you to explore both the dystopian and utopian aspects of the human condition, without it becoming too “academic.” Sci-fi has a great storytelling tradition with so much inspiration to draw upon. It allows us to create imaginary worlds that you can traverse around in. The genre is not just a backdrop, but instead, for us, science-fiction is a sandbox that opens up endless possibilities. It has allowed us to elevate the gameplay so you can be serious without losing the thrill that [you’re] playing a game. Trust and deceit is a pretty serious topic when it comes down to it, but with <em>First Class Trouble</em>, we, first of all, want to create a super entertaining game that facilitates having a lot of fun with your friends.<br /> <br /> Moreover, we wanted to create an experience where your appearance matters. The 1950s is a very expressive time period with very well-defined social roles, which works great for a game like this. The 50s also stand as a very civilized period in our society (which, of course, wasn’t). And we wanted to play around with those structures and dichotomies of civilized feeling vs. brutal mob-like behavior.<br /> <br /> The sci-fi element helps us in every part of the design, as we can bend all the rules. When we play games, we do it to escape. Hopefully, we are on the path to do that. A lot of the fun in playing a game is the ability to be someone or do something you can’t do in real life. <br /> <br /> <strong>Are there any key Unreal Engine development tools that were instrumental in realizing <em>First Class Trouble’s</em> ideas?<br /> <br /> Bevensee:</strong> Blueprints! The Blueprints system has been one of the decisive factors when developing <em>First Class Trouble</em>. As a studio, we are quite starved for dedicated back-end programmers and therefore need the capability for any member on the team to design and implement their own ideas into the game. We are comprised of a lot of generalists on our team. Therefore, our knowledge, which is basically “a little bit of everything,” is really helped along by the Unreal Engine Blueprints system as it takes our “generalist” knowledge and visualizes for us. It presents it on the screen clearly and makes it easy to forge new connections. It is a highly versatile development tool throughout the prototype and production phases. Its ease of use and speed in which you can combine the different elements of the engine facilitate us to achieve our overall design goals. <br /> <br /> Another instrumental UE feature was the network model, because with <em>First Class Trouble</em>, we ventured into something we hadn’t done before; namely real-time online multiplayer. As a team, we have experience creating single-player games and moving into multiplayer was, to put it bluntly, nerve-wracking. To go from a state where everything seamlessly worked as intended to the fact that we now had to work with clients, servers, replication, and so forth. It really was a daunting task, but when we got the general understanding of it all, we really benefited from the way UE4’s <a href="" target="_blank">networking</a> is set up. UE4, being built around the client-server model, already includes a basic framework for networked games that gave us a solid foundation to work with. It’s quite easy, in relative terms, to reach the goals of our game design given how everything is connected and well-documented.<br /> <img alt="FCT-shot-3.png" height="auto" src="" width="auto" /><br /> <strong>How has <em>First Class Trouble </em>evolved since starting as Project Cainwood? What inspired the official name?<br /> <br /> Bevensee: </strong>Going from Project Cainwood to <em>First Class Trouble </em>has been a long and winding path. Initially, the game was a cabin/ranch in the woods title where you played as different flat colored characters. This is where the name Cainwood originated, i.e. “CAbIN in the WOODs,” and Cain was the biblical character who killed his brother. So, it was a great working title. However, it mostly functioned as a minimal viable prototype for the core concept of a game based on trust and deceit. We then talked for weeks and decided to transform the game into a more arctic setting, ala “The Thing,” however, the eerie, action-paced, and more horror-oriented direction eliminated a lot of things that we felt were imperative for the vision we were looking for. For example, when people are scared, they are not really communicating. As we transitioned into the new setting, we knew we had to change the name of the game. We kept CAIN as the name of the “Central Artificial Intelligence Network,” because it&#39;s still a great reference to the ultimate betrayal between two brothers.<br /> <br /> <strong>Considering <em>First Class Trouble’s</em> art direction features a lot of retro-future designs, was it at all influenced by the Tex Avery cartoon “The House of Tomorrow?”<br /> <br /> Bevensee:</strong> Quite early in the development, we fell in love with the creative possibilities that science fiction could provide for the game. Science fiction, as a genre, opens up to a lot of world-building, which we as a team love. From a historical perspective, it serves as a great genre to frame more serious topics in an exciting and thrilling way. We started to envision how our space future could look, and how other people have tried to predict the future in the past. It obviously became quite evident that the future is not progressing as fast as people predicted it.<br /> <br /> Modern technology is far less integrated into everyday objects than has been imagined in a lot of sci-fi. In our homes today, technology sits among classic interior design; for instance, a table is still a pretty solid concept and hasn’t changed that much. This is where we saw a unique take on the art style and overall feel. We started to think about how a 1950’s technological vision could be paired with some of the many traditional design aesthetics still used today. Being a Danish developer, this meant utilizing some of our renowned design legends as an influence for the interior. This is also exactly where we fell in love with classics such as the “The House of Tomorrow,” but also classics such as “Westinghouse All-Electric House,” Jacques Tati’s “Mon Oncle,” General Motors’ vision of the highways “Key to the Future,” and Disney’s “Monsanto House of the Future,” EPCOT center, and so forth.<br /> <br /> What “The House of Tomorrow” perfectly combines is the marriage of humor and technological innovation. We strive to have as many mechanics and design elements as we can from that world combination; of course the humor in <em>First Class Trouble</em> is not quite as satirized as it is in "The House of Tomorrow," but it is important for us that the actions that you carry out allow for humorous role-play between players. For example, the spectating robotic vacuum cleaners that knock over furniture in front of other players.<br /> <img alt="FCT-shot-12.png" height="auto" src="" width="auto" /><br /> <strong>Where are all the places people can go to learn about <em>First Class Trouble</em>? </strong><br />   <ul style="margin-left: 40px;"> <li><a href="" target="_blank">Twitter</a></li> <li><a href="" target="_blank">Discord</a></li> <li><a href="" target="_blank">Facebook</a></li> <li><a href="" target="_blank">YouTube</a></li> </ul> GamesArtBlueprintsCommunityDesignInvisible WallsFirst Class TroubleCharles Singletary Jr.Fri, 07 Feb 2020 19:00:00 GMTFri, 07 Feb 2020 19:00:00 GMT University partners with industry on a new approach to urban design and planning University is working with Kohn Pedersen Fox, FXCollaborative, Esri, and Epic Games to develop a new collaborative real-time design solution that will change the way the next generation of architects and designers approach urban design and planning.The world of architectural design is evolving at an unprecedented rate. The next generation of architects and designers is growing up with real-time technology, and it’s beginning to affect every aspect of their lives—not just in the living spaces they’ll one day create, but in the way they shop, play, learn, travel, and communicate.<br /> <br /> Real-time technology is changing how designers view and interact with concepts. The new ability to test-drive a structure at the earliest stages of design—and influence its evolution as a result of that experience—leads to rapid iteration and exploration of the design envelope. It has the potential to foster greater creativity, to expose new possibilities.<br /> <br /> To make sure all this power is harnessed in the most beneficial way, established industry leaders, and technology providers need to come together with a united vision. A great example of this is a joint initiative called Virtual Places, which was the brainchild of leading educator <a href="" target="_blank">Cornell University</a> and industry powerhouses <a href="" target="_blank">KPF</a> (Kohn Pedersen Fox Associates) and <a href="" target="_blank">FXCollaborative</a>, all based in New York. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> The core objective of the Virtual Places project, which also receives support from HP, is to develop an Unreal Engine-based toolset that will enable architects and designers to quickly design virtual urban spaces that users can experience collaboratively in real time. <br /> <br /> “Cities are getting denser and denser as we are going away from a planning paradigm that was made for cars, to a kind of much more human experience with walkable cities, bikeable cities,” says Timur Dogan, Asst. Prof., Dept. of Architecture, Dir. Env. & Systems Lab at Cornell University. “I think these public spaces are really becoming much more important.”<br /> <br /> The project required a way to generate cities procedurally. Instead of trying to create an entirely new tool for this, the team turned to <a href="" target="_blank">Esri CityEngine</a>, an existing parametric building creator already used for city-scale projects. The company was happy to lend its support, offering technical assistance, licenses of CityEngine, and sample projects. <br /> <img alt="Spotlight_ProjectWren_blog_body_imgCornell_1.jpg" height="auto" src="" width="auto" /><br /> To oversee the technical aspects of the project, and to ensure it met the stakeholders’ requirements, the team relied on the expertise of locally based VR and AR platform company <a href="" target="_blank">The Glimpse Group</a>. Working with Esri and Epic, Glimpse created a plugin to bridge CityEngine and Unreal Engine. Glimpse also enhanced Unreal Engine’s <a href="" target="_blank">Collaborative Viewer</a> template to enable multiple people to simultaneously experience and edit parametric buildings in both desktop and VR. <br /> <br /> The ability to see fully rendered, photorealistic results in VR enables users to really experience their environment as they make changes. Henry Richardson, Professor at Cornell University, understands the importance of this. <br /> <br /> “The buildings we can make, we can develop them parametrically, we can change them and so on,” he says. “But it&#39;s the atmosphere, the ambience, the feel of the place, the meaning-making symbolisms, and so on, that really are the key to this process.”<br /> <img alt="Spotlight_ProjectWren_blog_body_imgEsri2.jpg" height="auto" src="" width="auto" /><br /> For the industry stakeholders, the project offers insights into how various sources of design information and environmental constraints could be used to explore urban projects in entirely new ways.<br /> <br /> “The first iteration of this tool is that you can collaboratively adjust and, in real time, understand what those changes in the urban environment will feel like,” says Alexandra Pollock, Principal and Director of Design Technology at FXCollaborative. “It also allows you to get into the space as an occupant to be able to look around, to walk around, to understand how that space may function.”<br /> <br /> Luc Wilson, Senior Associate Principal and Director at Kohn Pedersen Fox, agrees. “I&#39;m hoping we&#39;re able to link this to the simulation and quantification we&#39;re doing in a one-to-one manner,” he says. “We can say a place is going to be well daylit, but it&#39;s better if someone actually goes there and says, ‘Oh no, this is actually not well daylit, the space over here is.’ ”<br /> <br /> Cornell has wholeheartedly embraced the project, spinning up an associated class in record time—in parallel with the continuing research and development efforts—and even going as far as to establish a new VR Lab in the School of Architecture. These efforts have been met with unbridled enthusiasm.<br /> <img alt="Spotlight_ProjectWren_blog_body_imgCornell1.jpg" height="auto" src="" width="auto" /><br /> “The response is pretty much always positive from students,” says Martin Miller, another of the faculty’s professors. “They&#39;re always excited to see new things, and they see the potential in these new tools and new techniques.” <br /> <img alt="Spotlight_ProjectWren_blog_body_imgCornell_2.jpg" height="auto" src="" width="auto" /><br /> “They&#39;re really thinking critically and creatively about how that technology can actually be a catalyst to transform how we use it in the industry,” agrees Pollock. <br /> <br /> The project is garnering attention in the wider industry, with luminaries like Zaha Hadid, HOK, Shop Architects, Jacobs, CRTKL, and Woods Bagot all showing keen interest. <br /> <br /> “Design thinking is going through a paradigm shift,” Richardson says. “The firms that embrace this are going to be ahead. This research we are doing actually will help to mitigate risk. These technologies are going to really help architects to revamp the way they work, and we are very happy to be part of developing the know-how and the tools for making it happen.”<br /> <img alt="Spotlight_ProjectWren_blog_body_imgUE.jpg" height="auto" src="" width="auto" /><br /> The team is keen to ensure that schools and other firms have free access to the research work resulting from this project, and Esri and Epic are collaborating to maintain the plugin and template as both CityEngine and Unreal Engine continue to evolve. Interested parties will be able to access the plugin code base on the <a href="" target="_blank">community site</a> Esri has setup for the project, which they’ve dubbed Vitruvio. Use of the CityEngine run-time DLL is free for non-commercial use; for details, refer to the Esri SDK <a href=";d=DwMGaQ&amp;c=n6-cguzQvX_tUIrZOS_4Og&amp;r=DM1w_Go0_Uc-753CWieYivIhD9nt70uFHMGTU_WQEq8&amp;m=0ejbJbvZKnqJi_h4f5z3SJKgEkyKTS1G27ud-ws_i3g&amp;s=cEg5nuahniXbFKXnQa69dDaIacmxw4wb-jYXn3_8R6Q&amp;e=" target="_blank">license</a>. The Collaborative Viewer template is included with Unreal Engine.<br /> <br /> <br /> Want to use real-time technology to get ahead in your industry? <a href="" target="_blank">Download Unreal Engine</a> today, and visit our <a href="" target="_blank">Unreal Online Learning</a> hub to access over 40 hours of easy-to-follow self-paced video courses—all for free.<br /> <br /> If you’re an educator looking to bring Unreal Engine into your classroom, visit our <a href="" target="_blank">Educators page</a> to access free Epic-approved curricula, projects, content examples, documentation, and more.<br />  ArchitectureVRDXRDesignEducationCornell UniversityKohn Pedersen FoxFXCollaborativeEsri CityEngineThe Glimpse GroupKen PimentelFri, 07 Feb 2020 14:00:00 GMTFri, 07 Feb 2020 14:00:00 GMT ray tracing in Unreal Engine - Part 4: media and entertainment cinematic-quality gameplay to breathtaking live broadcasts, real-time ray tracing is bringing new levels of realism and immersion to interactive media and entertainment projects, as well as delivering new efficiencies for linear content production.In the final part of our series looking at the <a href="" target="_blank">evolution of real-time ray tracing in Unreal Engine</a>, and what the technology brings to various markets like <a href="" target="_blank">architecture</a> and <a href="" target="_blank">automotive</a>, we dive into the world of media and entertainment: games, film, television, broadcast, and live events.<br /> <br /> As we’ve previously discussed, the key benefits of ray tracing over rasterization lie in more accurate dynamic shadows whose softness varies dependent on distance from the casting object, and more accurate and richer reflections that correctly capture off-screen and moving elements. Together with other ray-traced effects like global illumination, ambient occlusion, and translucency, these offer a much more photorealistic result that helps viewers suspend disbelief and fully buy into virtual worlds. <br /> <br /> A further benefit of ray tracing is that setting up shadows and reflections is significantly faster and easier, eliminating the need for techniques like light baking and the manual placement of reflection probes and planes.<br /> <br /> And surprisingly, ray-traced shadows could actually be <a href="" target="_blank">faster to compute</a> than rasterized <a href="" target="_blank">Cascading Shadow Maps</a>, resulting in an improvement in performance in cases where shadows are required for moving objects, such as trees with leaves that sway in the wind.<br /> <br /> So let’s take a look at how real-time ray tracing is already benefiting media and entertainment projects specifically.<br />   <h3><strong>More believable real-time entertainment</strong></h3> From AAA games to live broadcast, real-time ray tracing is just starting to move from technology demo to commercial offering, and we can expect to see many more examples over the coming months.<br /> <br /> One of the first titles to offer players in-game real-time ray tracing is <a href="" target="_blank">Deliver Us The Moon</a>, a Sci-Fi thriller from Dutch indie developer KeokeN Interactive set in an apocalyptic near future. Despite having almost completed the game when real-time ray tracing became available, the team was able to retroactively <a href="" target="_blank">enable ray tracing for shadows and reflections in Unreal Engine</a>, tweaking materials and lighting setups to get the desired look. The RTX-enabled update to the original game, which also supports <a href="" target="_blank">NVIDIA</a>’s performance-boosting <a href="" target="_blank">DLSS</a>, launched in December 2019 and has garnered many positive reviews.<br /> <img alt="News_RayTracingME_blog_body_img_Moon3.jpg" height="auto" src="" width="auto" /><br /> Live broadcast is another area where real-time ray tracing is making rapid inroads. In September 2019, <a href="" target="_blank">The Future Group announced</a> that RIOT Games had used its Unreal Engine-based AR solution <a href="" target="_blank">Pixotope</a>—along with <a href="" target="_blank">Cubic Motion</a> for real-time facial animation, <a href="" target="_blank">Animatrik</a> for motion capture, and <a href="" target="_blank">Stype</a> for camera tracking—to deliver the first live broadcast containing real-time ray tracing and real-time facial animation. <br /> <br /> The event, which took place at <em>The League of Legends</em> LPL Regional Finals in Shanghai, featured one of the game’s characters, Akali, dancing with live performers, and being interviewed by a real-life presenter. Thanks to ray tracing, the quality of the shadows and subtle reflections on the character differentiate this live-to-air broadcast offering from previous live shows of a similar nature.<br /> <img alt="News_RayTracingME_blog_body_imgFutureGroup.jpg" height="auto" src="" width="auto" /><br /> At <a href="" target="_blank">IBC</a>, also in September, Dutch broadcast services and media solutions provider <a href="" target="_blank">NEP The Netherlands</a> demonstrated a<a href=";" target="_blank"> real-time ray-traced virtual studio</a> powered by <a href="" target="_blank">Zero Density’s Reality Engine</a>, another software offering that uses Unreal Engine’s renderer. The studio showed how reflections from offscreen objects, made possible by ray tracing, significantly increase the set’s credibility.<br /> <img alt="News_RayTracing_blog_body_img_NEP4.jpg" height="auto" src="" width="auto" /> <h3><strong>Final-pixel output in less time</strong></h3> While real-time performance is not a necessity for linear content—such as episodic television, commercials, and game trailers—the ability to render at blazingly fast speeds and achieve results previously only possible from offline renderers is a boon.<br /> <br /> Creative agency <a href="" target="_blank">Capacity</a> recently delivered its <a href="" target="_blank">third installment of an intro cinematic</a> for the <a href="" target="_blank">Rocket League Championship Series (RLCS)</a>, the official competitive league of the hit sports-action game, Rocket League. As with the previous two intros, the team used the original game assets provided by developer <a href="" target="_blank">Psyonix</a>, increasing their resolution and retexturing them for cinematic fidelity and added realism. This time, however, due to the advent of ray tracing in Unreal Engine, the team was able to move the project to real-time rendering instead of using an offline rendering solution. <br /> <img alt="News_RayTracing_blog_body_img_RocketLeague3.jpg" height="auto" src="" width="auto" /><br /> As well as being able to achieve the requisite high-quality lighting and accurate reflections at incredible speeds, the team identifies an enhanced creative process as a key benefit.<br /> <br /> "We&#39;re thrilled about what the real-time workflow has done for our creative process in terms of iteration and being able to quickly adapt to feedback,” says Ellerey Gave, Creative Director at Capacity. “And with the quality that we&#39;re able to achieve through real-time ray tracing and practically instantaneous renders (compared to traditional pre-rendered pipelines), it&#39;s quickly become our preferred way to work." <br /> <img alt="News_RayTracing_blog_body_img_RocketLeague4.jpg" height="auto" src="" width="auto" /><br /> Capacity is not alone in employing real-time ray tracing in this way. Part of what makes Unreal Engine different from other real-time engines is that we “eat our own dog food,” production-testing features on our own internal projects and portfolio of published titles. The Fortnite team at Epic Games has been putting Unreal Engine’s ray tracing implementation through its paces since its inception, and is currently using it for trailers and cinematics. The Fortnite Season 11 Chapter 2 trailer is a great illustration of this. It’s also an interesting example of the fact that, even for stylized content that doesn’t attempt to be photorealistic, ray tracing lends an extra layer of richness that draws the viewer into the world. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> However, when photorealism is the aim, it’s fascinating to see what’s possible. In <a href="" target="_blank">this example</a>, artist Sertac Tasdemir experimented with recreating the Batmobile from <em>The Dark Knight</em>, aiming to emulate the film’s lighting to the greatest possible extent. As Senior Editor of community website <a href="" target="_blank">80 Level</a> Arti Sergeev says: “The results are kind of amazing, and it makes you think about the future of the industry, and the way engines like Unreal Engine could be used in film and animation projects.”<br /> <img alt="News_RayTracingME_blog_body_img_Batmobile.jpg" height="auto" src="" width="auto" /> <h3><strong>Accelerated lighting and look development</strong></h3> It’s not just in rendering the finished frames that real-time ray tracing speeds things up. When used in the lighting and look development process, Unreal Engine enables lighters to see their end results directly in the editor, and make adjustments on the fly to refine their look in real time.<br /> <br /> With real-time ray tracing, the requirement to bake lights—a time-consuming, iterative process that requires many adjustments to UVs and settings to achieve a high-quality result—can, in some cases, be completely eliminated. A real-time ray tracing solution provides an immediacy that enables lighters to make the most of their creative talents.<br />   <h3><strong>More accurate techvis </strong></h3> Real-time ray tracing can also be used to good effect during shot planning, as we discovered when working on a project with Lux Machina, Magnopus, Profile Studios, Quixel, ARRI, and DP Matt Workman to test-drive our new virtual production toolset.  <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> Revealed at SIGGRAPH 2019, one of the things the project demonstrated was how LED walls can not only provide environments for real-world actors and props, but also light them and cast reflections on them, making it possible to capture final pixels in camera. <br /> <br /> Set in a desert environment at sunset, the shot featured an actor approaching and mounting a motorcycle, and donning a pair of reflective sunglasses. The aim was to achieve the best-looking shot with accurate reflections from the LED walls.<br /> <img alt="News_RayTracingME_blog_body_imgQuokka.jpg" height="auto" src="" width="auto" /><br /> As CG Supervisor on the project, I used real-time ray tracing to plan where exactly the motorcycle and actor needed to be placed in relation to the LED wall and the camera. This enabled me to quickly see how I could get the best reflections on the subjects, and to judge how much light I was going to get from the LED walls. I was able to validate that the layout of the LED panels was giving me the most reflection coverage on the motorcycle and the actor’s glasses.<br /> <img alt="News_RayTracingME_blog_body_imgTechviz.jpg" height="auto" src="" width="auto" /><br /> You can get a more in-depth behind-the-scenes look at the entire project in <a href="" target="_blank">this blog post</a>.<br />   <h3><strong>This is just the beginning! </strong></h3> As we embark on a new decade, there’s little question that technologies like real-time ray tracing will continue to evolve at an incredible pace and that virtual worlds will become completely indistinguishable from physical worlds. It’s exciting to imagine the new and innovative ways in which creators will harness this power to change the way we live, work, and play, and we’re thrilled to be part of that journey.<br /> <br /> <br /> Ready to see what real-time ray tracing can bring to your media and entertainment projects right now? Download <a href="" target="_blank">Unreal Engine</a> today.<br /> <br /> This article is part of a series on real-time ray tracing in Unreal Engine. You can also read <a href="" target="_blank">Part 1: the evolution</a>, <a href="" target="_blank">Part 2: architectural visualization</a>, and <a href="" target="_blank">Part 3: automotive design and visualization</a>.<br />  Film & TelevisionBroadcast & Live EventsGamesVirtual ProductionRay TracingJuan S. GomezThu, 06 Feb 2020 20:00:00 GMTThu, 06 Feb 2020 20:00:00 GMT comprehensive guide to creating 360-degree game trailers using Unreal Marketing Manager Renee Klint and Technical Art Director John Cruz walk users through how to create 360-degree game trailers that effectively portray the immersiveness of VR on 2D screens.As with most creative rabbit holes, we fell into the concept of 360 trailers by asking questions. What is the best way to represent a VR experience from non-VR mediums? How can we extend our immersive expertise to our game promotions? Trailers are the most powerful promotional tool for games; can we push a trailer to be just as immersive and engaging as the VR experience itself?<br /> <br /> We at Archiact were planning the announcement of our recent VR adventure experience, <a href="" target="_blank">FREEDIVER: Triton Down</a>, which is available now on <a href="" target="_blank">Steam</a> and the <a href="" target="_blank">Oculus Store</a>, and were nosing around for the best way to spread the word about this game we’d made and loved. With our 360-degree teaser trailer going live, it has not only kicked off the most successful announcement week in our studio’s history, but the video itself has shattered every video record our previous trailers had ever set. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> Best of all, no fancy third-party tech or expensive program licences were needed: we were able to accomplish all of this using Unreal Engine with only a small team of three developers spending a few hours for setup, plus another day or so for render time. The workflow was amazingly smooth, and we’d love to share it with you.<br /> <br /> If you’re wondering if 360 video might be right for your VR game promotions, this post will walk you through the technical steps within Unreal to film, record, and render your in-game content in 360 degrees. We’ll also share some of the learnings we gained along the way regarding the less-tangible side of 360 content creation, such as non-linear storytelling, viewer engagement, and more. <br />   <h2><strong>Okay, But Why 360?</strong></h2> For VR developers, the problem is a familiar one: how can you accurately convey just how amazingly immersive and interactive your VR game is, when the vast majority of your marketing materials will be experienced on a 2D screen? Most of us stick to what we know, and rely on creative ways to portray a three-dimensional experience through a 2D medium, largely in the form of trailers, screenshots, and GIFs.<br /> <br /> We knew right away that FREEDIVER needed more than that. Between the intense underwater environments and use of one-to-one gestural swimming locomotion, it was screaming for a promotional asset that matched its immersive chops: 360 video seemed like a good place to start.<br /> <br /> There was just one catch for us: we had never made 360 content before. <br /> <img alt="TECH-BLOG_ARCH_UE4BlogPost_IMG02.png" height="auto" src="" width="auto" /> <h2><strong>Who Else is Using 360 Content?</strong></h2> What’s the first thing you do when you’re about to do something for the first time? See what everyone else has already done!<br /> <br /> Somewhat surprisingly, there are only a handful of VR games that have chosen the 360 format for their promotional videos. The first and most prominent example is the trailer for <a href="" target="_blank">The Climb</a>. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> The Climb had an advantage here. Because it&#39;s gameplay is straightforward and relatable enough, which is literally summarized by its title, they didn’t need to spend too much time establishing features or mechanics. What we noticed right away in their trailer is the sense of presence: a big part of The Climb&#39;s appeal is the incredible vistas you’re rewarded with during and at the end of each climb, and the way this trailer is shot encourages the viewer to really look around and take in the beautiful scenery. You want to be there, right now—the act of climbing is almost secondary, and that’s okay. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <a href="" target="_blank">Arizona Sunshine’s</a> 360 trailer takes advantage of both space and time. While you’re whisked from scene to scene in this apocalyptic tableau, time is slowed down enough to give the impression that this is all happening at once. It gives a real sense of chaos to the view, and sets up expectations for a game that will drop you right in the middle of that intense whirlwind. Interestingly, the storytelling here is quite linear: where The Climb&#39;s trailer rewards the viewer no matter where they choose to look, Arizona Sunshine’s trailer still focuses the action right in front of the player, and there isn’t much else to see beyond the immediate action you’re served. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> Last, but not least is the 360 trailer for <a href="" target="_blank">Psychonauts: The Rhombus of Ruin</a>. At a runtime of 93 seconds, this trailer appears to be pre-rendered, and is almost entirely story-based. The use of binaural audio here is key, as the viewer turns their head around to explore the spaceship scene, the direction of the character’s voice will always tug them back to center. One challenge this trailer does highlight is how to tell a clear linear story in a non-linear format; without the use of voice over, there are few context clues to bring the viewer into the story and, therefore, to give them a presence in the world the developers have created. It’s a tough challenge! <h2><strong>What We Learned</strong></h2>   <ul style="margin-left: 40px;"> <li>Reward the player for looking around</li> <li>Presence is key, gameplay less so</li> <li>Find a way to keep your viewer’s attention centered on the most important thing, without punishing them for straying</li> <li>In other words, treat it a lot like VR! Many of the same principles apply here.</li> </ul> <h2><img alt="TECH-BLOG_ARCH_UE4BlogPost_IMG06.png" height="auto" src="" width="auto" /><br /> <strong>What Does 360 Content Need to Succeed?</strong></h2> With some new knowledge under our belts, we set our goal for our own 360 trailer. In the end, we decided the focus needed to be on tone and energy: we wanted viewers to feel what it was like to be trapped inside the hull of a sinking ship, to get a taste of the nerves and the shortness of breath that the game so wonderfully douses you in.<br /> <br /> <strong>Right away, we sketched out the following guiding elements:</strong><br />   <ul style="margin-left: 40px;"> <li>Establish the scene quickly: you are on a boat!</li> <li>Establish presence: you are a person! Look at those arms. Those are yours!</li> <li>Establish elements of anxiety: this is not a friendly place, and the swimmer is definitely in trouble</li> <li>Establish our highest level gameplay mechanics: swimming and oxygen management</li> <li>Always give the viewer something interesting to look at</li> <li>Put the viewer in danger, then give them hope, and go out with a bang</li> </ul> <br /> From that, our first storyboard was born: the viewer opens their “eyes” in the hull of a completely flooded ship galley. They must swim to the nearby intake hatch—past their dead shipmate’s floating body—and get enough oxygen to swim down into the vent and—hopefully—to safety. <div style="text-align: center;"><img alt="TECH-BLOG_ARCH_UE4BlogPost_IMG07.jpg" height="auto" src="" width="auto" /></div> <h2><strong>The Hidden Challenges of 360</strong></h2> Armed with our storyboard and a rough script, we dove into production. Right away, we faced challenges. Some were story based: How do we direct the viewer’s eye? How long should we keep the camera in a specific location before moving on? All the others were technical. How can we even record this in the way we need? What will our in-game avatar look like when you can swivel the head around in any direction? <br /> <br /> With Unreal, the road to answers for these questions was much smoother than we anticipated, and even allowed for iteration to get to that perfect end result. Here are the technical steps, one by one, for you to follow along with and/or troubleshoot your existing process. <h2><strong>Step 1: Get Equipped</strong></h2> The first step in production was to get our ducks in a row. And by ducks, we mean plugins. The one we used is the [experimental] <a href="" target="_blank">Stereoscopic Panoramic Capture plugin</a> from Unreal. Our trailer was created using a pre-4.23 version of the plugin, so be sure to check out the notes on the new version for the most up-to-date workflow! <div style="text-align: center;"><img alt="TECH-BLOG_ARCH_UE4BlogPostImg.jpg" height="auto" src="" width="auto" /></div> With that installed, make sure that Instanced Stereoscopic Rendering is OFF in the Project Settings. <div style="text-align: center;"><img alt="TECH-BLOG_ARCH_UE4BlogPost_IMG10.jpg" height="auto" src="" width="auto" /></div> Restart the Editor for the changes to take effect. Add the following execute console command nodes right after the Begin Play Event node. Now you’re ready to load up your scene! <div style="text-align: center;"><img alt="TECH-BLOG_ARCH_UE4BlogPost_IMG11.jpg" height="auto" src="" width="auto" /></div> <h2><strong>Step 2: VR Mo-Cap? Easier Than It Sounds!</strong></h2> Since presence is key, we absolutely needed to have the arms of the main character (Ren Tanaka) in every shot. That meant essentially performing motion capture inside the game itself. <a href="" target="_blank">Sequencer</a> was the best tool for this job, and we used it to record gameplay.<br /> <br /> In the FREEDIVER project, our base character is spawned only when the user plays the game. In order for <a href="" target="_blank">Sequence Recorder</a> to tangibly handle the base character, we needed to set the GameMode Override to “GameMode” in the World Settings. <div style="text-align: center;"><img alt="TECH-BLOG_ARCH_UE4BlogPost_IMG12.jpg" height="auto" src="" width="auto" /></div> Next, we added (dragged in) the BaseCharacter manually into the level. <div style="text-align: center;"><img alt="TECH-BLOG_ARCH_UE4BlogPost_IMG13.jpg" height="auto" src="" width="auto" /></div> We selected the BaseCharacter in the level and set its Auto Possess Player property to ‘Player 0’. <div style="text-align: center;"><img alt="TECH-BLOG_ARCH_UE4BlogPost_IMG14.jpg" height="auto" src="" width="auto" /></div> Once we had those set, the BaseCharacter took inputs from the controller and could then be used to record Sequencer animation. From there, we open up the Sequence Recorder window... <div style="text-align: center;"><img alt="TECH-BLOG_ARCH_UE4BlogPost_IMG15.jpg" height="auto" src="" width="auto" /></div> ...and selected the BaseCharacter and pressed the Add button at the top of the Sequence Recorder window. <div style="text-align: center;"><img alt="TECH-BLOG_ARCH_UE4BlogPost_IMG16.jpg" height="auto" src="" width="auto" /></div> Now it’s time to dive into the motion capture! Launch the Game in VR mode… <div style="text-align: center;"><img alt="TECH-BLOG_ARCH_UE4BlogPost_IMG17.jpg" height="auto" src="" width="auto" /></div> ...then press Shift+F1 to focus out of the VR window and get back to the Editor while the VR mode is still playing. Press the Record button at the top of the Sequence Recorder window. <div style="text-align: center;"><img alt="TECH-BLOG_ARCH_UE4BlogPost_IMG18.jpg" height="auto" src="" width="auto" /></div> Click back on the VR window to focus on it. You should be able to resume control over the character and camera. You should also see a countdown overlay indicating that the recording will start in 4, 3, 2, 1 seconds.<br /> <br /> Lights, camera, action! Once the recording begins, move about the virtual world and perform your actions as you planned. Remember, every action, from head movement to controller inputs, will be recorded as Sequencer animations, so don’t forget to act the part from head to toe. <div style="text-align: center;"><img alt="TECH-BLOG_ARCH_UE4BlogPost_IMG19.gif" height="auto" src="" width="auto" /></div> <h3><em><strong>Tips for VR Mo-Cap:</strong></em></h3>   <ul style="margin-left: 40px;"> <li>Don’t be afraid of multiple takes. Just like capturing live action, it will take a few run-throughs to get everything right.</li> <li>Keep your head as steady as you can. Avoid swinging around wildly.</li> <li>The viewer needs two to three seconds to fully focus on a new object or action.</li> <li>If you don’t have a player avatar with visible hands, definitely consider it! We were amazed by how much character and storytelling was possible through Ren Tanaka’s gestures. </li> <li>Exaggerate: tiny motions may not register in the final animation sequence, so keep your arms up high and wide, and your movements slow.</li> </ul> <br /> Once the performance is wrapped, press Shift+F1 again to defocus out of the VR window in order to get back to the Editor and stop the recording in the Sequence Recorder window. A recorded sequence will then be created. <div style="text-align: center;"><img alt="TECH-BLOG_ARCH_UE4BlogPost_IMG20.jpg" height="auto" src="" width="auto" /></div> Open up that sequence to see the animation track contents. <div style="text-align: center;"><img alt="TECH-BLOG_ARCH_UE4BlogPost_IMG21.jpg" height="auto" src="" width="auto" /></div> You can inspect the character body animation by right clicking on the SM_VRPlayer animation track properties and double-clicking on the recorded animation asset. <div style="text-align: center;"><img alt="TECH-BLOG_ARCH_UE4BlogPost_IMG22.jpg" height="auto" src="" width="auto" /></div> <h2><strong>Step 3: Fine-tuning Your Animation</strong></h2> While our method of motion capture was effective in recording the essence of the player’s movement through the scene—timing, head movement, and general placement/interaction of hands—no IK system is perfect, and there will likely always be room for improvement in the resulting animation.<br /> <br /> Because this teaser trailer would be the VR community’s first ever glimpse of what FREEDIVER has to offer, we wanted the animation to be perfect. It’s more than just a quality issue; it’s also a question of storytelling. First-person footage from a VR game is indistinguishable from first-person 2D game footage, unless you have significant player presence in the form of hands/interaction fidelity.<br /> <br /> In order to tell the story of Ren’s underwater struggle for survival in the 30 seconds or so we had to tell it, ensuring that her hands were believable “actors” in their movement and interactions was key. <div style="text-align: center;"><img alt="TECH-BLOG_ARCH_UE4BlogPost_IMG23.jpg" height="auto" src="" width="auto" /></div> Thanks to the Sequencer, you can export the FBX files from your gameplay motion inputs directly to the animation software of your choice. In our case, that was Maya, which our animator used to fine-tune the animation keyframes written [in] Sequencer.<br /> <br /> <em><strong>Remember: </strong></em>When you export to the animation software, you are only going to see the player avatar, and not the game world/objects around it. Since this makes orientation and fine interactions difficult, we recommend recording an additional 2D render from the viewpoint of your player avatar in Unreal, and attaching it to the head bones of your animating character. <div style="text-align: center;"><img alt="TECH-BLOG_ARCH_UE4BlogPost_IMG24.gif" height="auto" src="" width="auto" /></div> <h2><strong>Step 4: Final Tweaks & Export</strong></h2> Reimport your polished animations into Sequencer. Now, you have the chance to make any final tweaks to your in-game world. You can add lighting to guide your viewer’s eye, nudge in-game objects to better fit the frame, and even hand-animate moving elements for exact timing. This was by far one of the most useful steps in the process, and Unreal gives you the flexibility to make changes as you need.<br /> <br /> <em><strong>Remember:</strong></em> If you want to have non-in-game visual elements appear in your 360 video (such as a logo in the corner, legal text, etc.) now is the best time to add them. This way, they won’t be “stuck” to the viewer’s gaze like a sticker on their virtual eyeball, which is a distraction and instant immersion-killer. If a logo “floats” along with the viewer, but remains in place while they look around, it’s much less intrusive.<br /> <br /> When you’re ready to capture, all that’s left is to play in StandAlone Game mode, and render out the sequence as it unfolds exactly how you wanted it. <div style="text-align: center;"><img alt="TECH-BLOG_ARCH_UE4BlogPost_IMG24.jpg" height="auto" src="" width="auto" /></div> You can select the resolution your render will output as; for ultra-crisp video in 360, we’d recommend 8K, or 4K minimum. 360 images for both the left and right eye view will be rendered out and saved individually.<br /> <br /> <em><strong>Remember:</strong></em> Because the render outputs as 360 stills in sequence, there will be no attached audio. In order to capture your in-game soundscapes organically, use a screen recorder such as OBS or ShadowPlay to record the Sequencer events independently, then import to your editor later. <div style="text-align: center;"><img alt="TECH-BLOG_ARCH_UE4BlogPost_IMG25.jpg" height="auto" src="" width="auto" /></div> Next, fire up your linear editing software of choice and import the images as a sequence. From here, you can color correct as needed, and render out the final master file in the desired video format.<br /> <br /> VLC player can play this new 8K 360 panoramic movie format, with complete click-and-drag gaze control. <h2><strong>Step 5: Editing Your 360 Footage</strong></h2> This is it: you’re finally ready to take your master 360 files and edit them into your trailer, or whatever video asset you’re creating. 360 files will behave just like 2D files in most editing software, so simply arrange the sequences as needed, color correct them, add transitions and title cards, and render out your final. If you recorded the game audio separately, this is the time to add that back in. <h3><em><strong>Quick Tips for Editing in 360:</strong></em><br />  </h3> <ul style="margin-left: 40px;"> <li>Because your footage will be at least 4K, you will likely need a beefy PC to handle the render.</li> <li>You can add 2D elements such as text and still image files, but they must be projected in 360/VR-mode in order to avoid severe distortion when rendered in 360, as seen below. (Many editing suites have this function built in; otherwise, you should be able to find a plugin to handle your projections.)</li> <li>Not all graphics cards are equipped to render video effects in 360. Ensure you have a <a href="" target="_blank">supported graphics card</a> and update your drivers.</li> <li>As mentioned above, avoid overlaying text or graphics in layers directly over the 360 video, as they will remain static and become a severe distraction in the “corner” of your viewer’s virtual eye. If you want to have a logo permanently on-screen, for instance, add it to the game world and attach it to the player avatar instead.</li> </ul> <img alt="TECH-BLOG_ARCH_UE4BlogPost_IMG26.jpg" height="auto" src="" width="auto" /> <h2><strong>Step 6: Audio</strong></h2> For us, we knew that audio was going to be paramount to the success of this trailer, and that meant we left it to the audio experts!<br /> <br /> The audio for FREEDIVER was created by <a href="" target="_blank">Interleave</a>, and was designed to be as realistic and immersive as possible. Instrumentation is meant to function organically with the ship&#39;s sounds, since the ship is a primary character in FREEDIVER, which sings and speaks through its sinking. Instead of approaching the score with traditional brass or strings, the audio designer and composer settled on a music direction where the ship became the actual source of tone and tension throughout the score. They even rubbed different sustained frequencies against each other based on the user&#39;s input as a way of enhancing the tension the user would feel underwater.<br /> <img alt="TECH-BLOG_ARCH_UE4BlogPost_IMG26.png" height="auto" src="" width="auto" /><br /> Melodies were conjured by manipulating sounds like dry ice placed in large ventilation shafts, and string instrument bows were used on different densities of metal in software samplers. When designing the ship&#39;s large impacts and huge metallic whines from the ship, Interleave tuned them to work together with the goal of blending them with the music, making it difficult to distinguish one from the other. In-game, they also played with this idea of blending music and sound design further: for instance, the pause menu&#39;s swelling metal sounds playback in 3D, with randomized choices of samples and volume and position moving around the listener&#39;s head.<br /> <br /> The teaser provided the additional challenge of hitting many emotional beats in a short time span. At the beginning, the viewer hears the uncertain whines of the ship which crescendo into large haunting blasts, as Ren Tanaka struggles to get to the air pocket in the hatch. As she submerges, the music shifts and builds into a more triumphant and courageous section. The percussion and bass kick off with a rising heartbeat, settling down to begin a shift to an increasingly panicked heartbeat aligned with Ren&#39;s fight to survive.  <div style="text-align: center;"><img alt="TECH-BLOG_ARCH_UE4BlogPost_IMG29.PNG" height="auto" src="" width="auto" /></div> The teaser sound effects were edited to picture primarily from gameplay assets and then tweaked and sweetened. Interleave wanted a strong contrast between the air and water-filled environments, so they used a tonal contrast as well as a perceptual one. Plugins were used to add credible space to the air environments and to keep the underwater environment intimate. They also smeared the position of underwater sounds to accent sound wave speed differences in the two mediums, and many of the sounds were positioned using 360° surround tools.<br /> <br /> In the end, Interleave supplied us with impeccable 5:1 surround sound audio, as well as 2-channel files in case we were uploading the final video somewhere without 5:1 support. The trailer’s <a href="" target="_blank">final form</a> is a haunting, powerful representation of the FREEDIVER experience, and the audio plays a huge role in telling the story of Ren Tanaka’s plight. <h2 style="text-align: center;"><img alt="TECH-BLOG_ARCH_UE4BlogPost_IMG27.gif" height="auto" src="" width="auto" /></h2> <h2><strong>Step 7: Rendering & Upload</strong></h2> Once you’ve obtained the audio and striped it under your visuals, it’s time to kick off your final renders. Here are the exact settings we used for our two master renders: one for 5:1 surround, and one for 2-channel. <h3><br /> <strong>Video Settings</strong></h3> Format: H.264 with MP4 wrapper<br /> Width: 4096<br /> Height: 2048<br /> Frame Rate: 60<br /> Field Order: Progressive<br /> Aspect: Square Pixels<br /> TV Standard: NTSC<br /> Performance: Software Encoding<br /> Profile: Main<br /> Level: 5.2<br /> Bitrate Encoding: VBR, 2 Pass<br /> Target Bitrate [Mbps]: 30<br /> Maximum Bitrate [Mbps]: 35<br /> Video is VR: YES<br /> Frame Layout: Monoscopic<br /> Horizontal Field of View: 360<br /> Vertical Field of View: 180<br />   <h3><strong>Audio Settings</strong></h3> Audio Format: AAC<br /> Audio Codec: AAC<br /> Sample Rate: 48,000 Hz<br /> Channels: 5.1 (or 2-Ch)<br /> Quality: High<br /> Bitrate [Mbps]: 320<br /> <br /> <em><strong>Remember:</strong></em> Many video hosts, including Steam, do not support 5:1 sound, and will crunch it down to a messy 2-channel format. To avoid unexpected results, take a page from our book and have a custom 2-channel audio file rendered out for this exact purpose.<br /> <br /> Now that you have your final renders (hooray!), it’s time to upload to the video host of your choosing. YouTube, Vimeo, and Facebook all support fully interactive 360 content, but uploads can take a very long time to process, so be sure to give yourself plenty of time before the big reveal.<br /> <br /> Once your file has processed, take a moment to quickly double check that the video can play in 360 and is interacting as you expect, then share that awesome creation with the world!<br /> <br /> <br /> Thanks for reading! If you create your own 360 content using this guide, go ahead and share it with us via Twitter (tag us <a href="" target="_blank">@ArchiactVR</a> and <a href="" target="_blank">@UnrealEngine</a>) so we can see all your hard work.<br /> <br />  GamesArtCommunityDesignLearningMocapProduct DesignTutorialsVRArchiact VRFREEDIVER: Triton DownArchiact Marketing Manager Renee Klint and Technical Art Director John CruzThu, 06 Feb 2020 15:00:00 GMTThu, 06 Feb 2020 15:00:00 GMT Importing and working with external assets in Twinmotion our webinar on working with external assets in Twinmotion? Now you can watch it on demand! Learn the requirements and file formats for imported assets, and how to work with your own user library of custom-built assets and third-party content.We recently hosted the live webinar <strong>Importing and working with external assets in Twinmotion</strong>. If you missed it, no problem! The replay is available right here.<br /> <br /> In this webinar, Technical Evangelist Craig Barr dives into common workflows and provides tips for working with external assets in Twinmotion. He takes you through the requirements and file formats for assets coming into Twinmotion, as well as shows you how to work with your own user library of custom-built assets and third-party content.<br /> <br /> <strong>What you’ll learn:</strong><br />   <ul style="margin-left: 40px;"> <li>How to work with assets from providers such as AXYZ, Quixel, Turbosquid, and Xfrog</li> <li>Which file formats are supported for Twinmotion workflows</li> <li>How to properly import and create custom asset libraries</li> <li>Tips and tricks for working with models and materials</li> </ul> <br /> You can watch the full webinar below. Looking for more webinars? Check out the full series <a href="" target="_blank">here</a>.<br />   <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div>  ArchitectureCommunityDesignLearningNewsVisualizationTwinmotionWebinarTue, 04 Feb 2020 16:00:00 GMTTue, 04 Feb 2020 16:00:00 GMT free Marketplace content - February 2020 the mood, unleash a horde, and so much more with this month’s free Unreal Engine content, now available on the Unreal Engine Marketplace.In an ongoing partnership with Unreal Engine Marketplace creators, select content is available for free to the Unreal community each month, giving artists, designers, and programmers access to additional resources at no extra cost.<br /> <br /> Check out this month’s great selection below! <h2><strong>February’s featured free content:</strong></h2> <h2><a href="" target="_blank">Amplify LUT Pack</a> | <a href="" target="_blank">Amplify Creations</a></h2> <div style="text-align: center;"><img alt="News_UESC_FEB2020_Blog1.jpg" height="auto" src="" width="auto" /><br /> <em>Color grade and set the mood of your scene in just a few short clicks with over 200 high-quality LUTs.</em></div> <h2><a href="" target="_blank">Auto Settings</a> | <a href="" target="_blank">Sam Bonifacio</a> </h2> <div style="text-align: center;"><img alt="News_UESC_FEB2020_Blog2.jpg" height="auto" src="" width="auto" /><br /> <em>Configure in-game settings and button inputs with ease. Turn any console command into a setting, quickly adjust your settings panels, and more!</em></div> <h2><a href="" target="_blank">Combat Systems - Constructor</a> | <a href="" target="_blank">Vladimir Tim</a></h2> <div style="text-align: center;"><img alt="News_UESC_FEB2020_Blog3.jpg" height="auto" src="" width="auto" /><br /> <em>Prepare for combat—generate and control more than 200 mobile and desktop-ready variants of mechanical robots.</em></div> <h2><a href="" target="_blank">First Person Puzzle Template</a> | <a href="" target="_blank">Divivor</a></h2> <div style="text-align: center;"><img alt="News_UESC_FEB2020_Blog4.jpg" height="auto" src="" width="auto" /><br /> <em>Challenge your players with easy-to-implement puzzle mechanics—designed entirely in Blueprints.</em></div> <h2><a href="" target="_blank">Open World AI Spawn System</a> | <a href="" target="_blank">TiMer Games</a> </h2> <div style="text-align: center;"><img alt="News_UESC_FEB2020_Blog5_v2.png" height="auto" src="" width="auto" /><br /> <em>A performance-friendly spawning system for allies and enemies alike, including example AIs.</em></div> <h2><strong>New permanently free content:</strong></h2> <h2><a href="" target="_blank">Advanced Locomotion System V4</a> | <a href="" target="_blank">LongmireLocomotion</a></h2> <div style="text-align: center;"><img alt="News_UESC_FEB2020_Blog6.jpg" height="auto" src="" width="auto" /><br /> <em>Get up and get going with this flexible advanced bipedal locomotion Blueprint system for single-player projects.</em></div> <br /> Jumpstart a new project or ramp up development on one you’ve already started. Don’t forget to check back in March for yet another round of free products!<br /> <br /> <br /> Are you a Marketplace creator interested in sharing your content for free with the community? Visit <a href="" target="_blank"></a> to learn how you could be featured!<br />  CommunityEventsNewsGamesMarketplaceAmanda SchadeTue, 04 Feb 2020 15:00:00 GMTTue, 04 Feb 2020 15:00:00 GMT and nominees dazzle audiences at the VES Awards 2020 Unreal Engine community continued to raise the bar in 2019 and were recognized for their contributions in VFX over this past year at the 18th annual Visual Effects Society (VES) Awards.It was a magical night at the 18th annual Visual Effects Society (VES) Awards, as artists and industry luminaries from around the world came together to celebrate the craft of visual effects across film, television, commercials, animation, video games, and beyond.<br /> <br /> The Unreal Engine community continued to raise the bar for VFX over this past year, with nominees realizing a range of new and imaginative stories through mesmerizing blockbuster features, intimate VR shorts, stunning episodic series, and immersive IRL attractions.<br /> <br /> This year’s Unreal Engine-powered VES Award winners include ILM’s team behind “The Mandalorian,” which utilized Unreal Engine for capturing in-camera visual effects, and teams from “Game of Thrones,” whose incredible work was informed by detailed previz created by The Third Floor. See a full list of the winners <a href="" target="_blank">here</a>.<br /> <br /> Additionally, Unreal Engine-powered VES Award 2020 nominees included the breathtaking VR short “Myth: A Frozen Tale” from Disney Animation, the immersive escapades of ILMxLAB’s “Vader Immortal,” and more stunning work.<br /> <img alt="body_image_1.jpg" height="auto" src="" width="auto" /><br /> Each of these spectacular projects show that the lines between real and digital are blurring–not just for audiences, but also on set. With real-time tools, the traditional linear production pipeline is shifting into a more fluid and collaborative process from start to finish. Virtual production allows for all departments to get a better sense of the final shots much earlier in the creative process, empowering teams to align on creative decisions faster–utilizing methods such as virtual location scouting, or replacing green screens with photoreal LED walls. You can get a foundational understanding of virtual production and the ways it can be implemented in our <a href="" target="_blank">Virtual Production Field Guide</a>, a wide-ranging industry resource.<br /> <br /> As more and more filmmakers and artists are adopting Unreal Engine to achieve final pixel quality earlier in the creative process, we are thrilled to partner with industry leaders such as ILM, The Third Floor, Halon, Framestore, Disney Animation, and many other studios and artists to power their ambitious virtual production pipelines.<br /> <br /> Congratulations to all the nominees and winners!<br /> <br />  VESGamesCommunityBrian SharonMon, 03 Feb 2020 21:30:00 GMTMon, 03 Feb 2020 21:30:00 GMT Polter Pals uses sharp level design to craft a hauntingly memorable puzzle game studio Split Hare Games is hard at work on Polter Pals, a puzzle-platformer that finds the fun in deadly supernatural possessions.It’s not always easy making friends, and it’s even harder when you’re in an unfamiliar place like a new city, school, or workplace. Now, imagine how much harder this must be when you’re invisible to the people around you.<br /> <br /> <a href="" target="_blank">Polter Pals</a> is a puzzle-platformer from <a href="" target="_blank">Split Hare Games</a>, a one-person studio run by Nicholas Decker, who’s based in Louisiana. The <a href="" target="_blank">Unreal Dev Grant recipient</a> tells the story of a lonely ghost who simply wants people to join him in the afterlife so they can become his new friends. He speeds this process up by possessing his neighbors, controlling and tricking them to their untimely deaths. While being set ablaze by a house fire sounds like a horrific tragedy to most, to Polter Pals’ protagonist, it simply means he gets a new ghostly friend to hang out with!<br /> <br /> Satirical in tone and featuring an art-style that evokes preschooler television, Polter Pals offers 60-plus levels packed with brain-tickling puzzles and morbid comedy. Fresh off an appearance at <a href="" target="_blank">PAX South</a>, where Polter Pals was on display alongside other UE titles such as <a href="" target="_blank">Dungeon Defenders: Awakened</a>, <a href="" target="_blank">Everspace 2</a>, <a href="" target="_blank">Mythic Ocean</a>, and <a href="" target="_blank">Suicide of Rachel Foster</a>, we chatted with Decker about the game’s influences, goals, and how he’s using Unreal Engine to infuse life in this ghostly story.  <br />   <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <br /> <strong>When and how did you first get into game design, and how did the idea for Polter Pals come together?</strong><br /> <br /> <strong>Nicholas Decker: </strong>I&#39;ve been interested in making games since I first realized it was a job that you could have, but my experiences in college took me down a different career path. Around two and a half years ago, I came to a crossroads where I could keep doing the kind of work I had been doing or I could finally listen to that voice in the back of my head telling me how much I wanted to make games. I started reading about and taking courses on things like 3D modeling and development using UE4. Eventually, I started participating in game jams, which was my first actual development experience.<br /> <br /> Polter Pals started life as the second jam game I worked on, [which was then known as] Ghost Town. Based on the jam&#39;s prompt, a friend and I came up with an idea for a game where you possess people and walk them into dangerous environmental hazards. As I worked on it, the idea expanded further into the concept that you&#39;re a lonely ghost trying to make friends, and each person you [get to join you in the afterlife] becomes a ghost that friends you on social media. A lot of really fun ideas just naturally fell out of the concept and made it a blast to work on. When the game won first place in the jam, I knew I had to keep working on it and fully flesh it out.<br /> <br /> The jam version was pretty unfocused and didn&#39;t have much of a narrative. Continuing development into the full version, I had a ton of disparate ideas that didn&#39;t feel like they could fit into a cohesive game, and I struggled to come up with a story to tell. Eventually, after turning all these ideas around in my head for months, the dots just kind of connected themselves and I knew exactly where the game was going. The story itself just came out of a need to connect all of these fun ideas I had for things, places, and gameplay mechanics. If you look at any individual elements of the game (especially some of the things I haven&#39;t shown off yet), you&#39;d probably think none of it really makes sense in a game about a lonely ghost, but I think and hope that it will make sense to people once they play the full game.<br /> <br /> <img alt="polterpals_dieway.png" height="auto" src="" width="auto" /><br /> <strong>What are Polter Pals’ most significant influences, gaming or otherwise?</strong><br /> <br /> <strong>Decker: </strong>Polter Pals&#39; sense of humor is probably most influenced by the comedies I grew up with like Monty Python and South Park; not by being edgy, but by creating a world that is slightly off-kilter and unleashing chaos in it. Similarly, its thematic influences game-wise are from classics like Toe Jam & Earl and Theme Hospital. I always loved how TJ&E created its own reality that [resembled ours] but when viewed through an outside lens seemed wrong or backwards. Polter Pals does something similar by viewing the world of the living through the lens of the afterlife. From the perspective of a ghost, that rake wasn&#39;t designed for lawn care, but was designed to bring people to the never-ending party of the afterlife more quickly!<br /> <br /> Aesthetically, the game takes a lot of influence from stop-motion and claymation. It&#39;s trying to land in the sweet spot between the bright and cheerful style of the original Bob the Builder and the richly detailed, twisted visuals of movies like Nightmare Before Christmas and Coraline. The music, which is being composed by the wonderful and mysterious <a href="" target="_blank">Breakmaster Cylinder</a>, is inspired by a wide range of things from MF Doom beats to cheery old Hanna-Barbera tunes and industrial tracks from classic games.<br /> <img alt="feature_image.png" height="auto" src="" width="auto" /><br /> <strong>The game utilizes dark satire in a unique way. It’s a game about death, but the art style is cute. It also explores friendship, loneliness, and belonging. How are you going about blending darkness and comedy in a way that does justice to both?</strong><br /> <br /> <strong>Decker: </strong>That is definitely one of the hardest aspects of it creatively, right next to telling the narrative through a series of unrelated social media posts. I worried a lot initially about it being too dark conceptually or sending the wrong message. I&#39;ve tried to limit most of that darkness to the action that you see on screen, which is balanced by the bright and colorful aesthetic. The writing is skewed more toward goofiness and irreverence to create a world that&#39;s funny but also where those heavy topics don&#39;t feel too dark.<br /> <br /> <strong>What was your familiarity with Unreal Engine like going into the creation of Polter Pals?</strong><br /> <br /> <strong>Decker: </strong>Prior to Polter Pals, I had taken a UE4 course on Udemy and made a single, short game for a jam. Starting work on Ghost Town is when everything I learned really came together and my skills started to improve (and they&#39;re still improving every day). Part of me wishes I could start fresh on Polter Pals knowing what I know now, because a lot of the foundational systems in the game were designed when I had no clue what I was doing. Luckily for me, UE4 is super flexible and allows me a million ways to script around my rookie mistakes.<br /> <img alt="polterpals_househarming.png" height="auto" src="" width="auto" /><br /> <br /> <strong>A game like Polter Pals requires intuitive interactions between the characters and the environment. How is Unreal Engine assisting in creating this?</strong><br /> <br /> <a href="" target="_blank">Blueprints</a> are probably the most helpful part of UE4 to me. I have some experience in <a href="" target="_blank">C++</a>, but I made the decision to script the game entirely with Blueprints. It&#39;s given me a lot of flexibility to prototype things quickly and make changes without breaking the game. They feel like they&#39;re at just the right level to provide huge amounts of depth and possibility for game design without needing to dive too far into programming. There are also countless other incredibly useful tools from simple things like Actor Tags, which I&#39;ve used to denote what&#39;s flammable for a system that dynamically spreads fire, to complex systems like <a href="" target="_blank">AI and Perception</a>, which have allowed me to relatively easily create a cheap way for the humans in Polter Pals to know what to be scared of.<br /> <img alt="polterpals_burnindown.png" height="auto" src="" width="auto" /><br /> <strong>How are you approaching puzzle design in Polter Pals? What different level mechanics are you implementing, and were there any helpful Unreal Engine tools here? </strong><br /> <br /> <strong>Decker: </strong>My approach has been to create as much variety as possible and try not to reuse mechanics too much. Each area introduces new mechanics at a basic level and then expands on them for a couple of levels before moving on to something else. That amount of variety is definitely where Blueprints comes in most handy. I can prototype new mechanics and test how they interact with everything else in minutes. I&#39;ve also tried to reduce the amount of handholding and tutorials in the game, so mechanics are always introduced at a really simple level to keep things easier to understand. Interactions are really simple, too. The only way you can interact with the world is by jumping on a human&#39;s head and pressing their noses against everything in the environment. That way, you always know the first step to figuring things out is to poke at the environment until hell breaks loose.<br /> <br /> In terms of specific mechanics, I mentioned there&#39;s a fire mechanic that is based in Blueprints but also uses <a href="" target="_blank">Materials</a> and the <a href="" target="_blank">Cascade</a> particle system to generate impressive wildfires. There are some more malicious actors in the world like bosses, and adorable little puppies that leverage UE4&#39;s <a href="" target="_blank">Blackboard</a> and <a href="" target="_blank">Behavior Tree</a> tools to hunt down the living for you. The construction-themed area of the game makes heavy use of UE4&#39;s destructible mesh tools and <a href="" target="_blank">physics</a> engine for some satisfyingly destructive puzzles.<br /> <br /> <strong>Do you have any favorite Unreal Engine tools?</strong><br /> <br /> <strong>Decker: </strong>I love using the <a href="" target="_blank">Material Editor</a>. It&#39;s probably my favorite part of the engine, and I use it for a lot of really fun visual effects. I also use <a href="" target="_blank">Sequencer</a> in UE4 to create videos that I cut into GIFs using Photoshop, because what game about social media would be complete without memes?<br /> <br /> <strong>What strategies are you using to keep the challenges fresh and new as the player progresses?</strong><br /> <br /> <strong>Decker: </strong>I try to make every level feel unique. Aside from introducing new mechanics often, levels using already-introduced mechanics generally use them in different ways. What was a hazard last level might now be a tool. That way, instead of making you master the mechanics, the game forces you to think about other ways you can use them.<br /> <img alt="polterpals_liveburial.png" height="auto" src="" width="auto" /><br /> <strong>On top of making the game, you&#39;ve also been posting some Unreal Engine tutorials online. What was the inspiration behind making these?</strong><br /> <br /> <strong>Decker: </strong>When I started making tutorials, there weren&#39;t a lot of beginner-level VFX videos on YouTube, especially not for <a href="" target="_blank">Niagara</a> (which had recently released at the time). One of the best tools I&#39;ve had for learning UE4 has been tutorials on YouTube, and since there wasn&#39;t much out there, I had to do a little hands-on, trial-and-error learning with Niagara. <br /> <br /> I figured it would be nice, since I was already figuring it out for myself, to wrap some of those learnings up in introductory-level videos for other folks who were just getting started on VFX in UE4.<br /> <br /> The other inspiration was a <a href="" target="_blank">series of tutorials</a> by <a href="" target="_blank">Matthew Palaje</a>. He actually ran the jam where Polter Pals was born, and his videos on Blueprints were some of the more helpful tutorials I&#39;ve watched. He did a series where he looked at mechanics in popular games and replicated them in Blueprints. I thought it would be neat to do something similar for VFX, since a lot of effects can look like wizardry when you&#39;re staring at the finished product.<br /> <br /> <strong>Aside from the music, you’re doing everything yourself. What are the pros and cons of being a one-man dev team?</strong><br /> <br /> <strong>Decker: </strong>It&#39;s a double-edged sword, for sure! On one hand, I get to have complete creative control and learn every part of the development process. On the other hand, it&#39;s a truly massive amount of work. Doing everything means you never get to master any one skill, and you often spend so long working on one thing that your skills atrophy in other areas or you forget parts of your pipeline. It also makes motivation a bit of a struggle, because there&#39;s no one else depending on you and you don&#39;t have anyone else to feed off of. It definitely has its challenges, but I feel it will pay off in the long run.<br /> <br /> <strong>What do you hope players take away from their time with Polter Pals?</strong><br /> <br /> <strong>Decker: </strong>I hope it makes people laugh! I&#39;d like to think it might make people think about their relationship with social media, but I also don&#39;t pretend that I or the game has any answers about how much people should or shouldn&#39;t interact with social media.<br /> <br /> <strong>On that note, here’s how to follow Polter Pals online and via social media:</strong> <ul style="margin-left: 40px;"> <li><a href="" target="_blank">Twitch</a></li> <li><a href="" target="_blank">Twitter</a></li> <li><a href="" target="_blank">YouTube</a></li> <li><a href="" target="_blank">Website</a></li> </ul> BlueprintsCommunityGamesPolter PalsSplit Hare GamesMichael LuisThu, 30 Jan 2020 07:30:00 GMTThu, 30 Jan 2020 07:30:00 GMT a performance capture system for real-time mocap performance capture can transfer an actor’s entire performance to a digital character on the fly. Get an understanding of how these systems work, and which one is right for you, in our new white paper “Choosing a real-time performance capture system.”Performance capture gives filmmakers, game developers, and other creatives the opportunity to bring an actor’s full range of skills to a digital character. While motion capture has been around for a long time, historically it was used only to capture the broader motions of the body. Now these systems are sensitive enough to capture subtle details in the face and fingers, effectively capturing the actor’s entire performance. <br /> <br /> In the meantime, real-time technology has continued to evolve, making it possible to capture and target such performances on the fly. Using performance capture in this way has made all kinds of media possible, from live-capture stage performances to interactive experiences.<br /> <img alt="Tech_PerformanceCapture_blog_body_img1.jpg" height="auto" src="" width="auto" /><br /> In the new white paper <a href="" target="_blank">Choosing a real-time performance capture system</a>, we explore this emerging field from suits and gloves to cameras, and examine how different systems are best suited to different types of production.<br /> <br /> Real-time performance capture systems aren’t inexpensive, and the time to process the data you get from them is a key factor in the overall cost of their use. To keep your costs and processing times as low as possible, you need to understand how each type of system works so you can choose the right one for your projects. <br /> <img alt="Tech_PerformanceCapture_blog_imgK_L.jpg" height="auto" src="" width="auto" /><br /> In this paper, we go over the functions and usages of various performance capture systems in detail, and we explore two real-time projects: the <a href="" target="_blank">Siren Real-Time Performance video</a> released at GDC 2018, and SIGGRAPH 2018 Real-Time Live! Winner: <a href="" target="_blank">Democratizing MoCap: Real-Time Full-Performance Motion Capture with an iPhone X, Xsens, IKINEMA, and Unreal Engine</a>.<br /> <br /> <br /> Download the white paper <a href="" target="_blank">Choosing a real-time performance capture system</a> and get started with the exciting world of real-time performance capture today.<br />  Film & TelevisionCommunityMocapVirtual ProductionBroadcast & Live EventsGamesWed, 29 Jan 2020 21:30:00 GMTWed, 29 Jan 2020 21:30:00 GMT early look at next-generation real-time hair and fur an experimental peek at how Unreal Engine 4.24 handles hair and fur.Each release of Unreal Engine in recent memory has included improvements to advance the <a href="" target="_blank">creation of the most photorealistic</a> real-time <a href="" target="_blank">digital humans</a> possible. Over the years, Epic has been committed to providing industry-leading tools and capabilities for studios of all sizes to craft beautiful and believable characters. In the <a href="" target="_blank">4.20</a> release, we focused on skin and eyes through lighting and subsurface scattering enhancements. With <a href="" target="_blank">4.22</a> and <a href="" target="_blank">4.23</a>, we took a quantum leap forward with the introduction of real-time ray tracing with accurate reflections and soft shadows. This effort continues in <a href="" target="_blank">4.24</a> with a new feature we are proud to launch: real-time hair rendering and dynamic support in Unreal Engine.<br /> <br /> While our <a href="" target="_blank">Siren</a> showcase was a breakthrough when it debuted in 2018, her virtual hair leveraged a traditional card-based technique, relying on textured meshes to give the approximate shape of a much larger number of individual hairs. This approach was the best option back then even with its tradeoffs for achieving fine-strand detail, accurate shading, and physically-based simulation. NVIDIA <a href="" target="_blank">HairWorks</a> helped move the conversation forward, but its official support ended back with <a href="" target="_blank">Unreal Engine 4.16</a>. The time has come for first-class real-time hair and fur implemented natively in Unreal Engine.<br /> <br /> With our new strand-based workflow, each individual strand of hair is rendered with accurate motion, resulting in dramatically improved visual quality. Unreal Engine 4.24 introduces a full hair shader and rendering system with multiple scattering from light sources, ray-traced shadows and dynamics integration using visual effects editor <a href="" target="_blank">Niagara</a>.<br /> <img alt="DeveloperInterview_-Mechwarrior_5_002.jpg" height="auto" src="" width="auto" /> <div style="text-align: center;"><em>This hair groom was rendered with our multi-scattering approach to generate a more realistic look. </em></div> <h2><strong>Import Pipeline</strong></h2> Unreal Engine 4.24 focuses on hair rendering and simulation with easy integration into an existing pipeline using the <a href="" target="_blank">Alembic</a> file format. We provide a naming convention based <a href="" target="_blank">schema</a>, which facilitates the import of static grooms from DCC applications like <a href="" target="_blank">Ornatrix</a>, <a href="" target="_blank">Yeti</a>, and XGen into Unreal Engine, and offers a straightforward pathway to tie in with proprietary studio hair tools.<br /> <br /> Our schema allows the transfer of attributes such as “width” and “color” into Unreal Engine, along with “guide” attributes that are identified for the simulation of interpolated hairs. Multiple hair groups within a single alembic are supported via “group_id.” We partnered with the talented teams behind beloved hair grooming applications <a href="" target="_blank">Ornatrix</a> and <a href="" target="_blank">Yeti</a> to provide built-in support and each offers the ability to natively export to our Alembic protocol into Unreal Engine 4.24. Guidelines for using XGEN with our alembic schema can be found in the <a href=";sa=D&amp;source=hangouts&amp;ust=1579375447697000&amp;usg=AFQjCNGSKraOQh08YZc-N2qnxWBSoboBWg" target="_blank">documentation</a>. <br /> <br /> Once imported, the hair and fur can be attached to a <a href="" target="_blank">Skeletal Mesh</a>, binding each hair root to the closest triangle on the mesh. This enables morph and skinning deformation, which are critical to constraining hair and fur to skin animation.<br /> <img alt="Tech_Blog_Ostrich.jpg" height="auto" src="" width="auto" /> <div style="text-align: center;"><em>Ostrich groom created in Yeti by Character Artist <a href="" target="_blank">Yuriy Dulich</a> of <a href="" target="_blank">Biotic Factory</a> and rendered in Unreal Engine.</em></div> <h2><strong>Shading & Rendering </strong></h2> Rendering hair strands implies facing three distinct challenges: aliasing, light interacting with a single hair fiber, and light bouncing in-between fibers.<br /> <br /> A human head has hundreds of thousands of hair strands, each with an average diameter of about 100 micrometers. Generating a picture with hair requires rendering multiple visible hair strands within each pixel. For rendering with our workflow, the imported hair strands are converted into individual polylines and rendered as thin triangle strips facing the camera.<br /> <br /> One complication that results from creating thousands of hair strands is that we need to take into account how much more complex light scattering behavior will be within this system, which we addressed with <a href="" target="_blank">our own physically based hair shading model</a>. This provides a more realistic appearance of single hair strands. <br /> <br /> To accurately evaluate the light transmitted through each hair strand, we use a mixture of <a href="" target="_blank">Deep Opacity Maps</a> and a runtime voxelization. Once we know how much light has traveled through the volume of hair, local light scattering is evaluated using the <a href="" target="_blank">dual scattering approximation</a> as an estimation for multiple scattering. This enables Unreal Engine to more faithfully render hair, especially light blond hair, which has been traditionally more challenging. <br /> <img alt="DeveloperInterview_-Mechwarrior_5_001.jpg" height="auto" src="" width="auto" /> <div style="text-align: center;"><em>This groom was generated in XGen and imported to Unreal Engine. It contains approximately 50k curves. </em></div>   <h2><strong>Simulation </strong></h2> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> Hair simulation is implemented as part of Niagara and is available on the GPU. Once a physics asset is created, the simulation solver handles body collisions. Niagara handles self collisions and computes them based on an average velocity field. Adjustable parameters such as bend, stretch, and thickness can be accessed without opening the Niagara editor. <br />   <h2><strong>Performance</strong></h2> High-density grooms can easily contain hundreds of thousands or even millions of strands. And each strand can contain dozens of Control Vertices (CVs). The combination of these two factors will affect performance for import, rendering, and simulation. For reference, in order to achieve real-time on a high-end PC, we’ve been generating grooms with an average of 50k strands for long hairs with a relatively higher CV count, and an average of 200k strands for short hair with comparatively lower CV count. <br />   <h2><strong>Conclusion</strong></h2> We couldn’t be more excited about our curve-based implementation in Unreal Engine 4.24 as it represents a huge milestone for real-time hair and fur. But, this is only the beginning. Going forward, our mission is to continue improving the system’s quality and performance.<br /> <br /> For shading and rendering, we plan to enhance our hair BSDF by offering extra parametrization to produce a better and more accurate lighting response with hair fibers. While our multiple scattering approach conveys the appearance of light-colored hair in a real-time environment better than ever, we will push the engine to further refine hair quality. <br /> <br /> In addition, collision response upgrades, better elastic Materials, and finer artistic controls for the simulation system will be developed. Optimizations to support larger and denser grooms will be implemented in future releases and to ensure scalability on all platforms, we are also considering card generation for lower-end devices. <br /> <br /> Above all else, we are committed to building a hair-and-fur system that helps our clients ship amazing projects and would love to hear your feedback on how we can best serve your needs. This experimental feature is available for you to try now and to get started, <a href="" target="_blank">download Unreal Engine</a> today and check out our <a href="" target="_blank">Hair Rendering and Simulation</a> documentation.<br />  ArtDesignFeaturesNewsProduct DesignCharles de Rousiers, Gaelle Morand, and Michael ForotMon, 27 Jan 2020 16:00:00 GMTMon, 27 Jan 2020 16:00:00 GMT to mobile rendering support in Unreal Engine 4.24 and 4.25 update from the Epic Mobile team on the upcoming removal of ES2 and Metal 1.2 in Unreal Engine 4.25, and some tips for continuing to use them in 4.24.Greetings from the Unreal Engine Mobile team! Today we’d like to share some key updates regarding the feature levels supported by our mobile renderer, including recent changes in <a href="" target="_blank">4.24</a>, a heads-up about our plans for 4.25, and some insights into our overall direction for improving mobile support in future versions of the engine.<br />   <h2><strong>Our Goals for Mobile Platform Support</strong></h2> Classically, limitations and variance in mobile hardware have made it necessary for us to maintain alternate rendering pipelines between desktop, Android, and iOS, with certain rendering features only exposed to specific feature levels. However, our long-term goal is to establish consistency between each of these platforms in terms of fidelity, performance, and available features.<br /> <br /> To that end, we are working to develop functionality for mobile devices that are equivalent to desktop rendering much sooner for each release than we have in the past. For example, we have made <a href="" target="_blank">mesh auto-instancing</a> available for mobile devices, and soon we will also bring both <a href="" target="_blank">virtual texturing</a> and the desktop forward renderer to mobile.<br />   <h2><strong>New Default Feature Levels</strong></h2> In support of this initiative, we have taken advantage of mobile hardware improvements within the last several years to gradually improve our base feature levels for mobile devices, bringing each of them up to similar standards.<br /> <br /> Prior to <a href="" target="_blank">Unreal Engine 4.23</a>, OpenGL ES2 was Unreal Engine&#39;s default feature level for Android projects. However, with the release of 4.23, the new default for Android became ES3.1. <strong>With Unreal Engine 4.25, we will remove ES2 completely.</strong><br /> <br /> Similarly, because Unreal Engine 4.25 will use iOS 11 as the minimum version of iOS, <strong>we are also going to update the minimum version of Metal to 2.0 and remove Metal 1.2.</strong><br /> <br /> Because the capabilities of ES3.1 and Metal 2.0 are very similar, these changes will simplify the process of parallel feature development between iOS and Android in the future, which in turn will make it easier for us to progress our technology for mobile games in future releases. This change will also benefit artists making assets for mobile games in Unreal Engine by making their available tools and features more consistent.<br />   <h2><strong>Using ES2 and Metal 1.2 in Unreal Engine 4.24</strong></h2> This comes with caveats for users upgrading to Unreal Engine 4.24. While ES2 and Metal 1.2 are still available in this version of the engine, we have already begun to deprecate them, and users whose projects still rely on these feature levels will need to take special action to keep using them.<br /> <br /> If you are upgrading to 4.24 from a version of Unreal Engine where ES2 was the default, and you never changed your feature levels in your Project Settings, ES3.1 will automatically take over as the new default setting for your project. <strong>If you need to continue using ES2, you will need to open your Project Settings and manually re-enable it. </strong>This can be found in the <strong>Platforms</strong> > <strong>Android</strong> section, under the <strong>Build</strong> category.<br /> <img alt="NEWS_ES2-blog_bodyimg.png" height="auto" src="" width="auto" /><br /> Similarly, if your project uses Metal 1.2 by default, then when you upgrade to 4.24, your feature level will automatically be changed to Metal 2.0 as the new default.<strong> If you want to continue using Metal 1.2, you will need to manually override it. </strong>Unlike with ES2, it is no longer possible to enable Metal 1.2 in the Project Settings UI. Instead, you need to make the following change to the appropriate *<strong>Engine.ini</strong> for your project:<br /> <br />    <code>[/Script/IOSRuntimeSettings.IOSRuntimeSettings]</code><br />     <code>MaxShaderLanguageVersion=2</code><br /> <br /> With these settings, you should be able to migrate to 4.24 without losing support for your legacy devices. However, keep in mind that this option will no longer be available in 4.25 and beyond.<br />  FeaturesGamesMobilethe Unreal Engine Mobile teamThu, 23 Jan 2020 14:30:00 GMTThu, 23 Jan 2020 14:30:00 GMT