Engine - News, Developer Interviews, Spotlights, Tech BlogsFeed containing the latest news, developer interviews, events, spotlights, and tech blogs related to Unreal. Unreal Engine 4 is a professional suite of tools and technologies used for building high-quality games and applications across a range of platforms. Unreal Engine 4’s rendering architecture enables developers to achieve stunning visuals and also scale elegantly to lower-end systems.en-USConnect with the Unreal Engine community online many physical events around the world are currently on hold, there are plenty of places to connect with the Unreal Engine community online.Although many physical events around the world are on hold, there are plenty of places to connect with the Unreal Engine community online. From forums, to webinars, livestreams and full-on virtual events, our community of creators is continually staying active.<br /> <br /> Below is a listing of permanent resources and online activities that we’d love to invite you to. Please check this post often as it will be updated in an ongoing fashion with newly-added events. <div style="text-align: center;"><img alt="UE_Community_Online_Feed-thumb-desktop.png" height="auto" src="" width="auto" /></div> <h2><strong>PERMANENT, FREE RESOURCES</strong></h2> <a href="" target="_blank"><span style="color:#3498db;"><strong>Support and Documentation</strong></span></a><br /> From your first steps with Unreal Engine to completing your most ambitious real-time project, we’re here to help. With comprehensive reference documentation, instructional guides, community-based support, and options for dedicated professional support, you have full access to everything you need to succeed.<br /> <br /> <a href="" target="_blank"><span style="color:#3498db;"><strong>Unreal Online Learning</strong></span></a><br /> This growing catalog of nearly 50 courses and guided learning paths tracks your progression and awards your achievements, whether your spending your first hours in tools such as Sequencer or brushing up on your visualization skills.<br /> <br /> <a href="" target="_blank"><span style="color:#3498db;"><strong>Unreal Engine on YouTube</strong></span></a><br /> Here's where you'll find archives of Inside Unreal, live training, and other broadcasts from our <a href="" target="_blank">Twitch</a> channel; tech talks from GDC, Unreal Fest Europe, and other conferences; and so much more.<br /> <br /> <a href="" target="_blank"><span style="color:#3498db;"><strong>Webinar Series</strong></span></a><br /> Check out our free webinars to learn all about the latest Twinmotion features and workflows and how to use Unreal Engine to create photorealistic scenes and interactive designs.<br />   <hr /> <h2><strong>UNREAL EVENTS</strong></h2> <strong>March 26, 2020 | 9am ET & 2pm ET - <a href="" target="_blank"><span style="color:#3498db;"><strong>WEBINAR: What’s New in Twinmotion 2020.1</strong></span></a></strong><br /> We recently hosted the live webinar What’s New in Twinmotion 2020.1. The replay is now available. In this webinar, Martin Krasemann, Twinmotion Technical Marketing Specialist at Epic Games presents a deep dive into some of the new Twinmotion 2020.1 features.<br /> <br /> Find out how the release brings improved fidelity and higher-quality lighting, more realistic vegetation and humans, new features to review and present projects, and more. <a href="" target="_blank">Watch Now</a>. <br /> <br /> <br /> <strong>March 26, 2020 | 2pm ET - <a href="" target="_blank"><span style="color:#3498db;"><strong>INSIDE UNREAL: Blender to Unreal Tools, Part 3</strong></span></a></strong><br /> It's time for the third part in our "Blender to Unreal" series!<br /> <br /> In part 2 we covered how to work with Rigify and the Unreal Mannequin. We're leaving Manny behind for this adventure, and will proceed to demonstrate how to import custom animations, characters, and skeletons.<br /> <br /> <br /> <strong>April 2, 2020 | 2pm ET - <a href="" target="_blank"><span style="color:#3498db;"><strong>INSIDE UNREAL: State of Audio in 4.25</strong></span></a></strong><br /> You may have heard that we have several new exciting audio features coming in 4.25, and this week we thought we’d take a look at them!<br /> <br /> To kick things off, the audio team will give a quick update on the Audio Mixer. From there we will shift to discussions and demos of the Native Convolution Reverb, Native Ambisonics Decoding and Encoding Support, as well as the new non-real-time audio analysis plugin Synesthesia. And finally, Wyeth Johnson will cap off the show with a demo on visualizing audio with the Niagara Audio Data Interface.<br /> <br /> <br /> <strong>April 2, 2020 | 9am ET & 2pm ET - <a href="" target="_blank"><span style="color:#3498db;"><strong>WEBINAR: Unreal Engine and Quixel: pushing the boundaries of 3D</strong></span></a></strong><br /> Traveling the planet with a mission to build the world’s largest library of scans, Quixel has sought to vastly simplify production of all digital experiences. Now, since joining forces with Epic Games, this mission is accelerating.  <br /> <br /> Teddy Bergsman and Galen Davis will join Daryl Obert to demonstrate how Quixel Megascans, Bridge, and Mixer 2020—along with the power of Unreal Engine—are pushing the boundaries of what’s now possible with 3D. <a href="" target="_blank">Register Now</a>.<br /> <br /> <br /> <strong>April 9, 2020 | 2pm ET - <a href="" target="_blank">INSIDE UNREAL - Iterative Design for Comfort</a></strong><br /> This week we'll focus on building functionality that helps designers understand what's going on in the game levels they're building. Little features, like simple debug indicators in the level editor, or easily-accessed Blueprint functions, can greatly increase the speed and comfort of work while also reducing user error and frustration. We'll also discuss how to iterate on code and design, to avoid impacting users when underlying code changes take place.<br />   <hr /> <h2><strong>ADDITIONAL EVENTS</strong></h2> Members of our team often perform presentations, sit in on panel discussions, or otherwise share their insight while participating in a variety of online events. Here are some events where Epic plans to have a presence.<br /> <br /> <strong><a href="" target="_blank"><span style="color:#3498db;"><strong>NVIDIA GTC Digital</strong></span></a></strong><br /> GTC Digital is a great training, research, insights, and direct access to the brilliant minds of NVIDIA’s GPU Technology Conference, now online. Epic Games' Film & TV industry manager, David Morin, presents: <a href="" target="_blank">Creating In-Camera VFX with Real-Time Workflows</a>.<br /> <br /> <strong><a href="" target="_blank"><span style="color:#3498db;"><strong>Pocket Gamer Connects Digital</strong></span></a></strong><br /> A new virtual conference for the games industry. Unreal Engine is a Platinum sponsor with two solo presentations and two panel seats on April 6, 2020. Alan Noon, Evangelist Lead, will present <em>Enabling Creators: The Unreal Ecosystem</em> at both 10am BST and 1pm ET.<br /> <br /> Evangelist Paulo Souza will participate on the <em>Tools for Making Your Game More Awesome</em> panel at 3pm ET, while Evangelist Arran Langmead will participate on the <em>Tips and Tricks for Making Production Workflow Easier</em> panel at 12pm BST.<br /> <br /> <a href="" target="_blank"><strong>RealTime Conference 2020</strong></a><br /> Created by industry veterans from DreamWorks, FMX, and Weta Digital, RTC 2020 is a new cross-industry event devoted to the impact real-time is having on art, technology, business, and society. Epic Games is a founding partner of this now fully virtual event with two keynote presentations and four panel seats April 6-7, 2020.<br /> <br /> David Morin, Industry Manager for Film & TV, Unreal Engine, will deliver a keynote on Virtual Production as well as participate in a Virtual Production panel discussion on Monday, April 6.  Vladimir Mastilovic, CEO of 3Lateral, will deliver a Digital Humans keynote and will also participate in a Digital Humans panel on Tuesday, April 7. Kim Libreri, CTO of Epic Games, will participate in two panel discussions on <em>Bringing Real-Time Pipelines to Movie Making</em> on Tuesday, April 7. <br /> <br /> --<br /> <br /> More online events will be added as they are confirmed, so please check back often!<br />  DocumentationEventsUnreal Online LearningWebinarLivestreamMon, 30 Mar 2020 18:00:00 GMTMon, 30 Mar 2020 18:00:00 GMT free Marketplace content - April 2020 your worlds with breathtaking mountains, fill your landscapes with sprawling vegetation, and rest by soothing ocean waves—all with this month’s free content!In an ongoing partnership with Unreal Engine Marketplace creators, select content is available for free to the Unreal community each month, giving artists, designers, and programmers access to additional resources at no extra cost.<br /> <br /> Check out this month’s great selection below!<br /> &nbsp; <h2><strong>April’s featured free content:</strong></h2> <h2><a href="" target="_blank">Fog Gradients</a> | <a href="" target="_blank">David Grouse</a></h2> <div style="text-align:center"><img alt="News_UESC_APR2020_Blog1.jpg" src="" style="height:auto; width:auto" /></div> <div style="text-align:center"><em>Add the finishing touches to your environments with this post-process effect, built with enough flexibility and customization to match your artistic needs.</em><br /> &nbsp;</div> <h2><a href="" target="_blank">Landscapes Pack</a> | <a href="" target="_blank">DigitalTris&nbsp;</a></h2> <div style="text-align:center"><em><img alt="News_UESC_APR2020_Blog2.jpg" src="" style="height:auto; width:auto" /><br /> Guide your audience on a journey through snowcapped mountains, peaceful grasslands, and arid deserts with these high-quality landscapes.</em><br /> &nbsp;</div> <h2><a href="" target="_blank">Nature Package</a> | <a href="" target="_blank">SilverTm</a></h2> <div style="text-align:center"><em><img alt="News_UESC_APR2020_Blog3.jpg" src="" style="height:auto; width:auto" /><br /> Quickly and easily adorn your environments with lush vegetation, including trees, flowers, shrubs, and more!</em><br /> &nbsp;</div> <h2><a href="" target="_blank">Scanned Poplar and Aspen Forest with Seasons</a> | <a href="" target="_blank">Tirido</a></h2> <div style="text-align:center"><img alt="News_UESC_APR2020_Blog4.jpg" src="" style="height:auto; width:auto" /><br /> <em>Grow photorealistic plantlife with this collection of photoscanned trees, ferns, shrubs, and more—with all four seasons!</em><br /> &nbsp;</div> <h2><a href="" target="_blank">Tropical Ocean Tool</a> | <a href="" target="_blank">James Stone</a></h2> <div style="text-align:center"><em><img alt="News_UESC_APR2020_Blog5.jpg" src="" style="height:auto; width:auto" /><br /> Save time to relax by the waves with this easy-to-use water shader with crashing waves and underwater effects.</em></div> &nbsp; <h2><strong>Permanently free content:</strong></h2> <h2><a href="" target="_blank">Broadcast Studio</a> | <a href="" target="_blank">Warren Marshall</a></h2> <div style="text-align:center"><em><img alt="News_UESC_APR2020_Blog6.jpg" src="" style="height:auto; width:auto" /><br /> Get ready to go on-air with a set of props and materials perfect for building your own customized broadcast studio.</em></div> <br /> Don’t miss out on these environment-rich assets before month’s end, but do check back in May for a new batch of free content.<br /> &nbsp; <hr />Are you a Marketplace creator interested in sharing your content for free with the community? Visit <a href="" target="_blank"></a> to learn how you could be featured!<br /> &nbsp;CommunityEventsNewsGamesMarketplaceAmanda SchadeTue, 07 Apr 2020 14:00:00 GMTTue, 07 Apr 2020 14:00:00 GMT evaluates vehicle ergonomics utilizing VR and Unreal Engine technology enables Toyota to validate the ergonomics of automotive designs faster and at lower cost in a virtual environment. Ergonomics involves the application of biotechnology and engineering principles to develop products that are more user-friendly for consumers. In an automotive context, that means assessing everything from the position of the passenger seat to the location of the drinks holder. This is what some describe as "human factors engineering." <br /> <br /> Fifty years ago, the process would have relied on physical prototypes and mannequins. The late 1960s saw the measurements of hundreds of people taken and stored in huge databases, which were used to define the typical shape and size of human bodies. From these databases, physical models of people were built and used to work out things like the steering wheel position and the reachability of pedals.<br /> <br /> In recent years, virtual ergonomics technology has transformed that traditional process. Designers and engineers can now simulate human interaction with a vehicle far more realistically by testing the reactions of real people in a virtual environment. <br /> <br /> One innovative team at <a href="" target="_blank">Toyota</a> is channeling the power of Unreal Engine and VR to validate designs even faster and at far lower cost, capitalizing on the open nature of the engine to connect industry-leading software and technology. <br /> <img alt="Spotlight_Toyota_Blog_Body_Image_4.jpg" height="auto" src="" width="auto" /><br /> The resulting workflow not only improves the ergonomics of Toyota’s cars today, it opens up opportunities to test out proofs of concepts for the autonomous vehicles of the future.  <h3><strong>Real-time tools for human factors engineering</strong></h3> Typically, it takes three years for a vehicle to go from the early stages of design to the dealership floor. Ergonomic validation takes place at the beginning of that process in the first year to test out concepts and ideas between the design and engineering stages.<br /> <br /> Mikiya Matsumoto is the general manager of the Prototype Division, Digital Engineering Department at Toyota. While it’s now common for automotive companies to harness real-time technology on the showroom floor for <a href="" target="_blank">interactive configurators</a>, his team has been leveraging Unreal Engine far earlier in the automotive life cycle to assess the user friendliness of vehicle designs and identify areas for improvement.<br /> <img alt="Spotlight_Toyota_Blog_Body_Image_5.jpg" height="auto" src="" width="auto" /><br /> The process starts with the import of a 3D mockup into a virtual environment built in Unreal Engine. A person wearing a VR headset sits in a real car seat and experiences a series of simulated scenarios to test out the design and usability of the vehicle. <br /> <img alt="Spotlight_Toyota_Blog_Body_Image_6.jpg" height="auto" src="" width="auto" /><br /> One such validation scenario developed by the Toyota team involved testing the visibility of other road users out of the rear quarter window of a new-generation car. “We prepared several pedestrians and bikers in a virtual city environment,” says Matsumoto. “The evaluator could see the simulated pedestrians and bikers passing near to the vehicle through the rear quarter window from the driver seat position via a VR headset. The test enabled us to improve visibility and we were able to complete it very quickly at a low cost compared to conventional methods.”<br /> <img alt="Spotlight_Toyota_Blog_Body_Image_1.jpg" height="auto" src="" width="auto" /><br /> The team also leverages the setup to perform accessibility checks, using tracking gloves to evaluate how easy it is to reach various buttons and controls. It uses a <a href="" target="_blank">HTC Vive headset</a>, <a href="" target="_blank">CarSim</a> for vehicle dynamics, <a href="" target="_blank">Leap Motion controllers</a> for hand tracking, and a combination of different physical prototype parts and VR simulation, depending on the evaluation performed.<br /> <br /> Because Unreal Engine is an open platform, Matsumoto’s team doesn’t have to jump through hoops to work with these industry-leading third-party tools. Many software providers are connecting their tools and systems to the Unreal Engine platform via plugins—like the <a href="" target="_blank">Mechanical Simulation CarSim plugin</a> the team uses.<br /> <br /> Car model data is imported into Unreal Engine via <a href="" target="_blank">Datasmith</a>. Datasmith is a collection of tools and plugins for bringing content into the engine, many of which interoperate with the CAD and product lifecycle management (PLM) systems used in the automotive industry. Datasmith enables the team at Toyota to go straight from CAD to Unreal Engine in a couple of clicks without using any third-party software in between.<br /> <br /> With their ability to create complex scenarios that include virtual vehicles and human characters, game engine toolsets are the best way to perform real-time ergonomics testing in a virtual environment. The Toyota team leverages the <a href="" target="_blank">Blueprint visual scripting system</a> in Unreal Engine to create these virtual scenarios for each test. <br /> <img alt="Spotlight_Toyota_Blog_Body_Image_2.jpg" height="auto" src="" width="auto" /><br /> Blueprint is a powerful visual coding toolset that puts programming features into the hands of non-programmers. It gives teams the flexibility to build custom functionality for very specific requirements, such as the unique testing simulations necessary for Toyota’s ergonomics validation. <br /> <br /> Other ergonomics tools that are used in the automotive industry generally do not offer VR functionality. Users must validate ergonomic tasks from a third-person perspective, which lacks the immersive realism of VR. In addition to being expensive, these systems tend to be closed, which makes them harder to use in combination with third-party software. <h3><strong>Real-time ergonomics testing future vehicles </strong></h3> Many automotive ergonomics studies today are still carried out on real physical mockups of vehicles. These are costly and time-consuming to build as well as inefficient—the design has often changed by the time the physical mockup is available.<br /> <br /> The workflow developed by Matsumoto’s team saves time and money compared to traditional methods of ergonomic assessment, and provides a more flexible development path.<br /> <br /> “Real-time technology allows us to perform virtual user experience testing,” explains Matsumoto. "This reduces the cost of and time taken for proof-of-concepting, leading to a more agile way of development.”<br /> <br /> What’s more, Matsumoto believes real-time technology will be pivotal in proving the concepts of the vehicles of the future. “Future vehicles might not have a traditional steering wheel—they might have something totally different to control the direction of movement. By using real-time technology and VR, we can evaluate any type of human-machine interface (HMI) and user experience.”<br />   <hr />Want to harness the power of real-time technology for ergonomic evaluation? <a href="" target="_blank">Download Unreal Engine</a> today for free!<br />  Automotive & TransportationBlueprintsDatasmithDesignVRToyotaMon, 06 Apr 2020 15:30:00 GMTMon, 06 Apr 2020 15:30:00 GMT Unreal Engine and Quixel: pushing the boundaries of 3D how Quixel Megascans, Bridge, and Mixer 2020—along with Unreal Engine—are pushing the boundaries of what’s now possible with 3D visualization.We recently hosted the live webinar <strong>Unreal Engine and Quixel: pushing the boundaries of 3D</strong>. If you missed it, no problem! The replay is available right here.<br /> <img alt="Webinar_Quixel_Unreal_Blog_Body_Image_2.jpg" height="auto" src="" width="auto" /><br /> In this webinar, Teddy Bergsman and Galen Davis join Daryl Obert to demonstrate how Quixel Megascans, Bridge, and Mixer 2020—along with the power of Unreal Engine—are pushing the boundaries of what’s now possible with 3D. You’ll learn about: <br />   <ul style="margin-left: 40px;"> <li>The different industries using the Megascans ecosystem to create powerful visual experiences</li> <li>How Quixel Megascans, Bridge, and Mixer work together with Unreal Engine to create a smooth and entertaining creative experience for artists</li> <li>What’s to come for Unreal Engine and Quixel later in 2020</li> </ul> <br /> You can watch the full webinar below. Looking for more webinars? Check out the full series <a href="" target="_blank">here</a>. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <br /> <em>*Header image courtesy of B.O.W Qin.</em>ArchitectureAutomotive & TransportationBroadcast & Live EventsFilm & TelevisionGamesTraining & SimulationLearningWebinarQuixelMegascansMon, 06 Apr 2020 14:30:00 GMTMon, 06 Apr 2020 14:30:00 GMT digital worlds with Megascans and Unreal Engine the Megascans ecosystem and how it integrates with Unreal Engine to help you create photo-realistic worlds. All for free.<h3><strong>Free access to the Megascans Ecosystem</strong></h3> Since <a href="" target="_blank">joining forces with Epic Games</a> last year, the Megascans Ecosystem has received major updates for both Bridge and Mixer, and seen a wide variety of new assets added to the Megascans library.<br /> <img alt="Megascans-Ecosystem-Intro_body-img-01.png" height="auto" src="" width="auto" /><br /> This ecosystem is tightly integrated with Unreal Engine, from the robust integration plugin to the LODs that come with each asset, and, of course, the availability of the ecosystem completely free for use with the engine.<br /> <br /> For free access to Bridge, Mixer, and every asset in the Megascans library, just click the “Sign In” button on the <a href="" target="_blank">Quixel homepage</a> and log on using your Epic Games credentials, and you’re good to go. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <h3><strong>Quixel Bridge 2020</strong></h3> With the 2020 update, Bridge has been revamped from the ground up to give you an unparalleled asset-browsing, downloading, and exporting experience. Over a hundred new collections have been added to the library, along with hundreds of thousands of new tags for every asset, making it easier than ever to find what you’re looking for.<br /> <img alt="Megascans-Ecosystem-Intro_body-img-02.png" height="auto" src="" width="auto" /><br /> <img alt="Megascans-Ecosystem-Intro_body-img-03.png" height="auto" src="" width="auto" /><br /> Once you’ve found the perfect asset, just hit the download button, then send it straight to Unreal Engine with the “Export” button, which sets up the proper material, textures, and LOD automatically. <div style="text-align: center;"><img alt="Megascans-Ecosystem-Intro_gif-03.gif" height="auto" src="" width="auto" /></div> Quixel Bridge is now completely free for everyone, and as an Unreal Engine user, every single asset you see in the Megascans library is available for free. <a href="" target="_blank">Download Bridge today</a>. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <h3><strong>Quixel Mixer 2020</strong></h3> Quixel Mixer is an all-in-one texturing solution that combines 3D texturing, tillable material creation, and asset style transfer, all powered with Mixer’s hybrid scan-based/procedural texturing workflow. Mixer is also completely free to use—no strings attached. <div style="text-align: center;"><img alt="Megascans-Ecosystem-Intro_gif-04.gif" height="auto" src="" width="auto" /></div> <div style="text-align: center;"><img alt="Megascans-Ecosystem-Intro_gif-02.gif" height="auto" src="" width="auto" /></div> With 12,000 assets in the Megascans library, every asset can be fully customized to fit your own style, whether it’s transferring the style of one biome to another, or creating stylized assets quickly.<br /> <br /> Mixer 2020 allows you to push the boundaries of what you can do with scan data, and this update is just the first step in a long-term effort to give you powerful texturing tools that make every asset dynamic.<br /> <img alt="Megascans-Ecosystem-Intro_gif-05.gif" height="auto" src="" width="auto" /><br /> Thanks to the Smart Material system, such materials can then be saved for later use, and applied to as many assets as you want, all with a single click. Additionally, Mixer comes with a vast array of Smart Materials, and a plethora of new ones are already in the works for the next update. <div style="text-align: center;"><img alt="Megascans-Ecosystem-Intro_gif-01.gif" height="auto" src="" width="auto" /></div> Mixer is rich with features and this is just the beginning. It’s completely free, with dozens of Smart Materials in the works to help you texture your assets faster than ever.<br /> <br /> Quixel Mixer is available for everyone, for free. <a href="" target="_blank">Download Mixer now</a>.  <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> Want to learn more? Tune in to our live webinar <a href="" target="_blank"><strong>Unreal Engine and Quixel: pushing the boundaries of 3D</strong></a> on Thursday, April 2 at 9 AM and 2 PM EDT. In this webinar, Quixel’s Teddy Bergsman and Galen Davis join Epic’s Daryl Obert to demonstrate how Quixel Megascans, Bridge, and Mixer 2020—along with the power of Unreal Engine—are pushing the boundaries of what’s now possible with 3D. <strong>UPDATE:</strong> You can now watch the full webinar below.<br />   <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> GamesArtDesignFeaturesQuixelTeam Quixel Wed, 01 Apr 2020 19:00:00 GMTWed, 01 Apr 2020 19:00:00 GMT Games explains how to design a modern vehicular combat game Game Director Neil Barnden explains how they designed an arena-style shooter starring cars.While <a href="" target="_blank"><em>ShockRods</em></a> possesses many of the qualities you'd expect to find in popular vehicular combat games, it actually plays more like an arena shooter with cars. Developed by the studio that made the beloved <em>Carmageddon</em> series, <em>ShockRods</em> separates itself from its predecessors by implementing first-person shooter-style WASD controls coupled with strafing capabilities. To gain insight into how <a href="" target="_blank">Stainless Games</a> designed the unique shooter, we interviewed Game Director Neil Barnden.<br /> <br /> The studio co-founder talks about how games like <em>Quake III</em> <em>Arena</em> and <em>Unreal Tournament</em> along with <em>Rocket League</em> influenced the project and elaborates on how the team used a game-jam approach with its development. Early on, the studio decided to inject a high level of mobility to the vehicles; not only could they drive side-to-side, but they could also double jump. This injected additional depth into the mechanics and empowered the team to create more inventive modes and environments. Rounding out our interview, Barnden shares how leaning into <em>ShockRod’s</em> community allowed them to refine their game. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <strong><em>ShockRods</em> plays a lot like an arena shooter. What made the team inject cars into the equation?</strong><br /> <br /> <strong>Game Director Neil Barnden:</strong> We like arena shooters, and we also like driving games! It's as simple as that, really. We've been around for over 25 years, but have never made a shooter, and there was a strong desire to get into the genre as there are many avid shooter fans among the team. At the same time, Stainless has a long history of making games that feature cars and a lot of real car enthusiasts also work at the studio. So, the project actually began as an exploration of how we could take the "<em>Carmageddon</em>" brand in a new direction, moving it away from its previous destruction derby/death-race gameplay. The game, therefore, started out with the core concept of exploring how we could bring cars and guns together in a way that would allow players to control both simultaneously while allowing us to develop a new style of car-to-car combat. <br />  <br /> <strong>Were there any particular games that inspired <em>ShockRods</em>?</strong><br /> <br /> <strong>Barnden: </strong>We've pitched the game as "<em>Quake</em> meets <em>Rocket League</em>," which is a handy distillation of the concept. <em>Quake III Arena</em> is specifically the game that we used to spend a lot of time playing during out-of-work hours at the office. We would also like to give a shout-out to <em>Unreal Tournament</em>. <br /> <img alt="DeveloperInterview_ShockRods_05.jpg" height="auto" src="" width="auto" /><br /> <strong>With a history developing the <em>Carmageddon</em> games, was there anything you learned from developing that beloved series that you built off of for <em>ShockRods</em>? </strong><br /> <br /> <strong>Barnden:</strong> It's interesting because the further we got into the project, the less there was that tied it back to the <em>Carmageddon</em> games. Where <em>Carmageddon</em> was all about complex destruction code for vehicles and meticulous attention to the physics of car movement, <em>ShockRods</em> had to be built for speed, both in terms of movement and code calculations -- in order for the game to have the performance match its twitch-shooter pace. Earlier on in development, we also experimented with the idea of having human characters in the game; the drivers could be ejected from the cars and run around. But this interfered with the core gameplay too much, and was dropped, severing another connection with the <em>Carmageddon</em> series.<br /> <br /> The only lingering similarity in gameplay terms between <em>ShockRods</em> and <em>Carmageddon</em> is the concept of pickups that have fun effects when you use them, such as the balloon and ice-cube guns. And this is probably more of a Stainless signature than a <em>Carmageddon</em> connection, because we love games that make you shout and laugh when you play them!<br />  <br /> <strong>Not many vehicular combat games support WASD controls that allow cars to strafe side-to-side. Can you walk us through how the team came to this design?</strong><br /> <br /> <strong>Barnden: </strong>That goes back to the important point in our first answer: How do you make a game that brings cars and guns together, and control both at the same time while making it work in a satisfying way? This is something that games have struggled with, and either they don't solve it well, or they have to compromise how steering and wheels work. Fortunately, we discovered these things called "omni-directional wheels." They exist in the real world, and allow a vehicle to move in any direction. So that solves the problem of conventional cars not being able to strafe, because with omni-directional wheels, our cars can strafe. <br /> <br /> We then added the jump/boost function, which is also not such a fanciful concept either. We just took jump-jet technology and adapted it to our futuristic arena. <br /> <img alt="DeveloperInterview_ShockRods_17.jpg" height="auto" src="" width="auto" /><br /> <strong><em>ShockRods</em> features a wide assortment of inventive weapons. How did you design them?</strong><br /> <br /> <strong>Barnden:</strong> <em>ShockRods</em>’ design has been refreshingly like a game jam. We’ve had a small, passionate team on the project for a couple of years, and they constantly try to see if stuff works. All of the core gameplay came about this way. Some of the game’s weapons came about pretty much instantly, while others got iterated on and refined over time. We experimented a lot. <br />  <br /> <strong>Cars in <em>ShockRods</em> can double jump, which adds a platforming element to the game. Why was this a good inclusion?</strong><br /> <br /> <strong>Barnden:</strong> A conventional car is tied to the plane it's driving on; unless you drive it off a cliff, its movement is two dimensional. This makes a car's direction of travel highly predictable, and it's an easy target. It also limits the options for level design, particularly when thinking in terms of an arena shooter. Adding omni-directional wheels improves the manoeuvrability of the car and makes it a less predictable target, but it's still moving along a 2D plane.<br /> <br /> As soon as you add the ability to jump, the car is far less of an easy target to predictably fire at. And now the driver is able to travel from one place to another by using the third dimension. This opens up the opportunities for levels that have a far greater 3D design than would otherwise be possible without a network of ramps and jumps. And it's fun to jump, use the "Aftertouch" control to elegantly spin your car as you fly through the air, and railgun your opponent while you're upside-down! <br /> <img alt="DeveloperInterview_ShockRods_26.jpg" height="auto" src="" width="auto" /><br /> <strong>What can you tell us about the way Stainless Games designed the maps in <em>ShockRods</em>? </strong><br /> <br /> <strong>Barnden: </strong>The maps were developed alongside the evolution of our cars’ movement system and in conjunction with the development and testing of our game modes. Once again, this was an iterative process and involved playing simple grey-box versions of levels prior to building full levels. Some maps were completed but eventually removed or substantially reworked as the game evolved. Plus, there was the requirement later on in the project's development for the game to run on mobile devices, which also had impacted map design (although, fortunately, this was less of an issue than it might have been due to the constant emphasis on keeping the maps as economical and efficient as possible to maximize the game's performance).<br />  <br /> <strong><em>ShockRods</em> offers fun modes such as Golden Ram, which forces players to rack up points by taking control of a single battering ram and Shockball, which wears its <em>Rocket League</em> inspirations on its sleeves, but adds guns. How did the studio design and implement these fun multiplayer modes?</strong><br /> <br /> <strong>Barnden:</strong> One of the great strengths of using Unreal Engine is that its dev environment allows very rapid prototyping and testing. So, we can try out all the positive game mode ideas that we talk about on Slack. Again, with a small team of passionate gamers on the project dedicated to making the game as good as it can be, there's total enthusiasm for trying stuff out. We analyze whether or not something is worth pursuing, and then refine it until everyone has a blast playing it. If it's not fun, it doesn't go in the game.<br /> <img alt="DeveloperInterview_ShockRods_46.jpg" height="auto" src="" width="auto" /><br /> <strong>The studio has stated that it's relied heavily on the community to design <em>ShockRods</em>. In what ways did the community influence the game's development? </strong><br /> <br /> <strong>Barnden:</strong> The community involvement has been great; thanks to our <a href="" target="_blank">Discord channel</a> and a band of merry players who helped us out with pre-release testing, gave essential feedback on issues, and came up with suggestions to improve the game. Lots of additional tweaking came from these tests with first time users. Some specific changes were implemented directly because of community feedback, such as when pickups slightly vacuum towards you when you drive close to them (which means you don't have to drive right over them to pick them up).<br />  <br /> <strong>Stainless Games has said that the game was made to be easy-to-learn, but hard-to-master. How has the studio tried to deliver on that philosophy with ShockRod's design?</strong><br /> <br /> <strong>Barnden: </strong>The game uses a core set of controls that are very easy to pick up. The player can drive around, shoot and dodge [without] being shot really quickly. They don't have to jump at all, or double-jump, or boost in order to play the game and have fun. But as they get familiar with the game, they'll start to appreciate how much their fun increases as they start to jump around, switch to more appropriate weapons, use pickups in innovative ways, or use Aftertouch to maneuver more skilfully.  <br /> <br /> This was evident when the game was showcased at video game expos like Rezzed. Visitors of all ages who had never played the game could sit down and get straight into the action with minimal or no guidance, and were having fun within seconds of starting.<br /> <img alt="DeveloperInterview_ShockRods_55.jpg" height="auto" src="" width="auto" /><br /> <strong>Why was Unreal Engine a good fit for <em>ShockRods</em>?</strong><br /> <br /> <strong>Barnden: </strong>Unreal Engine has been the perfect engine for <em>ShockRods</em> because of the speed with which ideas can be prototyped and tried out. It's also optimized to allow us to achieve really good performance during network play. And because it's been developed for all platforms, we were also able to create the <a href="" target="_blank">Apple Arcade version</a> of the game within a really tight timeframe. <br />  <br /> <strong>Considering the game is available across various platforms, was it challenging designing the game around so many vastly different hardware systems?</strong><br /> <br /> <strong>Barnden: </strong>As mentioned, making the game run well on so many significantly different devices, and within a tight schedule, would have been impossible if we didn’t use Unreal. <br /> <br /> But there was an added challenge to make the game work on devices that used touch controls, since we originally conceived it for devices with controllers. However, thanks again to Unreal enabling rapid implementation/test cycles, the team was able to come up with a very intuitive control scheme for mobile/touch devices.<br /> <img alt="DeveloperInterview_ShockRods_34.jpg" height="auto" src="" width="auto" /><br /> <strong>Did the team have any favorite Unreal Engine tools?</strong><br /> <br /> <strong>Barnden:</strong> Each member of the team is happy with their own toolset within the editor. The artists really like <a href="" target="_blank">Material Editor</a>. In the past, a coder would have to make a shader based on an artist describing what they wanted. Now, artists can do it themselves, and in real time. Our designers and coders are also big fans of <a href="" target="_blank">Blueprints</a>.<br />  <br /> <strong>Are there any tips you can offer game designers?</strong><br /> <br /> <strong>Barnden: </strong>Try stuff out! If you're still having fun doing a thing after doing it over and over and over again during development, other people will probably have fun doing it, too!<br />  <br /> <strong>Thanks for your time. Where can people learn more about <em>ShockRods</em>?</strong><br /> <br /> <strong>Barnden:</strong> Go to <a href="" target="_blank"></a> to find out more about the game.<br />  Communitygame designgame designerGamesNeil BarndenshockrodsStainless gamesUE4Unreal EngineBlueprintsmaterial editorJimmy ThangWed, 01 Apr 2020 15:30:00 GMTWed, 01 Apr 2020 15:30:00 GMT transferable real-time skills at the University of Hertfordshire world-class games program gives students the opportunity to develop skills in interactive 3D that can be used in diverse industries.The <a href="" target="_blank">University of Hertfordshire</a> is one of the top game design schools in the world, with a highly successful games program. It was ranked <a href=",-according-to-latest-rookies-rankings" target="_blank">number one in the world </a>for “Game Design & Development Schools - Production Excellence 2019” in the Rookies World School Rankings 2019; fielded both <a href="" target="_blank">the winner</a> and <a href="" target="_blank">the runner up</a> of the “Game of the Year” category in the annual Rookie Awards; and won <a href="" target="_blank">Best Education Institution</a> at the TIGA Games Industry Awards. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> One of the driving forces behind this success is the opportunity the program provides to go beyond games and explore other applications of real-time technology. <div><br /> The university's staff specializes in teaching Unreal Engine, exposing students to an expansive range of real-time skills to prepare them for the diversity of <a href="" target="_blank">careers in interactive 3D</a>. “Cross-disciplinarity gives students the opportunity to go beyond the thing they thought they wanted to study,” says David Tree, Lecturer of Games and Animation Technology. “And that means that they can explore areas that they wouldn't have thought of.”<br /> <img alt="UniversityofHertfordshireBlog_Body_Image_2.jpg" height="auto" src="" width="auto" /></div> <h3><strong>Developing transferable skills in interactive 3D</strong></h3> The university’s digital animation program focuses on encouraging students to find their creativity. It develops their art skills and also bolts on the technical knowledge they’ll need for real-world production.<br /> <br /> Students who join the course complete a common first year regardless of the degree they're on. This enables them to try out new things they may never have thought to look into. “They might say ‘I came on as a 3D modeler, but I’ve discovered I love working with real-time shaders and I love building levels in Unreal,’” says Martin Bowman, Joint Program Leader of Digital Animation. “So at the end of that year, we just switch them from whichever degree they started from to the one they never realized is where their true talent lies.”<br /> <img alt="UniversityofHertfordshireBlog_Body_Image_4.jpg" height="auto" src="" width="auto" /><br /> By learning Unreal Engine, students are equipped with a set of transferable skills—ones that have traditionally been associated with game development and which are in high demand for an incredible range of industries including virtual production, animation, automotive, aerospace, medical visualization, and AR/ VR. “Our students no longer look at the future ahead of them and think ‘I must work in the entertainment industry,’” says Bowman. “Obviously many will. But a lot of them realize there’s this vast area of other stuff that requires real-time visualization, and most courses aren't really catering to that.”<br /> <img alt="UniversityofHertfordshireBlog_Body_Image_11.jpg" height="auto" src="" width="auto" /><br /> This ethos has had real, tangible effects. “Over the summer I ended up getting a job doing cinematics in Unreal Engine,” says student Justin Mills. “I was using techniques I picked up in the first year studying things like cinematography and camera angles—things you don’t normally associate with games. So the overall skill sets that you learn here are great.”<br /> <br /> With a range of transferable real-time skills under their belts, students who graduate from the program have a high success rate for entering the competitive job market. “We've got students all around the world now,” says Neil Gallagher, Senior Lecturer of BA and MA Games Art. “We've got students at Nintendo, at Ubisoft. We've got students at Bentley and visualizing at McLaren.”<br /> <img alt="UniversityofHertfordshireBlog_Body_Image_5.jpg" height="auto" src="" width="auto" /><br /> When they get to companies like these, the fact the students have learned real-time workflows using Unreal Engine comes as a huge advantage. “Unreal Engine is the tool they're using at these really high-end companies—our students are using the same tool for their everyday projects here”.  <h3><strong>Switching engines to increase student employability </strong></h3> While Unreal Engine has proven a huge success for teaching real-time skills and giving students the best chances in the job market, the university did not always use it. “Many years ago, we used to use another game engine. It was a good game engine, but it had a lot of limitations,” says Bowman. “It didn't have a great take up in the industry, and we realized that for our students to maximize their employment chances, we needed to move to a different one.”<br /> <img alt="UniversityofHertfordshireBlog_Body_Image_7.jpg" height="auto" src="" width="auto" /><br /> After assessing a number of options, the teaching staff settled on Unreal Engine. With millions of people already using it worldwide, they realized there was a wealth of support to help their students learn the technology. “That worldwide community is massively helpful in engaging students and making them realize that there's jobs for them—not just locally, but jobs across the world,” says Bowman.<br /> <br /> The ability of Unreal Engine for myriad real-time uses beyond games is one of the things that makes it so powerful, Bowman believes. “Where it excels is the fact that it can be customized, it can be made into pretty much anything you want,” he says. “Companies like <a href="" target="_blank">Framestore</a> and <a href="" target="_blank">Blue Zoo</a> are using Unreal and hiring our students, because they can work with the director on shots and get instant feedback from them.”<br /> <br /> What’s more, with some systems designed specifically for those of a non-technical outlook, it becomes possible for anyone to learn the engine. “It's accessible to people who maybe signed up to an arts course and didn't realize they were going to have to learn some programming,” says Tree. “The first Blueprints lesson comes and they say ‘But I'm not a programmer! I’m terrified of this.’ And then you explain that you just plug the wires in and you can experiment, and if you get it wrong, it just tells you where you got it wrong.”<br /> <img alt="UniversityofHertfordshireBlog_Body_Image_8.jpg" height="auto" src="" width="auto" /><br /> The <a href="" target="_blank">Blueprint visual scripting system</a> in Unreal Engine puts tools normally reserved for programmers into the hands of non-technical users via an intuitive visual node-based interface. <br /> <br /> For those interested in learning C++, the open nature of Unreal Engine provides the opportunity to dive into the code: “If we need to, we can modify or alter the engine in such a way that we can make it do what we want to and then use it to train technical artists,” says Tree.  <h3><strong>The future is interactive 3D</strong></h3> The lecturers on the games program at the University of Hertfordshire believe real-time skills are going to be increasingly important in the job market in the years to come. “We're in a very fast-changing industry—it's constantly evolving, constantly developing,” says Bowman. “One thing we can guarantee is that the future is going to become more interactive, it’s going to be 3D, and it will be made in Unreal.”<br /> <img alt="UniversityofHertfordshireBlog_Body_Image_3.jpg" height="auto" src="" width="auto" /> <hr />Want to start teaching Unreal Engine as part of your school’s game development curriculum? <a href="" target="_blank">Contact us</a> and we can start that conversation with you! <br /> <br /> You’ll also find free hands-on video courses and guided learning paths on <a href="" target="_blank">Unreal Online Learning</a>.<br />  ArchitectureAutomotive & TransportationBlueprintsBroadcast & Live EventsEducationFilm & TelevisionGamesMore UsesTraining & SimulationUniversity of HertfordshireMelissa RobinsonMon, 30 Mar 2020 14:00:00 GMTMon, 30 Mar 2020 14:00:00 GMT What’s new in Twinmotion 2020.1 release brings improved realism, new features to review and present projects, and much more! Watch the webinar. We recently hosted the live webinar What’s new in Twinmotion 2020.1. If you missed it, no problem! The replay is available below. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> The <a href="" target="_blank">recent release of Twinmotion 2020.1</a> raises the bar for fast, easy real-time architectural visualization, providing a greater sense of presence and realism. In this webinar, Martin Krasemann, Twinmotion Technical Marketing Specialist at Epic Games presents a deep dive into some of the new Twinmotion 2020.1 features. Find out how Twinmotion 2020.1 brings: <br />   <ul style="margin-left: 40px;"> <li>Improved visual fidelity and higher-quality lighting </li> <li>More realistic trees, plants, and people </li> <li>Faster and easier ways to add vegetation </li> <li>New features to review and present projects </li> </ul> <br /> You can watch the full webinar above. Looking for more webinars? Check out the full series <a href="" target="_blank">here</a>.<br />  ArchitectureCommunityDesignLearningNewsVisualizationTwinmotionWebinarFri, 27 Mar 2020 12:00:00 GMTFri, 27 Mar 2020 12:00:00 GMT virtual cities and digital twins with Unreal Engine studio Zoan leverages real-time tools to create everything from a digital 3D replica of Helsinki to digital twins of buildings that can interact with third-party data from Internet of Things sensors. If you could replicate an entire city in 3D, the possibilities would be endless. Real estate could be bought and sold with prospective clients touring houses and neighborhoods in VR. Planners could test out urban designs and assess their impact on traffic and the environment. Virtual tours of key attractions could boost tourism. You could even host VR concerts, connecting music fans from around the world.  <br />   <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <br /> One visualization studio has achieved just this. <a href="" target="_blank">Zoan</a> is a Finnish company that uses real-time technologies to create different kinds of immersive experiences across a range of sectors. The studio’s passion project over the past few years has been a virtual recreation of the Finnish capital city, Helsinki.<br /> <br /> Virtual Helsinki is just one of many projects the studio has worked on that share a common ethos: supercharging ROI by creating many different types of content from one model. “Using Unreal Engine, we can create animation, images, VR, 360 tours,” says Miikka Rosendahl, Founder and CEO of Zoan. “This is a totally new concept for clients, because they used to just pay for that one image. Now, they get all types of content.”<br />   <h3><strong>A digital 3D replica of Helsinki </strong></h3> <a href="" target="_blank">Virtual Helsinki</a> is a 3D digital copy of the city that can be used for everything from VR experiences to real-estate solutions. It’s been leveraged by <a href="" target="_blank">Business Finland</a> to promote Helsinki at events such as <a href="" target="_blank">Slush Singapore</a>, and formed part of Finland’s technology showcase when the city was named the <a href="" target="_blank">2019 European Capital of Smart Tourism</a>.<br /> <img alt="Spotlight_Zoan2_blog_Image_6.jpg" height="auto" src="" width="auto" /><br /> Developing a digital replica of a city naturally involves working with huge datasets. This is one of the reasons Zoan turned to Unreal Engine to build the environment. “Dealing with large city files was previously unthinkable if we tried to use traditional 3D modeling software,” explains Jonathan Biz Medina, Partner and CPO at Zoan. “Unreal Engine was able to give us real-time previews of such high quality that we fell in love with it, and we’ve been using it ever since.”<br /> <img alt="Spotlight_Zoan2_blog_Body_Image_2.jpg" height="auto" src="" width="auto" /><br /> In fact, the visual quality factor is one of the reasons the studio switched to Unreal Engine from an alternative solution. “In the beginning, we actually started with another game engine, but we were always lacking that visual quality,” says Rosendahl. “Then we saw some content pieces created with Unreal that were so visually stunning that we said ‘OK, if we want to achieve that level of photorealism, this is the way to go for us.’ ”<br /> <img alt="Spotlight_Zoan2_blog_Body_Image_3.jpg" height="auto" src="" width="auto" /><br /> While some of the team were initially cautious about making the switch, any fears were quickly allayed once they dived in. “When they realized you have all these cool new tools such as Blueprints that enable you to achieve way more, quicker, they saw that this was actually much easier than they thought,” says Rosendahl.<br /> <br /> The <a href="" target="_blank">Blueprint visual scripting system</a> puts Unreal Engine’s programing tools at the fingertips of non-programmers via a system that is visually-orientated rather than code-orientated. It provides a gateway into real-time workflows for less technically minded members of teams.<br /> <br /> For team members versed in C++, on the other hand, open access to Unreal Engine’s source code provides the freedom to customize and extend the engine. This was another reason Zoan decided to switch real-time tools. “That's something that we couldn't do with the other game engine,” says Rosendahl. <br />   <div style="padding:56.25% 0 0 0;position:relative;"><iframe allow="autoplay; fullscreen" allowfullscreen="" frameborder="0" src="" style="position:absolute;top:0;left:0;width:100%;height:100%;"></iframe></div> <script src=""></script>   <h3><strong>Digital twins with virtual buildings and IoT technology </strong></h3> Zoan also provides architectural visualization through <a href="" target="_blank">Zoan Digital Building</a>—a virtual design and construction service. Real-time technology is the driving force behind this offering. “Before, we were too dependent on rendering an image and waiting for the client to approve it,” says Medina. “Unreal gave us the ability to make those changes in real time, so we no longer have to wait hours for one image to appear.”<br /> <img alt="Spotlight_Zoan2_blog_Body_Image9.jpg" height="auto" src="" width="auto" /><br /> This has had a significant impact on the studio’s turnaround times, much to the delight of its clients. Miikka recalls a recent archviz project on which Zoan switched from traditional rendering to real-time rendering halfway through the project in order to deliver on time. “The client didn't see any difference in the quality, and now they were getting their new iterations quickly,” he says. “They were like, ‘What happened?’ We said, 'That's our new technology. That's Unreal Engine.’ ”<br /> <br /> Now Zoan can complete 90% of their projects in house, visiting the client’s office to finish the last 10% for small tweaks such as swapping out furniture variations. “That would never have been possible with traditional rendering,” says Rosendahl.<br /> <img alt="Spotlight_Zoan2_blog_Body_Image_7.jpg" height="auto" src="" width="auto" /><br /> Leveraging the power of real-time technology has fundamentally flipped the archviz business model for Zoan. Using traditional rendering, the studio would sell images and seconds of animation. With photorealistic interactive environments at its disposal, that paradigm no longer makes sense. “Now, it's more about selling square meters and creating that space with cool lightning,” says Rosendahl. “Then we can just take screenshots and create animation in real time, which there was no way of doing with a traditional rendering pipeline.”<br /> <img alt="Spotlight_Zoan2_Blog_Body_Image_11.jpg" height="auto" src="" width="auto" /><br /> The advent of <a href="" target="_blank">real-time ray tracing</a> has meant that the quality of work the studio can deliver has skyrocketed. “It was a very exciting moment for us when ray tracing came into the picture, because we were able to tell clients that we could deliver the same quality as traditional renderings and mass-produce the content,” says Rosendahl. “It has been totally groundbreaking for us.”<br /> <br /> Zoan has started to go beyond architectural visualization to experimenting with digital twins, leveraging Unreal Engine to create physical user interfaces that can be used to interact with third-party data such as people flow or data from <a href="" target="_blank">Internet of Things</a> (IoT) sensors.<br /> <br /> The idea came about after the studio realized it was creating realistic digital 3D models for clients that were then often discarded after the marketing phase was over. “We thought we could provide lifetime value for our clients,” says Rosendahl. “With real-time technology, we realized we could pull in IoT data that could be visualized in the digital twin that we created.”<br />   <h3><strong>Augmented reality for live entertainment </strong></h3> Beyond architectural visualization, digital twins, and virtual cities, Zoan is also active in the fledgling XR live entertainment industry. A recent project saw the studio collaborate with Warner Music Group and <a href="" target="_blank">Finnish singer-songwriter Vesala</a> for the artist’s live show at the largest indoor arena in Finland. “We made this spectacular augmented reality experience where the audience could share a synchronized AR layer for the whole duration of the concert,” says Laura Olin, COO and Partner at Zoan.<br /> <img alt="Spotlight_Zoan2_blog_Body_Image_5.jpg" height="auto" src="" width="auto" /><br /> The audience was able to view visual effects such as blooming flowers and shooting stars via mobile phone, which synchronized with Vesala’s songs. “What real-time technology allows us to do, especially AR, is break the boundaries of creativity,” says Medina. “We can give creators freedom to create whatever they want and explore artistic possibilities that are no longer bound by physical reality.”<br /> <img alt="Spotlight_Zoan2_Blog_Body_Image_10.jpg" height="auto" src="" width="auto" /><br /> Whether it’s opening up new creative avenues for performers like Vesala, or providing diverse marketing materials for real-estate developers, Zoan is convinced real-time technology is set to transform the industries it operates in. The studio predicts all its clients will soon start to look beyond the limitations of traditional processes. “They're not going to be satisfied with their long rendering times anymore,” says Rosendahl. “They're not going to want to wait to make the design iterations. It’s game-changing.”<br /> <br /> Waving goodbye to long render times will come as something of a relief to the Zoan CEO, too. “I would say I'm pretty traumatized from render farms,” says Rosendahl. “We have spent so much money getting something and it not being right. Being able to say ‘that's history for us’ just makes us super happy.”<br />   <hr />Want to create your own architectural visualizations or VR experiences? <a href="" target="_blank">Download Unreal Engine</a> for free today!<br />  ARArchitectureBlueprintsBroadcast & Live EventsDesignRay TracingVirtual HelsinkiVisualizationVRZoanDigital TwinWed, 25 Mar 2020 11:30:00 GMTWed, 25 Mar 2020 11:30:00 GMT into action with this free environment collection from Project Nature available to download for free, the Project Nature environment collection helps you create lush landscapes and interactive plant life with ease.In collaboration with <a href="" target="_blank">Project Nature</a>, Epic Games has released over 20 environment and vegetation products for free on the Unreal Engine Marketplace.<br /> <br /> The collection includes over 75 varieties of plant species, many of which were photoscanned from real-world vegetation, ranging from towering trees to the smallest of ground plants and flowers. Create a refreshing oasis retreat, escape to a serene meadow, or adorn a warm desert landscape for your audience to explore; with more than 550 optimized models, the sky, or the ground, is the limit on what you can grow!<br /> <br /> <span style="color:#3498db;"><strong><a href="" target="_blank">Download Project Nature’s environment collection</a></strong></span><br /> <img alt="News_SpringAssetGiveaway_BodyImg.jpg" height="auto" src="" width="auto" /><br /> Crossing grasslands and forests, Project Nature scanned interesting plants across the globe and assembled this collection over the course of two years. Now you have unlimited access to:<br />   <ul style="margin-left: 40px;"> <li>Hundreds of optimized plant models and materials</li> <li>The shader-based Project Nature Wind System 2.0</li> <li>The Project Nature Interaction System 2.0, including replicated animations</li> <li>A dynamic low-poly grass system</li> <li>A dynamic ivy creation pack</li> <li>And more!</li> </ul> <br /> This is the latest free content offering available to Unreal Engine creators. To find all available free content, visit the <a href="" target="_blank">free section of the Marketplace</a> and make sure to check back on the first Tuesday of each month to find even more free content!<br />  CommunityGamesMarketplaceTue, 24 Mar 2020 14:00:00 GMTTue, 24 Mar 2020 14:00:00 GMT students create stunning real-time ray tracing automotive project Bécart, a student from ISD Rubika, explains how he and fellow student Odilon Loïez created an impressive real-time ray-traced project showcasing Hyundai’s Genesis Essentia Concept car.Hi! I’m Valentin Bécart, a digital design student studying at ISD Rubika in France. I started learning Unreal Engine last year and my goal has become to work in the automotive industry. I’m here to showcase how my two-person team developed the <a href="" target="_blank">Genesis Essentia real-time experiment</a> using Unreal, which was a passion project of ours. The <a href="" target="_blank">Hyundai concept car</a> you see in the video was designed by professional automotive designer <a href="" target="_blank">Sasha Selipanov</a>, and my friend and development partner <a href="" target="_blank">Odilon Loïez</a> modeled it for our project. You can see the video below.  <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> With this project, we really wanted to establish an efficient pipeline we could leverage moving forward. Odilon was in charge of the Blender phase, the model of the car, all the Data Prep to use the model in real time, and the rig of the vehicle to create the animation. My job was to work with Unreal and all the visualization aspects. <br /> <br /> We chose Unreal Engine over pre-calculated software because of its ability to render images and videos in real time, coupled with the fact that the engine started offering ray tracing support. It’s an aspect we were excited to experiment with. <br /> <br /> For modeling toolset, we decided to use Blender because it’s a powerful free software. Odilon had also been using it for years, thanks in large part to his last internship at KISKA, a design agency in Austria.<br /> <br /> We went with Unreal Engine, in part, because a lot of automotive companies are including the tech in their pipeline. We also like how Unreal has the ability to create high-quality real-time ray-traced images and videos. In addition, thanks to my last internship at Light & Shadows, I’ve learned a lot about how to incorporate data within a real-time engine. <br /> <br /> Throughout the process, we also leveraged Unreal Engine’s VR capabilities, which we found very helpful to visualize our materials and the volume of the car. <br /> <br /> Below, you can see the final model Odilon made using Blender:<br /> <img alt="TechBlog_Genesis_012.jpg" height="auto" src="" width="auto" /> <h3><strong>The layout of our scene</strong></h3> The best way to see the effects of real-time ray tracing is to use lots of reflections, so we created a layout that would showcase many reflections throughout the car and the environment. <br /> <br /> As you can see in the image below, we decided to use two screens to project lights and reflections within our scene. The background videos you see on the panels are the work and property of <a href="" target="_blank">Studio A N F</a>, which really fit the mood of our artistic vision. <br /> <img alt="TechBlog_Genesis_013.png" height="auto" src="" width="auto" /> <div style="text-align: center;"><em>Here is the final representation of our environment.</em></div> <h3><br /> <strong>Setup and lighting </strong></h3> To display and play the video on our screens, we used <a href="" target="_blank">Media Texture</a> and Blueprints, respectively. With this setup, the lights from the screen are able to reflect onto our car. To create our illumination, we used a rect light coupled with Gaussian blur. This technique is used for both screens surrounding the vehicle. <br /> <img alt="TechBlog_Genesis_014.png" height="auto" src="" width="auto" /><br /> To create the animation, we used Unreal’s <a href="" target="_blank">Sequence Editor</a>, which we found to be very powerful and intuitive. We found it very helpful because the tool not only allows you to adjust the location of shots, but allows you to adjust object settings like materials, intensity, and color. We also liked how it allowed us to play around with the camera’s aperture and depth of field.  <br />   <h3><strong>View Modes</strong></h3> We found that View Modes can really help with visualization as it provides access to reflections, diffuse, and lighting. We’ve sectioned off the different view modes from our project below. From left to right, they include: Final, Diffuse, Lighting + Reflection, and Reflection only.<br /> <img alt="TechBlog_Genesis_015.jpg" height="auto" src="" width="auto" /> <h3><strong>Ray tracing</strong></h3> When real-time ray tracing became available, we became impressed by the power and beauty of what we could get within the viewport.  <br /> <br /> For our project, we used low-settings quality for the reflections due to the limitations of our graphics card. Specifically, we chose to use two bounds of reflections with a sample per pixel around 8 and 16. The contrast between Screen Space Reflections and the Ray Traced Reflection is astonishing. You can see the difference between both in the following images:  <div class="juxtapose"><img alt="TechBlog_Genesis_016.png" class="before-after-slider-1" height="auto" src="" width="auto" /><img alt="TechBlog_Genesis_017.png" class="before-after-slider-2" height="auto" src="" width="auto" /></div> <div style="text-align: center;"><em>Left: Screen Space Reflection; Right: Ray Traced Reflection</em></div> <div style="text-align: center;"> </div> <h3><strong>Ray tracing details</strong></h3> To experiment with the power of real-time ray-traced reflections, we played around with various settings. It was impressive seeing the number of reflection bounds we could have in the viewport. For example, let’s take a look at how it applies to our car’s wheels:<br /> <img alt="TechBlog_Genesis_018_One-bound.png" height="auto" src="" width="auto" /> <div style="text-align: center;"><em>One bound in Reflection</em><br /> <img alt="TechBlog_Genesis_019_Two-bound.png" height="auto" src="" width="auto" /><br /> <em>Two bounds in Reflection</em><br /> <img alt="TechBlog_Genesis_020_Three-bound.png" height="auto" src="" width="auto" /><br /> <em>Three bounds in Reflection</em><br /> <img alt="TechBlog_Genesis_021_Final-result.png" height="auto" src="" width="auto" /></div> <div style="text-align: center;"><em>Above is the final result with two bounds in the viewport, which were limited by our personal computer.</em></div>   <h3><strong>In conclusion</strong></h3> Tackling this endeavor allowed us to learn and experiment a lot. It enabled us to find an effective workflow between Blender and Unreal Engine within the context of an automotive project. We’re happy to say that we were able to establish an efficient process that we’ll be able to leverage in future projects. <br />   <hr />Looking for more unreal engine learning resources? We have over 40 hours of free, hands-on video courses and guided learning paths available on <a href="" target="_blank">Unreal Online Learning</a>.<br />  Automotive & TransportationArtBlueprintsCommunityDesignEducationRay TracingVRGenesis EssentiaValentin BécartTue, 24 Mar 2020 11:30:00 GMTTue, 24 Mar 2020 11:30:00 GMT A Simple Story uses its art direction to tie its story and gameplay together Spanish studio Piccolo shares their artistic workflow and explains how they created Arise’s amazing animation, VFX, and lighting. <div>New Spanish studio <a href="" target="_blank">Piccolo</a> formed when partners in the digital advertising industry decided to quit their financially stable jobs working with big brands such as Nike and Coca-Cola to pursue their game-development dreams. Considering the studio founders were not only financially successful but earned many accolades in their previous roles, this departure equated to a significant gamble on their part. Despite stepping into unfamiliar territory, Piccolo’s first title, <em>Arise: A Simple Story</em>, garnered great reviews, which is a rarity for a brand new studio. The game’s art direction, in particular, was heavily praised with sites like <a href="" target="_blank">PlayStation Universe</a> stating, “<em>Arise</em> has an amazing art style. I was captivated by the visuals from beginning to end.” </div> <br /> To see how the fledgling developer was able to create such a charming, minimalistic look, we reached out to several members of the team. The Piccolo devs talk about how they looked outside of the gaming industry for inspiration, most notably at Studio Ghibli and classic animated Disney films. They share their philosophy for how colors and shapes can be used to convey different moods and delve into how they created the game’s innovative rewind mechanic, which allows players to forward and reverse the flow of time. Finally, they walk artists through their workflow and talk about how they created <em>Arise</em>’s lighting, VFX, and animation systems.   <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <strong>Arise features a very imaginative, minimalist art style that is at times playful, enchanting, and mysterious. How did you come up with the artistic vision for the game?</strong> <br /> <br /> <strong>Art Director Jose Luís Vaello: </strong>There are multiple factors to take into account here. First, we were telling a simple story and wanted to translate that simplicity to the art style, but we didn’t want to go too low-poly or low-fidelity. We spent a lot of time experimenting with the lighting and atmosphere because those are elements that really set the mood of the game.<br /> <img alt="arise-1920.jpg" height="auto" src="" width="auto" /><br /> We were also telling a very emotional story, and we wanted the emotion conveyed in each level to [resonate]. If you add too many elements into a scene, your attention is drawn to each individual detail instead of the whole. For this reason, each of our levels has a different color palette with a dominant color.<br /> <br /> In the end, it’s all about conveying an emotion: cheerful, melancholic, sad, etc. It was an exercise of self-restraint, especially in the bleaker chapters. In video games, it’s common to just add as many things as the budget and GPU can handle to “enrich” the visuals. <br /> <br /> <strong>Were there any games or works of art that influenced the look of <em>Arise</em>?</strong><br /> <br /> <strong>Vaello:</strong> We try not to use other games as references when building our own because we believe leaning on references outside the industry will make our games more unique. Still, we are gamers ourselves and we are influenced by what we play. I would say that Inside would be the most obvious influence. It is minimalistic, too, but its simplicity comes from a lot of work, and it has an outstanding attention to light and atmosphere.<br /> <br /> Still, our main influences actually come from traditional hand-drawn animated movies. Studio Ghibli, of course, but also a lot of classic Disney films. Fantasia, for instance, was a huge inspiration visually, and serves as a great example of how to create emotion and a wordless narrative with [inanimate objects] coupled with a [compelling] soundtrack. <br /> <br /> Also, in terms of animation and color palettes, we took inspiration from many Pixar films, particularly “Up,” which uses it effectively to play a big role in influencing narrative arcs. <br /> <br /> <strong>With the game's unique time-bending mechanic, snow in an environment can spring forward and melt to reveal flowers underneath. Was it challenging to get the same levels to look equally stunning across various seasons?</strong><br /> <br /> <strong>Technical Artist Jordi Ministral: </strong>The biggest challenge when producing the game was transforming the environments. Most games feature a static environment and characters moving through them, so engines are optimized to work this way. Our game was the opposite: We have a single character and we dynamically alter the environment around him. Depending on the chapter, this involved updating the transforms of all assets in the level, or morphing a mesh in real time for a melting patch of snow, updating collisions, and more. In other chapters, the daylight cycle forced dynamic sunlight with dynamic shadows. <br /> <br /> Having this running at a good framerate on consoles was tricky, and we had to come up with a number of creative tricks to make it work. For example, all of our environment actors have collisions disabled and are only enabled when the object is within a certain distance of the player. This saved a lot of CPU [cycles] in levels where scrubbing time involved updating thousands of actors.<br /> <img alt="sc75k6.png" height="auto" src="" width="auto" /><br /> <strong><em>Arise</em> features a wide variety of visual effects that showcase fire, snow, wind, and more. How did the development team implement the game’s VFX?</strong><br /> <br /> <strong>Ministral: </strong>Related to the aforementioned answer, since our gameplay features a timelapse mechanic, we had to make everything “rewindable.” Traditional particle effects move forward all the time, and the state at T depends on the state at T-1. You can’t send the simulation to a specific time in real time because that would mean running the simulation from time 0 to the time you want to display at each tick. So we ended up creating our own “rewindable” particle systems that use two different techniques:<br /> <br /> 1) If the movement you want to represent can be expressed with a mathematical formula, you implement the formula in the vertex shader. Falling debris is a good example of this, as a falling object’s position is:<br /> <br /> P(t) = P(t0) + V(t0)*t + (G * t2)/2<br /> <br /> So, with a base formula and some clever randomization per individual, you can simulate falling rocks that execute on the GPU and can be sent to any moment in time [simply] by setting a value for T. We did this for dust clouds that simply project outwards from a starting position and also for ambient effects like rain and snowflakes.<br /> <br /> 2) For more complex simulations that cannot be easily expressed with a formula, we created a system that bakes the complete simulation into textures. Then the shader simply samples the texture position corresponding to the position in time that is to be represented. We get full rewindable cheap GPU particles at the cost of some memory, but since our game is light on textures, it was not a problem [for us].<br /> <br /> We also have a system that bakes a spline’s positions and tangents into textures that can be accessed within the shader to drive an animation, so artists can just tweak the spline within the editor and see the effect adapt. Once you start seeing a texture as abstract information, there are lots of creative things you can do.<br /> <img alt="sc75k7.png" height="auto" src="" width="auto" /><br /> <strong>With lighting that includes immersive sun shafts, bloom, and more, how did Piccolo Studio light the game?</strong><br /> <br /> <strong>Ministral:</strong> First of all, we use only what the engine provides, which is more than enough for us. We are a small team, and we think we should focus on creativity and let Epic do what they do best.<br /> <br /> The most important decision [we made] was to spend a big part of GPU cost on lighting, volumetric fog, and DOF. So, for consoles, we knew we couldn’t afford complex materials or a huge amount of triangles. Draw calls needed to be minimized, too. We already had made a decision to have a simple art style, so it was doable.<br /> <br /> Then we had to come up with a workflow that allowed us to iterate lighting fast. We wanted the art director to be able to adjust lighting. The problem is that lighting a scene involves a skylight component, a directional light component, exponential height fog, etc. Many components with many different parameters organized in a way that makes sense from a programmer’s point of view, but very difficult for an artist to grasp.<br /> <br /> If I want the shadows of my scene “bluer,” I know I need to adjust the indirect lighting tint in a global post-process volume, but if I want the background of the scene lighter, I will have to adjust the scattering color of the height fog, and this is another actor in the level. Light shafts affect fog but are set up in the directional light. Moreover, if, like in <em>Arise</em>, all of these evolve over time, you have to set up a sequence for all of these actors, add keys for all of these properties, and teach artists methodologies using Sequencer. If you are a big team, you can afford a lighting artist, but the art director will still need to talk to them and sometimes things can get lost in translation.<br /> <br /> After trying several approaches, we came up with a system that artists could learn: we have a single actor that encapsulates all components relevant to [light] a scene: skylight, directional light, fog, and an unbound post-process volume. This actor is given a data asset we call a light asset that is a collection of all relevant properties of all those components. The actors copy values from the collection to the components in real time. Then the art director needs only to open the asset, adjust values and see changes in the scene, he doesn’t need to worry about where each property really belongs. The asset is just values that an artist can understand, so they can iterate on light and ambiance without knowing about components or [rely on] a technical artist.<br /> <img alt="image7.png" height="auto" src="" width="auto" /><br /> Once this system is in place, you get some positive effects: a scene’s entire ambience is contained within one asset, so you can have multiple settings and backup versions of your light by simply duplicating the asset. You can switch between them to see how the mood changes. You can also have one asset define the beginning of your laps, another define the end, and lerp them over on an alpha value.<br /> <img alt="image8.png" height="auto" src="" width="auto" /><img alt="image6.png" height="auto" src="" width="auto" /><br /> Then we came up with the concept of light volumes that work exactly [like] post process volumes. As the camera enters a light volume, all properties of the light asset of that volume are weight-blended with the default light of the level and feed into the underlying components. We ended up with dynamic light in space and time that artists could tweak on their own, so we could iterate many more times and the result was way better.<br /> <br /> Finally, we invested a lot in lightmaps. Getting subtle, indirect lighting that was never too dark or too bright was tricky, and required a lot of iteration. Lightmap artifacts and low-resolution imperfections are usually hidden by a texture’s detail, but our textures are almost flat, so we were very sensible to lightmap bleeding. We tweaked lightmap resolution for each actor manually and ended up changing some internal lightmass settings to achieve the clean look we wanted.<br /> <br /> <strong>The way the unnamed protagonist traverses across the environment coupled with how the seasons change the look of levels is charming. How did the team execute the animations in <em>Arise</em>? </strong><br /> <br /> <strong>Producer Alexis Corominas: </strong>Acting-wise, we used Clint Eastwood as a reference. We are big fans of his and the way he hides his emotions, but still manages to convey complex [subtleties] with the slightest of moves. When we modeled the character, we gave him gloves so he didn’t have fingers, and we didn’t rig the face on purpose. This forced the animators to work only with body language.<br /> <img alt="image4.png" height="auto" src="" width="auto" /><br /> We also set up our animation Blueprint so that for each state in our state machine, we can randomly pick one animation from a list of candidates, and animators can simply add variations of each action to that pool of animations. This makes the character feel more alive because each time he executes an action, it is a little different.<br /> <br /> We have “personality” volumes that override the pool of animations for a specific action. Animators can place these volumes in the map and when the character is inside the volume, he will play a different animation for that action that better matches the emotion of the moment.<br /> <br /> Also, we had different pools of idle breakers for each chapter, because each is devoted to an overall emotion. In the end, it was very important for us that the animations felt cohesive, but you felt subtle differences at every step of the journey.<br /> <br /> <strong>With the brighter and more cheery levels generally being easier juxtaposed against the sadder, darker areas being more challenging, <em>Arise</em> elegantly blends story, gameplay, and art together. Can you elaborate on how you pulled this off?</strong><br /> <br /> <strong>Corominas:</strong> First, for each chapter, we decided the one single emotion that we wanted to convey. Then we came up with all of the [design] elements that would work together to achieve that emotion, [such as:]<br /> <br /> <u>Shapes</u>: Rounded shapes are easier on the eyes so we used them for “good” memories, whereas sharp edges, high walls, and menacing cliffs were used for our “bad” memories.<br /> <br /> <u>Colors</u>: We use saturated, vivid colors to convey positive and happy emotions. For sad and bleak moments, we used desaturated colors and a more uniform palette, because in sad moments, the world itself seems muffled and lifeless.<br /> <br /> <u>Gameplay</u>: Platforming and jumping is exhilarating, so we mainly used it in our childhood, youth, and joyish moments. Somber moments, on the other hand, would demand more reflexive gameplay. As an example, we had some clever puzzles with snail shells in the [early] level “Joy,” but we removed them. They were good puzzles, but they felt like work, and childhood is about experiencing the world in a safe environment, not thinking about your way out of problems.<br /> <br /> <u>Difficulty</u>: We used the traditional concept of video game “challenge” in our memories of youth because it makes sense for teenagers to challenge themselves. Difficulty during the old age phase comes by way of environmental hazards, because being old can be about ordinary actions that become harder and harder, and not about overcoming great challenges.<br /> <br /> If you have a clear idea of what you want to achieve at a high level, it helps a lot with all of your low-level decisions, because you have a reason for everything. It gives your project consistency.<br /> <br /> <strong><em>Arise</em> features a very minimal user interface. Why was this a good fit for the game and how did the team design it?</strong><br /> <br /> <strong>UI designer Oriol Pujadó:</strong> As with everything else, this stems from our high-level concept of achieving simplicity. We wanted players to connect with the story in a very raw, wordless, and emotional way; so the simpler, the better. However, with user interfaces and especially with tutorials, there’s a thin line there that you cannot cross, because if players don’t understand your tutorials, it will ruin the game for them.<br /> <br /> <strong>Can you walk us through the studio's artistic workflow?</strong><br /> <br /> <strong>Corominas: </strong>Once we decide on what a chapter will be about and what players will be doing in terms of gameplay, designers start working on a grey-box layout of the level and the art team does some concept artwork [to establish] the general mood of the scene.<br /> <img alt="image2.png" height="auto" src="" width="auto" /><br /> Then when designers have something “playable,” the art director does some paintovers to make structures more interesting, and artists do the first iteration of assets.<br /> <img alt="image3.png" height="auto" src="" width="auto" /><br /> Then the map goes back to designers and they make sure everything is still playable and re-adjust things. Basically, we go back and forth between the art and design team until we run out of production time! <br /> <br /> In <em>Arise</em>, we also have an automatic camera. You will notice when you’re playing the game that we are constantly adjusting the point of view, distance, and FOV depending on where you are. So, as the level art is being iterated, we are also iterating the camera. Each level has hundreds of camera volumes that are blended together as the player moves around, and these need to be tested and iterated a lot to ensure the perspective is not making you miscalculate a jump.<br /> <br /> The last thing we do is lighting (including lightmaps) and [manage the] atmosphere, as it relies on the art assets for each level to be reasonably “final.”<br /> <br /> <strong>Can you talk about your Unreal experience coming into the project?</strong><br /> <br /> <strong>Corominas:</strong> Some of the team members we hired did have previous Unreal experience and had worked on released games. For us, it was our first game, so we didn’t have any experience with any engine. We actually had to do research on game engines. After we decided on UE, we then recruited people with UE experience. <br /> <br /> Why was Unreal Engine a good fit for the game?<br /> <br /> <strong>Corominas:</strong> Because it is brilliant, powerful, and easy to work with. <a href="" target="_blank">Blueprints</a>, on its own, is a gamechanger, as programmers are no longer your bottleneck. Same thing with the shader editor; 3D artists can learn how to create and adjust most of the <a href="" target="_blank">materials</a>, and your technical artists can focus on the more complex stuff. The more your non-technical [people] can work within the editor on their own without the help of a technical profile, the more your productivity boosts, and you can see how UE was designed with that in mind. For small studios with limited resources, this productivity boost is even more important than the pure technological edge of having the newest graphical effects available to you.<br /> <br /> <strong>Ministral: </strong>There’s this misconception of Unreal being only fit for hyper-realistic rendering and AAA games, but this is simply not true. Just because the default settings are geared for realism doesn’t mean stylized rendering is difficult to achieve, it is not. Sometimes developers want to use every feature just because they are available to them. You have to “decide” not to use AO or motion blur if it doesn’t suit your style, or keep your materials simple. Sometimes just a diffuse and normal map will do it. <br /> <br /> <strong>How did you learn the engine?</strong><br /> <br /> <strong>Ministral:</strong> Lots of tutorials and many hours analyzing template projects. We do have a background in digital production and motion graphics, so the core concepts of rendering are not unfamiliar to us, we only had to learn the specifics of the engine. Whoever designed it must have a similar mindset to ours, because everything just made sense, learning it came very naturally. Of course, it takes time to know all components available and what every property does, but the basic workflow was easy to grasp. Also, as we recruited people with experience, we could lean on that experience to speed up the process.<br /> <br /> <strong>Can you provide any development tips for artists using Unreal Engine?</strong><br /> <br /> <strong>Corominas:</strong> It sounds obvious, but keep your content browser tidy, agree on a naming convention at the start of the project and enforce it. It is tempting to just start working on personal folders and pretend to figure out a proper organization later in the project, but this never happens and you are stuck with messy folders forever.<br /> <br /> <strong>Ministral: </strong>Trust the engine. Epic’s defaults are better than you think. Especially when it comes to lightmaps, it will be very common that your first iterations will be disappointing, and you will be tempted to just ramp up all quality settings until it looks nice. If you don’t really know what you are doing, don’t touch them. Follow the documentation. Make sure your lightmap UVs are properly set. Take into account that light color and intensity affects bouncing, too. Sometimes we do a first lightmap calculation with a white skylight of intensity 1, so basically we are bouncing an initial light of 1 and it helps to see what is happening, which materials are bouncing enough light, and which ones are not.<br /> <br /> <strong>Thanks for your time. Where can people learn more about <em>Arise</em>: A Simple Story?</strong><br /> <br /> <strong>Ministral:</strong> You can find more about the game at <a href="" target="_blank"></a>. <br />  Arise: A Simple StoryArtartistartistsBlueprintsCommunityGamesPiccoloDesignUnreal EngineUE4Jimmy ThangThu, 19 Mar 2020 17:00:00 GMTThu, 19 Mar 2020 17:00:00 GMT production on the battlegrounds of “Game of Thrones” technology powered virtual sets, virtual scouting, and shot exploration as The Third Floor helped showmakers innovate increasingly complex scenes on <em>Game of Thrones</em>.<br /> The Third Floor is one of the entertainment industry’s most remarkable visualization companies. It has won a multitude of awards including five Emmys, and is a major contributor to the visual storytelling that audiences see today—from Marvel Studios’ films like <em>Avengers: Endgame</em> to streaming <em>Star Wars</em> series such as <em>The Mandalorian</em> on Disney+. <br /> <br /> Since its inception in 2004, <a href="" target="_blank">The Third Floor</a> has helped content makers in film, television, video games, and location-based entertainment to design and realize compelling projects. Its teams are experts at using the virtual world to plan and problem solve ahead of shooting, leading the future wave of virtual production.<br /> <br /> The Third Floor’s work on the mega-popular TV series <a href="" target="_blank">Game of Thrones</a> helped to create some of the most memorable scenes on the show. In turn, constantly striving to push the boundaries of what’s possible has had a transformative impact on the studio’s pipeline. “The experience we had on <em>Game of Thrones</em> gave us the confidence to adopt Unreal Engine fully across the company,” says Kaya Jabar, Virtual Production Supervisor. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <h3><strong>Building virtual sets for huge-scale action</strong> </h3> The Third Floor became involved in the production of <em>Game of Thrones</em> back in the early days of the series, previsualizing select scenes that were going to be particularly complicated. With multiple departments, VFX vendors, and units working on the show, the studio’s mockups helped define and communicate the showmakers’ creative and technical vision. <br /> <br /> With the bar for epic visuals soaring and audience expectations rising season over season, The Third Floor’s role as a hub for design and planning became increasingly important. The studio needed a way to keep improving on the benefits of visualization—it needed new tools and new approaches.<br /> <img alt="Spotlight_ThirdFloor_Blog_Body_Image_1.jpg" height="auto" src="" width="auto" /><br /> One of these tools was a virtual scouting toolset, built by The Third Floor in Unreal Engine, that enabled the show’s team to explore and stage dramatic Season 8 scenes in increasingly ambitious environments. This became a vital asset to plan out shots for sets that were still under construction or were in various phases of design. The studio deployed multiple artists to Belfast early on to work with the art department and build virtual versions of sets in Unreal Engine for scouting. <br /> <img alt="Spotlight_ThirdFloor_Blog_Body_Image_4.jpg" height="auto" src="" width="auto" /> <h3><strong>Empowering production to experiment with real-time tools</strong></h3> Whether evaluating a location or envisioning a yet-to-be-created set, virtual scouting enables more time for experimentation, greater discussion, and the ability to better refine the shot list. Environments can be traversed in a matter of seconds from the comfort of the art department. Testing different lenses with real-world camera settings, a variety of lighting setups, and various character animation paths allows for more creative iteration.<br /> <img alt="Spotlight_ThirdFloor_Blog_Body_Image_3.jpg" height="auto" src="" width="auto" /><br /> Compared to pre-rendered frames, a real-time workflow lends itself to flexibility, quick iteration, and constant feedback that increases creativity and promotes communication between departments within the production. "As an extension of real-time, virtual reality offers complete immersion in the environment, leading to not only an increase in artistic considerations but also aiding with technical decisions," says Adam Kiriloff, Senior Real-Time Technical Artist at The Third Floor. "For example, how many horses can fit side by side in the main street of King’s Landing? Is a crane really needed for that over-the-wall shot and how do you go about framing with a massive dragon in the room?" <br /> <img alt="Spotlight_ThirdFloor_Blog_Body_Image_2.jpg" height="auto" src="" width="auto" /><br /> The virtual scouting toolset developed in house by The Third Floor enables shots to be planned in VR. The idea behind the tool was to provide a way to view previs environments from a more immersive perspective and accelerate the process by planning camera shots and testing lenses virtually. Using either a tablet or wearing a headset, users can view the environment or be fully immersed within it.<br /> <br /> A key feature is the “virtual lens”—a virtual screen attached to the controller in VR that users can hold up to plan shots within the virtual environment. The virtual lens mimics real-life camera and film-back settings. Lens configurations can be set in advance and lens swapping in VR is as easy as scrolling through a selection list. 3D annotation makes it possible to take notes and do group reviews. Measurements and distances can be calculated using a variety of laser and point-to-point measuring tools.<br /> <img alt="Spotlight_ThirdFloor_Blog_Body_Image_7.jpg" height="auto" src="" width="auto" /><br /> The Third Floor’s virtual scouting solution came into play on <em>Game of Thrones</em> in Season 8 as a way to plan camera shots and decide on character movements in sets like King’s Landing that were just starting to be constructed in the production backlot. Other sets that already existed, like the Throne Room, were virtually scouted as well in order to visualize the set in several stages of destruction and in different lighting scenarios. <br /> <br /> Working within the art department, artists from The Third Floor built digital 3D assets and ingested them into Unreal Engine, creating environments including Winterfell, The Red Keep, and Castle Black. "We created environment materials in Substance Designer, painted bespoke props in Substance Painter, and developed particle effects and set up lighting and atmospheric effects like fog and ash in Unreal Engine," says Kiriloff. "In some instances, we used photogrammetry to capture key props, such as the Iron Throne itself."<br /> <img alt="Spotlight_ThirdFloor_Blog_Body_Image_8.jpg" height="auto" src="" width="auto" /><br /> The environments were faithfully recreated from a variety of sources and accurately reflected real-world scale. Once a version of the set was in Unreal Engine, it could be scouted by everyone from production to visual effects to the art department in an HTC Vive head-mounted display using The Third Floor’s virtual scouting toolset. This approach was key for Season 8, Episode 6—the series finale—that saw the director of photography working with The Third Floor team to virtually scout action and camera coverage to produce rough blockings.<br /> <img alt="Spotlight_ThirdFloor_Blog_Body_Image_11.jpg" height="auto" src="" width="auto" /><br /> One such sequence in this episode sees Jon Snow making his way up the stairs of the crumbling Red Keep to meet Queen Daenerys as she surveys the ruins of King's Landing and addresses her armies. This sequence began life as an animatic created by Director of Photography Jonathan Freeman as he worked with The Third Floor’s VR set scouting and virtual camera team. <br /> <img alt="Spotlight_ThirdFloor_Blog_Body_Image_12.jpg" height="auto" src="" width="auto" /><br /> Once the animatics were approved, the process moved into previs. The Third Floor’s artists took files directly from the virtual camera team and, using these as a guide, began to add animation and camera moves to the shots.<br /> <br /> For the pivotal Throne Room scene, the episode cinematographer scouted a virtual version of the Throne Room created by The Third Floor. Working in the environment in real time, it was possible to quickly find the best possibilities for shots and angles that fit for the story, taking still-frame photoboards on the fly to develop a refined shot list. In a scene featuring a 747-sized CG character within a real set, virtual scouting provided an effective way to evaluate the area the dragon would take up and what frame-ups worked best to capture the action.<br /> <img alt="Spotlight_ThirdFloor_Blog_Body_Image_6.jpg" height="auto" src="" width="auto" /><br /> The interactivity inherent in real-time technology provides the power to visualize and evaluate while being accurate to production’s evolving designs, helping to inform them. With wait times eliminated, it was not an issue to try out an idea or see what the scene might look like with different lighting options or even at a different time of day. This put the project into a more flexible space, with more opportunity to explore new creative avenues. <h3><strong>Empowering creatives with new virtual toolsets </strong></h3> Clients love the immediacy of real-time rendering, but also want uncompromised visual quality. This is one of the key reasons The Third Floor adopted Unreal Engine in its real-time workflow. “If we're going to be catering to cinematographers, directors, and designers who demand the highest fidelity, the best lighting, the best overall user interface, we need a workflow with Unreal Engine,” says Chris Edwards, CEO of The Third Floor. <br /> <br /> In the end, it comes down to that ability to facilitate greater creativity. “For me, Unreal Engine is the ultimate holy grail that helps bring together everything we are trying to do, allowing us to provide directors, designers, and other visionaries with much more capability than they could have ever imagined,” says Edwards. “They now have so much more freedom to create whatever they want.”<br /> <br /> The Third Floor foresees many more creators in entertainment catching on to these benefits in the future, even for smaller-scale productions. “You know, a lot of people think that this is a big, expensive thing because our work includes Hollywood blockbuster movies,” says Edwards. “But really, there’s so much potential for many more films and projects, and I think Unreal Engine is going to unlock mainstream access to this type of toolset and capability.” <br /> <br /> <br /> Want to test out and explore your own creative ideas? <a href="" target="_blank">Download Unreal Engine</a> for free today!<br />  Film & TelevisionPrevisVirtual ProductionVirtual SetsThe Third FloorGame of ThronesWed, 18 Mar 2020 20:00:00 GMTWed, 18 Mar 2020 20:00:00 GMT for the jobs of tomorrow with a new field guide for creators you prepared for the jobs of tomorrow? Introducing the C<em>reator’s Field Guide to Emerging Careers in Interactive 3D</em>. Learn about careers in the 3D graphics industry, the skills needed for each role, and the skills requested most by employers.Are you prepared for the jobs of the future? The <a href="" target="_blank"><em>Creator’s Field Guide to Emerging Careers in Interactive 3D</em></a> will help you understand what it takes to build a workforce for an immersive world.<br />   <h3><strong>Interactive 3D skills are in huge demand</strong></h3> Interactive 3D started in the games industry, but this technology is now being used everywhere—from advertising to manufacturing, architecture, healthcare, film, and more. The ability to simulate and interact with the virtual world is helping us solve problems in the physical world that we couldn’t previously, fundamentally changing the way we work and communicate.<br /> <br /> As more and more employers embrace this technology, the demand for real-time 3D skills is skyrocketing. <a href="" target="_blank">Burning Glass reports</a> that jobs that require real-time and interactive 3D skills are growing <strong>601% faster</strong> than the job market overall. Real-time skills are real-world skills, and Unreal Engine is the fastest-growing real-time engine skillset to have on your resume.<br /> <img alt="News_CreatorsFieldGuide_blog_body_img_burningglass.png" height="auto" src="" width="auto" /><br /> New job roles and job titles are also appearing. For example, <a href="" target="_blank">Hired’s 2020 Report on software engineering jobs</a> found that AR/VR engineer is now the fastest growing software engineering role—growing 1400% faster than other engineer roles.<br /> <br /> New roles such as experiential designer, simulation specialist and architectural visualization specialist are also emerging. Given how fast technology is changing, students and jobseekers may not even be aware of these opportunities and the skills that unlock them. <br />   <h3><strong>Introducing the “Creator’s Field Guide to Careers in Interactive 3D”</strong></h3> The <em>Creator’s Field Guide to Careers in Interactive 3D </em>is a free roadmap for students and job seekers, aimed at helping them navigate the exciting world of interactive 3D careers and providing guidance on the specific skills and competencies these new roles require.<br /> <br /> <a href="" target="_blank"><span style="color:#3498db;"><strong>Download the <em>Creator’s Field Guide to Careers in Interactive 3D</em></strong></span></a><br /> <img alt="News_CreatorsFieldGuide_Blog_Body_Image_3.jpg" height="auto" src="" width="auto" /><br /> The guide was created through interviews with hiring managers, professionals in these roles, and Unreal Engine experts. We specifically focused this guide on the entry-level skills and competencies needed for eight new and emerging job roles.<br /> <br /> This guide is designed to:<br />   <ol style="margin-left: 40px;"> <li>Introduce students and job seekers to emerging careers in interactive 3D</li> <li>Arm job seekers and hiring managers with a roadmap for evaluating and communicating their skills</li> <li>Help educators identify which competencies to focus on when teaching Unreal Engine</li> </ol> <br /> The world of work is changing fast. Technology is not only reshaping the way people do their existing jobs, it’s opening up a whole world of new ones. These emerging careers are just a sample of what’s on the horizon as the demand for immersive content grows.<br /> <br /> In the future, everyone will be a creator.<br />   <h3><strong>Get started building your career in interactive 3D</strong></h3> Get the <a href="" target="_blank"><em>Creator’s Field Guide to Emerging Careers in Interactive 3D</em></a> for free, and kick off your career by getting started with one of our free video courses on <a href="" target="_blank">Unreal Online Learning</a>.<br /> <a href="" target="_blank"><img alt="News_CreatorsFieldGuide_Blog_Body_Image_1.jpg" height="auto" src="" width="auto" /></a><br /> Whether you are interested in building your career in games, architecture, automotive, film & broadcast, we’ve got courses to get you started.<br />  EducationLearningLinda SellheimTue, 17 Mar 2020 18:00:00 GMTTue, 17 Mar 2020 18:00:00 GMT audio in Unreal Engine explained this developer blog, Lead Audio Programmer Aaron Mcleran discusses how split screen best works from an audio perspective using Unreal Engine.Since split-screen support rolled out for Fortnite in Season 11, I found myself regularly walking through the details with production, gameplay programmers, and sound designers with how split-screen audio works and what are expected results. Even now, after shipping with the feature, I sometimes am pinged by other developers within Epic internally with someone asking me how a detail or two works.<br /> <br /> It turns out that most people’s intuitive understanding of how it should work is counter to how it actually does and needs to work. Because of the surprising amount of interest there is to hear more about this topic from the game audio community, I wrote this developer tech blog to provide a basic walk through of how split-screen audio works in Unreal Engine.   <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <h3><strong>Background On Split-Screen Gaming</strong></h3> Split-screen gaming has seen somewhat of a recent comeback in games. It was huge in the early days of multiplayer gaming before the advent of console networking support. Some of my fondest teenage memories are playing split-screen games like Goldeneye and Halo with my friends in pizza-fueled sleep-overs. Then, after the first consoles started offering broadband internet support, we saw a slow drop off in split-screen support. That is slowly reversing. People like playing together online *and* in-person with their friends.<br /> <br /> Fortunately, Unreal Engine supports up to four-way split screen. Often games are made to support local “couch” co-op and remote networked multiplayer. This is what we do in Fortnite. You can have your friends over and share the same console and jump in and play with your other friends across the world. <br /> <br /> To enable split screen in Unreal Engine, you just need to enable the option in your project settings in the Local Multiplayer settings tab and select the split screen mode you want. You can also customize and do much more using the C++ API.<br />   <h3><strong>The Challenges of Split-Screen Gaming</strong></h3> Enabling split screen is ultimately not as simple as toggling a game project option and hoping for the best. Adding more perspectives into the world results in more objects requiring rendering and less opportunities for culling. It also means more things are in use and loaded and thus referenced by the garbage collector. Doing all this extra work places great strains on nearly every subsystem in a game engine.<br /> <br /> On the audio front, <strong><em>things get interesting</em></strong>.<br />   <h3><strong>Intuition Is Not Always Right</strong></h3> From my conversations with fellow developers, the intuition most people seem to have is that audio for each split perspective should render out all the audio audible to each player (i.e. “listener”). Indeed, it makes sense if: a gun goes off right next to player one, it should sound near to player one. But from the perspective of player two, who is further away, it should also sound like it’s far away! After all, <em><strong>ears can’t be split</strong></em>. We should hear all audio from both perspectives, right? Wrong.<br /> <br /> Setting aside the doubled CPU cost (for two-way split screen) of audio rendering, this scenario would simply cause an unending sonic catastrophe. Think about it: every single event that happens on screen that is within audible range of all players is doubled for two-way split screen. For three-way or four-way, it could be quadrupled! One footstep happens and… it’s the sound of a group of people taking a step. If that footsteps happens at the exact same time for each listener (which it would) and they play the same footstep variation, you’ll suddenly get very loud audio as all the identical sounds constructively add together. You might even get clipping. Just imagine the chaos of a battlefield. Every gunshot is rendered and audible from all perspectives at once.<br /> <br /> Ok, so what <em><strong>is</strong></em> the right way to do it? It’s simple: <strong>render sounds once relative to the closest listener</strong>.<br /> <img alt="TechBlog_FNBR_Splitscreen_TV1_BodyImg.png" height="auto" src="" width="auto" /> <h3><strong>How Unreal Engine Handles Split Screen</strong></h3> Unreal Engine’s solution is quite elegant. It’s implementation strategy predates me, but it’s a simple and elegant solution that impressed me when I joined Epic five years ago. Although it's similar in principle to methods I’ve seen elsewhere, Unreal Engine’s solution is particularly elegant.<br /> <br /> Essentially what Unreal Engine does is simply transform every sound emitter location to be in the local-space transform of its nearest listener. This not only simplifies a ton of lower-level details, it means that the lower-level audio renderer only really needs to worry about <strong>one</strong> listener transform. Indeed, because of this simplicity, the <a href="" target="_blank">audio mixer</a>, which is our new multiplatform audio renderer, doesn’t require any listener geometry representation. Since all sound emitters are always transformed into the local-space transform before being sent to the audio renderer, all sounds are simply rendered spatialized relative to an identity matrix. This simplicity reduces complexity around our spatialization code and has paid dividends with regard to simplifying development of our more exciting next-gen spatialization features. <br />   <h3><strong>Downsides</strong></h3> Although counter-intuitive, rendering sounds only relative to their closest listener makes a lot of sense, but it does come with some drawbacks, such as:<br />   <ul style="margin-left: 40px;"> <li>Perspectives that aren’t too helpful</li> </ul> <div style="margin-left: 120px;">Obviously, a gunshot shot near one of the split-screen players may not even be angled towards the closest listener but could be pointing at the split-screen player furthest away. So, you might think you should prioritize playing the sound relative to the player further away. The problem is, however, that such an audio cue might be confusing to the player getting shot at! But, as I stated, the alternative is more problematic. </div>   <ul style="margin-left: 40px;"> <li>Traveling sounds flipping perspective</li> </ul> <div style="margin-left: 120px;">A long duration or looping sound might be playing long enough to flip from one listener perspective to another. This flip can sound jarring. A worse-case scenario (and one we tested quite a bit) would be two players/listeners that are facing the same direction but are reasonably far away from each other and then you have a sound that travels from one player to the other. At the halfway point, the sound goes from sounding like it’s in front of a player to being behind the other player. It’ll “pop” from front to back, which isn’t great.</div>   <ul style="margin-left: 40px;"> <li>Rendering more sounds can change mix and priority balances</li> </ul> <div style="margin-left: 120px;">Any sound designer can attest that getting a balanced game mix is not easy. Getting a game with a balanced mix that works for both single and split screens is much more challenging. </div>   <ul style="margin-left: 40px;"> <li>Additional sounds and CPU costs</li> </ul> <div style="margin-left: 120px;">Although rendering audio only once (relative to closest listener) is way less expensive than rendering audio differently for each listener, it still adds CPU cost. This is because we simply have more audio “within range” as there are two places to consider when rendering in-range sounds. For a game like Fortnite, where we push the limits of CPU and Memory on a bi-weekly basis, this is no trivial challenge.</div>   <h3><strong>Multiple Endpoints</strong></h3> The savvy among you might ask about rendering audio to different audio endpoints; meaning  a separate hardware output (e.g. a different set of speakers, or different controllers, etc). Some consoles do support that and on PC it is possible to render audio to different endpoints. <br /> <br /> The idea here is that you could indeed render audio for each split-screen perspective and route it to the different hardware outputs that, presumably, the player would hear through headphones.<br /> <br /> Besides the fact that, as of <a href="" target="_blank">4.24</a>, this is not something supported in the UE audio engine (as of this writing, but we are planning to support this in 4.25).Furthermore, it’s not something that would be supported on all platforms equally. So, even if you wanted to try to mitigate the CPU cost of all the extra audio rendering, you’d likely still need an alternative solution in the cases where there is no capability of rendering to multiple hardware endpoints. I also find it amusing that you’d invite your friend over to play couch co-op with split screen then immediately put on headphones and not talk, but that’s just me!<br />   <h3><strong>The Takeaway</strong></h3> So, yes, handling audio split screening can be a somewhat complex and hairy topic with lots of technical details. However, it turns out that, counter-intuitively, if you simply render audio relative to the nearest listener (i.e. near split-screen view), it all works out reasonably well.<br /> <br /> For more information on the topic, check out the chapter I wrote on the topic in an upcoming 3rd volume of a book series, <a href="" target="_blank">Game Audio Programming Principles and Practices</a>, edited by Guy Somberg.<br />  GamesDesignFeaturesLearningTutorialsAudio RenderingFortniteLead Audio Programmer Aaron McLeranTue, 17 Mar 2020 15:30:00 GMTTue, 17 Mar 2020 15:30:00 GMT Automotive Materials pack now available on the Marketplace for free Games has updated the Automotive Materials pack to include 10 new master materials, over 150 instances, and more. Released in 2016, the original <a href="" target="_blank">Automotive Materials pack</a> represented an extremely valuable resource to not just automotive visualization artists but also laid out a path for many Unreal users to create stunning visuals. With a passion for automobiles that runs deep at Epic Games, we sought out to create a fitting update to that material pack. <a href="" target="_blank">Available for free now</a>, the revised collection comes with a completely new set of 10 master materials and over 150 instances that will give artists the foundation to add new materials to fit their product range.  <div style="text-align: center;"> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> </div> The new master materials now utilize object space triplanar projections instead of world space mapping, though, users can switch to UVs if they prefer. <br /> <br /> With this new pack, we ensured that the new materials worked seamlessly with <a href="" target="_blank">ray tracing</a> as we developed the new automotive master materials. This means users can get more realistic depictions of automotive type surfaces more efficiently than ever. Going well beyond fixing outdated materials, we created new dedicated master materials for textiles, car paint, and decals. We also added new metals, reflectors, flip-flop paint effects and are introducing new emissive materials for interior displays, new interior textiles, and a much-needed wood material. Furthermore, we took advantage of the amazing <a href="" target="_blank">Quixel</a> library of textures to deliver 4K texture maps in the process.  <div style="text-align: center;"><img alt="interior_5_Audi_HDR_Credit.png" height="auto" src="" width="auto" /></div> Rounding the collection out, we reorganized the materials so that finding the right material among the numerous material instances is easier than ever. The materials are now simpler, more organized, more efficient and, best of all, more fun to use. <br /> <br /> Special thanks goes out to Audi Business Innovation and Audi AG for providing the A5 Cabriolet 3D model.<br /> <br /> Download the collection today and try it for yourself! <br />  ArtDesignMarketplaceRay TracingAutomotive & TransportationProduct DesignFeaturesManufacturingRussell Paul, Francis Maheux, and Nicolas LongchampsThu, 12 Mar 2020 18:30:00 GMTThu, 12 Mar 2020 18:30:00 GMT Motion joins the Unreal Engine team provider of automated performance-driven facial animation technology Cubic Motion is joining the Epic family, extending our commitment to advancing the state of the art in the creation of digital humans.Today we are thrilled to welcome <a href="" target="_blank">Cubic Motion</a> to the Epic Games family. Cubic Motion is a longtime Epic partner and a leading provider of automated performance-driven facial animation technology and services for video games, film, broadcast, and immersive experiences. By joining forces, our teams are solidifying our commitment to advancing the state of the art in the creation of believable digital humans for all Unreal Engine users. <br /> <img alt="News_Cubicmotion_Epic_blog_body_image_1.jpg" height="auto" src="" width="auto" /><br /> Cubic Motion’s talent will work hand in hand with <a href="" target="_blank">3Lateral</a>, developer of innovative technologies that enable digitization of human appearance and motion at unprecedented levels of realism. 3Lateral joined the Unreal Engine team in January 2019 to lead development of the state of the art in real-time capabilities for the creation of virtual humans and creatures.<br /> <br /> “We are delighted to be joining Epic Games and look forward with excitement to this next chapter in our story,” said Cubic Motion CEO Dr. Gareth Edwards. “Together, we are uniquely positioned to push the boundaries of digital human technology, bringing ever more realism and immersion to all forms of visual entertainment.”<br /> <br /> “Digital humans are not only the next frontier of content creation, but also the most complex endeavor in computer graphics. With Cubic Motion bringing their computer vision and animation technology and expertise to our digital human efforts, Epic along with our team at 3Lateral are one step closer to democratizing these capabilities for creators everywhere,” said Tim Sweeney, founder and CEO of Epic Games.<br /> <br /> “Facial animation that conveys the slightest nuance of human expression is essential to crossing the uncanny valley. We believe that holistically combining Epic's Unreal Engine with 3Lateral’s facial rig creation and Cubic Motion’s solving technology is the only way to answer this challenge, and ultimately, to reach the pinnacle of digital human artistry with Unreal Engine,” said Epic Games CTO Kim Libreri. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> Cubic Motion has been integral to numerous notable Unreal Engine real-time demonstrations, including the first “<a href="" target="_blank">Hellblade: Senua’s Sacrifice</a>” live character performance at GDC 2016, followed by the expanded “<a href="" target="_blank">From Previs to Final in Five Minutes</a>,” which earned Best Real-Time Graphics and Interactivity at SIGGRAPH 2016. Epic and Cubic Motion have continued to collaborate, showcasing high levels of quality and believability in photorealistic digital humans in “<a href="" target="_blank">Meet Mike</a>” at SIGGRAPH 2017, and “<a href="" target="_blank">Siren</a>” at GDC 2018.<br /> <br /> Cubic Motion also develops the <a href="" target="_blank">Persona</a> system, which provides an end-to-end hardware and software solution for capturing and translating an actor’s performance onto their digital counterpart in real time, and enables immediate character facial animation in Unreal Engine.<br /> <br /> Cubic Motion’s facial animation technology has also been used in the production of many notable AAA titles, including Sony Interactive Entertainment’s “<a href="" target="_blank">God of War</a>” and Insomniac Games’ “<a href="" target="_blank">Marvel’s Spider-Man</a>.”Cubic MotionFeaturesMocapGamesFilm & TelevisionBroadcast & Live EventsVirtual ProductionVRDana CowleyThu, 12 Mar 2020 15:30:00 GMTThu, 12 Mar 2020 15:30:00 GMT 2020 delivers new levels of realism and much more higher-fidelity assets, more realistic lighting, and new options for presentation and review, Twinmotion 2020 takes fast, easy, real-time archviz to a whole new level. Get it today.We’re excited to announce that Twinmotion 2020.1 is now available! It’s packed with new features that help turn your CAD and BIM models into even more convincing visualizations and immersive experiences in just a few clicks, and provide new options for presentation and review. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <h2>Enhanced lighting and rendering</h2> Right out of the box, you’ll notice a higher-quality look to your scenes, thanks to an entirely reworked lighting and shadowing system that features a new screen-space global illumination (SSGI) method for indirect lighting and better baseline settings to more closely match physical lights.<br /> <br /> For exterior scenes, there’s a new physically based atmospheric sun and sky, offering more realistic skies that accurately reflect different locations, seasons, and times of day—including beautiful sunsets.<br /> <img alt="News_Twinmotion2020_Blog_Body_Image_1.jpg" height="auto" src="" width="auto" /><br /> New area and volumetric lighting let you simulate illumination from large surfaces like ceiling panels or windows, and add atmosphere to your scene with fog, mist, dust, or smoke. And the depth of field feature now closely mimics that of a real camera, for cinematic effects. There’s also a new automatic exposure option to provide a better viewing experience when moving from interiors to exteriors and vice versa.<br /> <img alt="News_Twinmotion2020Blog_Body_Image_10.jpg" height="auto" src="" width="auto" /><br /> On the materials front, this release adds a new X-Ray material for viewing occluded objects such as vents and plumbing, a Frosted Glass material, a light-emitting Glow material (courtesy of SSGI), and the ability to add video as a material—perfect for simulating TV screens or flickering fires.<br /> <img alt="News_Twinmotion2020_Blog_Body_Image_4jpg.jpg" height="auto" src="" width="auto" /> <h2>Enhanced vegetation system</h2> It’s not just lighting that’s received a quality boost. Twinmotion’s vegetation system gets some significant upgrades, starting with a new set of high-resolution tree assets from procedural organic 3D modeller <a href="" target="_blank">Xfrog</a>. Each tree, which has an order of magnitude more polygons than the previous assets, comes in three different ages and four seasons. To further enhance the realism, the texture resolution of materials has been increased, there’s a new two-sided foliage shader with sub-surface scattering that portrays the effect of light filtering through leaves, and the effects of wind are simulated at the leaf level. <br /> <img alt="News_Twinmotion2020_Blog_Body_Image_8.jpg" height="auto" src="" width="auto" /><br /> Other vegetation assets, including all bushes, have been replaced by high-quality Quixel Megascans assets.<br /> <br /> Visualizing how new plantings will develop over time is critical for many architectural and landscape design projects, and in some cases—such as for certain state-funded project bids—is an obligatory requirement. With Twinmotion 2020’s new Growth slider that blends and scales between the three provided tree ages, it’s easy to show how a project will look at delivery, a few years later, and then finally once everything has reached full maturity.<br /> <br /> Other new features in this area include a new scattering tool to apply vegetation on selected geometry, more customization options for grass, and improvements to the painting system.<br /> <img alt="News_Twinmotion2020_Blog_Body_Image_2.jpg" height="auto" src="" width="auto" /> <h2>Higher-fidelity humans</h2> Not to be outdone, Twinmotion’s built-in library of 3D human characters has been replaced with new high-quality photo-scanned assets, enhancing the credibility and ambience of your scene. There are 62 mocap-animated characters, each with five changes of clothing, and a further 82 ready-posed characters, so there’s plenty of variety. All characters are sourced from <a href="" target="_blank">AXYZ design</a>, who specialize in providing 3D people for archviz. <br /> <img alt="News_Twinmotion2020_Blog_Body_Image_7.jpg" height="auto" src="" width="auto" /> <h2>One-click synchronization with Rhino</h2> Twinmotion 2020 sees Rhino added to its list of packages supported by Direct Link, joining SketchUp Pro, Revit, ARCHICAD, and RIKCAD. This enables you to synchronize your Rhino and Grasshopper data into Twinmotion with a single click while retaining organization and hierarchy and automatically substituting native materials with Twinmotion PBR materials. Support has also been updated for SketchUp Pro 2020.<br /> <img alt="News_Twinmotion2020_Blog_Body_Image_6.jpg" height="auto" src="" width="auto" /> <h2>Project presentation and review</h2> The successor to BIMmotion, the new Twinmotion Presenter enables you to share an individual project in a standalone viewer as a lightweight packaged executable, so clients and stakeholders can review a project without needing to have Twinmotion installed. You can easily create a presentation with multiple points of view and camera paths; the view can be set to free, guided, or locked, enabling you to choose whether viewers can explore at will, see the project only from pre-selected positions, or only play back the animation as rendered, respectively.<br /> <br /> When it comes to providing feedback in a design review, or recording non-visual data such as brand names or pricing, the new Note tool is great for making sure information is captured in context. Annotations can be exported to BCF format (IFC standard) in a zip file and loaded into Revit, ARCHICAD, or many other BIM packages, streamlining the iterative process. <br /> <br /> To see all the new features in Twinmotion 2020.1, take a look at the <a href="" target="_blank">Release Notes</a>. You can also watch our <a href="" target="_blank">recent webinar on Twinmotion 2020.1</a> on demand. <br /> <br /> Twinmotion 2020.1 is currently available at a 50% discount price of $249 USD (regional pricing may vary) for a perpetual license. This introductory price includes all subsequent upgrade releases until the end of December 2021. <br /> <br /> As a special thank-you to our early adopters, those who had downloaded the previous Twinmotion release will receive Twinmotion 2020.1 for free; entitlements are automatically added to their accounts, and the software can be accessed through the Epic Games launcher. There’s also a free trial option for those wishing to evaluate the new features before purchasing, and a free educational version for students and teachers.<br /> <br /> <a href="" target="_blank"><img alt="Button.jpg" height="auto" src="" width="auto" /></a><br /> <br />  TwinmotionArchitectureFeaturesNewsWed, 11 Mar 2020 13:14:00 GMTWed, 11 Mar 2020 13:14:00 GMT to required setup for Android NDK 21 in Unreal Engine 4.25 are updating Unreal Engine 4.25 to use Android NDK r21. In this post, we will walk you through the new setup process that Android developers will need to take advantage of the update.Greetings from the Unreal Engine Mobile team! Today, we’re bringing you an update concerning setup for your Android development environment in 4.25 and onward. <br /> <br /> Specifically, Unreal Engine 4.25 now requires <strong>Android Native Development Kit Revision 21 (NDK r21)</strong> to support the development of Android projects. This requires a new setup process using <strong>Android Studio</strong> instead of Codeworks for Android 1R7u1. While we will be publishing new documentation for this process for 4.25’s full release, we wanted to share the new setup steps ahead of time for those using 4.25 preview builds.<br />   <h2><strong>Recommended Setup</strong></h2> You should make sure that the Unreal editor and the Epic Games Launcher are both closed to ensure that there are no problems with either the installation of NDK components or setting your environment variables for the engine.<br /> <br /> If you are moving to Unreal 4.25 from 4.24 or earlier, we recommend that you uninstall CodeWorks for Android and any existing NDK components, as well as delete the folder CodeWorks was installed to, before proceeding with further setup to ensure that your environment variables will be set correctly. Otherwise, Android Studio will continue to use the previous CodeWorks installation folder for SDK updates. The default installation directory for CodeWorks is C:/NVPACK.<br /> <br /> If you need to support an earlier installation of Unreal Engine while also maintaining an installation of Unreal 4.25 or later, refer to the section on Using Earlier NDK or Unreal Versions below.<br />   <h3><strong>1. Installing Android Studio</strong></h3> To set up the required NDK components on your computer, you need to install Android Studio version 3.5.3.<br />   <ol style="margin-left: 40px;"> <li>Navigate to the <a href="" target="_blank">Android Studio Archive</a> page in your web browser. Scroll down to <strong>Android Studio 3.5.3</strong>, click on it to unfold the dropdown, and download the appropriate installer or zip file for your operating system.</li> </ol> <img alt="TechBlog_AndroidStudio_Download.png" height="auto" src="" width="auto" /> <ol start="2" style="margin-left: 40px;"> <li>Run the <strong>Android Studio installer</strong>. In the <strong>Welcome to Android Studio Setup </strong>dialogue, click <strong>Next</strong> to continue.</li> </ol> <div style="text-align: center;"><img alt="TechBlog_AndroidStudio_Step2.png" height="auto" src="" width="auto" /></div> <ol start="3" style="margin-left: 40px;"> <li>In the <strong>Choose Components</strong> dialogue, click <strong>Next </strong>to continue. You can leave the default components enabled.</li> </ol> <div style="text-align: center;"><img alt="TechBlog_AndroidStudio_Step3.png" height="auto" src="" width="auto" /></div> <ol start="4" style="margin-left: 40px;"> <li>In the <strong>Configuration Settings</strong> dialogue, select an appropriate install location and click <strong>Next</strong> to continue. We recommend using the default location.</li> </ol> <div style="text-align: center;"><img alt="TechBlog_AndroidStudio_Step4.png" height="auto" src="" width="auto" /></div> <ol start="5" style="margin-left: 40px;"> <li>In the <strong>Choose Start Menu Folder</strong>, click <strong>Install </strong>to begin the installation process.</li> </ol> <div style="text-align: center;"><img alt="TechBlog_AndroidStudio_Step5.png" height="auto" src="" width="auto" /></div> <ol start="6" style="margin-left: 40px;"> <li>When the installation finishes, click <strong>Next</strong> to begin setting up components. </li> </ol> <div style="text-align: center;"><img alt="TechBlog_AndroidStudio_Step6.png" height="auto" src="" width="auto" /></div> <ol start="7" style="margin-left: 40px;"> <li>When setup completes, make sure the <strong>Start Android Studio</strong> box is checked and click <strong>Finish</strong> to exit the installer.</li> </ol>   <h3><strong>2. Setting Up Android Studio for First Time Use</strong></h3> When you start Android Studio for the first time, follow these steps:<br />   <ol style="margin-left: 40px;"> <li>When the <strong>Import Android Studio Settings</strong> dialog appears, select <strong>do not import settings</strong> and click <strong>OK</strong> to continue.</li> </ol> <div style="text-align: center;"><img alt="TechBlog_AndroidStudio_Step2-1.png" height="auto" src="" width="auto" /></div> <ol start="2" style="margin-left: 40px;"> <li>When the <strong>Data Sharing</strong> dialog appears, choose whether or not you want to send usage statistics to Google. This is an option you may choose at your discretion, and either choice will continue to the next step.</li> </ol> <div style="text-align: center;"><img alt="TechBlog_AndroidStudio_Step2-2.png" height="auto" src="" width="auto" /></div> <ol start="3" style="margin-left: 40px;"> <li>The <strong>Android Studio Setup Wizard</strong> will appear. Click <strong>Next</strong> to continue. If you are prompted for an update, click the <strong>X</strong> button to dismiss the prompt. </li> </ol> <div style="text-align: center;"><img alt="TechBlog_AndroidStudio_Step2-3.png" height="auto" src="" width="auto" /></div> <ol start="4" style="margin-left: 40px;"> <li>In the<strong> Install Type</strong> dialog, select <strong>Standard</strong> and click <strong>Next</strong>.</li> </ol> <div style="text-align: center;"><img alt="TechBlog_AndroidStudio_Step2-4.png" height="auto" src="" width="auto" /></div> <ol start="5" style="margin-left: 40px;"> <li>In the <strong>Select UI Theme </strong>dialog, choose your preferred theme and click <strong>Next</strong>.</li> </ol> <div style="text-align: center;"><img alt="TechBlog_AndroidStudio_Step2-5.png" height="auto" src="" width="auto" /></div> <ol start="6" style="margin-left: 40px;"> <li>In the <strong>Verify Settings</strong> dialog, click <strong>Finish</strong> to finalize your setup and begin downloading components.</li> </ol> <div style="text-align: center;"><img alt="TechBlog_AndroidStudio_Step2-6.png" height="auto" src="" width="auto" /></div> <ol start="7" style="margin-left: 40px;"> <li>When components are finished downloading, click <strong>Finish </strong>again to end setup.<br />  </li> </ol> <strong>Finalizing Android Studio Installation on Windows, Mac, and Linux</strong><br /> <br /> If you are using Windows, restart your computer for all settings to take effect. If you are using Linux, close your terminal window and reopen it. If you are using a Mac, you can either close your terminal window and reopen it or log out and log back in. You must do this before moving on to the next section.<br />   <h3><strong>3. Setting Up Unreal to Use NDK r21</strong></h3> To set up Unreal Editor to use Android NDK r21: <br />   <ol style="margin-left: 40px;"> <li>Navigate to your Unreal 4 engine install directory. For example, <strong>C:/Program Files/Epic Games/UE_4.25</strong>.</li> <li>Open <strong>Engine/Extras/Android</strong>.</li> <li>Inside this directory, run the <strong>SetupAndroid </strong>script appropriate for your operating system. SetupAndroid.bat is for Windows, SetupAndroid.command is for Mac, and is for Linux.</li> </ol> <div style="text-align: center;"><img alt="TechBlog_AndroidStudio_Step3-3.png" height="auto" src="" width="auto" /></div> <div style="margin-left: 40px;"> </div> <ol start="4" style="margin-left: 40px;"> <li>You will be prompted to accept the Android SDK license agreement. Type <strong>Y</strong> and press <strong>Enter</strong> to accept.</li> </ol> <div style="text-align: center;"><img alt="TechBlog_AndroidStudio_Step3-4.png" height="auto" src="" width="auto" /></div> <ol start="5" style="margin-left: 40px;"> <li>When the installation completes, press any key to dismiss the command prompt and finish the process.</li> </ol> <br /> This script will download and install NDK r21 for you in your Android home directory. The install directory should be <strong>C:/Users/Username/AppData/Local/Android/Sdk/ndk/21.0.6113669</strong>, where “Username” is your login name for your computer.<br />   <h2 id="earlierSection"><strong>Using Earlier NDK or Unreal Versions</strong></h2> In the event that you require an installation of Unreal Engine 4.24 or older, or that your project needs to target an earlier version of Android NDK not supported by this installation process, you can manually set your environment variables to target the version you need. In Unreal Editor, you will find your Android SDK paths in the <strong>Project Settings</strong> menu under <strong>Platforms > Android SDK</strong>. <div style="text-align: center;"><img alt="TechBlog_AndroidManualNDK.png" height="auto" src="" width="auto" /></div> Alternatively, you can edit the <strong>BaseEngine.ini</strong> for your engine installation manually under <code>[/Script/AndroidPlatformEditor.AndroidSDKSettings]</code>: <div style="margin-left: 40px;"><br /> <code>NDKPath=(Path="D:/[NDKInstallPath]")</code></div> <br /> Where <code>[NDKInstallPath]</code> is the location of your desired NDK installation.<br /> <br /> For the purpose of making your future installations of Unreal Engine as smooth as possible, we recommend using Android Studio per the above installation steps. You can then download NDK r14b from the <a href="" target="_blank">Unsupported NDK Downloads</a> page on the Android developer site and manually target it in your installations of 4.24 or earlier.<br /> <br /> Alternatively, if you want to keep an installation of CodeWorks for earlier projects, you can perform the Android Studio installation above without removing CodeWorks. The SetupAndroid script in 4.25 will still automatically download NDK r21, but you will need to manually target its location in your Unreal 4.25 installation.<br />  Androidthe Unreal Engine mobile teamTue, 10 Mar 2020 18:00:00 GMTTue, 10 Mar 2020 18:00:00 GMT Audio in Unreal Engine’s Shooter Game Sample Project Technical Sound Designer Adam Block walks users through all the revisions that went into revamping the audio in Unreal’s Shooter Game sample.Hi friends! My name is Adam Block. I’m a senior technical sound designer who’s been working professionally in the video game industry since 2008. I’m an active member of the game-audio community and moderator of the Facebook group <a href="" target="_blank">Unreal Engine Audio Tips & Tricks</a>, where I try to help folks with questions about audio in Unreal Engine. In the first six years of my game audio career, I worked in-house at several Activision and Sony studios on <em>Guitar Hero</em>, <em>Marvel Ultimate Alliance II</em>, <em>Transformers: Fall Of Cybertron</em>, <em>Planetside II</em>, and several others. In 2014, I founded <a href="" target="_blank">Craft Media Group</a>, a game-audio company that provides scalable, end-to-end game-audio development on a remote basis for projects of all sizes. To date, I’ve personally worked on over forty games and Craft has evolved into an offsite audio department for game studios who require a team of senior level game audio talent that can hit the ground running. <br /> <br /> This dev blog is intended for anyone who’s interested in the sound design process or implementation side of working with Unreal Engine. I was hired by Epic Games to perform a complete audio overhaul of their “<a href="" target="_blank">Shooter Game</a>” project, which can be installed via the <a href="" target="_blank">Epic Games Launcher</a>. It’s also Epic’s most downloaded starter template, so if you haven’t done so, grab it and check it out. <br /> <br /> I not only redesigned all of the audio assets, but I included scalable Blueprint logic and tried my best to depict things in a clear and digestible manner, while still incorporating a few useful parameters, systems, variables, and components that I feel most sound designers will want to look into. What I’ve setup is merely a design suggestion and one of many ways to approach audio implementation in Unreal Engine. I feel there’s a misconception within the game-audio community that in order to be “next-gen sounding” (whatever that actually means), there needs to be middleware incorporated into your UE project, which simply isn’t true.<br /> <br /> My hope is that we see an increased number of game audio folks who recognize the value of becoming a more technically savvy content creator. There’s no disputing the massive improvements to the native Unreal Engine tools and features that Epic’s programming and QA teams have been implementing, testing, and modifying over the past few years, and they have a very ambitious and exciting roadmap ahead. I feel it’s our responsibility as professional sound designers to take advantage of these tools and technology, becoming better game designers to boot, rather than relying on a programmer to complete often simple implementation tasks for us, especially considering the dawn of the <a href="" target="_blank">Blueprint</a> era.<br /> <br /> As someone who has been a huge advocate of Unreal Engine since 2012, who’s passionate about the art of sound design and craft of implementation, I was humbled to be hired for this project. After taking everything into consideration (sound design, implementation, discovery, bug fixing, playtesting, meetings, syncing, etc.) I estimated this to be a 15 to 20-day job. Once I synced to the project, I’ve gotta say, it was pretty rewarding to select all the actors in the audio sub-level and delete them. Here’s a quick overview of my experience and some key things I kept in mind. <br />   <h3><strong>The Project Goal </strong></h3> My main objective on this project was to update and make the soundscape more modern sounding and dynamic. The previous assets were over-compressed, the mix wasn’t attended to as well as it should have been, and the assets had a pretty dated sound. There was also a rather imposing music track, which was pretty demanding on the ears in terms of frequency and volume. It really took away many opportunities to experience individual game events and ultimately led to gameplay that was less fun. Some of the bigger offenders were the weapons lacking punch, the ambience needed to offer more of a sense of space and texture, the footsteps were super heavy and loud, and the pickups were borderline explosive. I could go on about individual assets, but for the sake of time, I felt that all assets needed an update. I basically just nuked everything that was in the game and started from scratch.   <div style="text-align: center;"><img alt="TechBlogs_ShooterGame_CraftMediaGroup_NukeLevel.gif" height="auto" src="" width="auto" /><br /> <em>Here’s rather satisfying footage of me nuking everything in the audio sub-level</em></div> <h3><strong>Sound (Re)Design</strong></h3> The audio in Shooter Game had a dated aesthetic and an imbalanced mix. Since it is still one of the most commonly downloaded sample projects for new licensees and people new to Unreal Engine, this may have given people the wrong impression regarding what’s possible with the native audio engine. Aside from that, the dated sound, combined with an inconsistent implementation strategy made the sample game much less fun to play. The audio assets and implementation lacked balance between the main classes of sound (e.g. weapons vs. footsteps vs. pickups). The .wav assets had limited dynamic range and were competing with each other in the same frequency space. They also lacked individuality and articulation. The soundscape simply seemed bland and, as a result, diminished any emotional connection to the experience, which is arguably the most crucial role of audio in any media. Sound designers want to ensure that any cause or effect in a video game has purpose-driven and meaningful audio design that supports the overarching game design and story. Because of these issues, I felt that a complete overhaul was necessary.<br /> <br /> For a couple days, I recorded a good amount of original source material by going around different areas of San Diego armed with mics and field recorders. I grabbed all sorts of stuff ranging from ambiences to footsteps on different surfaces. I also recorded custom foley, such as metal impacts and other textured recordings at my studio. I also got some excellent source material using different synthesizers and ran all sorts of weird stuff through FX chains to evoke a mysterious and futuristic sci-fi world.  <div style="text-align: center;"><img alt="Dev_Blog_img_Com.jpg" height="auto" src="" width="auto" /></div> I knew most of my time would be spent on implementation and learning about the legacy systems that pre-existed in the project, some of which are still in use. I also spent a lot of time sketching out ideas for how I wanted to approach the implementation process and what new systems I thought I might need. One refreshing thing about this project was that Epic’s Lead Audio Programmer Aaron McLeran and Technical Sound Designer Dan Reynolds basically gave me the freedom to do anything I wanted! Let’s get into those details.<br />   <h3><strong>Implementation </strong></h3> Some gameplay systems in this project were hardcoded in project level C++ code. In other cases, I kept the existing implementation as-is, and simply swapped out SoundCues and source files with newer sound content. For example, the weapon-shooting system uses a hard-coded looping sound system. That system references a custom SoundCue node that only exists in Shooter Game; called Sound Node Local Player, it switches assets based on first and third-person instantiation. Aaron and Dan informed me that this is usually done in higher level gameplay code systems and not in Sound Cues. But for Shooter Game, because of engineering priority purposes, changing how Shooter Game implements local vs non-local player sounds wasn’t an option. I had to do this re-design without involving any programmers for new features or code work. Thus, I had to continue using this implementation method. For other implementation cases, like impact templates, stingers, and pickups, I generally left the sound hooks for those alone and only swapped out the sounds themselves and their associated cues. <br /> <br /> For the remaining implementation work, I felt there were three main areas I could focus on that most people would likely find insightful. These areas included: <br />   <ul style="margin-left: 40px;"> <li><strong>Overlap Events</strong>: Changing ambience and one-shot sounds based on overlap events.</li> <li><strong>Actor Blueprints</strong>: Create custom utilities to repurpose and customize.</li> <li><strong>Material-Based Footsteps</strong>: Change footsteps (or other) sounds based on surface type hit results.</li> </ul> <br /> <strong>Overlap Events</strong>: In my opinion, overlap events represent Unreal Engine implementation 101. They allow us to create a Blueprint event from something we can place in the level. Once we are in Blueprints, sound designers have a lot of control over any sound system they might design.  <div style="text-align: center;"><img alt="TechBlogs_ShooterGame_CraftMediaGroup_TriggerVolume_compressed.gif" height="auto" src="" width="auto" /></div> <div style="text-align: center;"><em>Figure 1. Placing a Brush Actor to trigger an overlap event in an area where I want to have custom audio behavior in the level.</em></div> <br /> To illustrate this, in Figure 1 above, I’ve placed a custom volume (for the reverb) and trigger (for the ambience) to match the layout of the room. Any time the player overlaps these invisible Brush Volumes in the level, we can do some or any of the following:<br />   <ul style="margin-left: 40px;"> <li>Change the room tone or ambient bed that’s playing in the space.</li> <li>Switch to more unique one-shot sounds that trigger randomly around the player’s world position while we’re in that space. </li> <li>Change the reverb for any sounds inside this space.</li> <li>Turn sounds on and off either inside or outside the volume.</li> <li>Use HPF & LPFs for sounds that are playing either in this volume or outside the volume.</li> </ul> <div style="text-align: center;"><img alt="TechBlogs_ShooterGame_BP_Overlap_Ambience_01.PNG" height="auto" src="" width="auto" /></div> <div style="text-align: center;"><em>Figure 2. Overlap events execute logic that controls ambience.</em></div> <br /> In Figure 2, I demonstrate a simple Blueprint script, which executes when an overlap event occurs. First, I do a check to ensure that the actor that caused the overlap event and the controlled pawn are the same/equal. I then retrieve an audio component I have in the world by reference and fade it in. Boom, simple. Note that the reason I need to check that the right actor is triggering the volume is that NPCs (or a character controlled by another player across a network connection) might change the ambient sounds on our local client based on their actions. That would get confusing really fast.<br /> <img alt="TechBlogs_ShooterGame_BP_Ambient_OS.PNG" height="auto" src="" width="auto" /> <div style="text-align: center;"><em>Figure 3. Here’s audio logic that spawns one-shot sounds at random vector coordinates around the player’s world location.</em></div> <br /> In figure 3, I’ve created some logic that uses overlap events and a Timer By Function that periodically spawns a random one-shot sound around the player’s location. The distances and frequency in which these sounds are spawned are completely user defined via public float variables that I can easily tweak and iterate on as I try different sounds out in PIE. This is, of course, in addition to our standard pitch and volume variations within the cue.<br /> <img alt="TechBlogs_ShooterGame_CraftMediaGroup_P_Audio_Spline.PNG" height="auto" src="" width="auto" /> <div style="text-align: center;"><em>Figure 4. Audio logic for the “BP_Audio_Spline” Blueprint.</em></div> <h3><strong>Actor Blueprints</strong></h3> If you haven’t already discovered their power, take it from me: Blueprints are extremely powerful and unlock an immense amount of audio functionality and customization. For example, in Figure 4, I show my utility script that handles the logic of playing a sound on the point of a spline closest to the listener. Figure 5 below shows an example spline path for the sound to travel. <br /> <img alt="TechBlogs_ShooterGame_spline_Wind.PNG" height="auto" src="" width="auto" /> <div style="text-align: center;"><em>Figure 5. I draw a spline which follows the balcony’s edge in the High-Rise Level. My looping wind sound will emit from the closest point on the spline to the player’s position.</em></div> <br /> Typical use cases for this kind of thing might be water flowing in pipes, looping water sounds for curved rivers or streams, etc. The reason you would want to do that is to create the sense of a larger, distributed sound source vs a single-point source or a series of point sources. <div style="text-align: center;"><strong><img alt="TechBlogs_ShooterGame_Spline_Follows.gif" height="auto" src="" width="auto" /></strong><br /> <em>Figure 6. A visual depiction of audio following the player’s closest position to a spline.  </em></div> <br /> In this project, I’m using it in a few areas. Figure 6 shows one case where I use it along the balcony edge. I simply created a spline path for the audio to traverse along and used a windy/exterior looping sound to play on that path. As a result, it feels like the closer you get to the edge, the more wind you hear. But it won’t rise and fall in volume (i.e. a series of one-shots). <h3><br /> <strong>Material-Based Footsteps</strong></h3> <img alt="TechBlogs_ShooterGame_Animation_Foosteps_01.PNG" height="auto" src="" width="auto" /> <div style="text-align: center;"><em>Figure 7. The Blueprint script that changes footstep sounds based on the hit result of a line trace.</em></div> <br /> In general, changing which footstep or other foley sounds play based on what physical material the character is walking on or interacting with does not require code support in Unreal Engine. Figure 7 shows my logic for footsteps and jumplands (i.e. the sound that plays when the character hits the ground after a jump). Something like this script can be used for any case where you want to use a line trace, obtain information about what was hit, and change sounds based on that result. <br /> <img alt="TechBlogs_ShooterGame_CraftMediaGroup_Anim_Notify.PNG" height="auto" src="" width="auto" /> <div style="text-align: center;"><em>Figure 8. Playsound notifies trigger sounds while animation notifies trigger Anim BP graph events.</em></div> <br /> Figure 8 shows that the events seen in this Animation Blueprint are being triggered by individual animation notify events that I’ve setup and named “Footstep” in various run and sprint animations. <br /> <br /> Before utilizing this method, you first need to ensure that the project actually has all the physical materials set up on different surfaces. What physical material types that exist are defined in project settings. In this sample project, I’m using only tile, grass, and metal materials. However, if your project has unique materials like that “liquid metal” in the movie Terminator, you’ll want to define it in the project’s settings. Once they are set up, you’ll be able to assign those surface types to physical materials and meshes that have been placed in the level. Usually environment artists have set these up for graphics purposes already, so sound designers often just need to go and see what exists and use them. <div style="text-align: center;"><em><img alt="TechBlogs_ShooterGame_footstep_map.png" height="auto" src="" width="auto" /><br /> Figure 9. Using a “map” variable to associate physical material types and sound cues</em></div> <br /> In Figure 9, I’ve made a variable type map where I associate the hit result of my line trace to play different sounds based on the surface that’s been hit. The Blueprint script performs the following logical sequence:<br />   <ul style="margin-left: 40px;"> <li>Anim notify triggers an event</li> <li>It gets the Pawn Owner & Casts to the Player Pawn BP</li> <li>It then get’s the world location (as my starting vector location)</li> <li>It then subtract 100 units (use as my ending vector location)</li> <li>It performs a line trace using those coordinates </li> <li>If it hits a surface with line trace, it then tells me what it was</li> <li>It returns the physical material that was hit and finds the sound reference I’ve set up using my map variable </li> <li>It then sets a sound variable called “Sound To Play” using the result from what was found in my map variable</li> <li>Finally, it reference the mesh on the player pawn and spawns my footstep sound attached to the mesh</li> </ul>   <h3><strong>Closing Notes</strong></h3> There are often several ways to implement an idea or system in Unreal Engine. While there’s usually not a right or wrong way to do things per se, there’s often a more efficient way. If you’re working on a team, I recommend asking a programmer or designer to periodically review and critique your Blueprint logic as a general best practice. The more you educate yourself and tap into systems outside of audio, the more you will become deeply engaged in the design process, which allows you to be a co-equal developer alongside other game designers and developers using Unreal Engine. That’s a pretty cool thing.<br /> <br /> For anyone looking to learn more about Unreal Engine audio, I recommend joining my Facebook group “<a href="" target="_blank">Unreal Engine Audio Tips & Tricks</a>,” check out some of the UE courses on <a href="" target="_blank">Udemy</a>, join the <a href="" target="_blank">Unreal Engine Forums</a>, and explore YouTube channels like <a href="" target="_blank">Mathew Wadstein’s WTF Is</a> series. Epic has also recently released <a href="" target="_blank">Online Learning Courses</a> that you should check out and feel free to contact me via my <a href="" target="_blank">Linkedin</a> or visit my website <a href="" target="_blank"></a>.<br /> <br /> Thanks for reading!<br />  GamesBlueprintsDesignFeaturesLearningSenior Technical Sound Designer Adam BlockMon, 09 Mar 2020 21:00:00 GMTMon, 09 Mar 2020 21:00:00 GMT Unreal Online Learning courses for learning AI with Blueprints, physics-based shotviz, and more out the latest Unreal Online Learning courses! Learn how to become a Marketplace publisher, get started with Datasmith, explore how AI agents work within a video game environment, and learn how shotviz can save you time and money while filming on location.Our roster of <a href="" target="_blank">Unreal Online Learning</a> courses is growing and new courses are being added every month. Whether you’re looking to learn something entirely new in Unreal Engine or build upon your existing skillset, Unreal Online Learning has the resources you need.  <br /> <br /> Check out our newest Unreal Online Learning courses below!<br /> <br /> <a href="" target="_blank">Creating Marketplace Content </a><br /> <br /> The Unreal Engine Marketplace is an e-commerce platform that developers can use to purchase and download assets for use in their projects. This course will provide all the information you need to create a Marketplace publisher profile and launch your first product. <br /> <img alt="News_UOL_blogimg.jpg" height="auto" src="" width="auto" /><br /> <a href="" target="_blank">Getting Started with Datasmith</a><br /> <br /> This course covers how to use Datasmith to bring AEC and manufacturing 3D assets into Unreal Engine. Find out how to install Datasmith exporters, how to import and work with assets, and how to showcase the results. <div style="text-align: center;"><img alt="News_UOL_LMS-Hero-Image-1200x675.png" height="auto" src="" width="auto" /></div> <a href="" target="_blank"> Introduction to AI with Blueprints</a><br /> <br /> In this course, you’ll be introduced to Unreal Engine's AI tools, exploring how AI agents work within a video game environment and the systems used to achieve realistic behaviors. <br /> <img alt="News_UOLBlog_Body_Image_1.jpg" height="auto" src="" width="auto" /><br /> <a href="" target="_blank">Physics-Based Shotviz</a><br /> <br /> In this course, you’ll learn about shotviz. Save time and money while filming on location when you scout and pre-plan setups using physically accurate lighting and cameras on a virtual location. <div style="text-align: center;"><img alt="News_UOL_PhyscisBasedShotviz_cms_L_1200_675.jpg" height="auto" src="" width="auto" /></div> <br /> <a href="" target="_blank">Revit to Unreal Engine Fundamentals</a><br /> <br /> Learn how to leverage your Revit models in Unreal Engine and create stunning images and animations in real time. <div style="text-align: center;"><img alt="News_UOL_214-CMS-Desktop-1200x675.png" height="auto" src="" width="auto" /></div> Ready to take your interactive 3D career to the next level? Check out all of our other <a href="" target="_blank">Unreal Online Learning</a> courses and guided learning paths. You can even earn badges for every course you complete and share them on social media!<br />  Unreal Online LearningLearningEducationMelissa RobinsonFri, 06 Mar 2020 20:00:00 GMTFri, 06 Mar 2020 20:00:00 GMT blends genres to create PVPVE shooter The Cycle Creative Director Torkel Forner and Executive Producer Jonathan Lindsay share their inspirations behind The Cycle and detail the Unreal Engine tools that helped make the unique game possible. When you think about an online multiplayer game, what comes to mind? Probably something PVP-oriented, where players compete against each other to win a prize. Maybe it’s something a little more cooperative, like a PVE experience, where players band together to take on swarms of enemies in the pursuit of survival or glory. <em><a href="" target="_blank">The Cycle</a></em>, is both of these ideas mixed together, while being completely unique in and of itself. <br /> <br /> Developed by <a href="" target="_blank">YAGER</a>, <em>The Cycle</em> features a unique, simultaneous blend of “PVPVE” gameplay that forces players to fight dangerous monsters on an alien planet, while also tasking them to complete contracts and escape a devastating weather anomaly. <br /> <br /> To see how the Berlin-based studio is continually evolving the competitive online game, we interviewed YAGER Creative Director Torkel Forner and Executive Producer Jonathan Lindsay. The two share their inspirations behind <em>The Cycle</em> and delve into the tools that helped the team materialize their game.   <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <strong><em>The Cycle</em> features a very unique premise that has players duking it out for resources on an alien planet. What inspired this concept? </strong><br />  <br /> The original gameplay inspiration came from old memories playing World of Warcraft–in particular, going to the first PVP areas. The added tension of other hostile players made an already fantastic experience all the more memorable. As for the universe, setting, and feeling that we’re trying to get to, we’ve drawn inspiration from westerns and science fiction in our pursuit of creating a world that players continuously want to come back to. Another inspiration is the main loop of a typical MOBA game, [which requires you] to start every match from zero, level up your character and gear rapidly, and join a competitive endgame that has you fighting other players. We’ve strived to achieve a similar loop in a shooter game, creating a tactical and intense experience.<br />  <br /> <strong>What do you hope for <em>The Cycle</em> and what kind of audience would you like to attract? </strong><br /> <br /> We’re looking to create a game and an experience for players who want to be free to experiment, explore, and express freedom–within a competitive match. We want to create a game that allows players to play in many different ways, and we believe that we can do this best by creating, essentially, a competitive sandbox game. Of course, we are looking for experienced players as the competitive elements in our game are quite strong; having smarts and developed shooter skills will definitely help you win matches.<br /> <img alt="DeveloperInterview_TheCycle_PVE.jpg" height="auto" src="" width="auto" /><br /> <strong>What was a key moment in the game’s development or conceptualization that made the team think they stumbled onto something promising?</strong><br /> <br /> There were a lot of different moments during development like that–we playtest the game every day with the entire team and even early on, we had moments and matches where it felt like we were hitting the mark. However, we strive to constantly improve the game, and whenever we feel that we’re hitting that sweet spot, we try to aim higher and make it even sweeter.<br /> <br /> <strong>How big is the team?</strong><br />  <br /> We are comprised of approximately 60 developers and all work together in our Kreuzberg studio based in Berlin, Germany.<br /> <img alt="DeveloperInterview_TheCycle_EGS_makeover_uplinkcobots_11.jpg" height="auto" src="" width="auto" /><br /> <strong>Can you speak to the team’s experience using Unreal?</strong><br /> <br /> We initially switched from our in-house engine to Unreal 3 back in 2004. Later, we became early adopters of UE4. The reason we stayed with Unreal was due to the engine’s incredible high usability, which allows us to iterate incredibly quickly. <br /> <br /> <strong>Why was Unreal a good fit for <em>The Cycle</em>?</strong><br /> <br /> We really value the usability of the editor and the ability for everyone on the team to create gameplay. Also, the wide range of visual possibilities that the engine offers is unmatched in the industry. Over the years, this has helped us deliver experiences to players and customers in a broad variety of ways. We are able to look back at more than 15 years of strong and knowledgeable support from Epic’s Unreal team. Now, we have such a high level of Unreal expertise in our studio that it’s become pointless for us to look for alternative engines as we’re able to develop much faster and better with Unreal.  <br /> <img alt="DeveloperInterview_TheCycle_21.jpg" height="auto" src="" width="auto" /><br /> <strong>Does the team have any favorite Unreal Engine tools? </strong><br /> <br /> A major boost for us was the <a href="" target="_blank">Blueprint</a> scripting. It allows us to experiment way faster as it empowers our whole team to create gameplay experiences. Also, the introduction of <a href="" target="_blank">live coding</a> greatly helped our gameplay engineering team.<br /> <br /> <strong>Being an ever-evolving game, are there any helpful tools that empower the team to create a regular cadence of content?</strong><br /> <br /> With [Unreal’s] streamlined import pipeline, which includes <a href="" target="_blank">automatic LOD generation</a>, texture management/streaming, and the ease of using the <a href="" target="_blank">material editor</a>, we are able to add content on a constant basis without needing to create a [custom] pipeline as one is [already] available out of the box.<br /> <img alt="DeveloperInterview_TheCycle_6.jpg" height="auto" src="" width="auto" /><br /> <strong>Was there a difficult concept to execute on that Unreal assisted with?</strong><br /> <br /> Without Unreal’s <a href="" target="_blank">online subsystems</a>, the features that we are currently developing: friends, clans, and leaderboards, would be much harder, especially as we plan to take <em>The Cycle</em> to multiple platforms.<br />  <br /> <strong>For more information on <em>The Cycle</em>, visit: </strong> <ul style="margin-left: 40px;"> <li><a href="" target="_blank"></a></li> <li><a href="" target="_blank"><em>The Cycle</em>’s Facebook page</a></li> <li><a href="" target="_blank"><em>The Cycle</em>’s Twitter</a></li> <li><a href="" target="_blank"><em>The Cycle</em>’s YouTube page</a></li> <li><a href="" target="_blank"><em>The Cycle</em>’s Instagram</a></li> </ul> pvpveThe CycleUE4Unreal EngineyagerGamesBlueprintsCommunityCharles Singletary Jr.Thu, 05 Mar 2020 15:30:00 GMTThu, 05 Mar 2020 15:30:00 GMT simulation transforms maintenance planning for the UK’s railways Rail dramatically improves the efficiency, cost, and time to completion for track renewal programs, thanks to a UE4-powered virtual simulation environment developed at the University of Salford.The UK’s railways carry around <a href="" target="_blank">50 percent more passengers</a> today than a decade ago, bringing the total to about four million people a day. With those numbers set to increase, it’s imperative that improvements and modernizations are carried out to deliver more frequent, reliable, and safe services, while causing minimal disruption to passengers. <br /> <br /> For Network Rail, the publicly owned infrastructure manager that maintains large swathes of Britain’s train tracks, this means track renewal programs must be carefully planned and designed to avoid major service disruption. <br /> <br /> To achieve this, the engineering team that designs track renewal programs can now use a UE4-powered 4D simulation tool developed by the University of Salford’s <a href="" target="_blank">THINKlab</a>. By evaluating rail works in a virtual simulation environment, Network Rail can dramatically improve the efficiency, cost, and time it takes to deliver a project. “In the past it would take a full week to plan a weekend’s work,” says Steve Naybour, Head of Transformation at the Network Rail South Alliance. “This effort can now be reduced to a few hours using the new tool.” <br />   <div style="padding:56.25% 0 0 0;position:relative;"><iframe allow="autoplay; fullscreen" allowfullscreen="" frameborder="0" src="" style="position:absolute;top:0;left:0;width:100%;height:100%;"></iframe></div> <script src=""></script> <h3><strong>Harnessing the power of BIM and real-time simulation</strong></h3> THINKlab is a state-of-the-art “future center” that is part of the University of Salford’s <a href="" target="_blank">School of Science, Engineering & Environment (SEE)</a>. It leads research across a number of fields including ICT platforms for urban regeneration, engineering, and smart city applications, with a strong emphasis on industry and interdisciplinary collaboration.<br /> <br /> The simulation tool THINKlab has developed enables Network Rail to build a 3D model of any site from a range of data sources. Virtual tracks are positioned onto 3D digital terrain from CAD or laser-scanned track data. Building Information Modelling (BIM) models of overhead line equipment, ballast, sleepers, rails, and signalling apparatus can be imported for a more accurate representation of the physical site. And a library of plant equipment assets provides excavators and diggers to include in the simulation. <br /> <img alt="Spotlight_NetworkRail_blog_body_img3.jpg" height="auto" src="" width="auto" /><br /> Users can define the resources needed to complete the project and build a timeline of activities, with task interdependencies mapped. Costing functionality in the software provides an accurate view of the financial impact of choices. As planning decisions are made and data is input, the software automatically simulates the work, offering views from a number of camera angles, at different levels of magnification, and at different speeds. <br /> <br /> Any changes in the planning inputs are immediately represented in the simulation. This results in much shorter feedback loops compared to the conventional approach to computer modeling, because engineers, designers and planners don’t have to rely on computer-modeling specialists to see the implications of changes to their plans. <br /> <img alt="Spotlight_NetworkRail_blog_body_img1.jpg" height="auto" src="" width="auto" /><br /> The tool has proven invaluable for assigning the right amount of resources to Network Rail’s track renewal programs. “Whereas we might have used five pieces of equipment for a job, we can often use two or three,” says Stephen Kearney, Head of Development at S&C Alliance South East. “With this package, we can see and prove what we will need in advance.” <h3><strong>Leveraging Blueprints to improve workflow efficiency</strong></h3> Unreal Engine is the go-to solution for nearly all of THINKlab’s visualization work. “We started migrating most of our projects to UE4 shortly after it became public, as it provided vastly superior image quality out-of-the-box and allowed much easier and faster iteration  of our scenes,” says Michal Cieciura, Lead Developer at THINKlab. “The licensing model was fantastic, and we were getting a top-tier engine for next to nothing.”<br /> <img alt="Spotlight_NetworkRail_blog_body_imgBlueprints.jpg" height="auto" src="" width="auto" /><br /> With access to the engine’s source code, the team’s developers could familiarize themselves with the framework much faster, which allowed them to optimize the mechanics early on and to a much greater degree.<br /> <br /> They also made good use of the Blueprint visual scripting system, the scripting language that puts tools that are ordinarily reserved for programmers into the hands of designers and other non-programmers. “The Blueprint system not only made prototyping and experimental designs a breeze for our programmers, but it also allowed the non-programming members to become more self-sufficient, as they were able to build parts of the needed functionality themselves,” says Cieciura. “This, in turn, made our workflow more efficient overall.”<br /> <img alt="Spotlight_NetworkRail_blog_body_imgBlueprints2.jpg" height="auto" src="" width="auto" /><br /> Having worked with node-based interfaces in their respective modeling packages, team members on the project found the engine’s Material Editor easy to get to grips with. “What’s more, the PBR materials—in combination with Unreal Engine’s lighting solutions—have not only improved visual realism in real time, but also removed the requirement for offline rendering,” says Cieciura. <h3><strong>Improve ROI and delivery times with interactive workflows</strong></h3> The transformation of Network Rail’s infrastructure renewal programs comes at a time when the UK government has embarked on a <a href="" target="_blank">series of programs</a> aimed at reinvigorating the UK construction sector using digital technologies. <br /> <img alt="Spotlight_NetworkRail_blog_body_img2B.jpg" height="auto" src="" width="auto" /><br /> One of the key objectives of these programs has been to drive the adoption of BIM to better plan, build, maintain, and use infrastructure. By leveraging the power of real-time simulation along with BIM models, the virtual simulation environment developed by the THINKlab shines a light on where this digital transformation could go next. <br /> <br /> For Network Rail, the innovation has already proved worth its weight in gold. “This tool reduces the risk of incurring additional cost and time in the delivery of the work,” says Ameet Masania, Programme Manager at Network Rail. “It’s becoming an integral tool in the way we deliver railway jobs.”<br /> <br /> <br /> Are you interested in finding out how Unreal Engine can be used for simulation in your industry? <a href="" target="_blank">Get in touch</a> and we’ll get that conversation started.<br />  Automotive & TransportationTraining & SimulationBlueprintsInfrastructureNetwork RailKen PimentelThu, 05 Mar 2020 12:00:00 GMTThu, 05 Mar 2020 12:00:00 GMT Engine 4.25 Preview 1 now available production-ready Niagara VFX system and ray-tracing technology for a test run while trying out even more of the latest features coming in Unreal Engine 4.25.<strong>UPDATE:</strong> 4.25 Preview 3 is now available <hr />Ready to see what's next with Unreal Engine? Unreal Engine 4.25 Preview 1 is here, and you can start putting the latest features to the test today. <a href="" target="_blank">Niagara VFX</a> and <a href="" target="_blank">real-time ray tracing</a> are both production-ready in this release, so you can unlock even more creative potential with these production-proven features. Be sure to try out the latest updates to Hair and Fur, the Unreal Insights profiling tools, and general navigation improvements.<br /> <br /> To take 4.25 Preview 1 for a spin, head to your Library on the Epic Games launcher under the Unreal Engine tab, select “Add Versions”, and choose 4.25 Preview 1. <div style="text-align: center;"><img alt="425Preview1_Blog.jpg" src="" /></div> For a complete list of updates included in 4.25 Preview 1, visit the <a href="" target="_blank">Unreal Engine 4.25 Preview thread</a>. We invite you to provide feedback on this Preview and subsequent releases in the respective thread. <br /> <br /> Please keep in mind that Previews are intended to only provide a sample of what will be included in the full release and are not production-ready. We encourage you to use copies of your projects with the Previews and wait for the final 4.25 release before updating to the new engine version.<br />  ArchitectureFilm & TelevisionGamesTraining & SimulationVirtual ProductionBroadcast & Live EventsCommunityNewsAmanda SchadeWed, 04 Mar 2020 14:30:00 GMTWed, 04 Mar 2020 14:30:00 GMT