https://www.unrealengine.com/rssUnreal Engine - News, Developer Interviews, Spotlights, Tech BlogsFeed containing the latest news, developer interviews, events, spotlights, and tech blogs related to Unreal. Unreal Engine 4 is a professional suite of tools and technologies used for building high-quality games and applications across a range of platforms. Unreal Engine 4’s rendering architecture enables developers to achieve stunning visuals and also scale elegantly to lower-end systems.en-USConnect with the Unreal Engine community onlinehttps://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fconnect-with-the-unreal-engine-community-online%2FUE_Community_Online_thumbnail-376x276-ea897ae1b6c4d6ce0258eb2322ba48f6bfa545c5.pngAlthough many physical events around the world are currently on hold, there are plenty of places to connect with the Unreal Engine community online.Although many physical events around the world are on hold, there are plenty of places to connect with the Unreal Engine community online. From forums, to webinars, livestreams and full-on virtual events, our community of creators is continually staying active.<br /> <br /> Below is a listing of permanent resources and online activities that we’d love to invite you to. Please check this post often as it will be updated in an ongoing fashion with newly-added events. <div style="text-align:center"><img alt="UE_Community_Online_Feed-thumb-desktop.png" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fconnect-with-the-unreal-engine-community-online%2FUE_Community_Online_Feed-thumb-desktop-1400x788-a6917dba2f2cc41cc1426a3afdff135521cea738.png" style="height:auto; width:auto" /></div> <h2><strong>PERMANENT, FREE RESOURCES</strong></h2> <a href="https://www.unrealengine.com/support" target="_blank"><strong>Support and Documentation</strong></a><br /> From your first steps with Unreal Engine to completing your most ambitious real-time project, we’re here to help. With comprehensive reference documentation, instructional guides, community-based support, and options for dedicated professional support, you have full access to everything you need to succeed.<br /> <br /> <a href="https://www.unrealengine.com/onlinelearning-courses" target="_blank"><strong>Unreal Online Learning</strong></a><br /> This growing catalog of nearly 50 courses and guided learning paths tracks your progression and awards your achievements, whether your spending your first hours in tools such as Sequencer or brushing up on your visualization skills.<br /> <br /> <a href="https://www.youtube.com/channel/UCBobmJyzsJ6Ll7UbfhI4iwQ" target="_blank"><strong>Unreal Engine on YouTube</strong></a><br /> Here's where you'll find archives of Inside Unreal, live training, and other broadcasts from our <a href="https://www.twitch.tv/unrealengine" target="_blank">Twitch</a> channel; tech talks from GDC, Unreal Fest Europe, and other conferences; and so much more.<br /> <br /> <strong><a href="https://www.unrealengine.com/en-US/events/webinar-series" target="_blank">Webinar Series</a></strong><br /> Check out our free webinars to learn all about the latest Twinmotion features and workflows and how to use Unreal Engine to create photorealistic scenes and interactive designs.<br /> &nbsp; <hr /> <h2><strong>UNREAL EVENTS</strong></h2> <br /> <strong>May 7, 2020 | <a href="https://forums.unrealengine.com/unreal-engine/events/1755035-inside-unreal-spyder-from-day-zero-to-post-launch" target="_blank">INSIDE UNREAL: Spyder: From Day Zero to Post-Launch</a></strong><br /> We're excited to have Sumo Digital on Inside Unreal this week! The team behind the Spy-on-The-Wall adventure game Spyder talks about their journey from day zero to the weeks after launch. Learn more about the mobile Apple Arcade game, as well as how the team overcame difficult challenges during its production—including the experience of launching a game from home.<br /> <br /> <strong>May 8, 2020 | 2pm ET - <a href="https://www.twitch.tv/unrealengine" target="_blank">Twinmotion 2020 for Education</a></strong><br /> During this livestream, we will explore the new Twinmotion 2020 and discuss how best to integrate it into your academic environment. We'll show you how incredibly easy it is to assemble a stunning environment in minutes, and walk you through the newest features.<br /> &nbsp; <hr /> <h2><strong>ADDITIONAL EVENTS</strong></h2> Members of our team often perform presentations, sit in on panel discussions, or otherwise share their insight while participating in a variety of online events. Here are some events where Epic plans to have a presence.<br /> <br /> <strong><a href="https://www.awexr.com/" target="_blank">AWE</a></strong><br /> Get awe-inspired with the best in AR/VR -- now fully online from May 26-29, 2020. <a href="https://augmentedworldexpo.secure.force.com/BuyTicket?id=7011H000000ypzO&amp;__hstc=236398302.aaa4a43e8e4309bf5239d031331d5b85.1578939614506.1588861528455.1589294573334.23&amp;__hssc=236398302.1.1589294573334&amp;__hsfp=3853071260&amp;_ga=2.122877774.439324939.1589294573-1518862469.1578939613" target="_blank">Register now</a> to view presentations from Unreal Engine experts in the realms of 5G and spatial computing technologies, virtual production, and simulation technologies.<br /> <br /> Marc Petit, General Manager of Unreal Engine, presents: <a href="https://www.awexr.com/usa-2020/agenda/1285-what-s-next" target="_blank">What's Next?</a><br /> David Morin, Industry Manager for Film &amp; TV, presents: <a href="https://www.awexr.com/usa-2020/agenda/1773-creating-in-camera-vfx-with-real-time-workflows" target="_blank">Creating In-Camera VFX with Real-Time Workflows</a><br /> Sebastien Loze, Industry Manager for Training and Simulation, co-presents with Myra LalDin, CEO of Perspectives: <a href="https://www.awexr.com/usa-2020/agenda/1579-learning-by-living-memorable-virtual-experiences" target="_blank">Learning by Living Memorable Virtual Experiences</a><br /> <br /> <strong><a href="https://www.nvidia.com/gtc/" target="_blank"><strong>NVIDIA GTC Digital</strong></a></strong><br /> GTC Digital is a training, research, insights, and direct access to the brilliant minds of NVIDIA’s GPU Technology Conference, now online. Epic Games' Film &amp; TV industry manager, David Morin, presented: <em><a href="https://developer.nvidia.com/gtc/2020/video/s22160" target="_blank">Creating In-Camera VFX with Real-Time Workflows</a></em>, now available on demand.<br /> <br /> <strong><a href="http://www.c4dlive.com/" target="_blank"><strong>C4D Live</strong></a></strong><br /> In light of the cancellation of the 2020 NAB Show, Maxon hosted a virtual NAB presence on C4DLive.com featuring an incredible line-up of presenters. Epic Senior Technical Product Designer Andy Blondin gave a 25-minute virtual presentation: <a href="https://youtu.be/tipehpvKHzE" target="_blank"><em>Blurring the Lines between Film, Television, and Games</em></a>, now available on demand.<br /> <br /> <strong><a href="https://nabshow.com/express/" target="_blank">NAB Show Express</a></strong><br /> On Wednesday, May 13, Miles Perkins, Business Development Manager for Film &amp; TV, participated in a panel: <a href="https://dir.nabshowexpress.com/8_0/sessions/session-details.cfm?scheduleid=1713" target="_blank">How is Virtual Production Changing Television?</a> To watch the archived recording, <a href="https://dir.nabshowexpress.com/8_0/login/login.cfm" target="_blank">register for a free account</a>.<br /> &nbsp; <hr /> <h2><strong>PREVIOUS UNREAL EVENTS</strong></h2> <br /> <strong>March 26, 2020 | 9am ET &amp; 2pm ET - <a href="https://www.unrealengine.com/en-US/blog/webinar-what-s-new-in-twinmotion-2020-1" target="_blank"><strong>WEBINAR: What’s New in Twinmotion 2020.1</strong></a></strong><br /> We recently hosted the live webinar What’s New in Twinmotion 2020.1. The replay is now available.&nbsp;In this webinar, Martin Krasemann, Twinmotion Technical Marketing Specialist at Epic Games presents a deep dive into some of the new Twinmotion 2020.1 features.<br /> <br /> Find out how the release brings improved fidelity and higher-quality lighting, more realistic vegetation and humans, new features to review and present projects, and more. <a href="https://www.unrealengine.com/en-US/blog/webinar-what-s-new-in-twinmotion-2020-1" target="_blank">Watch now</a>.&nbsp;<br /> <br /> <br /> <strong>March 26, 2020 | 2pm ET - <a href="https://forums.unrealengine.com/unreal-engine/events/1728586-blender-to-unreal-tools-part-3-march-26" target="_blank"><strong>INSIDE UNREAL: Blender to Unreal Tools, Part 3</strong></a></strong><br /> It's time for the third part in our "Blender to Unreal" series!<br /> <br /> In part 2 we covered how to work with Rigify and the Unreal Mannequin. We're leaving Manny behind for this adventure, and will proceed to demonstrate how to import custom animations, characters, and skeletons. <a href="https://youtu.be/yOELHvKqiHU" target="_blank">Watch now</a>.<br /> <br /> <br /> <strong>April 2, 2020 | 2pm ET - <a href="https://forums.unrealengine.com/unreal-engine/events/1728189-state-of-audio-in-4-25-date-tbd" target="_blank"><strong>INSIDE UNREAL: State of Audio in 4.25</strong></a></strong><br /> You may have heard that we have several new exciting audio features coming in 4.25, and this week we thought we’d take a look at them!<br /> <br /> To kick things off, the audio team will give a quick update on the Audio Mixer. From there we will shift to discussions and demos of the Native Convolution Reverb, Native Ambisonics Decoding and Encoding Support, as well as the new non-real-time audio analysis plugin Synesthesia. And finally, Wyeth Johnson will cap off the show with a demo on visualizing audio with the Niagara Audio Data Interface. <a href="https://youtu.be/wux2TZHwmck" target="_blank">Watch now</a>.<br /> <br /> <br /> <strong>April 2, 2020 | 9am ET &amp; 2pm ET - <a href="https://www.unrealengine.com/en-US/blog/webinar-unreal-engine-and-quixel-pushing-the-boundaries-of-3d" target="_blank"><strong>WEBINAR: Unreal Engine and Quixel: pushing the boundaries of 3D</strong></a></strong><br /> Traveling the planet with a mission to build the world’s largest library of scans, Quixel has sought to vastly simplify production of all digital experiences. Now, since joining forces with Epic Games, this mission is accelerating. &nbsp;<br /> <br /> In this presentation, Teddy Bergsman and Galen Davis demonstrated how Quixel Megascans, Bridge, and Mixer 2020—along with the power of Unreal Engine—are pushing the boundaries of what’s now possible with 3D. <a href="https://youtu.be/KhTPayu_YUs" target="_blank">Watch now</a>.&nbsp;<br /> <br /> <br /> <strong>April 9, 2020 | 2pm ET - <a href="https://forums.unrealengine.com/unreal-engine/events/1742289-iterative-design-for-comfort-april-9" target="_blank">INSIDE UNREAL - Iterative Design for Comfort</a></strong><br /> This week we'll focus on building functionality that helps designers understand what's going on in the game levels they're building. Little features, like simple debug indicators in the level editor, or easily-accessed Blueprint functions, can greatly increase the speed and comfort of work while also reducing user error and frustration. We'll also discuss how to iterate on code and design, to avoid impacting users when underlying code changes take place. <a href="https://youtu.be/2YiN8r-ZwrE" target="_blank">Watch now</a>.<br /> <br /> <br /> <strong>April 9, 2020 - May 14, 2020 - <a href="https://www.unrealengine.com/en-US/blog/join-the-unreal-fast-track-student-challenge-and-learn-unreal-engine-online-this-spring" target="_blank">Unreal Fast Track: Join the student challenge</a></strong><br /> Epic Games is bringing university students from around the world together to participate in the <a href="https://www.unrealengine.com/fast-track" target="_blank">Unreal Fast Track</a>, a personalized way to learn Unreal Engine, take part in exclusive Q&amp;As with industry professionals, and get to know best practices for working on remote teams. Epic is hosting an exclusive Discord hub for the initiative, which runs through May 14. <a href="https://www.unrealengine.com/en-US/blog/join-the-unreal-fast-track-student-challenge-and-learn-unreal-engine-online-this-spring" target="_blank">Read more</a>.<br /> <br /> &nbsp;<br /> <strong>April 16, 2020 | 2pm ET - <a href="https://forums.unrealengine.com/unreal-engine/events/1742628-houdini-workflows-with-unreal-engine-april-16" target="_blank">INSIDE UNREAL: Houdini Workflows with Unreal Engine</a></strong><br /> Mike Lyndon from SideFX will explore how the latest iteration of the Houdini Niagara data interface has simplified the process of exporting point caches and added new ways of using that data in Niagara. With it, you can import fluid sims, crowds or other types of static or animated caches. The second half of the presentation from Paul Ambrosiussen will provide you with a snapshot of the SideFXLabs toolset, and how it fits into an artists workflow of creating and exporting procedural data for Unreal Engine.<br /> <br /> <br /> <strong>April 17, 2020 | 2pm ET - <a href="https://youtu.be/ywu4l1RTFPU" target="_blank">Unreal Educator livestream series: Collaboration and Teamwork in Unreal Engine</a></strong><br /> Join us Fridays for a brand new livestream series geared specifically towards teaching and learning Unreal Engine in an academic environment. We are kicking off with a topic that is essential to achieving success in Unreal Engine: collaboration. Unreal Engine is designed from the ground up for teams and it’s easier than you think to get started using collaborative workflows. <a href="https://youtu.be/ywu4l1RTFPU" target="_blank">Watch now</a>.<br /> <br /> <br /> <strong>April 24, 2020 | 2pm ET - <a href="https://youtu.be/k1UBeAizxkQ" target="_blank">Unreal Educator livestream series: Unreal Educator’s Field Guide - Tips and Tricks for Instructors</a></strong><br /> In this presentation, Luis Cataldi explores the breadth of Epic resources available to help you teach Unreal Engine in your schools, online, or within your studio. We also share our favorite community resources that we have learned from and have been inspired by. This is a valuable video for anyone who teaches or helps others learn Unreal Engine. <a href="https://youtu.be/k1UBeAizxkQ" target="_blank">Watch now</a>.<br /> <br /> <br /> <strong>April 29, 2020 | 9am ET &amp; 2pm ET - <a href="https://www.unrealengine.com/en-US/blog/webinar-working-collaboratively-in-unreal-engine" target="_blank">WEBINAR: Working collaboratively in Unreal Engine</a>&nbsp;</strong><br /> It's easy to build Unreal Engine projects with numerous people in the same environment, live and in real time, whether you're creating a virtual set or designing a new product. In this presentation, we will demonstrate two unique collaborative workflows.&nbsp;<br /> <br /> The first enables multiple artists to make changes simultaneously to the same Unreal Engine project safely and reliably. Updates happen on the fly for everyone in the group, with no wait. The second shows how to create a runtime experience of design data to conduct collaborative design review sessions. <a href="https://www.unrealengine.com/en-US/blog/webinar-working-collaboratively-in-unreal-engine" target="_blank">Watch now</a>.<br /> <br /> <br /> <strong>April 30, 2020&nbsp;| 2pm ET&nbsp;- <a href="https://forums.unrealengine.com/unreal-engine/events/1749627-unreal-engine-4-25-release-highlights-april-23" target="_blank">INSIDE UNREAL: 4.25 Release Highlights</a>&nbsp;</strong><br /> It's easy to build Unreal Engine projects with numerous people in the same environment, live and in real time, whether you're creating a virtual set or designing a new product. In this presentation, we will demonstrate two unique collaborative workflows.<br /> <br /> <br /> <strong>May 1, 2020 | 2pm ET - <a href="https://forums.unrealengine.com/unreal-engine/events/1749631-unreal-educator-livestream-series-unreal-educator%E2%80%99s-field-guide-april-24" target="_blank">Unreal Educator livestream series: Quixel Bridge, Mixer, and Megascans for Education</a></strong><br /> During this livestream, we will dig into the Quixel ecosystem to better understand the amazing toolset and vast content library available to all for free, and explore academic-friendly workflows for generating breathtaking environments. <a href="https://forums.unrealengine.com/unreal-engine/events/1749631-unreal-educator-livestream-series-unreal-educator%E2%80%99s-field-guide-april-24" target="_blank">Learn more</a>.<br /> <br /> <br /> <strong>May 13, 2020 | 9am ET &amp; 2pm ET - <a href="https://www.unrealengine.com/en-US/blog/webinar-source-control-in-unreal-engine" target="_blank">WEBINAR: Source control in Unreal Engine</a></strong><br /> Planning, tools, and execution lie at the heart of any successful project. Agile teams often implement version control systems to record changes to files over time, enabling them to easily recall specific versions at a later point. This reduces risk and improves collaboration between developers.<br /> <br /> In this webinar, Technical Artist Matthew Doyle goes over the different options for source control in the Unreal Editor. <a href="https://bit.ly/3cPBLhM" target="_blank">Watch now</a><strong>.</strong><br /> <br /> <br /> <strong>May 20, 2020 | 9am ET &amp; 2pm ET - <a href="https://www.unrealengine.com/en-US/blog/webinar-synchronizing-rhino-data-into-twinmotion-in-one-click" target="_blank">WEBINAR: Synchronizing Rhino data into Twinmotion in one click</a></strong><br /> In this webinar, Scott Davidson of McNeel and Associates shows how to prepare your model in Rhino and get it ready for Twinmotion. Next, Martin Krasemann, Twinmotion Product Specialist at Epic Games, demonstrates how to import the Rhino scene file into Twinmotion, converting native materials into Twinmotion PBR materials while retaining the scene hierarchy. Martin enriches the scene with paths, vegetation, vehicles, and characters, taking it all the way to the final rendering. <a href="https://www.youtube.com/watch?v=COTjOpslp_Q" target="_blank">Watch&nbsp;now</a>.<br /> <br /> --<br /> <br /> More online events will be added as they are confirmed, so please check back often!<br /> &nbsp;DocumentationEventsUnreal Online LearningWebinarLivestreamMon, 20 Apr 2020 20:00:00 GMTMon, 20 Apr 2020 20:00:00 GMThttps://www.unrealengine.com/blog/connect-with-the-unreal-engine-community-online VR medical simulation from Precision OS trains surgeons five times fasterhttps://cdn2.unrealengine.com/Unreal+Engine%2Fspotlights%2Fvr-medical-simulation-from-precision-os-trains-surgeons-five-times-faster%2FSpotlight_PrecisionOS_Thumbnail-375x275-2dd3056dcfb86740fdd96ef225f6fc15451aba74.jpgVirtual reality gives surgeons the opportunity to train for every possible scenario. Find out how Precision OS’s virtual reality solution is changing the face of medical education through faster, more effective, more portable training.In June of last year, <a href="/spotlights/precision-os-delivers-accredited-curriculum-for-orthopedic-surgical-training-in-vr" target="_blank">we wrote about Precision OS</a> and their innovative use of Unreal Engine to run lifelike training for orthopedic surgeons. In this new video spotlight, the company talks about how their virtual reality solution is changing the face of medical education through faster, more thorough, more portable training, and the ability to train new surgeons for every possible scenario. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/6XrfSB0lvFE" width="100%"></iframe></div> <h3>Changing the way surgeons learn</h3> Dr. Danny P. Goel, CEO of Precision OS and an orthopedic surgeon himself, says their goal is to change the way we think about surgical education. “It's actually challenging 400 years of dogma, which has been to learn on plastic models or to learn on cadavers,” he says.&nbsp;<br /> <br /> Whereas medical textbooks and plastic models show neat-and-tidy representations of bone, muscle, and organs, the reality is quite different. Medical education also makes use of cadaver labs, lectures, and on-the-job training, but it can take healthcare providers months to become proficient. And even then, they won’t have seen every possible situation that can arise during a procedure.<br /> <br /> “So how do they get that practice before they get to the O.R.?” asks Goel. “You don't do the same procedure over and over again to become an expert, because every patient is different. It's doing the same thing with variation, which actually brings you towards expertise. And that is what we focus on with our technology.”<br /> <img alt="Spotlight_PrecisionOSBlog_Body_Image_1.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fspotlights%2Fvr-medical-simulation-from-precision-os-trains-surgeons-five-times-faster%2FSpotlight_PrecisionOSBlog_Body_Image_1-1640x900-ca437838a01c6e8549fa48edd51e7cf22f7d0113.jpg" style="height:auto; width:auto" /><br /> Rob Oliveira, Co-Founder and CCO at Precision OS, likens the training to a flight simulator that runs numerous different scenarios. “Pilots, when they get on a plane, they've had to have quite a few hours in the simulator,” he says. “VR is exciting because it moves it closer to that model, and they can practice and fail hundreds of times in a simulator or in VR. That's where you want them to fail, not on you.”<br /> <br /> In a recent independent study, experienced surgeons assessed the skills of VR and non-VR trainees. The VR group was found to be not only superior to non-VR trainees in technical skill, but also to have learned the information 570% faster. The study was published in the prestigious <a href="https://journals.lww.com/jbjsjournal/Abstract/2020/03180/Improved_Complex_Skill_Acquisition_by_Immersive.14.aspx" target="_blank">Journal of Bone and Joint Surgery</a> in early 2020.<br /> <img alt="Spotlight_PrecisionOSBlog_Body_Image_10.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fspotlights%2Fvr-medical-simulation-from-precision-os-trains-surgeons-five-times-faster%2FSpotlight_PrecisionOSBlog_Body_Image_10-1640x900-7c345416096e33e3c0101f28d96925ca20785d61.jpg" style="height:auto; width:auto" /><br /> Precision OS recently formed a <a href="https://www.precisionostech.com/sign-fracture-care-international-and-precision-os-join-efforts/" target="_blank">partnership with SIGN Fracture Care</a>, a humanitarian organization that provides fracture surgery to the injured poor. SIGN will distribute Oculus Quest wireless VR headsets to 365 hospitals in 53 countries, and then deliver Precision OS’s surgical training to these regions remotely. These efforts will not only shorten the time for training, but will also free up the time and funds previously spent on flying surgeons to training facilities. <h3>The right tool for the job</h3> Colin O’Connor, Co-Founder and CTO at Precision OS, says they chose Unreal Engine for its visual quality and its toolset. “We want to make sure that we hit that high-fidelity mark right from the outset,” he says. “On top of that, Unreal has everything built into it.”<br /> <br /> <br /> Interested in finding out how you could use Unreal Engine to improve your field? <a href="mailto:simulation@epicgames.com" target="_blank">Get in touch</a> to start that conversation.<br /> &nbsp;MedicalPrecision OSTraining & SimulationVRSébastien LozéTue, 26 May 2020 16:24:17 GMTTue, 26 May 2020 16:24:17 GMThttps://www.unrealengine.com/spotlights/vr-medical-simulation-from-precision-os-trains-surgeons-five-times-fasterFinding Funding: So you need some money to make a video game…https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Ffinding-funding-so-you-need-some-money-to-make-a-video-game%2FTHUMBNAIL_FindingTheFunding_Part1-375x275-e4474bc9c4b7ff8cfea915a808b8206b10ce680e.jpgBefore you start pitching your studio or project, it’s important to understand who you’re talking to. To that end, we’ve assembled a brief primer on the different types of funding you might pursue.<em><strong>EDITOR'S NOTE:</strong> Author</em>&nbsp;<em>Mike Futter is a freelance journalist and gaming industry consultant.</em><br /> <br /> Making a video game is a series of creative challenges.&nbsp;<br /> <br /> Some of those challenges, you’ve prepared for. You’ve gone to school for programming, you’ve honed your visual arts skills, or you’ve built a career in sound design. Perhaps you’ve worked in a QA shop or done some freelance localization. Maybe you’re freshly “indie” after years in AAA. But unless you’ve studied finance, happened to stumble into those responsibilities earlier in your career, or have the benefit of a business professional on staff, you may not be quite as ready for those important tasks.<br /> <br /> You’ve got big creative ideas. Whether slightly blurry or completely in focus, you have a picture in your mind of a game you want to make. What you may not have is the money to execute on that vision. You know you need financing, but you might not know the differences between the types of funding or how to secure it.<br /> <br /> Before you start pitching your studio or project, it’s important to understand who you’re talking to. To that end, we’ve assembled a brief primer on the different types of funding you might pursue. <div style="text-align:center"><img alt="Bloodstained_Pic1.png" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Ffinding-funding-so-you-need-some-money-to-make-a-video-game%2FBloodstained_Pic1-800x450-4c002ed1264d8d294894654ec094e350c712714b.png" style="height:auto; width:auto" /></div> <h2><strong>Venture capital, angel investment, project funding, equity investing... What does it all mean?</strong></h2> <strong>Private equity (or “PE”):</strong> Private equity investment comes from individuals or firms that purchase ownership in companies that have not yet gone public or public companies that investors plan to take private (though the latter isn’t common in the video game industry). The goal, as with any investment, is to see a return through a company’s growth.&nbsp;<br /> <br /> Private equity is a diverse field, with some funders choosing to take a passive role. Others are more hands-on, providing guidance and supporting management. Often, PE firms will target established but underperforming companies in the belief they can turn them around and reap big returns.<br /> <br /> Some private equity firms funding in video games include Providence Equity Partners, Insight Venture Partners, and Lightspeed Venture Partners. There are also funds that focus specifically on the video game industry. Makers Fund, Kowloon Nights, and Hiro Capital are just a few of these new groups springing up to support studios and industry innovators.<br /> <br /> <br /> <strong>Venture capital (or “VC”):</strong> Venture capital is a catch-all term for early stage investments. It can come from private equity, financial institutions, or solo investors looking to help take small, promising companies and help them reach their potential. VC is risky, but it’s becoming a common way for new businesses to secure funding necessary for growth. VCs usually want to make sure their risk is mitigated as best as possible, that means exercising governance control of a company through board seats and voting rights. It’s important to note that an investor may require the ability to veto certain salary increases, sale of intellectual property, or other significant business decisions to make up for their votes, which will likely always be in the minority compared to business founders.<br /> <br /> Venture capital firms investing in games include Sequoia Capital, Kleiner Perkins Caufield &amp; Byers, and Andreessen Horowitz.<br /> <br /> <br /> <strong>Angel investment:</strong> Angel investors are typically experienced entrepreneurs or seasoned executives that have made their own fortunes. They typically fund in fields in which they have professional or academic experience, often seeking out companies that have cohesive business plans and strong managerial teams.<br /> <br /> Although the term "angel investor" may sound heavenly, these individuals are still looking for a return on their investment. They just may be in a position to offer expert guidance, mentorship, and access to other high-profile funders that might otherwise be out of reach for a nascent business.<br /> <br /> <br /> <strong>Crowdfunding: </strong>While not the first video game crowdfunding project, 2012’s Double Fine Adventure (renamed later as Broken Age) put Kickstarter on the map for game developers. The idea is simple: pitch your game to your audience and ask them to fund it years before it will be delivered. In practice, crowdfunding is a complicated, intense, and stressful way to seek funding. The bubble has long since burst, too. <a href="https://www.polygon.com/2020/1/22/21068797/kickstarter-2019-board-games-video-games-tabletop-data-china-tariffs-trump" target="_blank">In 2019, video games earned $16.9 million on Kickstarter</a>, down from <a href="https://www.kickstarter.com/blog/the-year-of-the-game" target="_blank">more than $50 million in 2012</a>. To date, <a href="https://www.kickstarter.com/help/stats" target="_blank">there have been more than 50,000 projects launched in the games category</a>, with more than 20,000 successfully funding (though it is important to note that this group includes tabletop campaigns, which make up the bulk of funding in this category). Crowdfunding isn’t the magic bullet for video game funding some hoped it would be, but it is still a viable path if you have the right project.<br /> <br /> Kickstarter continues to be a popular target for companies seeking crowdfunding, though Indiegogo offers different options, including one that still awards backer funds for projects that do not reach their targets.<br /> <br /> <br /> <strong>Crowdinvesting: </strong>The crowdinvesting model blends the grassroots community building of crowdfunding with a more traditional outlook on funding video game projects. Fig, founded in 2015, is a curated platform for seeking both crowdfunding and microinvestment. The model has seen success, with <a href="https://www.fig.co/campaigns" target="_blank">25 of 39 campaigns successfully funding</a> (64%), including Double Fine’s upcoming Psychonauts 2 and Drastic Games’ Soundfall. The platform has also made investor profits for five of its eleven released games, with five games profitable to investors and a sixth close to breaking even. Fig continues to evolve its model, recently expanding to a model called “Open Access,” an ongoing funding campaign for games that are in an early access state.<br /> <br /> While some people see crowdfunding and crowdinvesting as a generator of momentum, Fig founder and CEO Justin Bailey cautions against seeing either as a magic bullet for success. He suggests that these types of campaigns come after developers have traction.<br /> <br /> “In reality, we work better when we're trying to amplify what they already have,” Bailey explained. “Everybody likes to be part of a success story. Everybody likes it when momentum is already established. It's just way too hard to try to get things moving. When they come into this, they should think, ‘Hey, I'm going to bring at least half of both the traffic and the funding I need for this campaign to succeed.’”<br /> <br /> We’ll discuss crowdfunding and crowdinvesting in greater depth in a future article.<br /> <br /> <br /> <strong>Publishing:</strong> Traditional publishing is still a viable option for a number of developers. While there are different models under this umbrella, a publisher typically provides funding early on in development, provides a variety of services (marketing, public relations, quality assurance, localization, first-party relations, etc.), and allows studios to focus on making the best game possible instead of dealing with a number of non-creative concerns. We’ll be talking more about publishing in the future.<br /> <br /> <br /> <strong>Grants and Tax Credits:</strong> Your studio’s geographical location might make you eligible for certain types of funding. The <a href="https://www.cmf-fmc.ca/" target="_blank">Canadian Media Fund</a> and the <a href="https://ukgamesfund.com/" target="_blank">UK Games Fund</a> are two examples of government financial support for local companies working in digital interactive media. The <a href="https://www.georgia.org/industries/film-entertainment" target="_blank">state of Georgia</a>, in the United States, and <a href="https://www.investquebec.com/international/en/industries/multimedia.html" target="_blank">the province of Quebec, Canada</a> are examples of regional governments offering tax credits. &nbsp;<br /> <br /> Likewise, some companies offer incentives or funding opportunities for using certain tools. With $100 million set aside, <a href="https://www.unrealengine.com/en-US/megagrants" target="_blank">Epic MegaGrants</a> provides funding for a wide range of projects and creators, including game and tool developers, both students and educators working in academic settings, media and entertainment creators, and more. If you’re working with Unreal Engine, or contributing open-source tools and resources for the community, applying for an Epic MegaGrant may be a good option.&nbsp; <div style="text-align:center"><img alt="Psychonauts2_Pic1.png" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Ffinding-funding-so-you-need-some-money-to-make-a-video-game%2FPsychonauts2_Pic1-800x450-487de667a7cf87186474267c3925b84c290c0848.png" style="height:auto; width:auto" /></div> <h2><strong>Funding is not a problem; it’s an opportunity</strong></h2> A big part of preparing to seek funding is understanding that your motives need to align with your prospective funders’. It’s easy to look at your need for cash as a problem requiring a solution. But that will only get you so far. You may convince friends and family to support your project based on need alone, but the investment community isn’t going to bite.<br /> <br /> “I see it every single day where it's like, ‘Hey man, I just need money. I don't care if it comes from VCs or publishers or banks. I just need a bag of cash,’ says Execution Labs co-founder Jason Della Rocca. “It doesn't work that way.”<br /> <br /> Della Rocca says that funders are looking for opportunities. Positioning funding for your studio or game as a problem isn’t likely to work.<br /> <br /> “A person who's going to give you money is looking for an opportunity, and the opportunity looks different depending on who they are,” Della Rocca explains. “If I'm your mom the opportunity is different than if I'm a bank. It’s different if I'm a VC. It’s different if I'm a platform or a publisher."<br /> <br /> Shifting perspective from what Della Rocca calls a “missing money” problem to one that entices an investor with opportunity is a key to being successful. Finding those hooks starts with identifying what your team is good at, touting the talent you’re working with, and highlighting the path you’ve taken so far.<br /> <br /> That ultimately leads to a major fork in the road: project funding versus company equity financing, which we’ll cover in-depth in our next piece. <div style="text-align:center"><img alt="Moss_Pic1.png" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Ffinding-funding-so-you-need-some-money-to-make-a-video-game%2FMoss_Pic1-800x450-c758ee965abce7db28a9852e685d56c7d7f527c8.png" style="height:auto; width:auto" /></div> <h2><strong>Let your principles guide your pursuit of funding</strong></h2> Finding funding can be challenging, but it’s important to remember that taking equity investment or project funding is a long-term proposition. You’re going to be working with your funder for years, and it’s crucial that you find the right match.&nbsp;<br /> <br /> That might even mean making the hard decision to turn down your only offer if the terms aren’t right. However, if you have a strong studio and/or a great game concept, chances are another offer isn’t too far away.<br /> <br /> Just as it’s necessary to determine what kinds of opportunities you are offering a funder, it’s important to go through the exercise of figuring out exactly what you want from a relationship with an investor and your core principles when seeking funding.<br /> <br /> Polyarc (Moss) took a number of years to find the right funder for the studio. CEO and co-founder Tam Armstrong revealed that he and his team spent years figuring out whether they wanted equity funding or project funding.<br /> <br /> “The project funding we were able to have productive conversations around, came with constraints or requirements that we were not particularly interested in,” Armstrong said. “Not to say that they're either good or bad, but we were very interested in retaining ownership of our intellectual property. At the time the project financing options we had didn't seem to allow for that. Whereas the equity investment does intrinsically allow for that because they're investing in the IPs.”<br /> <br /> Armstrong noted that there are, of course, project funding deals that allow developers to retain control of their intellectual property. (Note that your legal counsel should review any contracts and can best advise you about IP ownership, control, and usage on a case-by-case basis.) Ultimately, Polyarc pursued equity investment, but not exclusively to retain control of its intellectual property.<br /> <br /> “Once we were talking to enough of these people, we realized that the incentives being aligned to see the success of the company just felt better to us,” Armstrong explained. “It felt better to know that the money we were taking had similar goals to ourselves, which is the long term health of the company.”<br /> &nbsp; <h2><strong>Looking ahead…</strong></h2> In our next article, we’ll discuss finding funding at different stages of the development process, provide guidance on pitch materials, and share tips from funders who are out there looking for games and studios just like yours. In the future, we’ll take a closer look at crowdfunding, crowdinvesting, and publishing as options for making your game a reality.<br /> <br /> <em>Mike Futter is a freelance journalist and gaming industry consultant. He is the author of the Gamedev Business Handbook and co-host of the <a href="http://virtualeconomy.libsyn.com/" target="_blank">Virtual Economy</a> podcast. Find out more <a href="https://www.fsquared.biz/about" target="_blank">here</a>.</em>GamesFundingMike FutterTue, 26 May 2020 13:00:00 GMTTue, 26 May 2020 13:00:00 GMThttps://www.unrealengine.com/blog/finding-funding-so-you-need-some-money-to-make-a-video-game Unearth savings up to 70% during the Unreal Engine Marketplace Spring Salehttps://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Funearth-savings-up-to-70-during-the-unreal-engine-marketplace-spring-sale%2FThumbnail-375x275-314a225a91faa1caa0fdd9ef56a79df7085bf14a.jpgMake development a breeze with construction kits, character collections, flourishing countrysides, and more!<div>With up to 70% off more than 5,000 select <a href="http://www.unrealengine.com/marketplace" target="_blank">Unreal Engine Marketplace</a> products during the Spring Sale, the Marketplace is teeming with content to make development a breeze. Now through Wednesday, June 3 discover discounted construction kits, character collections, captivating countrysides, and so much more!</div> <div style="text-align:center"><img alt="Blog_Body_Image_1.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Funearth-savings-up-to-70-during-the-unreal-engine-marketplace-spring-sale%2FBlog_Body_Image_1-1640x900-ece10c1ad0f31f30609f516473434950f821d625.jpg" style="height:1080px; width:1968px" /></div> Uncover a treasure trove of products to help you construct lavish hotels and lively low-poly campsites, or vast solar systems with starry skyboxes and advanced sci-fi spaceships. Don’t miss out on ambient sounds and tracks to envelop your audience, explosive effects, and handy tools to tidy up your projects.<br /> <br /> <a href="http://epic.gm/eventsale" target="_blank">Sale lasts now through June 3 at 11:59 PM EDT</a>. Happy shopping!CommunityEventsNewsGamesMarketplaceAmanda SchadeMon, 25 May 2020 14:00:00 GMTMon, 25 May 2020 14:00:00 GMThttps://www.unrealengine.com/blog/unearth-savings-up-to-70-during-the-unreal-engine-marketplace-spring-saleWebinar: Synchronizing Rhino data into Twinmotion in one click Interested in working with Rhino and Twinmotion? Check out the replay of our recent live webinar to learn a faster and easier way to work with Rhino data in Twinmotion 2020.Did you catch our recent live webinar <strong>Synchronizing Rhino data into Twinmotion in one click</strong>? In case you missed it, the replay is available right here. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/COTjOpslp_Q" width="100%"></iframe></div> The recent release of Twinmotion 2020 brings a new Direct Link with Rhino, enabling you to synchronize your Rhino data into Twinmotion with a single click.&nbsp;<br /> <br /> In this webinar, Scott Davidson of McNeel and Associates shows how to prepare your model in Rhino and get it ready for Twinmotion. Next, Martin Krasemann, Twinmotion Product Specialist at Epic Games, demonstrates how to import the Rhino scene file into Twinmotion, converting native materials into Twinmotion PBR materials while retaining the scene hierarchy. Martin enriches the scene with paths, vegetation, vehicles, and characters, taking it all the way to the final rendering.&nbsp;<br /> <br /> What you’ll learn:&nbsp;<br /> &nbsp; <ul style="margin-left:40px"> <li>Tips on keeping your Rhino workflow efficient when prepping your file</li> <li>How to use the new Rhino Direct Link for a faster and easier way to work with Rhino data in Twinmotion</li> <li>How to use the Twinmotion feature set to breathe life into your project</li> </ul> <br /> <br /> Looking for more learning content? Check out the full <a href="/events/webinar-series" target="_blank">webinar series</a> and visit our <a href="/onlinelearning-courses" target="_blank">Unreal Online Learning</a> portal.ArchitectureCommunityDesignLearningVisualizationTwinmotionWebinarRhinoThu, 21 May 2020 19:30:00 GMTThu, 21 May 2020 19:30:00 GMThttps://www.unrealengine.com/blog/webinar-synchronizing-rhino-data-into-twinmotion-in-one-clickLearn game development for free with Unreal Online Learninghttps://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Flearn-game-development-for-free-with-unreal-online-learning%2FNews_GameDevTrackThumbnail-375x275-249ede771c958b7764df1aa0e1e64f0eb7fa96dd.jpgNew game dev courses are available on Unreal Online Learning! Learn how to convert Blueprints to C++, how to get started in game audio design, and how to create a game environment from scratch.Looking to create your first game in Unreal Engine? Want to learn the skills needed for a <a href="/blog/prepare-for-the-jobs-of-tomorrow-with-a-new-field-guide-for-creators" target="_blank">career in real-time 3D</a>? Learn the core concepts of game creation with <a href="/onlinelearning-courses" target="_blank">Unreal Online Learning</a>’s newest game development courses.&nbsp;<br /> <br /> These free courses are a great start to learning the foundational skills needed for game development and design.<br /> <br /> Take your next steps in your journey towards becoming a gameplay designer, level designer, audio designer, game environment artist, lighting artist, or <a href="/tech-blog/jobs-in-unreal-engine---technical-artist" target="_blank">technical artist</a>—or learn how to make a video game of your own from start to finish.<br /> <br /> Check out the new courses below! <h3><a href="/onlinelearning-courses/build-a-detectives-office-game-environment" target="_blank">Build a Detective’s Office Game Environment</a></h3> Create a scene from scratch, from planning and prototyping to adding the elements that will take it to an alpha state.<br /> <br /> You will learn how to:&nbsp;<br /> &nbsp; <ul style="margin-left:40px"> <li>Recreate the process for blocking out a 3D environment.</li> <li>Edit a blocked-out scene based on testing of pacing and flow.</li> <li>Use best practices to light a scene to create a believable mood.</li> <li>Apply post-processing volumes to modify the color grading and atmosphere of a scene.</li> <li>Import custom assets into Unreal Engine.</li> </ul> <img alt="News_GameDevTrackBlog_Body_Image_3.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Flearn-game-development-for-free-with-unreal-online-learning%2FNews_GameDevTrackBlog_Body_Image_3-1640x900-35c7330c0cb3af3b7c3889e82ad2c9ea14732c7c.jpg" style="height:auto; width:auto" /> <h3><a href="/onlinelearning-courses/ambient-and-procedural-sound-design" target="_blank">Ambient and Procedural Sound</a></h3> Industry experts Richard Stevens and Dave Raybould will guide you in learning the core techniques you need to get started in game audio design in Unreal Engine.<br /> <br /> You will learn how to:&nbsp;<br /> &nbsp; <ul style="margin-left:40px"> <li>Create sound cues and ambient actors that can be played and controlled in a level with Blueprint.</li> <li>Generate sounds and effects which loop, but vary in sound each loop.</li> <li>Control audio playback using Blueprints.</li> <li>Build audio systems that will play around a player or have spatialization.</li> <li>Recognize different methods for building sound into a level and when each method is appropriate.</li> </ul> <img alt="News_GameDevTrackBlog_Body_Image_2.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Flearn-game-development-for-free-with-unreal-online-learning%2FNews_GameDevTrackBlog_Body_Image_2-1640x925-5161e5134705de8a3838eb2ca40cc505bdfdd5b6.jpg" style="height:auto; width:auto" /> <h3><a href="/onlinelearning-courses/converting-blueprints-to-c" target="_blank">Converting Blueprints to C++&nbsp;</a></h3> This course takes you from a foundational knowledge of Blueprints and an understanding of the fundamentals of C++, through the process of converting a Blueprint project to C++. Over the course of that journey, you will learn the core concepts and best practices of using C++ in Unreal Engine.<br /> <br /> The final result will be an AI agent that senses the world around them, considers what to do based upon those senses, and navigates the world intelligently to reach a specified goal.&nbsp;<br /> <br /> You will learn how to:&nbsp;<br /> &nbsp; <ul style="margin-left:40px"> <li>Create an actor or component when appropriate.</li> <li>Find functions in C++ and determine which types map to which in Blueprint.</li> <li>Bind C++ functions to user input levels and delegates.</li> <li>Create uproperty variables for different levels of access for the editor and Blueprint.</li> </ul> <img alt="News_GameDevTrackBlog_Body_Image_5.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Flearn-game-development-for-free-with-unreal-online-learning%2FNews_GameDevTrackBlog_Body_Image_5-1640x920-5c31c3fef4c64e6c1457e84ce16ffc155d41c519.jpg" style="height:auto; width:auto" /> <h3><a href="/onlinelearning-courses/creating-photoreal-cinematics-with-quixel" target="_blank">Creating Photoreal Cinematics with Quixel</a></h3> Joe Garth walks you through the creation of a scene from Quixel's Rebirth cinematic, exploring the concepts and tools used.<br /> <br /> You will learn how to:&nbsp;<br /> &nbsp; <ul style="margin-left:40px"> <li>Create an actor or component when appropriate.</li> <li>Find functions in C++ and determine which types map to which in Blueprint.</li> <li>Bind C++ functions to user input levels and delegates.</li> <li>Create uproperty variables for different levels of access for the editor and Blueprint.</li> <li>Import Quixel Megascans via Bridge for use in Unreal Engine.</li> <li>Generate realistic looking lighting with lighting actors.</li> <li>Utilize Unreal Engine tools to compose realistic, organic scenes.</li> <li>Apply post-processing to a scene to make it feel more realistic.</li> <li>Modify export and rendering settings to achieve the best possible quality render.</li> </ul> <img alt="News_GameDevTrackBlog_Body_Image_1.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Flearn-game-development-for-free-with-unreal-online-learning%2FNews_GameDevTrackBlog_Body_Image_1-1640x895-33389c307f33c6e1d952c80afad0feea91cc16a9.jpg" style="height:auto; width:auto" /> This is just the tip of the iceberg for what is to come this year on <a href="/onlinelearning-courses" target="_blank">Unreal Online Learning</a>. Be on the lookout for new learning paths that focus on the skills needed for a career in the games industry, as well as courses that focus on content creation and project-based learning.<br /> <br /> We look forward to the amazing things you will create on your journey.GamesLearningUnreal Online LearningTue, 19 May 2020 22:59:32 GMTTue, 19 May 2020 22:59:32 GMThttps://www.unrealengine.com/blog/learn-game-development-for-free-with-unreal-online-learningWebinar: Source control in Unreal Engine https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fwebinar-source-control-in-unreal-engine%2FWebinar_Source_Control_Thumbnail-375x275-fc31b3c707adfc765c496626ec289e8a59c818da.jpgMissed our webinar on source control systems? Now you can watch it on demand! Discover options for reducing risk and improving collaboration between developers on Unreal Engine projects.If you missed our recent live webinar <strong>Source control in Unreal Engine</strong>, don’t worry! You can watch the replay right here. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/faYmvw_Pd-A " width="100%"></iframe></div> Planning, tools, and execution lie at the heart of any successful project. Agile teams often implement version control systems to record changes to files over time, enabling them to easily recall specific versions at a later point. This reduces risk and improves collaboration between developers.<br /> <br /> In this webinar, Technical Artist Matthew Doyle goes over the different options for source control in the Unreal Editor.<br /> <br /> You’ll learn: <ul style="margin-left:40px"> <li>The basics of using source control in Unreal Engine to manage versioning of Levels, Materials, Blueprints, C++ code, and other content</li> <li>The pros and cons of each source control option</li> <li>How to set up and use Perforce, Git, Plastic SCM, and Subversion</li> </ul> Looking for more learning content? Check out the full <a href="https://www.unrealengine.com/en-US/events/webinar-series" target="_blank">webinar series</a>&nbsp;and visit our <a href="https://www.unrealengine.com/en-US/onlinelearning-courses" target="_blank">Unreal Online Learning</a> portal.WebinarLearningArchitectureAutomotive & TransportationBroadcast & Live EventsFilm & TelevisionGamesTraining & SimulationMore UsesMon, 18 May 2020 16:08:22 GMTMon, 18 May 2020 16:08:22 GMThttps://www.unrealengine.com/blog/webinar-source-control-in-unreal-engineWith top-notch visuals and design, Lies Beneath pushes the boundaries of the Oculus Questhttps://cdn2.unrealengine.com/Unreal+Engine%2Fdeveloper-interviews%2Fwith-top-notch-visuals-and-design-lies-beneath-pushes-the-boundaries-of-the-oculus-quest%2FDeveloper_Interview_THUMB_ALT-375x275-96371fd211f1e0d7f6c75c9b352daeb90005bcfa.jpgDeveloper Drifter shares how it executed on Lies Beneath comic-book visuals and designed the VR horror game’s environments, weapons, enemies, and more.With a <a href="https://www.oculus.com/experiences/quest/1706349256136062/?locale=en_US" target="_blank">near five-star user rating</a>, <em>Lies Beneath</em> is one of the most highly-rated VR games designed for the Oculus Quest. It’s undoubtedly one of the best visually, both from a technical and artistic standpoint with sites like <a href="https://www.useapotion.com/2020/04/lies-beneath-review/" target="_blank">Use A Potion!</a> stating that the game offers “truly impressive sights, some of which will stick with you for some time thanks to their horrific presentation.” The title exemplifies how a AAA quality experience can be tailor-made for Oculus’ mobile VR headset.<br /> &nbsp;<br /> The game was developed by VR veteran studio Drifter, the studio behind 2018’s <em><a href="http://www.driftervr.com/gunheart" target="_blank">Gunheart</a></em> and, more recently, 2019’s <em><a href="http://www.driftervr.com/roborecall" target="_blank">Robo Recall: Unplugged</a></em> for the Oculus Quest. To learn how the team developed <em>Lies Beneath</em>, we interviewed Creative Director Brian Murphy, Art Director Kenneth Scott, VFX Artist Aaron Mortensen, Concept Artist John Wallin, Audio Director Ken Kato, and Tech Director Matt Tonks. They share their inspirations and talk about how they executed on the game’s stylized comic-book aesthetic. They also share how they designed <em>Lies Beneath</em>’s environments, weapons, enemies, and more.&nbsp; <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/S5FzvCWs8JI" width="100%"></iframe></div> <strong><em>Lies Beneath</em> has been compared to <em>Resident Evil 4</em>, <em>Evil Dead</em>, <em>The Thing</em>, and more. Were there any particular games or works of fiction that influenced the title?&nbsp;</strong><br /> &nbsp;<br /> <strong>Creative Director Brian Murphy:</strong> Yeah, the gameplay is heavily influenced by <em>Resident Evil</em> and <em>Silent Hill</em> for sure. We loved the idea of taking their weird and distinctive brand of action horror and translating it into an immersive VR experience. And then obviously when it came to the world and aesthetic of the game, we drew heavily from classic mid-century horror comics like <em>Tales from the Crypt</em>, and more modern Japanese horror comics like the works of Junji Ito.<br /> &nbsp;<br /> <strong>Why was <em>Lies Beneath</em>’s comic-book art style a good fit for the game?</strong><br /> &nbsp;<br /> <strong>Art Director Kenneth Scott:</strong> Every narrative-driven experience, games, films or otherwise, owes its core to the written word. Comics, as a medium, strikes a great chord between the artist’s playground of invention and the reader’s agency and personal interpretation. We’ve all experienced the emotional clash of seeing our favorite childhood illustrated characters brought to film. It’s rough. There is rarely an upside. A lot of the character nuances, timing, and voices took place in our heads, and no Hollywood budget or smart casting could ever compete.&nbsp;<br /> <img alt="DeveloperInterview_Phantom_15.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fdeveloper-interviews%2Fwith-top-notch-visuals-and-design-lies-beneath-pushes-the-boundaries-of-the-oculus-quest%2FDeveloperInterview_Phantom_15-1080x618-cb890b1dcda5d641848743a5f28b4e5e941d4bfa.jpg" style="height:auto; width:auto" /><br /> <strong>To bolster <em>Lies Beneath</em>'s comic-book motif, the game features page-turning comic-book illustrations that feature an awesome 3D parallax effect that's especially cool when you view them from different angles. How did you create this effect?</strong><br /> &nbsp;<br /> <strong>Scott:</strong> This was quality Drifter-team shenanigans. There is some under the hood wizardry that Brian Murphy and Matt Tonks did to make the interactive experience invisible and intuitive, and to quote our Tech Art Czar Drew Hunt, "Through Unreal Engine, we leveraged dark magic that split pixels into an illusion of depth."<br /> &nbsp;<br /> <strong>VFX Artist Aaron Mortensen:</strong> On the <a href="https://docs.unrealengine.com/Engine/Rendering/Materials/index.html" target="_blank">materials</a> side, most of the grunt work was done using the bump offset node, which let us create an illusion of depth on a single mesh plane and kept animation complexity and overdraw down. By splitting up John and Kenneth’s awesome comic art into separate layers, we could push and pull those layers however we wanted using the bump offset and then merge them into a final image that appears to have depth.&nbsp;<br /> &nbsp;<br /> <strong>Concept Artist John Wallin:</strong> We started with rough sketches of each page, thinking about where things would be placed in depth. We had three layers to play with, and the frame/foreground would go first, then middle ground and then background. Sometimes things like snow or a character holding something, or even a car interior, would benefit from being separated and placed across multiple layers.<br /> &nbsp;<br /> After the testing phase gets approved, it’s time to make fully detailed paintings out of the rough sketches. I stuck to the layer order as best as I could so I wouldn’t run into nasty surprises afterward. Having no experience with comic-book art, I tried to mimic a classic comic book while taking advantage of the liberating third dimension. It was hard to figure out the layer orders, but with help from the team and thorough testing, it was pretty straight forward during the polishing phase, and I could focus on just making it look as good as I could.<img alt="DeveloperInterview_Phantom_04.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fdeveloper-interviews%2Fwith-top-notch-visuals-and-design-lies-beneath-pushes-the-boundaries-of-the-oculus-quest%2FDeveloperInterview_Phantom_04-1080x608-08a2e4af70540e43c4b51c81b619b6ac60bf3849.jpg" style="height:auto; width:auto" />&nbsp;<br /> <strong>From creatures that fly to ghouls that stalk you on foot, the monsters in <em>Lies Beneath</em> are menacing. How did you approach designing the game's enemies?</strong><br /> &nbsp;<br /> <strong>Audio Director Ken Kato:</strong> Creature sound design was the hardest part about this game. I tried to help enhance their unique creature qualities and game mechanic roles. For example, the chase sounds of the Hunter sounds like he’s breathing down your neck because that’s what excites people when faced with the Hunter. The head-crab sounds were punched up with exaggerated footstep sounds to communicate to the player these creepy creatures are crawling everywhere. For bipeds, I initially was going to try to push the creepiness factor with their vocal emotes, but I soon realized that when you’re engaged with these characters, the situation is no longer creepy, and you’re in combat. So I approached it more like designing NPC emotes for shooters.<br /> <img alt="DeveloperInterview_Phantom_07.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fdeveloper-interviews%2Fwith-top-notch-visuals-and-design-lies-beneath-pushes-the-boundaries-of-the-oculus-quest%2FDeveloperInterview_Phantom_07-1080x608-cc627c89c29a43468901d10f454a91a00b3bb8f7.jpg" style="height:auto; width:auto" /><br /> <strong>Drifter has stated that the game's massive boss battles were created almost entirely using <a href="https://docs.unrealengine.com/Engine/Blueprints/index.html" target="_blank">Blueprints</a>. Can you walk us through how you created these larger-than-life encounters?&nbsp;</strong><br /> <br /> <strong>Tech Director Matt Tonks: </strong>Blueprints are very well suited to something like a boss battle. Being able to rapidly iterate and make use of deep ties into the level scripting so easily was a huge asset when we were building these fights. For our boss battles, we typically built a prototype (in Blueprint) in a small test environment to work out how the core mechanics of the fight would play out, and then moved it to the real environment where we worked everything together with VFX, audio, and more. Each fight was very different, so the process of building them varied, but the process was always:&nbsp;<br /> &nbsp; <ul style="margin-left:40px"> <li>Prototype what the player actually does in the fight</li> <li>Move functionality into the level and polish all the rough edges to make things cohesive for the player</li> </ul> <img alt="DeveloperInterview_Phantom_14.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fdeveloper-interviews%2Fwith-top-notch-visuals-and-design-lies-beneath-pushes-the-boundaries-of-the-oculus-quest%2FDeveloperInterview_Phantom_14-1080x601-3488123255118328a40a2aba19ba72793ddb75d0.jpg" style="height:auto; width:auto" /><br /> <strong>The ominous environments in the game's mid-twentieth century depiction of Alaska have been praised for delivering a palpable and consistent level of tension. How did you approach world and level design?&nbsp;</strong><br /> &nbsp;<br /> <strong>Murphy:</strong> The first iteration of the level design was actually more like a classic <em>Resident Evil</em> game, where a given chapter might have been broken into three or four smaller “rooms” that, at the time, even had loading screens between them. It wasn’t until months into the game that we decided to stitch the entire game world together into one giant seamless map. That process actually ended up being a really great framework for us, because it meant that the entire game was broken into like 20 or 30-meter chunks, and for each section, we got to ask ourselves the same questions:&nbsp;<br /> &nbsp; <ul style="margin-left:40px"> <li>What is the player’s mode of play in this section? (spooky haunted house, sneaky, action, puzzling)&nbsp;</li> <li>What will be scary or memorable in this section? (set-piece, new monster, cinematic moment, boss, new weapon, etc.)</li> <li>What hidden secrets and narrative beats do we need to unfold here?</li> </ul> <strong>The game features a variety of melee weapons and guns that include axes, revolvers, shotguns, and more. How did you approach designing them along with the game's combat system?</strong><br /> &nbsp;<br /> <strong>Murphy:</strong> Building a combat system in a horror game is always an interesting challenge because it’s kind of the inverse of your standard video game power fantasy. Instead of building your skill to a level where you can effortlessly wipe out everything that gets in your way, we wanted to create a combat system that often made you feel vulnerable and helpless. One big way we did that was by building little flaws into every weapon that required some kind of VR motion control interaction to overcome. Melee weapons occasionally get stuck in the things you hit, and can even get yanked from your hand from time to time. Bear traps dangle from your hands heavily and are made more difficult to throw, and each gun requires its own physical interaction to reload.&nbsp;<br /> <br /> Of course, we didn’t want the game to be pure desperation, so we also added quite a few systems that rewarded players for improvising, or demonstrating skill or effort. For melee weapons, we reward bigger, harder swings with more damage, and we did a ton of work on the throwing system so that weapons always go where you feel like they should when you release them (If we left throwing completely up to physics, the thrown item would tumble off into a hilariously random direction.) We also paired the gun with the lighter, so that if you held the lighter up while aiming down the sight of the gun, we’d show you a little reticle and vulnerable spots on an enemy, but all of those bonuses come at a cost. Holding the lighter instead of a melee weapon makes you more vulnerable up close, and throwing a melee weapon will leave you empty-handed. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/jLGckK5MLjk" width="100%"></iframe></div> <strong>In addition to providing an aiming reticle for guns and brightening dark environments, the lighter also lights the correct path forward. How did you come up with this mechanic?&nbsp;</strong><br /> &nbsp;<br /> <strong>Murphy:</strong> From the very beginning, we knew that giving the player an item that revealed hidden information and spooky details would be cool, especially in VR. We actually prototyped lots of different objects. My favorite, before the lighter, being a hoop that the player could hold up and look through to reveal an alternate reality. That felt cool in VR, as it had a nice physical and understandable element to it, but it also felt like it didn’t fit into the world we were creating. When the idea for the lighter came along, it felt perfect because it was a little more vague and unreliable than the magic disk, and we felt like that it was a scarier tool to use. It also introduced all kinds of fun physical interaction like flicking it on and the embers that made it a really interesting VR experience as well.<br /> &nbsp;<br /> <strong><em>Lies Beneath</em> features a very minimal UI. The blood on your hands indicates your health and holsters across your virtual body grant quick access to weapons. Was there a lot of iteration involved here?&nbsp;</strong><br /> &nbsp;<br /> <strong>Murphy: </strong>We knew from the beginning that we didn’t want to have any health bars or anything. We tried as much as we could to always place pertinent gameplay information in physical context with game objects in the world. That’s why the number of bullets in a gun can only be seen when reloading when you’re looking at the gun, and your health can only be seen via the red splatters on your hands. By providing incomplete information on your game state, or forcing the player to split their attention between checking their combat readiness and the encounter happening in front of them, we hoped to make every encounter more frantic, and more immersive.<br /> <img alt="DeveloperInterview_Phantom_12.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fdeveloper-interviews%2Fwith-top-notch-visuals-and-design-lies-beneath-pushes-the-boundaries-of-the-oculus-quest%2FDeveloperInterview_Phantom_12-1080x608-0f18451512c7307552a76fcce7fea47a9a6d285c.jpg" style="height:auto; width:auto" /><br /> <strong>With this being Drifter Entertainment's third VR game, what have you learned from those experiences that you've built upon for <em>Lies Beneath</em>?</strong><br /> <br /> <strong>Tonks:</strong> <em>Gunheart</em> and <em>Robo</em> turned out to be a great combo for us. We cut our teeth on VR interactions and gameplay mechanics in <em>Gunheart</em> and got deep experience with the Quest on <em>Robo Recall: Unplugged</em>. <em>Gunheart</em> helped us build a more interesting VR experience, and <em>Robo</em> helped us build it at framerate on the more modest resources of the Quest.<br /> &nbsp;<br /> <strong><em>Lies Beneath</em> is one of the most visually stunning games on the Oculus Quest and runs well, too. Can you provide some insight into how the team was able to strike this graphical balance?</strong><br /> <br /> <strong>Tonks: </strong>At this point, Drifter collectively has a huge amount of experience with the Quest and what it’s capable of. Based on our prior knowledge, we knew from day one regarding how many enemies on screen we could handle, how complex our shaders could be, and how many things can be drawn at once before frame rate dips. That said, I think there are three main points that summarize the bulk of our efforts in this regard:<br /> &nbsp; <ul style="margin-left:40px"> <li>Wisely selecting an art style that we would be confident we could make look awesome within the constraints of the platform</li> <li>Leveraging (and modifying) the <a href="https://docs.unrealengine.com/Engine/HLOD/index.html" target="_blank">HLOD</a> system to merge draw calls down in order to get the environments looking the way we wanted without blowing the budget</li> <li>Switching to <a href="https://docs.unrealengine.com/Platforms/Mobile/Android/VulkanMobileRenderer/index.html" target="_blank">Vulkan</a> in order to bring down rendering overhead and enable visual effects like tone mapping (via subpasses) while still staying within budget</li> </ul> <img alt="DeveloperInterview_Phantom_05.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fdeveloper-interviews%2Fwith-top-notch-visuals-and-design-lies-beneath-pushes-the-boundaries-of-the-oculus-quest%2FDeveloperInterview_Phantom_05-1080x608-82ef41afa32ecb190034b07af047edde814913a3.jpg" style="height:auto; width:auto" /><br /> <strong>From scary monster moans to eerie musical compositions, <em>Lies Beneath</em> features awesome audio design. Can you elaborate on how this was achieved?</strong><br /> &nbsp;<br /> <strong>Kato: </strong>I touched a little bit on sound design earlier, so I’ll talk about music. We had three composers: Kazuma Jinnouchi (<em>Metal Gear Solid 4, Halo 4, Halo 5, Pokemon Detective Pikachu</em>), Richard Williams, and myself. I basically assigned different areas and usages of music to different composers. Kazuma worked on large combat music. Richard did minor battles. And I handled the rest, which included the menu music, incidental stingers, and cut scenes. In terms of the production, I personally found horror music to be really hard to execute. Dissonance is hard, especially when you don’t have access to live musicians being that we're an indie company with limited resources. So, we relied on commercially available samples. At the beginning of the project, I immediately found out playing random notes on a keyboard doesn’t necessarily produce “dissonant” music. So we had to make up some rules in order to make the pieces dissonant but cohesive. It was really fun to work on horror-themed music. I treated it as a once-in-a-lifetime opportunity, but I hope to do it again in the future.<br /> &nbsp;<br /> <strong>Thanks for your time. Where can people learn more about the game?</strong><br /> &nbsp;<br /> Right on the Oculus store pages: <a href="https://www.oculus.com/experiences/quest/1706349256136062/" target="_blank">Quest</a> and <a href="https://www.oculus.com/experiences/rift/3567174723300615/" target="_blank">Rift</a>.lies beneathOculus QuestDrifter EntertainmentUnreal EngineUE4GamesArtBlueprintsCommunityDesignVRJimmy ThangMon, 11 May 2020 18:30:00 GMTMon, 11 May 2020 18:30:00 GMThttps://www.unrealengine.com/developer-interviews/with-top-notch-visuals-and-design-lies-beneath-pushes-the-boundaries-of-the-oculus-questInside Xbox event reveals wave of new Unreal Engine-powered gameshttps://cdn2.unrealengine.com/Unreal+Engine%2Fevents%2Finside-xbox-event-reveals-wave-of-new-unreal-engine-powered-games%2FInsideXbox_THUMB_ALT-375x275-212a3b61a71289a428042fbd3127e41a80329ad4.jpgWith games like Bright Memory: Infinite, Call of the Sea, Chorus, and more, the majority of newly revealed titles shown running on Xbox Series X during Microsoft’s gaming event are powered by Unreal Engine.&nbsp;Microsoft recently unveiled several next-generation games running on Xbox Series X during its <a href="https://www.youtube.com/watch?time_continue=9&amp;v=XCTPu1aUsTE&amp;feature=emb_title" target="_blank">Inside Xbox livestream</a>, the first in its Xbox 20/20 series. The majority of the newly revealed titles are created with Unreal Engine. Check out the full list of exciting new UE-powered games along with their descriptions and trailers below.&nbsp; <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/xtwPMBdgPGE" width="100%"></iframe></div> <strong><em><a href="https://www.xbox.com/games/bright-memory-infinite" target="_blank">Bright Memory: Infinite</a></em> | FYQD-Studio/Playism</strong><br /> <br /> Unreal Dev Grant recipient <em>Bright Memory: Infinite</em> is an all-new lightning-fast fusion of the FPS and action genres created by FYQD-Studio. Combine a wide variety of skills and abilities to unleash dazzling combo attacks. <em>Bright Memory: Infinite</em> is set in a sprawling, futuristic metropolis in the year 2036. A strange phenomenon that scientists can’t explain has occurred in the skies around the world. The Supernatural Science Research Organization (SRO) has sent agents to various regions to investigate this phenomenon. It is soon discovered that these strange occurrences are connected to an archaic mystery – an as-of-yet unknown history of two worlds that’s about to come to light.<br /> &nbsp; <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/6lq25iJbmMY" width="100%"></iframe></div> <strong><em><a href="https://www.xbox.com/games/call-of-the-sea" target="_blank">Call of the Sea</a></em> | Out of the Blue/Raw Fury</strong><br /> <br /> <em>Call of the Sea</em> is an otherworldly adventure game set in the 1930s South Pacific. Explore a lush island paradise and puzzle out the secrets of a lost civilization in the hunt for your husband’s missing expedition.<br /> &nbsp; <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/gnRy-waJBjA" width="100%"></iframe></div> <strong><em><a href="https://www.xbox.com/games/chorus" target="_blank">Chorus</a></em> | Deep Silver Fishlabs/Deep Silver</strong><br /> <br /> <em>Chorus</em> is a new space-flight combat shooter releasing in 2021. Become Nara and Forsaken, her sentient starfighter, on a compelling, personal journey of redemption. Unlock devastating weapons and mind-bending abilities in a true evolution of the space-combat shooter. Explore breath-taking interstellar vistas, ancient temples, and venture beyond our waking reality. Outgun, outwit, and outmaneuver your enemies in an epic quest to free the galaxy from oppression.&nbsp;<br /> &nbsp; <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/fyCdiZ10Tgc" width="100%"></iframe></div> <strong><em><a href="https://en.bandainamcoent.eu/scarlet-nexus/scarlet-nexus" target="_blank">SCARLET NEXUS</a></em> | Bandai Namco Studios/Bandai Namco Entertainment</strong><br /> <br /> In a far distant future, humanity’s last hope falls into the hands of an elite group of psionic soldiers, who battle an invincible threat known as Others. Unravel the mysteries of a Brain-Punk future caught between technology and psychic abilities in <em>SCARLET NEXUS</em>.<br /> &nbsp; <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/i9eg7Pzk15g" width="100%"></iframe></div> <strong><em><a href="https://scorn-game.com/" target="_blank">Scorn</a></em> | Ebb Software&nbsp;</strong><br /> <br /> <em>Scorn</em> is an atmospheric first-person thriller adventure currently in development by Ebb Software.<br /> &nbsp; <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/32gaWV-5hXg" width="100%"></iframe></div> <strong><em><a href="https://www.xbox.com/games/the-ascent" target="_blank">The Ascent</a></em> | Neon Giant/Curve Digital</strong><br /> <br /> <em>The Ascent</em> is a solo and co-op action RPG set in a cyberpunk world created by Unreal Dev Grant recipient Neon Giant. The mega-corporation that owns you and everyone, The Ascent Group, has just collapsed. Can you survive without it?<br /> &nbsp; <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/Yw2DGm280Pg" width="100%"></iframe></div> <strong><em><a href="http://www.themediumgame.com/" target="_blank">The Medium</a></em> | Bloober Team</strong><br /> <br /> <em>The Medium</em> is a next-gen psychological-horror game, featuring a “dual” soundtrack by Akira Yamaoka and Arkadiusz Reikowski.<br /> <br /> Become a medium living in two worlds: the real and the spirit. Haunted by a vision of a child’s murder, you travel to an abandoned hotel resort, which years ago became the stage of an unthinkable tragedy. There you begin your search for difficult answers.<br /> &nbsp; <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/5MzIbWbMFi0" width="100%"></iframe></div> <strong><a href="https://www.bloodlines2.com/" target="_blank"><em>Vampire: The Masquerade – Bloodlines 2</em></a> | Hardsuit Labs/Paradox Interactive</strong><br /> <br /> Enter the World of Darkness and rise through vampire society. Experience Seattle: a city full of alluring, dangerous characters, and factions. You are dead now but stronger, quicker, more alluring, and have the potential for much more. Choose to be brutal and unflinching or cultured and seductive. Use charm, cunning, terror, and sheer will to rise through vampire society. What monster will you be? <hr />Interested in developing for next-gen? We’ve added initial support for Sony’s PlayStation 5 and Microsoft’s Xbox Series X consoles in the recently-released Unreal Engine 4.25. Learn more <a href="https://www.unrealengine.com/blog/unreal-engine-4-25-released" target="_blank">here</a>.<br /> &nbsp;inside xboxnext-gen gamesUnreal EngineUE4Xbox Series XGamesCommunityNewsJimmy ThangThu, 07 May 2020 21:30:00 GMTThu, 07 May 2020 21:30:00 GMThttps://www.unrealengine.com/events/inside-xbox-event-reveals-wave-of-new-unreal-engine-powered-gamesExploring the depths of the new Sky Atmosphere system https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fexploring-the-depths-of-the-new-sky-atmosphere-system%2FTHUMBNAIL_SkyAtmosphere_Presentation-375x275-9925b72cd2ec6cd27276b7ee383ff168eb2403d6.jpgUsing Quixel Megascans assets, Epic's Sjoerd De Jong breaks down how to render a beautiful and fully dynamic sky while showcasing atmospheric changes and powerful Material Editor integration.Using <a href="https://quixel.com/" target="_blank">Quixel Megascans</a> assets, “Exploring the depths of the new Sky &amp; Atmosphere system” is a new hands-on presentation by Epic's Sjoerd De Jong that explores the robust Sky Atmosphere system that’s now available in <a href="https://www.unrealengine.com/en-US/blog/unreal-engine-4-25-released" target="_blank">Unreal Engine 4.25</a>.&nbsp;<br /> &nbsp; <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/SeNM9zBPLCA" width="100%"></iframe></div> <br /> Starting off with the basics, you'll learn how to render a beautiful and fully dynamic sky within seconds. From there, the session showcases how to create alien, dusty, or wet atmospheres, after which you'll be taken all the way up into space to witness how atmosphere rendering is altered depending on altitude. Finally, explore the powerful Material Editor integration and how it can be used to create different kinds of sky styles. The session wraps up with a brief look at the engine’s upcoming volumetric cloud features.<br /> <br /> For more information on the Sky Atmosphere system, please visit <a href="https://docs.unrealengine.com/en-US/Engine/Actors/FogEffects/SkyAtmosphere/index.html" target="_blank">this documentation page</a>.<br /> <br /> You can also learn more about Quixel Megascans’ mission and the future of the Sky Atmosphere system in <a href="https://youtu.be/oDQl9gw_fRM" target="_blank">this short video</a>.GamesNewsQuixelThu, 07 May 2020 18:00:00 GMTThu, 07 May 2020 18:00:00 GMThttps://www.unrealengine.com/blog/exploring-the-depths-of-the-new-sky-atmosphere-systemIke develops virtual simulator for automated trucks with Unreal Enginehttps://cdn2.unrealengine.com/Unreal+Engine%2Fspotlights%2Fike-develops-virtual-simulator-for-automated-trucks-with-unreal-engine%2FSpotlight_IKE_Thumbnail-375x275-6449835fd2a65d448894aa09496ee049551401e6.jpgWith a goal of automating long-distance freight transportation in partnership with the trucking industry, Ike is developing a virtual simulator by building on Unreal Engine’s out-of-the-box AI and simulation features.Based in Silicon Valley, automation technology company <a href="https://www.ike.com/" target="_blank">Ike</a> is currently working in partnership with the trucking industry to automate long-distance freight transportation. The team, which is made up of a number of veterans of the self-driving industry—alumni of companies including Uber, Waymo, Tesla, Apple, and Cruise—shares a common belief that moving goods instead of people massively simplifies the technical challenges for automated vehicles.&nbsp;<br /> <br /> With decades of experience in robotics, the team is confident that it can bring a safe, scalable solution to market. What remains the key challenge is proving that safety aspect, as Simulation Lead Pete Melick explains. “The rates of difficult events that you encounter driving many miles over the highway are very low,” he says. “To prove with statistical confidence that your truck is going to be resilient in any different combination of extreme things that could happen—but don't happen very often to normal trucks—is quite difficult. It would require driving literally tens of millions of miles.” <h2>A hybrid approach to simulation</h2> Instead, the team elected to use simulation as their primary validation tool. There are two types of simulation for autonomous driving, each with its pros and cons. By using both methods, Ike is able to get the best of both worlds.<br /> <br /> <strong>Log simulation</strong> means feeding data from real driving into the automation system. The advantage of using real driving data is that the logged data contains all of the nuanced imperfections of real sensors and real interactions, but it also has the disadvantage that closed-loop control is not possible. In other words, the scene recorded in a real-world log can’t change in response to the behavior of the simulated vehicle. If the automated truck decides to drive slower, other vehicles in the simulation will not react to that change and the simulation loses a lot of its value.&nbsp;<br /> <img alt="Spotlight_IKEBlog_Body_Image_3.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fspotlights%2Fike-develops-virtual-simulator-for-automated-trucks-with-unreal-engine%2FSpotlight_IKEBlog_Body_Image_3-1640x900-f2cbc0c26667d574a4dd451bf3adaf2aefd142aa.jpg" style="height:auto; width:auto" /> <div style="text-align:center"><em>A log sim where the simulated vehicle’s behavior (blue) diverges from that of the logged vehicle (red)</em></div> <br /> <strong>Virtual simulation</strong> uses fabricated scenarios and responsive Actors, like a video game. Unlike log simulation, other vehicles in a virtual scenario can respond to behavior from the automation system. The unit of a virtual sim is called a <em>scenario</em>, a specific event that the truck might encounter while driving on the highway. Virtual simulation unlocks the possibility of procedurally varying the parameters that define a scenario to create many similar but distinct scenarios, each of which is known as a <em>variation</em>. Variations enable Ike to test its automation software in a vast array of possible circumstances.&nbsp;<br /> <br /> Ike had an advantage in its simulation development through its prior acquisition of a code base from another automated driving company, <a href="https://medium.com/nuro/ike-hits-the-road-fec294de7a5c" target="_blank">Nuro</a>. The result of two years of development effort, the code base constituted what Melick describes as a world-class infrastructure for robotics, including essential technologies like visualization, mapping, on-board infrastructure, cloud infrastructure, machine learning, labeling pipelines, and log simulation. The team took this as its starting point, and then added its members’ own expertise in the domain. <h2>Designing scenarios in Unreal Engine</h2> To develop its virtual simulation tool, the team turned to Unreal Engine, spending a year extending the many relevant out-of-the-box features for its specific needs.&nbsp;<br /> <br /> Melick explains that while everyone working on virtual simulation recognizes the benefits of closed-loop control and variations, not everyone is implementing them in the same way.&nbsp;<br /> <br /> “The way that we have differentiated ourselves is the level to which we have embraced the game engine as a key component of our virtual simulation tool, as opposed to some other companies who are using a game engine—maybe even using Unreal Engine—but are kind of keeping it at arm's length and using it as just an image generator in conjunction with another simulation tool,” he says.<br /> <br /> As a first step in that process, Ike customized the Unreal Engine Level Editor to be its scenario design tool.&nbsp;<br /> Ike’s trucks calculate their position in the world using high-definition maps, consisting of LiDAR intensity and elevation data, which is collected and processed into lane centers and boundaries. That same map data is streamed into Unreal Engine using the Landscape API, so that the team can design and run their scenarios on it.<br /> <img alt="Spotlight_IKEBlog_Body_Image_4.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fspotlights%2Fike-develops-virtual-simulator-for-automated-trucks-with-unreal-engine%2FSpotlight_IKEBlog_Body_Image_4-1640x896-faf0a745f6ae316cfed258eaedc2bdb007454697.jpg" style="height:auto; width:auto" /> <div style="text-align:center"><em>Designing a scenario in the Unreal Editor on Ike’s maps</em></div> <br /> The automation system requires higher-resolution map data than is easily found in open-source data formats; to capture the necessary data, Ike uses a special mapping vehicle fitted out with two LiDAR scanners and physically drives it down the highway. This makes the company completely self-sufficient, giving it the power to simulate anywhere it can drive its mapping vehicle.<br /> <br /> Once the maps are imported, most of the building blocks for scenario design are available out of the box: triggers based on time or distance, splines for Actors to follow, an efficient environmental query system, a fully featured and customizable GUI, a scripting language for designing arbitrarily complex choreographies.&nbsp;<br /> <br /> “Game engines are tools for building living, breathing worlds—levels in which the player makes decisions and the world reacts realistically,” says Melick. “That's all a scenario is, except the player is a robot.”<br /> <br /> How should that robot control its avatar in the simulator? And how should the simulator provide the necessary inputs to the robot? Ike’s automation software consists of code modules, which communicate with each other using <a href="https://developers.google.com/protocol-buffers" target="_blank">Google Protocol Buffer</a> messages. The team has made some small modifications to the engine to enable it to send and receive those same messages. Any Actor or Component in the simulator can publish or subscribe to any message, just like the onboard software modules.&nbsp; <div style="text-align:center"><img alt="side_by_side_optimized.gif" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fspotlights%2Fike-develops-virtual-simulator-for-automated-trucks-with-unreal-engine%2Fside_by_side_optimized-00bfe7287be6f77f706281b4f25b918676471df5.gif" style="height:auto; width:auto" /><br /> <em>A scenario playing in Ike’s log viewer (left) and simulator (right). Animation courtesy of Ike.</em></div> <br /> In the current setup, the simulator publishes mock object detections, which are fed to the tracking software, and subscribes to steering, throttle, and brake commands, which control the motion of the simulated vehicle.<br /> <br /> Ike has also used Unreal Engine's AI Perception system to add some intelligent behaviors to its simulated agents. For example, while following a mapped lane or a predetermined spline, they can detect an obstacle in their path and use an IDM-based speed controller to avoid a collision. <h2>Harnessing the power of Blueprint</h2> To enable designers to extend the range of scenarios, the team exposes functionality to <a href="https://docs.unrealengine.com/en-US/Engine/Blueprints/index.html" target="_blank">Blueprint</a>, Unreal Engine’s visual scripting system.<br /> <br /> “We know that as fast as we work, we can never outpace the imaginations of our scenario designers,” says Melick. “We need to help them create scenarios in efficient and repeatable ways.”&nbsp;<br /> <br /> Using Blueprint, designers create new behaviors and choreography. For example, they can make an Actor weave left and right about its lane with a parameterized period and amplitude. Or they can add simulated noise to the detections fed to the autonomy software to test its sensitivity to imperfect inputs. They can even use Blueprint to create a keyboard-controlled Actor to interact with the simulated truck—all without writing a line of C++.<br /> <img alt="Spotlight_IKE_Blog_Body_Image_1.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fspotlights%2Fike-develops-virtual-simulator-for-automated-trucks-with-unreal-engine%2FSpotlight_IKE_Blog_Body_Image_1-1640x900-5d877df8f49d903680fae2f10544cb0b96668226.jpg" style="height:auto; width:auto" /> <div style="text-align:center"><em>Adding noise to simulated autonomy inputs in Blueprint</em></div> <br /> Blueprint is also the key to Ike’s variations system. A designer adds a variable parameter by adding a Component to an Actor and implementing a Blueprint function that defines the effect of varying that parameter.&nbsp;<br /> <br /> “In this way, we can vary just about any property of a scenario,” says Melick. “We can vary the position, orientation, speed, or size of any Actor. We can vary the target an Actor wants to drive towards, their desired following distance, or how aggressively they'll change lanes. If it can be expressed in Blueprint, it can be varied.”<br /> <br /> Melick is aware that some people criticize the use of game engines as simulators, claiming that they don’t allow for <em>determinism</em>, the property where repeated executions of the same simulation produce identical results.&nbsp;<br /> <br /> “That assumption is wrong,” he says. “Ike’s simulator is deterministic, enabling us to benefit from fast, repeatable offline testing. We’ve only scratched the surface of Unreal’s potential for autonomy simulation. We have many exciting projects underway, including using virtual sim for repeatable hardware-in-the-loop testing and making it possible to automatically generate virtual scenarios from real driving events.”&nbsp;<br /> <img alt="controllable_actor_optimized.gif" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fspotlights%2Fike-develops-virtual-simulator-for-automated-trucks-with-unreal-engine%2Fcontrollable_actor_optimized-39066a9633900cda3a070c45bc3a011bf3dbc8b5.gif" style="height:auto; width:auto" /> <div style="text-align:center"><em>Manually driving a virtual vehicle in Ike’s simulation.&nbsp;Animation courtesy of Ike.</em><br /> &nbsp;</div> While we may still be some time away from seeing driverless trucks safely navigating our highways, every scenario Ike creates in Unreal Engine brings that goal one step closer. It’s a reassuring thought.<br /> <br /> <br /> Interested in finding out how you could unleash Unreal Engine’s potential for training and simulation? <a href="mailto:simulation@epicgames.com" target="_blank">Get in touch</a> to start that conversation.<br /> &nbsp;AIAutomotive & TransportationIkeTraining & SimulationAutonomous vehiclesSébastien LozéWed, 06 May 2020 20:14:53 GMTWed, 06 May 2020 20:14:53 GMThttps://www.unrealengine.com/spotlights/ike-develops-virtual-simulator-for-automated-trucks-with-unreal-engineFeatured free Marketplace content - May 2020https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Ffeatured-free-marketplace-content---may-2020%2FNews_UESC_MAY2020_THUMB-375x281-8cbe6c0d629b45d7e16ecc1ea72cab564f2e5d1c.jpgMake development a breeze! Rapidly layout scenes, design metropolises, add flair with effects, and more with May’s free Unreal Engine Marketplace content.In an ongoing partnership with Unreal Engine Marketplace creators, select content is available for free to the Unreal community each month, giving artists, designers, and programmers access to additional resources at no extra cost.<br /> &nbsp; <h2><strong>May's featured free content:</strong></h2> <h2><a href="https://www.unrealengine.com/marketplace/product/driveable-cars-basic-pack" target="_blank">Drivable Cars Basic Pack</a> | <a href="https://www.unrealengine.com/marketplace/profile/Digital+Dive+Studio" target="_blank">Digital Dive Studio</a></h2> <div style="text-align:center"><img alt="News_UESC_MAY2020_Blog1.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Ffeatured-free-marketplace-content---may-2020%2FNews_UESC_MAY2020_Blog1-770x433-ba82a346cc47aea78b34ac36a749d7bcd74208e1.jpg" /></div> <div style="text-align:center"><em>Get behind the wheel of three VR-ready drivable cars, featuring hatchback, sedan, and SUV designs, complete with LODs.</em></div> <h2><a href="https://www.unrealengine.com/marketplace/product/materialize-vfx" target="_blank">Materialize VFX</a> | <a href="https://www.unrealengine.com/marketplace/profile/W3+Studios" target="_blank">W3 Studios&nbsp;</a></h2> <div style="text-align:center"><img alt="News_UESC_MAY2020_Blog2.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Ffeatured-free-marketplace-content---may-2020%2FNews_UESC_MAY2020_Blog2-770x433-2f19edbb8a2fc24582b7d84be86700cc245494b2.jpg" /><br /> <em>Warp models in and out with style, using a variety of preset effects such as icy frost or a spectral effect—modify them, and add new patterns!</em></div> <h2><a href="https://www.unrealengine.com/marketplace/product/modern-city-downtown-with-interiors-megapack" target="_blank">Modern City Downtown with Interiors Megapack</a> | <a href="https://www.unrealengine.com/marketplace/profile/Leartes+Studios" target="_blank">Leartes Studios</a></h2> <div style="text-align:center"><img alt="News_UESC_MAY2020_Blog3.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Ffeatured-free-marketplace-content---may-2020%2FNews_UESC_MAY2020_Blog3-770x433-502e48ddaff36af12872cfd735a5527c112ad47e.jpg" /><br /> <em>Build a bustling metropolis with over 350 game-ready meshes, including buildings, vehicles, props, signs, and more!</em></div> <h2><a href="https://www.unrealengine.com/marketplace/product/sci-fi-robot-01" target="_blank">Sci Fi Robot</a> | <a href="https://www.unrealengine.com/marketplace/profile/Dspazio" target="_blank">Dspazio</a></h2> <div style="text-align:center"><img alt="News_UESC_MAY2020_Blog4.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Ffeatured-free-marketplace-content---may-2020%2FNews_UESC_MAY2020_Blog4-770x433-ce07a4ffb2e7c793ac657a5ffb874ddb6ee483c9.jpg" /><br /> <em>A&nbsp;VR and console-ready robot character with high-quality customizable materials perfect for your sci-fi projects.</em></div> <h2><a href="https://www.unrealengine.com/marketplace/product/the-targeting-system" target="_blank">The Targeting System</a> | <a href="https://www.unrealengine.com/marketplace/profile/Ibrahim+Akinde" target="_blank">Ibrahim Akinde</a></h2> <div style="text-align:center"><img alt="News_UESC_MAY2020_Blog5.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Ffeatured-free-marketplace-content---may-2020%2FNews_UESC_MAY2020_Blog5-770x433-7426b4340bbf822a35011f07336e2d7a4c7ed5b9.jpg" /><br /> <em>Keep your eye on the target with this plug-and-play solution for directional targeting that is both customizable and multiplayer-ready.</em></div> <br /> Download this month’s array of free products—then come back for a special treat, and more free content, from the Marketplace in June!<br /> &nbsp; <hr />Are you a Marketplace creator interested in sharing your content for free with the community? Visit <a href="https://www.unrealengine.com/uesponsoredcontent" target="_blank">unrealengine.com/uesponsoredcontent</a> to learn how you could be featured!<br /> &nbsp;CommunityEventsNewsGamesMarketplaceAmanda SchadeTue, 05 May 2020 13:30:00 GMTTue, 05 May 2020 13:30:00 GMThttps://www.unrealengine.com/blog/featured-free-marketplace-content---may-2020Unreal Engine Spotlight - BANDAI NAMCO Entertainmenthttps://cdn2.unrealengine.com/Unreal+Engine%2Fspotlights%2Funreal-engine-spotlight---bandai-namco-entertainment%2FTHUMBNAIL_BNE_Spotlight-375x275-e14f20125a87709d2bc954bce816cccf47bc986f.jpgThe team at BNE breaks down its use of UE across major franchises including TEKKEN, SOULCALIBUR, and ACE COMBAT.With storied histories dating back to the 1950s, both Bandai and Namco have given the world some of its most memorable toys, anime, and gaming experiences. From <em>Power Rangers</em> to <em>Pac-Man</em>, there’s no shortage of iconic franchises under the umbrella of the two influential companies that merged in 2005 to form BANDAI NAMCO Entertainment.&nbsp; <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/yju3K7cYlgM" width="100%"></iframe></div> Today, the unified team continually looks to evolve its franchises while fostering creativity both within its own company and amongst its global gaming community. At its core, BANDAI NAMCO strives to serve its fans with compelling content that carries on the legacy of legendary gaming franchises, including <em>TEKKEN</em>, <em>SOULCALIBUR</em>, and <em>ACE COMBAT</em>.<br /> <br /> <img alt="BNE_Spotlight_SOULCALIBUR_2.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fspotlights%2Funreal-engine-spotlight---bandai-namco-entertainment%2FBNE_Spotlight_SOULCALIBUR_2-1920x1080-afd30c72f3821f1fb676f11ad75f7e16dc05d047.jpg" style="height:auto; width:auto" /> <div style="text-align:center"><em>SOULCALIBUR VI</em></div> <br /> Of course, keeping multiple franchises fresh while maintaining the flexibility to iterate on new ideas isn’t an easy task. It requires an ambitious approach and the use of proven, constantly improving tools that enable teams of all sizes to achieve success while providing a level of quality that signifies a true evolution for each franchise.<br /> <br /> In 2013, at the launch of the PlayStation 4 and Xbox One, game director Katsuhiro Harada understood these technical and creative challenges and led a charge at BANDAI NAMCO Entertainment to adopt Unreal Engine to accommodate each team’s individual needs and the never-ending pursuit of quality.<br /> <br /> <img alt="BNE_Spotlight_TEKKEN_2.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fspotlights%2Funreal-engine-spotlight---bandai-namco-entertainment%2FBNE_Spotlight_TEKKEN_2-1920x1080-d8c2ca147376cd12a200eda9d2e71800ff33b06a.jpg" style="height:auto; width:auto" /> <div style="text-align:center"><em>TEKKEN 7</em></div> <br /> “<em>TEKKEN</em> has a 25-year history, and <em>SOULCALIBUR</em> and <em>ACE COMBAT</em> both have more than 20 years of history. So, there are multiple generations of fans waiting for these titles. So, it’s very difficult to approach all of them,” said Harada. “Interestingly, all three of these titles were born in the polygon generation, when people called them 3D polygon games. Not the gameplay, but the graphic impact was the first thing to help us establish these franchises when they debuted 20 or 25 years ago. So, it is important for us to make the audience understand, with the first glance, that this is something different and new. The visual impact, at first glance, is very important.”&nbsp;<br /> <br /> <img alt="BNE_Spotlight_ACECOMBAT_2.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fspotlights%2Funreal-engine-spotlight---bandai-namco-entertainment%2FBNE_Spotlight_ACECOMBAT_2-1920x1080-89cb1fc9b5d3188d7c0555040ec5ddc70010e9ea.jpg" style="height:auto; width:auto" /> <div style="text-align:center"><em>ACE COMBAT 7: SKIES UNKNOWN</em></div> <br /> “I know many companies and developers are trying to develop and use their original engine,” said Harada. “But BANDAI NAMCO being such a large company and having such a wide variety of titles, not only in content but also the variety of cost and size of games, it is very difficult for one original engine to take care of all of these requirements for the wide variety of titles.”<br /> <br /> As seen in the video above, we had the chance to visit the BANDAI NAMCO Entertainment team in Japan to learn more about their use of Unreal Engine. From <a href="https://www.bandainamcoent.com/games/tekken-7" target="_blank"><em>TEKKEN 7</em></a> to <a href="https://www.unrealengine.com/en-US/spotlights/how-soulcalibur-vi-became-the-best-fighting-game-in-the-franchise" target="_blank"><em>SOULCALIBUR VI</em></a> and <em><a href="https://www.unrealengine.com/en-US/developer-interviews/ace-combat-7-soars-high-with-ue4-to-become-franchise-s-best-installment" target="_blank">ACE COMBAT 7: SKIES UNKNOWN</a></em>, including its VR capabilities, the developers break down how Unreal Engine has altered their content-creation pipeline and helped them build upon the legacies of these storied gaming franchises.Bandai Namco EntertainmentGamesTEKKEN 7SOULCALIBUR VIACE COMBAT 7: Skies UnknownDaniel KayserTue, 05 May 2020 11:30:00 GMTTue, 05 May 2020 11:30:00 GMThttps://www.unrealengine.com/spotlights/unreal-engine-spotlight---bandai-namco-entertainmentOre Creative explains how to leverage Editor Utility Widgets to stylize your gamehttps://cdn2.unrealengine.com/Unreal+Engine%2Ftech-blog%2Fore-creative-explains-how-to-leverage-editor-utility-widgets-to-stylize-your-game%2FTechBlog_IRA_THUMB_ALT-375x275-bb726674ff5d7067df739e667fb31a532483c856.jpgIra Act 1: Pilgrimage creator Zachary Downer shares his stylization pipeline to create the atmospheric visuals for the upcoming adventure game.&nbsp;<strong>Introduction&nbsp;</strong><br /> <br /> Hello, my name is Zachary Downer and I’m the creator of <em>Ira Act 1: Pilgrimage</em>, an upcoming atmospheric adventure game from Ore Creative that you can check out <a href="http://www.iragame.com/" target="_blank">here</a>.&nbsp;<br /> <br /> In this tech blog, I’ll be covering <em>Ira</em>’s <a href="https://docs.unrealengine.com/Engine/Blueprints/index.html" target="_blank">Blueprint</a>-based asset stylization pipeline, why it was created, and how you can leverage Unreal Engine’s <a href="https://docs.unrealengine.com/Engine/UMG/UserGuide/EditorUtilityWidgets/index.html" target="_blank">Editor Utility Widgets</a> in your projects.<br /> <br /> <strong>Benefits of stylization</strong><br /> <br /> Stylizing your game is a great way to do a lot with your visuals using limited resources. On our end, stylization allows <em>Ira</em> to use almost any asset from the asset store, speed up development time, define ourselves visually, and allows us to create an overall richer experience with a lot less work.<br /> <br /> Like many in the indie scene, I’m a solo developer who can 3D model, texture, use Blueprints, and more. I know enough to be dangerous in a handful of areas, but I’m not going to be creating all the assets for <em>Ira Act 1: Pilgrimage</em> single-handedly as it would take too much time and resources. If I attempted to make everything in Ira myself, the game may never get released. The limitation of time means I need to leverage premade assets from places like the <a href="https://www.unrealengine.com/marketplace/store" target="_blank">Unreal Marketplace</a>, and be strategic about what needs to be created from scratch or modified from pre-existing assets. There is no need to spend large amounts of time modeling generic assets when they are widely available online, and, in some cases, free for commercial use.&nbsp;<br /> <img alt="Looping-asset-gif.gif" src="https://cdn2.unrealengine.com/Unreal+Engine%2Ftech-blog%2Fore-creative-explains-how-to-leverage-editor-utility-widgets-to-stylize-your-game%2FLooping-asset-gif-63ed44c59c2397951679e9d5b7d9350836befa6a.gif" style="height:auto; width:auto" /><br /> <em>The Unreal Marketplace features high quality assets from talented individuals and is a great resource to leverage for your project.</em><br /> <br /> While the benefits are clear, using premade assets can come with certain limitations and challenges. For example, many of these assets have varying styles, material formats, creator idiosyncrasies, and, if not used properly, could turn people off from your game (regardless of how enjoyable it is) as their initial impression might be that the game looks like an “asset flip.” Even assets that attempt a more realistic look may showcase an individual creator’s style, which can create visual inconsistencies. The way I solve for this in <em>Ira</em> is by (you guessed it) asset stylization. Once everything is stylized, people won’t know where the assets have come from. They will simply be integrated into the world seamlessly. This has saved me a significant amount of time and energy, so I can focus on making the game the best it can be.<br /> <br /> Another great advantage of stylization is that it can be done in an almost limitless number of ways. This allows developers to create a unique style that fits the mood and atmosphere of their experience while simultaneously allowing them to stand out among the crowd. In today's world, if your game can’t distinguish itself visually, then it might be held back from reaching its full potential. If you are new to defining an art style for your project, however, you’re in luck. There are plenty of tutorials and livestreams that can help you get started with post-process and material stylization. Feel free to check out this <a href="https://www.youtube.com/watch?v=cQw1CL0xYBE" target="_blank">livestream</a> from Epic to get started with stylized post process effects.<br /> <br /> With that said, when stylization is leveraged properly in your development process, the benefits can be far-reaching.<br /> <br /> <strong>How <em>Ira</em> leverages Editor Utility Widgets to create an asset stylization pipeline</strong><br /> <br /> Using Editor Utility Widgets allows me to significantly cut down the time it takes to use asset packs and convert them into a style that would fit <em>Ira</em>. Editor Utility Widgets have saved me countless hours over the course of the game’s development. Let's take a closer look at the system I’ve put together, and once you see what’s possible, you can borrow some of these ideas and use them to your advantage in your projects. The pipeline is split into a few parts.<br /> <br /> <strong>Breaking up material groups</strong><br /> <br /> Some assets require minor preparation before being converted using <em>Ira</em>’s asset conversion tool. In this first step, I break up asset material groups into their own unique material slots (when necessary). This allows me to quickly create solid color groups that match Ira’s visual style. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/DPf3cB0Cnns" width="100%"></iframe></div> Part 1: Break up materials&nbsp; <em>Breaking up material groups (if necessary) using the built-in Mesh Editing Plugin.</em><br /> <br /> <strong>Processing assets</strong><br /> <br /> After making sure the proper material groups are assigned, I use Editor Utility Widgets to automate repetitive tasks in the editor using Blueprints; in this instance, converting assets to be used in <em>Ira</em>. Once the assets are converted, I simply tweak a few material settings (if necessary) and the asset is game-ready.<br /> <br /> Part 2: Process assets with <em>Ira</em>'s utility widget tool&nbsp; <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/5kRO8wy2A1A" width="100%"></iframe></div> <em>Processing assets with Ira's asset conversion tool so they will be game-ready.</em><br /> <br /> <strong>In scene stylization tools</strong><br /> <br /> Once the assets are processed, I use my visual stylization tool (based off a custom post process actor) to better control the lighting and atmosphere of the scene. This tool allows for scene and actor stylization without having to re-setup the logic each time, and provides me with an unparalleled level of control. This is another time-saving tool that makes stylization fast and easy.<br /> <br /> Part 3: Scene stylization tool&nbsp; <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/fDamLn2wBLE" width="100%"></iframe></div> <em>Using a custom post process volume actor and custom stencils to affect the scene’s visual style in a more granular way.</em><br /> <br /> <strong>Conclusion</strong><br /> <br /> In this clip, you can see the culmination of the asset pipeline on this mesh/scene. By leveraging Editor Utility Widgets to automate <em>Ira</em>’s stylization pipeline, I’ve been able to save time, money, resources, and even myself from unneeded stress. I hope you found this peek into Ira’s visual pipeline enlightening and that you’ll be able to walk away with some new techniques and ideas for your Unreal projects. &nbsp;<br /> <br /> Part 4: Conclusion <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/IXJrgsqBk2I" width="100%"></iframe></div> <em>Find ways to leverage <a href="https://docs.unrealengine.com/en-US/Engine/UMG/UserGuide/EditorUtilityWidgets/index.html" target="_blank">Editor Utility Widgets</a> in your own project. Follow <a href="https://www.youtube.com/watch?v=s_rt49atj0Y" target="_blank">this link</a> for a live demonstration on creating editor utility widgets.</em><br /> <br /> For more information on<em> Ira Act 1: Pilgrimage</em>, follow us on Twitter <a href="https://twitter.com/IraGame?ref_src=twsrc%5Egoogle%7Ctwcamp%5Eserp%7Ctwgr%5Eauthor" target="_blank">@IraGame</a> or visit <a href="http://www.iragame.com/" target="_blank">our website</a>.&nbsp;<br /> &nbsp;ArtBlueprintsCommunityDesignGamesira act 1pilgrimageTrainingTutorialszachary downerUnreal EngineUE4Zachary DownerThu, 30 Apr 2020 19:00:00 GMTThu, 30 Apr 2020 19:00:00 GMThttps://www.unrealengine.com/tech-blog/ore-creative-explains-how-to-leverage-editor-utility-widgets-to-stylize-your-gameWebinar: Working collaboratively in Unreal Engine https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fwebinar-working-collaboratively-in-unreal-engine%2FWebinar_Working_Collaboratively_Thumbnail-375x275-22d9148eed792cc516a5fc059f6ed807813b8e0f.jpgMissed our webinar on collaborative workflows in Unreal Engine? Now you can watch it on demand! Learn how multiple artists can simultaneously make changes to the same project safely and reliably, and how to perform collaborative design review.We recently hosted the live webinar <strong>Working collaboratively in Unreal Engine</strong>. If you missed it, no problem! The replay is available right here, in two parts.<br /> <br /> In this webinar, Senior Technical Marketing Manager Daryl Obert and Technical Artist Matthew Doyle demonstrate two unique collaboration workflows. &nbsp;<br /> <br /> The first focuses on the Multi-User Editor—a powerful tool that enables multiple artists to make changes simultaneously to the same Unreal Engine project safely and reliably. Updates happen on the fly for everyone in the group, with no wait. &nbsp;&nbsp; <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/MPIpOdNmNGE" width="100%"></iframe></div> The second explores how to use the Collab Viewer Template to create a runtime experience of design data to conduct collaborative design review sessions. The template has a variety of navigation modes as well as VR support. Additionally, it provides tools that are easy to set up and use for moving objects, animating exploded views, X-Raying geometry, and setting bookmarks.&nbsp; <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/5hZvKNv8wZo" width="100%"></iframe></div> You’ll learn how to:&nbsp;<br /> &nbsp; <ul style="margin-left:40px"> <li>Set up the Multi-User Editor and work with it</li> <li>Work with the Virtual Scouting tools to make better creative decisions</li> <li>Add the Collab Viewer Template to a project</li> <li>Work with the Collab Viewer Template’s pre-built Blueprints</li> <li>Add a custom menu to the Collab Viewer Template’s UI</li> </ul> <br /> Looking for more webinars? You can <a href="https://bit.ly/3cPBLhM" target="_blank">register today</a> for our next webinar: “Source control in Unreal Engine”, scheduled for May 13, 2020, and check out the full series <a href="https://www.unrealengine.com/en-US/events/enterprise-webinar-series" target="_blank">here</a>.<br /> <br /> <em>Models courtesy of TurboSquid</em>LearningWebinarArchitectureAutomotive & TransportationBroadcast & Live EventsFilm & TelevisionGamesTraining & SimulationMore UsesThu, 30 Apr 2020 18:00:00 GMTThu, 30 Apr 2020 18:00:00 GMThttps://www.unrealengine.com/blog/webinar-working-collaboratively-in-unreal-engineEmotional documentary explores new compassionate possibilities of VRhttps://cdn2.unrealengine.com/Unreal+Engine%2Fdeveloper-interviews%2Femotional-documentary-explores-new-compassionate-possibilities-of-vr%2FDeveloperInterview_MeetingYou_THUMBNAIL-375x275-652d11fdc9aa027022d452e30e4f2ebd9a043e37.jpgLearn how Korean broadcasting station MBC and Vive Studios leveraged Unreal Engine to create a special VR experience based on a mother’s memories.It was a moment captured by a documentary that brought its audience to tears.<br /> <br /> In <em>Meeting You</em>, a grieving mother is reunited in virtual reality with her deceased daughter. The documentary, produced by Korean broadcasting station Munhwa Broadcasting Corporation (MBC) and shown on February 6, delivered a powerfully emotional journey for the woman and the audience who witnessed it.<br /> <br /> The project started with a question: “What would you say to your loved ones in heaven if you could meet them once again?” After experimenting with different ideas, the production team decided to combine VR with a documentary approach.&nbsp;<br /> <br /> The team was seeking participants for the project when they heard the story of Ji-sung Jang, a mother of four. Her third child, 7-year-old Na-yeon, tragically passed away in the fall of 2016. Ji-sung wished for a chance to cook miyeok-guk, or Korean seaweed soup served on birthdays, for her late daughter and to tell her that she loves her and thinks about her daily.&nbsp;<br /> <br /> After the project was confirmed, Vive Studios helped recreate Na-yeon in VR so that Ji-sung could share a moment with her late daughter.&nbsp;<br /> <br /> Thanks to the efforts of the broadcasting team and Vive Studios, Ji-sung was reunited with Na-yeon in MBC’s virtual studio. After spending years missing her daughter, Ji-sung finally had a chance to tell her one more time that she loved her. Na-yeon seemed to delight in her special meal, made a birthday wish, and then said, "I love you, mom." The moment ended as the little girl transformed into a delicate white butterfly and drifted away.<br /> <br /> MBC and Vive Studios spent seven months creating the experience for Ji-sung, allowing her to feel like she was able to spend one more precious moment in time with her child.<br /> <br /> We spoke to MBC’s producer Jong-woo Kim and Vive Studios, a creative storyteller which provides production technology in VR, AR, VFX, and film, about the project and how it presented new opportunities and direction for VR beyond typical entertainment uses.<br /> &nbsp;<br /> <strong>Combining VR with a documentary was a very original idea. What was the inspiration for producing <em>Meeting You</em>?<br /> <br /> MBC Producer Jong-woo Kim: </strong>The motif of reuniting with family members who passed away came to mind while I was planning a new program. I was inspired, in part, by photorealistic CG renders created by a game engine. The CG images were very realistic yet transcendental, which seemed to align with the theme of our project.<br /> <br /> We decided to develop in VR because linear CG footage wouldn’t be any different from viewing existing videos, no matter how high the quality. VR allows the participant to interact with their counterpart which delivers a completely different experience. For this reason, we believed that it would be possible to give an impression of actually “meeting” someone. We also had to choose between VR and AR to proceed, and eventually decided on VR for its immersive nature.<br /> <img alt="DeveloperInterview_Meeting-you_blog_body-image1.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fdeveloper-interviews%2Femotional-documentary-explores-new-compassionate-possibilities-of-vr%2FDeveloperInterview_Meeting-you_blog_body-image1-1920x1080-167ab46147bc7e0303ce4375d5abf745df147b5b.jpg" style="height:auto; width:auto" /><br /> <strong>Vive Studios: </strong>When the broadcasting station approached us with their project, we thought it would give us the opportunity to go beyond VR technology that [typically] served as a form of entertainment up until now. The project would allow us to create something with which the general public could emotionally resonate. The fact that our technology can comfort and touch viewers was enough of a reason for us to take part in the project, disregarding other factors like revenue. It also allowed us to further pioneer a new frontier in the VR realm, which is heavily characterized by its entertainment purposes.<br /> <br /> <strong>What was the most challenging part of the production process?<br /> <br /> Kim: </strong>At first, we lacked the assurance that our work could be a genuine reality for someone without merely resorting to an entertaining experiment. Another challenging aspect was that we were unsure whether there would be a family that would be brave enough to participate.<br /> <br /> After the project progressed to a certain extent, we faced technical limitations. For example, enabling uninhibited facial expressions while the character moves around was harder than we imagined, and forming a voice using deep learning was also a difficult task. Many difficulties arose until the day of the VR experience, but in retrospect, the most challenging aspect yet was technically restoring the unique characteristics of human “Na-yeon.”<br /> <br /> Also, we put a lot of thought into how to arrange the encounter and where it would take place. Unlike films, we were cautious with letting any of the filmmaker's intent get involved. Our team gained the trust of the participating family, aiming to create the experience based solely on the family's memories, even if it meant having less emotional impact. Through this approach, the family also opened up to us. <div style="text-align:center"><img alt="DeveloperInterview_Meeting-you_blog_body-image2.png" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fdeveloper-interviews%2Femotional-documentary-explores-new-compassionate-possibilities-of-vr%2FDeveloperInterview_Meeting-you_blog_body-image2-1453x820-d175277cd84f7474bd63ca7eb6c5e44e81d615df.png" style="height:auto; width:auto" /></div> <strong>Vive Studios:</strong> At the beginning of the project, we proceeded with our existing methods because it was our first time working on a project that required such a sensitive and emotional approach. We integrated voice recognition and AI to make Na-yeon automatically react and tested out engaging interactions between Na-yeon and her mother, such as taking a picture or drawing together.<br /> <br /> However, several pilot tests led us to recognize the need for an extremely cautious approach. We realized that it isn’t easy for a bereaved mother to keep calm and use logical thinking to partake in various tasks in an unfamiliar virtual environment. We also felt immense pressure knowing that we had just one opportunity to film. In other words, capturing the dramatic moment of the mother and daughter reunion made it impossible for multiple takes.&nbsp;<br /> <br /> <strong>What did you focus on the most to create a “VR experience tailored for one”?<br /> <br /> Kim: </strong>A VR experience tailored for one person required an analysis of intimate memories of which only that person knows. We focused on analyzing and replicating Na-yeon’s overall impression, behavior, and facial expressions based on the recollections of her mother. Although the lack of technical skill and time made it difficult to depict all of these details realistically, we tried our best to create a digital human based on memory, albeit within our limitations.&nbsp;<br /> <br /> <strong>What was the production process?<br /> <br /> Kim: </strong>It was important to give the mother the impression that she had entered her memories. Many aspects of Na-yeon’s image were based on the mother’s memory, and the setting was based in a park that the mother recalled visiting with Na-yeon. The entire point of the story was to show that Na-yeon was alive and well in her memory as well as in heaven so that we can console the mother even if the experience was artificial.<br /> <br /> From a technical perspective, we intended to give an impression of actually spending time with a person by mixing in pauses in time and interaction as the story unfolds. This aspect focused on the interactivity into which all VR filmmakers put a lot of thought. With more technical advancements, it will be possible to feature much more interaction and open-ended stories.<br /> <br /> <strong>Vive Studios: </strong>During production, we set three different standards in consideration of the aforementioned difficulties.<br /> <br /> We tried our best to avoid factors that may interfere with the mother’s emotions. This meant eliminating automated features such as voice recognition or using AI for character reactions, which were considered in the early stages of planning, because they had a chance of failing and not giving us the results we were looking for. Instead, we used <a href="https://www.unrealengine.com/en-US/tech-blog/choosing-a-performance-capture-system-for-real-time-mocap" target="_blank">motion capture</a> to create a set of animation clips that portrayed Na-yeon’s every-day behavior with great detail and applied them according to the scenario. In between the flow of animations, we inserted idle looping so that we could adjust the timing between idling and progressing through the scenario, depending on the state of Na-yeon’s mother.<br /> <img alt="DeveloperInterview_Meeting-you_blog_body-image3.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fdeveloper-interviews%2Femotional-documentary-explores-new-compassionate-possibilities-of-vr%2FDeveloperInterview_Meeting-you_blog_body-image3-1920x1080-7721d521e171660aa953690bcd5cc156868d54ae.jpg" style="height:auto; width:auto" /><br /> <img alt="DeveloperInterview_Meeting-you_blog_body-image4.png" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fdeveloper-interviews%2Femotional-documentary-explores-new-compassionate-possibilities-of-vr%2FDeveloperInterview_Meeting-you_blog_body-image4-3840x2160-b8d71b3f06730e4e9eb315d1908a6331dc7070b4.png" style="height:auto; width:auto" /><br /> Also, we added simple interactions, such as blowing out candles and Na-yeon reacting naturally when her hair is caressed, to replace complex interactions. Here, eye contact was key. We used <a href="https://docs.unrealengine.com/en-US/Engine/Animation/AimOffset/index.html" target="_blank">Aim Offset</a> so that Na-yeon naturally makes eye contact with her mother while going through the set of motions.&nbsp;<br /> <br /> The second key point was configuring devices and optimizing programs in order to deliver a natural VR experience for Na-yeon’s mother. To meet the broadcasting standards for graphics quality and performance, we chose HMD equipment powered by high-end PCs but achieving high rendering performance was not an easy endeavor. Various <a href="https://www.unrealengine.com/en-US/blog/spring-into-action-with-this-free-environment-collection-from-project-nature" target="_blank">environment props</a> such as grass had to be rendered across a vast space, and Na-yeon’s skin, hair, outfit, and overall appearance had to be realistic even when viewed up-close while also remaining lightweight. To achieve this, we continuously optimized the modeling and Unreal Engine material setup, and actively utilized the <a href="https://docs.unrealengine.com/en-US/Engine/Rendering/ScreenPercentage/index.html#enablingtemporalupsample" target="_blank">upsampling</a> feature.<br /> <br /> For the HMD devices, a wireless VR module was used so that the mother could move around the virtual space untethered, and special gloves were used that could deliver a sense of warmth to the mother when she embraced Na-yeon. An external fan was also programmed in Unreal Engine so that it could be controlled to deliver the experience of a windy outdoor environment.<br /> <br /> Finally, we provided various perspectives so that the mother’s experience could be effectively delivered straight to the viewers. The mirrored VR footage from the mother’s point of view wasn’t fitting for the documentary as it was too shaky and too low resolution, which would easily cause motion sickness. We resolved this issue by using the <a href="https://www.unrealengine.com/en-US/spotlights/zero-density-delivers-live-broadcast-virtual-production-solutions" target="_blank">Unreal Engine-powered Zero Density</a> solution, which allowed us to provide a third-person point of view footage of the mother shown simultaneously with the CG background as the final broadcasting footage. <div style="text-align:center"><img alt="DeveloperInterview_Meeting-you_blog_body-image5.png" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fdeveloper-interviews%2Femotional-documentary-explores-new-compassionate-possibilities-of-vr%2FDeveloperInterview_Meeting-you_blog_body-image5-818x434-08566e9c8ada7dd6325f76aa131ff863984320e7.png" style="height:auto; width:auto" /></div> The VR footage and third-person point of view footage were rendered by two separate PCs, which had to show matching scenes from the same position. So we developed a solution to sync the space and internal settings between the two programs. This enabled the footage from the mother’s perspective and another from a separate camera to be composited in real time, which was then shared with the viewers. To achieve natural camera work, a hand-held camera with a spatial tracking sensor was used to film the virtual space designed in Unreal Engine as if it were an actual location.<br /> <img alt="DeveloperInterview_Meeting-you_blog_body-image6.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fdeveloper-interviews%2Femotional-documentary-explores-new-compassionate-possibilities-of-vr%2FDeveloperInterview_Meeting-you_blog_body-image6-4608x2592-910ff8225a3c3d50dbe0c740c5876ea5eb8028b6.jpg" style="height:auto; width:auto" /><br /> <strong>Why did you choose Unreal Engine for <em>Meeting You</em>?<br /> <br /> Vive Studios: </strong>Terrestrial broadcasting has restricted production times and budgets. In order to quickly produce high-quality interactive graphics, we decided that Unreal Engine was a great fit.<br /> <br /> We had experience working with Unreal Engine, so we were aware of Unreal Engine’s ability to create sophisticated real-time graphics as well as features like shaders and logic using a highly intuitive method. Another reason we chose Unreal Engine was to shorten the production time using <a href="https://unrealengine.com/marketplace/en-US/profile/Epic+Games" target="_blank">free resources offered by Epic Games</a> or assets from the <a href="https://unrealengine.com/marketplace/en-US/new-content" target="_blank">Unreal Engine Marketplace</a>.<br /> <br /> <strong>Where was Unreal Engine used in production?<br /> <br /> Vive Studios:</strong> <a href="https://quixel.com/megascans" target="_blank">Quixel’s Megascans library</a> was very useful for placing various environment props within the large space to deliver a believable VR experience for Ji-sung.<br /> <br /> When restoring the child as a digital human, we faced difficulties creating photoreal skin texture and wrinkles because they were too smooth. Unreal Engine’s documentation on <a href="https://docs.unrealengine.com/Resources/Showcases/DigitalHumans/index.html" target="_blank">digital humans</a> served as a reference that helped us work on the <a href="https://docs.unrealengine.com/en-US/Resources/Showcases/DigitalHumans/index.html" target="_blank">subsurface</a> profile for light scattering through skin and authoring shaders to <a href="https://docs.unrealengine.com/en-US/Resources/Showcases/DigitalHumans/#microgeometry" target="_blank">create minute pore details</a>.&nbsp;<br /> <img alt="DeveloperInterview_Meeting-you_blog_body-image7.png" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fdeveloper-interviews%2Femotional-documentary-explores-new-compassionate-possibilities-of-vr%2FDeveloperInterview_Meeting-you_blog_body-image7-3840x2160-c3a32eb66265b5936214caf86f08e9f3ca406b08.png" style="height:auto; width:auto" /><br /> Na-yeon’s movement was designed using features like <a href="https://docs.unrealengine.com/en-US/Engine/Animation/AnimBlueprints/index.html" target="_blank">animation Blueprints</a> and Aim Offset for a natural blend between animation motions and real-time eye contact with Ji-sung. Unreal Engine was also used to control external devices such as fans to simulate wind and devices designed to simulate body warmth.<br /> <img alt="DeveloperInterview_Meeting-you_blog_body-image8.png" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fdeveloper-interviews%2Femotional-documentary-explores-new-compassionate-possibilities-of-vr%2FDeveloperInterview_Meeting-you_blog_body-image8-1709x1042-f806bd6192f15479bd9a4cff4d99a2768b68480e.png" style="height:auto; width:auto" /><br /> <strong>What have you learned from this project?<br /> <br /> Kim: </strong>After the documentary aired, many other families sent us their stories and expressed their desire to meet their deceased loved ones. We were very grateful for their responses but also found ourselves thinking hard on it. We felt that this project was much more difficult and large-scale than we had imagined. Although we’re unsure whether we could make this a regular program, we have demonstrated the possibilities of leveraging technology for compelling stories. If we were to work on a similar project, we would want to take it to the next level.<br /> <br /> <strong>Vive Studios: </strong>After <em>Meeting You</em> aired, international press like Reuters and BBC covered the project and we were met with explosive interest and responses that surpassed our expectations. There were many who sympathized with Na-yeon’s mother whereas there were also mixed opinions on the scope of the technology’s application or about ethical standards.<br /> <br /> We not only discovered the scale of influence that could be generated when cutting edge technology meets human emotions but also learned that we must approach sensitive topics with great discretion. Now seems to be the time to have a public discussion on how to manage and distinguish real life and virtual reality when such advanced technology becomes the norm in the near future.<br /> <br /> <strong>What does the future hold for you and what are your next goals?<br /> <br /> Vive Studios: </strong>Vive Studios will put more focus on the technical recreation of humanistic experiences through various experiments and test the forefront of technology. To achieve this, we will continue with digital human research and development using Unreal Engine. We also plan on developing a real-time virtual production pipeline so that we can utilize virtual reality technology to produce films and TV series.<br /> &nbsp; <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/uflTK8c4w0c" width="100%"></iframe></div> <br /> Film & TelevisionCommunityDesignVRMocapRay TracingVirtual ProductionVirtual SetsVisualizationDigital HumanMeeting YouVive StudiosMunhwa Broadcasting CorporationThu, 30 Apr 2020 15:30:00 GMTThu, 30 Apr 2020 15:30:00 GMThttps://www.unrealengine.com/developer-interviews/emotional-documentary-explores-new-compassionate-possibilities-of-vrMixed reality takes giant leap forward with Unreal Engine and HoloLens 2https://cdn2.unrealengine.com/Unreal+Engine%2Fspotlights%2Fone-giant-step-for-mixed-reality-visuals-with-unreal-engine-and-hololens-2%2FSpotlight_Apollo11_HololensThumbnail-375x275-1061e0d09211eaf6f8ca66e59dc7409182837a5f.jpgHolographic computing will transform industries from retail to engineering. We take a look at how we can now achieve the highest-quality visuals yet seen in mixed reality with a deep dive into the groundbreaking Apollo 11 HoloLens 2 project.&nbsp;Picture a team of engineers around a table. They’re collaborating and interacting with the same ultra-high fidelity digital 3D hologram, streamed wirelessly to their HoloLens 2 devices from a high-end PC.<br /> <br /> This is the technical capability showcased in the live Apollo 11 HoloLens 2 demonstration—and it’s a giant leap forward for mixed reality in an enterprise context. The demo illustrates that by leveraging Unreal Engine and HoloLens 2, we can achieve the highest-quality visuals yet seen in mixed reality. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/r0ubiU3PRHw" width="100%"></iframe></div> We <a href="https://www.unrealengine.com/en-US/blog/microsoft-build-2019-epic-games-apollo-11-mission-hololens-2-unreal-engine-4" target="_blank">previously took a look at this project </a>when it came to fruition last year. Now, we’re going to dive deep into some of the technical aspects of its creation. <h3>High-fidelity MR visuals streamed wirelessly&nbsp;</h3> The Apollo 11 demo—originally intended to be presented onstage at Microsoft Build—is a collaboration between Epic Games and Microsoft to bring best-in-class visuals to HoloLens 2. “When Alex Kipman, Microsoft’s technical fellow for AI and mixed reality, invited us to try out early prototypes of the HoloLens 2, we were blown away by the generational leap in immersion and comfort it provided,” explains Francois Antoine, Epic Games’ Director of Advanced Projects who supervised the project. “We left the meeting with only one thought in mind: what would be the best showcase to demonstrate the HoloLens 2’s incredible capabilities?”&nbsp;<br /> <br /> It just so happened that last year was the 50th anniversary of mankind’s biggest technical achievement—the first humans landing on the moon. This provided the context they were looking for, and so the Epic team set out to create a showcase that retold key moments of the historic mission.<br /> <br /> To make sure they were as faithful as possible to the source material, the team called upon some of the industry’s foremost experts on the Apollo 11 mission. The live demo was presented by ILM’s Chief Creative Officer John Knoll and Andrew Chaikin, space historian and author of Man on the Moon. Knoll provided much of the reference material, advised on the realism of the 3D assets, and explained the whole mission to the Epic team working on the project.&nbsp;<br /> <br /> Diving into many aspects of the Apollo 11 mission, the demo offers an unprecedented level of visual detail. “To date, there is nothing that looks this photoreal in mixed reality,” says MinJie Wu, Technical Artist on the project.&nbsp;<br /> <img alt="Spotlight_Apollo11_HololensBlog_Body_Image_3.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fspotlights%2Fone-giant-step-for-mixed-reality-visuals-with-unreal-engine-and-hololens-2%2FSpotlight_Apollo11_HololensBlog_Body_Image_3-1640x900-e69e77ce220ddcc45237cf568ee597ea9e789c50.jpg" style="height:auto; width:auto" /><br /> Visuals are streamed wirelessly from Unreal Engine, running on networked PCs, to the HoloLens 2 devices that Knoll and Chaikin are wearing, using a prototype version of <a href="https://azure.microsoft.com/en-us/services/spatial-anchors/#features" target="_blank">Azure Spatial Anchors</a> to create a shared experience between the two presenters.<br /> <br /> By networking the two HoloLens devices, each understands where the other is in physical space, enabling them to track one another. A third <a href="https://en.wikipedia.org/wiki/Steadicam" target="_blank">Steadicam camera</a> is also tracked by bolting a HP Windows Mixed Reality headset onto the front of it.&nbsp;<br /> <br /> Calibrating this third physical camera to align with the Unreal Engine camera was particularly challenging. “We shot a lens grid from multiple views and used OpenCV to calculate the lens parameters,” says David Hibbitts, Virtual Production Engineer. “I then used those lens parameters inside of Unreal with the Lens Distortion plugin to both calculate the properties for the Unreal camera (field of view, focal length, and so on) and also generate a displacement map, which can be used to distort the Unreal render to match the camera footage.”<br /> <br /> Even if you are able to get the camera settings and distortion correct, there can still be a mismatch when the camera starts moving. This is because the point you're tracking on the camera doesn't match the nodal point of the lens, which is what Unreal Engine uses as the camera transform location, so you need to calculate this offset.<img alt="Spotlight_Apollo11_Hololesn_blog_body_illustration.png" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fspotlights%2Fone-giant-step-for-mixed-reality-visuals-with-unreal-engine-and-hololens-2%2FSpotlight_Apollo11_Hololesn_blog_body_illustration-1640x1000-60c0db727822bf7c02e17dc8fc766b4dd716bbab.png" style="height:auto; width:auto" />“To solve this, you need to know some known 3D positions in the world and some known 2D positions of those same points in the camera view, which lets you calculate the nodal point's 3D position, and if you know the tracked position of the camera when you capture the image, you can find the offset,” explains Hibbitts.<br /> <br /> For getting the known points and known camera position, the team made sure to have the tracking system running while it shot the lens grids using the points on the grid as the known points. This enabled it to get the calibration of both the camera and the tracking offset in one pass.<br /> <br /> Three Unreal Engine instances are required for the setup: one for the camera and one for each HoloLens. They all network to a separate, dedicated server. “They're all talking to each other to figure out where they are in the physical space, so everybody could look at the same thing at the same time,” explains Ryan Vance, XR Lead on the Apollo 11 project.&nbsp;<br /> <br /> Shifting the computing process away from the mobile device and onto a high-powered PC is a significant step forward for mixed reality. Previously, you’d have to run the full Unreal Engine stack on the mobile device. “The advantage of that is that it’s the standard mobile deployment strategy you’d use for a phone or anything else,” says Vance. “The disadvantage is that you’re limited by the hardware capability of the device itself from a compute standpoint.”<br /> <br /> Now, by leveraging Unreal Engine’s support for Holographic Remoting, high-end PC graphics can be brought to HoloLens devices. “Being able to render really high-quality visuals on a PC and then deliver those to the HoloLens gives a new experience—people haven’t had that before,” says Vance.&nbsp;<br /> <img alt="Spotlight_Apollo11_HololensBlog_Body_Image_7.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fspotlights%2Fone-giant-step-for-mixed-reality-visuals-with-unreal-engine-and-hololens-2%2FSpotlight_Apollo11_HololensBlog_Body_Image_7-1640x900-93a50706abba6820cf13513af74ea53a3e84363f.jpg" style="height:auto; width:auto" /><br /> Holographic Remoting streams holographic content from a PC to Microsoft HoloLens devices in real time, using a Wi-Fi connection. “Not being dependent on native mobile hardware to generate your final images is huge,” says Wu.&nbsp;<br /> <br /> The interplay between the presenters and the holograms illustrates the difference between designing interactions for MR and VR. “In VR, you set up your ‘safe-zone’ and promise you’re only going to move around in that,” says Jason Bestimt, Lead Programmer on the project. “In MR, you are not bound in this way. In fact, moving around the space is how you get the most out of the experience.”<br /> <img alt="Spotlight_Apollo11_HololensBlog_Body_Image_1B.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fspotlights%2Fone-giant-step-for-mixed-reality-visuals-with-unreal-engine-and-hololens-2%2FSpotlight_Apollo11_HololensBlog_Body_Image_1B-1640x900-f3da86f162930400c439db3b158daaa6f7e72343.jpg" style="height:auto; width:auto" /> <h3>Addressing roundtrip latency with reprojection&nbsp;</h3> To create the Apollo 11 demo, the team leveraged features that have already been used in games for many years. “The cool thing is, you can take standard Unreal networking and gameplay framework concepts—which are pretty well understood at this point—and use them to build a collaborative holographic experience with multiple people,” says Vance.<br /> <br /> Whenever you add a network connection into an XR system, there’s latency in streaming the tracking data. One of the biggest challenges on the project was ensuring the tracking systems aligned despite at least 60 milliseconds roundtrip latency between the PC and the HoloLens devices. “The world ends up being behind where your head actually is,” explains Vance. “Even if you're trying to stand extremely still, your head moves a little bit, and you'll notice that.”&nbsp;<br /> <br /> To address this, Microsoft integrated their <a href="https://docs.microsoft.com/en-us/windows/mixed-reality/hologram-stability#reprojection" target="_blank">reprojection technology</a> into the remoting layer—a standard approach for dealing with latency issues. Reprojection is a sophisticated hardware-assisted holographic stabilization technique that takes into account motion and changes to the point of view as the scene animates and the user moves their head. Applying reprojection techniques ensured all parts of the system could communicate and agree where a point in the virtual world correlated to a point in the physical world.&nbsp; <h3>Experimenting with mixed reality interactions&nbsp;</h3> After the live event project proved the HoloLens 2-to-Unreal Engine setup viable, the team created a version of the demo for public release. “We wanted to repackage it as something a bit simpler, so that anybody could just go into deploying the single-user experience,” says Antoine. The Apollo 11 demo is <a href="https://www.unrealengine.com/marketplace/en-US/product/missionar" target="_blank">available for download</a> on the Unreal Engine Marketplace for free.&nbsp;<br /> <img alt="Spotlight_Apollo11_HololensBlog_Body_Image_5.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fspotlights%2Fone-giant-step-for-mixed-reality-visuals-with-unreal-engine-and-hololens-2%2FSpotlight_Apollo11_HololensBlog_Body_Image_5-1640x900-bca9f8e40fe36cede622a34c0fa4243ffdc79dac.jpg" style="height:auto; width:auto" /><br /> The team discovered much about the types of mixed reality interactions that were most successful while creating this public version. “At first, we created plenty of insane interactions—Iron Man-style,” says Simone Lombardo, Tech Artist on the project. “But while we thought these were really fun and intuitive to use, that wasn’t the case for everybody.”<br /> <br /> Users new to mixed reality found the more complicated interactions difficult to understand, with many triggering interactions at the wrong time. The easiest interactions proved to be a simple grab/touch, because these mirror real-world interactions.&nbsp;<br /> <br /> Based on this finding, the Apollo 11 demo leverages straightforward touch movements for interaction. “We removed all the ‘complex’ interactions such as ‘double pinch’ not only because demonstrators were accidentally triggering them, but also because we ended up getting a lot of unintentional positives when the users’ hands were just out of range,” explains Bestimt. “Many users at rest have their thumbs and index fingers next to each other, creating a ‘pinch’ posture. As they raised their hand to begin interacting, it would automatically detect an unintentional pinch.”<br /> <br /> Other methods of interaction—such as using menus—also proved detrimental to the experience, resulting in the user unconsciously moving around the object far less.<br /> <br /> The team also found novel ways to create a user experience that bridged the gap between the virtual and physical worlds. Many users’ natural instinct upon encountering a hologram is to try to touch it, which results in their fingers going through the hologram. “We added a distance field effect that made contact with real fingers imprint the holograms with a bluish glow,” says Lombardo. “This creates a connection between the real and unreal.”<br /> <img alt="Spotlight_Apollo11_HololensBlog_Body_Image_6.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fspotlights%2Fone-giant-step-for-mixed-reality-visuals-with-unreal-engine-and-hololens-2%2FSpotlight_Apollo11_HololensBlog_Body_Image_6-1640x900-3b912f667165d7143afaea3a8aaee7c0363e12c5.jpg" style="height:auto; width:auto" /><br /> The team have since been working on Azure Spatial Anchors support for the HoloLens 2, which is now available&nbsp;in <a href="https://docs.unrealengine.com/en-US/Support/Builds/ReleaseNotes/4_25/index.html#new:hololens2improvements" target="_blank">Unreal Engine 4.25</a>. Spatial anchors will allow holograms to persist in real-world space between sessions. “That makes it a lot easier to get everything to align,” explains Vance. “So it should be relatively simple to reproduce the capabilities we demonstrated in our networked stage demo, with multiple people all sharing the same space.”<br /> <br /> Since the launch of Microsoft HoloLens 2, holographic computing has been predicted to have a seismic impact on industries ranging from retail to civil engineering. The setup showcased in the Apollo 11 demo represents a huge progression in the quality of visuals and interactions that will take us there.&nbsp;<br /> <img alt="Spotlight_Apollo11_HololensBlog_Body_Image_8.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fspotlights%2Fone-giant-step-for-mixed-reality-visuals-with-unreal-engine-and-hololens-2%2FSpotlight_Apollo11_HololensBlog_Body_Image_8-1640x900-be39f28d413cbad95dd925c7e2fb932b2a788469.jpg" style="height:auto; width:auto" /><br /> <br /> Want to create your own photorealistic mixed reality experiences? <a href="https://www.unrealengine.com/en-US/" target="_blank">Download Unreal Engine</a> for free today.&nbsp;<br /> <br /> &nbsp;Apollo 11ARBlueprintsHoloLensMore UsesMRTraining & SimulationBroadcast & Live EventsArchitectureProduct DesignWed, 29 Apr 2020 14:30:08 GMTWed, 29 Apr 2020 14:30:08 GMThttps://www.unrealengine.com/spotlights/mixed-reality-takes-giant-leap-forward-with-unreal-engine-and-hololens-2nDreams shares lessons learned from developing innovative VR shooter Phantom: Covert Opshttps://cdn2.unrealengine.com/Unreal+Engine%2Fdeveloper-interviews%2Fndreams-shares-lessons-learned-from-developing-innovative-vr-shooter-phantom-covert-ops%2FPhantom_THUMB_ALT-375x275-2c15d561436304b67d2adb811b00b82ab87bdbb9.jpgnDreams Game Director Lewis Brundish and Technical Director Grant Bolton talk about how they developed a stealth VR shooter that takes place entirely from a kayak.&nbsp;<em><a href="https://www.ndreams.com/titles/phantom/" target="_blank">Phantom: Covert Ops</a></em> is one of the most interesting games in development. Not only is it a VR stealth shooter set during the Cold War, but players are seated inside a virtual kayak the entire time. Developer nDreams started prototyping the concept as a way for players to naturally explore in VR while avoiding simulation sickness. Their experiments paid dividends with numerous publications praising the title. Shacknews awarded <em>Phantom: Covert Ops</em> its <a href="https://www.shacknews.com/article/112638/game-critics-awards-best-of-e3-2019-winners-announced" target="_blank">best VR game of E3 award</a>, and we liked it so much that we honored it with a <a href="https://www.unrealengine.com/events/e32019/unreal-e3-awards-2019---most-engaging" target="_blank">most engaging game of E3 nomination</a>. &nbsp;<br /> <br /> To gain insight into how the UK-based studio is designing one of the most anticipated VR titles on the horizon, we reached out to nDreams Game Director Lewis Brundish and Technical Director Grant Bolton. The pair reveal the painstaking steps it took to iterate on their kayaking mechanics so that it would feel realistic yet accessible to those who have never kayaked before. They elaborate on how they designed gameplay that would be highly tactile and physical and leverage VR’s strengths. The duo also talk about how they implemented a slick and minimal UI designed for VR and touch upon how they optimized the game to work on Oculus’ standalone Quest VR headset. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/GelDZYAWl4g" width="100%"></iframe></div> <strong>There's certainly never been a stealth kayak shooter, let alone one in VR. Can you talk about how the inventive game originated?&nbsp;</strong><br /> <br /> <strong>Game Director Lewis Brundish:</strong> Our main goal at the start of the project was to come up with a movement system for VR that would be totally immersive and comfortable to play. The main strength of VR is the sense that you are really present in the game world, but this is always slightly compromised whenever you have to teleport to move around. We wanted to find a method of navigation that would be smooth and comfortable, allowing exploration without ever breaking that sense of immersion and tactile control. One of the early suggestions was putting the player in a boat, and after a few prototypes, we knew that we were onto something.&nbsp;<br /> <br /> <strong>At what point during development did nDreams feel like they had a compelling gameplay loop?</strong><br /> <br /> <strong>Brundish:</strong> We were convinced by the idea of our kayak-movement system very early on during prototyping, but what we weren’t sure about was how well this would work with the player in the boat. Our earliest test map just had a single guard on a low bridge crossing a river – the player had no weapons at this point, and their only option was to wait below the bridge for the guard to look away before paddling on. Something about this simple setup worked far better than we had anticipated; the low angle that the player viewed the world from made them feel naturally sneaky, the fact that the guard and player don’t inhabit the same space made the river feel like an inventive hidden route, the wait directly beneath the guard felt tense and suspenseful …even the movements you make with your arms while paddling describe an exaggerated sneaking gesture. Obviously, there was a lot of work ahead of us at this point to flesh out and balance the gameplay loop and combat. Those systems took several months to nail down, but this early test convinced us that the combination of stealth and kayaking was going to work.<br /> <img alt="DeveloperInterview_Phantom_01.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fdeveloper-interviews%2Fndreams-shares-lessons-learned-from-developing-innovative-vr-shooter-phantom-covert-ops%2FDeveloperInterview_Phantom_01-1080x608-a20f1d013c75946a292e8bab7987a998cc35ba7b.jpg" style="height:auto; width:auto" /><br /> <strong>Considering militaristic kayaks are real, how did the studio balance making the experience realistic versus fun?</strong><br /> <br /> <strong>Brundish: </strong>Something that really helped us with this early on was listening to a diverse range of voices and opinions across the team. We have several team members who have real-life kayaking experience and were very insistent on the gameplay feeling realistic, while others had no frame of reference at all and wanted the game to control in an intuitive and arcade-style way. We considered both of these groups to be equally important and went through countless iterations of the controls and balance, trying to get it right for both crowds. For a long time, we thought that we were going to have to offer two separate control schemes, but we eventually found a way to combine the requirements of both groups into a system that we found to be accessible and intuitive for newcomers, but with the depth and nuance of a simulation beneath it. Getting this right was one of the biggest challenges we faced and has taken almost the entire length of the project. &nbsp; &nbsp;<br /> &nbsp;<br /> <strong>The game takes place in the Black Sea region in 1991. Can you establish why that location and time period were chosen?</strong><br /> <br /> <strong>Brundish: </strong>We wanted the player to use weapons and equipment that were analog and tactile, going too modern seemed out of the question; however, we didn’t want to go so far back that some of the stealth gadgets would seem unrealistic. The early 90s felt like the perfect time period. It also provided the military equipment and tone we were looking for. The end of the Cold War was a perfect backdrop for our military espionage narrative.&nbsp;<br /> <br /> Regarding the location, it was important that our environments supported the core mechanics of our gameplay. The levels would all be waterlogged, with military equipment readily accessible to someone in a boat – a naval installation made the most sense, and we incorporated the idea of it being previously abandoned to allow for more variations in theming and potential routes for the player. We researched the historical and political significance of naval bases throughout the Cold War, and the Black Sea leaped out as an evocative and appropriate setting.&nbsp;<br /> <img alt="DeveloperInterview_Phantom_03.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fdeveloper-interviews%2Fndreams-shares-lessons-learned-from-developing-innovative-vr-shooter-phantom-covert-ops%2FDeveloperInterview_Phantom_03-1080x608-7c9b00608c33f0232df0c8126c3e289e6e7a6683.jpg" style="height:auto; width:auto" /><br /> <strong>Considering players can take a stealthy approach or go in with guns blazing, how did you balance gameplay so that both play-styles would be viable?&nbsp;</strong><br /> <br /> <strong>Brundish:</strong> Initially, we intended to balance the game around a more combat-heavy approach, but we found that this made the stealth gameplay too easy to bypass entirely. Making the combat too difficult, on the other hand, meant that it wasn’t a viable approach, and we wanted the player to have as many options as possible.<br /> <br /> We settled on a balance that rewards patience and planning – combat is viable as long as it’s planned and executed cleanly. Once you have scouted an area and located all of the enemies and obstacles, you can formulate a plan of attack for the order you will take them out in or think about how you can avoid them entirely. If you go charging into an area without care, you will likely be taken out very quickly.&nbsp;<br /> <br /> <strong>Can you delve into how you designed the game's various guns and refined <em>Phantom: Covert Ops</em>' shooting and reloading mechanics for VR?&nbsp;</strong><br /> <br /> <strong>Technical Director Grant Bolton: </strong>Lewis described an aspiration for “movie realism” where the player can act naturally but isn’t punished with overly fiddly or cumbersome detail. In <em>Phantom</em>, you don’t reload with a button-press; you have to reach for a new magazine from your pouch and place it in the weapon. We have balanced the detection of these actions to be generous and satisfying. Our goal is to quickly teach each player to learn how to reload so that it feels instinctive to them, as if they were a trained Phantom operative.<br /> <br /> We also studied the way real players use each weapon and adapted the software in subtle ways to accommodate – for example, the pistol can be operated one-handed but we found many people use a second hand to steady their grip. Thus, we added a hand animation to the game so that the avatar visibly cups the pistol handle with their off-hand to mirror the player’s real-life action.<br /> <img alt="DeveloperInterview_Phantom_05.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fdeveloper-interviews%2Fndreams-shares-lessons-learned-from-developing-innovative-vr-shooter-phantom-covert-ops%2FDeveloperInterview_Phantom_05-1080x608-05589fc14e635a185e3bcd8987ad22b596438a60.jpg" style="height:auto; width:auto" /><br /> <strong>Can you explain your approach to designing the game's environments to facilitate varied tactical scenarios?</strong><br /> <br /> <strong>Brundish:</strong> It was very important to us that the player always feel an advantage in the kayak; we didn’t want them to feel like it would ever be a better idea to get out. As such, the environments are designed in a way that present more options than what the enemies walking on land have access to. To ramp up the challenge and variety as the game progresses, we introduce a number of new enemies, interactions, and objectives.&nbsp;<br /> <br /> Environmentally, we vary from wide-open spaces to claustrophobic tunnels and everything in between. In a game where the goal is to go unnoticed, minor differences from the number of enemies in an area down to the directions they face can have huge implications for how a scenario will play out. The narrative also introduces unique beats into the gameplay, such as objectives that force the player to raise alarms, or a section where the player can no longer use their weapons.<br /> <br /> <strong>People have praised <em>Phantom: Covert Ops</em> for how realistic the kayaking aspect of the game feels with players being able to use their paddle to push off walls coupled with the fact that they can lean in a direction to sharpen a turn. How much experimentation and iteration did the studio have to do to ensure this aspect of the game felt good?</strong><br /> <br /> <strong>Bolton: </strong>A lot of experimentation! There were many factors involved – we needed the game to feel familiar to seasoned kayakers but be accessible and comfortable for extended periods of play. During the early stages of development, we tried very “gamey” models with discreet controls – these were very reliable and comfortable but required us to teach players various abstract button presses instead of letting them act naturally. We also tried full simulations, where the player’s every movement and paddle-stroke had a realistic effect on the boat, but this was difficult to teach and could cause comfort issues. It was also much more tiring – making broad sweeping strokes without the resistance of the water takes a toll on shoulders!&nbsp;<br /> <br /> The solution we arrived at is a bit like the “fly-by-wire” concept modern jet fighters use; we analyze the player’s inputs, how they’re moving their paddle and body, and translate these into the best physics forces to comfortably move the boat in the way they intended.<br /> <br /> <a href="https://docs.unrealengine.com/Gameplay/Tools/VisualLogger/index.html" target="_blank">Unreal Engine’s visual logging system</a> was incredibly helpful during this process, as we could record the paddle strokes of playtesters and play them back in the editor, zooming in to see exactly how they were moving.<br /> <img alt="DeveloperInterview_Phantom_06.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fdeveloper-interviews%2Fndreams-shares-lessons-learned-from-developing-innovative-vr-shooter-phantom-covert-ops%2FDeveloperInterview_Phantom_06-1080x608-ecc4f32b170135323b73723b1d76599cb72b8ec1.jpg" style="height:auto; width:auto" /><br /> <strong>One of VR's greatest strengths is that it can provide gamers unparalleled freedom to reach out and interact with the world. Can you delve into how the studio leveraged VR to make the game more physically immersive?&nbsp;</strong><br /> <br /> <strong>Bolton:</strong> Right from our earliest prototypes, we knew interaction with the paddle would be key. Having it ripple and splash the water was an obvious but important feature, but we ended up needing more subtle systems as well. For example, there's no solid paddle pole between the player's hands in the real world. This could lead to a disconnect with the avatar. We account for this by maintaining grip on the paddle even if it were to be wrenched away by a strong force. We also constrained movements that would intersect the avatar and the paddle.<br /> <br /> Weapons and equipment are another important area; we tried to get the larger guns feeling heavy if held with just one hand, filtering movement and rotation slightly per object. A second hand can be used for stabilization, reducing the effect. Magazines must be slotted into the weapon but can also be thrown to create a distraction. We tried to support actions that players would naturally try, to make the game feel more natural and immersive.<br /> <br /> Throughout <em>Phantom</em>'s missions, there are areas where players can rip off panels, pull levers and shutters, pull out plugs, and more. To make these feel substantial and physical, we combine animation, audio, and haptic queues with kinematic tricks. We've learned that every real-world movement a player performs should have an effect in the software; this is the key to immersion. However, this doesn't necessarily have to be true 1:1 movement. We can trick the brain (to an extent) by filtering real-world movements before they appear in the game. For example, to represent pulling a rusty lever, we might translate the first few inches of pulling a lever to a much smaller movement, with an accompanying creaking sound. After breaching some threshold, we play a “crack” sound and trigger dust particles, then allow the avatar's hand to catch up with the player's real-world position. This provides the feeling of wrenching a stiff object until it gives.&nbsp;<br /> <br /> This system also allows the player to pull themselves towards or away from things they can grab in the environment, which is useful for getting up close to ammo pickups, control panels, and more. When the player grips a piece of the environment, their virtual hand locks in place, and we begin to map the movement of their controller to simulated pressure in-game. This pressure is applied as a physical force to the avatar and boat, such that moving the controller away from the body pushes the kayak away from the environment, and pulling objects pulls you towards it.&nbsp;<br /> <img alt="DeveloperInterview_Phantom_04.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fdeveloper-interviews%2Fndreams-shares-lessons-learned-from-developing-innovative-vr-shooter-phantom-covert-ops%2FDeveloperInterview_Phantom_04-1080x608-da027531ec0f0ba0a2cb154e0d0ea516b4b74270.jpg" style="height:auto; width:auto" /><br /> <strong>Why was Unreal Engine a good fit for the game?</strong><br /> <br /> <strong>Bolton:</strong> Unreal is fantastic for multi-platform VR development as we were able to quickly iterate using the editor preview mode and a PC VR headset. Within seconds of making a data change, we can be playing the game in VR, and have confidence that the gameplay systems will translate well to other platforms.<br /> <br /> The sophisticated asset pipeline and lighting tools are favored by our artists, and Unreal’s profiling and scalable render features have been essential for delivering on standalone VR headsets like the Oculus Quest.<br /> <br /> <strong>With no traditional in-game HUD coupled with slick motion graphics that relay mission objectives to players as they float downstream, can you share how the team designed the VR title's minimal UI?</strong><br /> <br /> <strong>Brundish:</strong> As Grant mentioned, we felt that rather than treating the experience as a simulation, we wanted players to feel like they were inside a movie. As such, the general rule was that giving the player on-screen information was fine as long as it felt cinematic and immersive. We looked at opening film titles to inspire the style of our objective text, which appears in-world and doesn’t interrupt the flow of gameplay. When we were trying to draw attention to ammo clips in the equipment bags, we didn’t want to draw highlights around them as this didn’t feel cinematic – eventually, we settled on dramatically spotlighting the relevant ammo whenever you are holding a gun, as dramatic lighting choices are consistent with a cinematic visual language.<br /> <br /> Another discovery we made with UI elements in VR is that they are less obtrusive if the player opts into them intentionally. <em>Phantom</em> allows you to tag enemies using a viewfinder (as is the case in many stealth games). Originally, we had these effects visible on the enemies whenever the player looked through the viewfinder, but we felt that the additional UI elements looked too distracting. As soon as we added the requirement to take a photo of the enemies first, this feeling disappeared – we found that more information on screen feels appropriate if the player has specifically requested it first. &nbsp; &nbsp;<br /> <br /> <strong>What was the most challenging aspect of designing the game, and how did you overcome it?</strong><br /> <br /> <strong>Bolton: </strong>The large explorable environments of <em>Phantom</em> were the most challenging aspect to achieve on the Oculus Quest. The Quest is a very capable piece of hardware, but as a standalone device, it requires extra care and attention around performance. Our world-builders were meticulous with scene composition and encounter design to make the best use of the device, ensuring that sightlines, dynamic objects, destructible lights, and enemy patrol paths wouldn’t result in any view or scenario being too resource-intensive to render. We also developed several bespoke tools to help with optimization and profiling, as well as relying heavily on existing Unreal Engine features.<br /> <img alt="DeveloperInterview_Phantom_02.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fdeveloper-interviews%2Fndreams-shares-lessons-learned-from-developing-innovative-vr-shooter-phantom-covert-ops%2FDeveloperInterview_Phantom_02-1080x608-b3db7285cbb858fe67ff2d678f6b1e7747f47a3f.jpg" style="height:auto; width:auto" /><br /> <strong>nDreams has developed many VR games. What has the studio learned about the burgeoning medium thus far that you're building upon for <em>Phantom: Covert Ops</em>' development?</strong><br /> <br /> <strong>Bolton: </strong>At nDreams, we talk a lot about the power of VR for immersion and try to place the player in a believable world that behaves as they’d expect. We try to focus on the space immediately around the player first, as this is where detail and depth perception have the greatest impact. With <em>Phantom</em>, we’ve placed the player in a kayak and loaded it with powerful military hardware, so they begin to explore their equipment and locomotion before they’ve even paddled out into the wider game world.<br /> Does the studio have any game-design advice for aspiring VR developers?<br /> <br /> <strong>Brundish: </strong>Don’t be afraid to try something new. VR is still in its infancy, and many of the tropes and genres that we are familiar with from traditional gaming don’t translate directly. I think we need to explore new solutions and ideas that wouldn’t necessarily make sense in other mediums but might make perfect sense in VR. It’s a scary proposition to develop a game that doesn’t sound familiar to people or match expectations carried over from other platforms, but I hope we’ve shown with <em>Phantom</em> that unique ideas can be worth following through on!<br /> &nbsp;Designdesignergame designgame designerGamesnDreamsphantom covert opsUE4Unreal EngineVRCommunityJimmy ThangTue, 28 Apr 2020 13:30:00 GMTTue, 28 Apr 2020 13:30:00 GMThttps://www.unrealengine.com/developer-interviews/ndreams-shares-lessons-learned-from-developing-innovative-vr-shooter-phantom-covert-opsDesigning sets and action sequences on “His Dark Materials” with virtual productionhttps://cdn2.unrealengine.com/Unreal+Engine%2Fspotlights%2Fdesigning-sets-and-action-sequences-on-his-dark-materials-with-virtual-production%2FSpotlight_HisDarkMaterials_Thumbnail-375x275-103942f16ca000796d4a34d1f856f0b4733794dd.jpgPainting Practice used UE4 to help directors and producers plan vast sets and key action sequences on “His Dark Materials.”Adapted from the bestselling novels by Philip Pullman, <a href="https://www.imdb.com/title/tt5607976/?ref_=fn_al_tt_1" target="_blank">His Dark Materials</a> is an epic fantasy drama TV series produced by <a href="https://bad-wolf.com/" target="_blank">Bad Wolf Studios</a> and <a href="https://warnerbros.fandom.com/wiki/New_Line_Cinema" target="_blank">New Line Cinema</a> for the BBC and HBO. It’s set mainly in a parallel universe, a world of mythical cities and vast landscapes inhabited by everything from witches to armored polar bears.<br /> <br /> To help design and visualize the enormous sets and complex action sequences required, the production team turned to <a href="https://paintingpractice.com/" target="_blank">Painting Practice</a>, an award-winning design studio whose services include—among other things—previs, animation, VFX, production design, concept art, and digital matte painting.<br /> &nbsp; <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/CowUqRrKzp4" width="100%"></iframe></div> <br /> The team at Painting Practice, who had previously dabbled in real-time rendering on a couple of other shows, started implementing Unreal Engine at the earliest stages of preproduction for <em>His Dark Materials</em>.<br /> <br /> One of the challenging sets they needed to design was Trollesund, the main port of Lapland in the world of Lyra Belacqua, the story’s heroine. For this, the team made a scan of a quarry using photogrammetry drones and used it to create a high-resolution model to import into Unreal Engine. With a few additional models and some textures, the real-time mock-up was complete.<br /> <img alt="Spotlight_HisDarkMaterials_Blog_Body_Image_4.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fspotlights%2Fdesigning-sets-and-action-sequences-on-his-dark-materials-with-virtual-production%2FSpotlight_HisDarkMaterials_Blog_Body_Image_4-1640x900-118e39c9bfcdba7200fdb21d044e9fde4ab03bd2.jpg" style="height:auto; width:auto" /><br /> The previs enabled the director and producers to explore the conceptual environment and make key decisions on the scale of the buildings and the set. “Unreal was a great way of getting people to physically be in a space that was not even conceived,” says Dan May, Creative Director at Painting Practice, and one of the company’s co-founders.<br /> <br /> Joel Collins, Executive Producer and Production Designer on the show, elaborates. “These are very expensive sets, and on a show like <em>His Dark Materials</em>, where you're building an entire town for one episode, you've got to make really good decisions that are practical and financial and creative, and they're all combined,” he says.&nbsp;<br /> <br /> “What Painting Practice's team created gave us an ability to make critical judgments that meant that we could get it right on the edge of affordable and absolutely on the edge of shootable, in the sense of there was no fat on what we built.”<br /> <img alt="Spotlight_HisDarkMaterials_Blog_Body_Image_3.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fspotlights%2Fdesigning-sets-and-action-sequences-on-his-dark-materials-with-virtual-production%2FSpotlight_HisDarkMaterials_Blog_Body_Image_3-1640x900-6c703caa7ed591d8d5794512635c00d5e49033e4.jpg" style="height:auto; width:auto" /><br /> Another critical piece of previs was for a fight scene between two of the story’s armored polar bears. Episode 7,<em> The Fight to the Death</em>, involves mortal combat between Lyra’s friend Iorek Byrnison and the usurping king Iofur Raknison. The episode’s director, Jamie Childs, explains that being able to experiment with a virtual camera on a set that did not yet exist enabled him to explore the best camera angles to maximize the scene’s believability. &nbsp;<br /> <br /> “I wanted to walk around and look at the bear fight, because I wanted that fight to feel like it was a real fight being shot, not a CG fight,” he says. “I could actually do that with Unreal, I could move around that room physically, get the camera in the position I wanted to, and see on my camera monitor what was going on, and record those shots and cut it together—and that was really freeing for a director.”&nbsp;<br /> <img alt="Spotlight_HisDarkMaterials_Blog_Body_Image_5.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fspotlights%2Fdesigning-sets-and-action-sequences-on-his-dark-materials-with-virtual-production%2FSpotlight_HisDarkMaterials_Blog_Body_Image_5-1640x900-a7ba0c9cc04c7f14abe424e506b03b1fcd7f8293.jpg" style="height:auto; width:auto" /><br /> Childs goes on to explain how the sensation reminded him of his early days learning his craft, when he would go out and just film things to try them out, letting his creativity find the essence of the story.<br /> <br /> “I didn't really think that previs would help me do that side of things,” he says. “I thought it might help me with technically where to put a camera and stuff like that, but it actually made me go ‘Right, I don’t need to worry about any of the noise; I can actually just go and create something.’ ”<br /> <br /> Real-time technology is already moving beyond preproduction into some elements of full production, and Collins sees a time when it will span the entire production process.<br /> <br /> “I'm now really excited by the next 10 years because of this Unreal thing, in terms of it will just almost be start to finish,” he says. “Maybe all production will end up like that.”<br /> <br /> Painting Practice is also looking to increase the reach of real-time technology within the industry by launching Plan V, a new software app based on Unreal Engine. Plan V is a virtual reality bespoke studio environment that enables artists, producers, directors, and many other members of a production crew to experiment with lenses, storyboards, previs, and more.&nbsp;<br /> <img alt="Spotlight_HisDarkMaterials_Blog_Body_Image_7.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fspotlights%2Fdesigning-sets-and-action-sequences-on-his-dark-materials-with-virtual-production%2FSpotlight_HisDarkMaterials_Blog_Body_Image_7-1640x900-c3d9e7863255c3acc0602c657763efbecf91fa2c.jpg" style="height:auto; width:auto" /><br /> With a simplified workflow and a user-friendly interface, Plan V is designed to enable less technically inclined users to interact in a high-quality 3D environment and design worlds, sets, and scenes for films, television, advertising, or games. It supports both local and remote collaboration.&nbsp;<br /> <br /> A first version of the free app will be available to download on May 1 via Steam and on the <a href="https://paintingpractice.com/" target="_blank">Painting Practice website</a>.&nbsp;<br /> &nbsp; <hr />Want to make the most of your production budget? <a href="https://www.unrealengine.com/en-US/" target="_blank">Download Unreal Engine</a> for free today.<br /> &nbsp;Film & TelevisionHis Dark MaterialsVirtual ProductionPrevisPainting PracticePlan VBen LumsdenMon, 27 Apr 2020 12:30:00 GMTMon, 27 Apr 2020 12:30:00 GMThttps://www.unrealengine.com/spotlights/designing-sets-and-action-sequences-on-his-dark-materials-with-virtual-productionTwinmotion Community Challenge #3: “Green space in an urban jungle”https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Ftwinmotion-community-challenge-3-green-space-in-an-urban-jungle%2FNEWS_TM-Community-Challenge-03_Thumbnail-375x275-6d65fdb45ef0dce337037b07fef28ec179accc8b.jpgOur third contest is looking for the best architectural visualizations created in Twinmotion, this time with a park-themed twist. Find out more here.&nbsp;Spring is in the air, and that means it’s time for our next Twinmotion Community Challenge! The previous competition took place back in fall, and we’d like to say congratulations to all those who took part, with a special mention for our competition winner Paweł Rymsza. You can find out how Paweł created the winning entry “House at the Waterfall” <a href="https://www.unrealengine.com/en-US/events/house-at-the-waterfall-breakdown-the-winning-entry-of-twinmotion-community-challenge-2" target="_blank">in his breakdown article</a>.&nbsp;<br /> <br /> Want to win some Epic prizes? Enter our new contest and share your art with the world. This time around we’re looking for your best static images of a green space in an urban jungle.&nbsp;<br /> <br /> A cash prize of $500 is up for grabs for the most compelling depiction of a park in an urban development. All entries must be created in Twinmotion 2020. Those who don’t have the full license can use the free trial, and students and teachers can use Twinmotion for education.<br /> <br /> This is a great opportunity to put the enhanced vegetation system in Twinmotion 2020 through its paces. Fill your park with the new high-resolution tree and vegetation assets, test out the new grass customization options, and explore features that make propagating vegetation throughout a scene easier.&nbsp;<br /> <br /> All entries are to be submitted via social media, with the best showcased on Twinmotion’s own social media channels.<br /> <br /> Check out the challenge details below and get your entry posted by the deadline!<br /> <img alt="NEWS_TM-Community-Challenge-03_Blog_Body_Image_1.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Ftwinmotion-community-challenge-3-green-space-in-an-urban-jungle%2FNEWS_TM-Community-Challenge-03_Blog_Body_Image_1-1640x900-6cd5f3bf449f8622141bf39d8b880d884b9c710e.jpg" style="height:auto; width:auto" /> <h2><strong>Challenge #3: “Green space in an urban jungle”</strong></h2> Parks in built-up areas are oases in an ocean of concrete. For this challenge, we’re looking for the most impressive image of a park in an urban development.<br /> <br /> Entrants will need to upload an image with at least a 2K resolution created using Twinmotion 2020. The deadline for entry is May 21, 2020 at 11:59 PM GMT+2.<br /> <br /> The winning entry will receive a cash prize of $500 plus Epic Games swag.<br /> <br /> See below for important info on submission and rules.<br /> <img alt="NEWS_TM-Community-Challenge-03_Blog_Body_Image_2.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Ftwinmotion-community-challenge-3-green-space-in-an-urban-jungle%2FNEWS_TM-Community-Challenge-03_Blog_Body_Image_2-1640x900-df4d18a7cda1b2bf3d62c52d38456cc73f10db7d.jpg" style="height:auto; width:auto" /> <h2><strong>Submissions</strong></h2> Your submission should be a 2K image created with Twinmotion 2020. Entries created using the trial and education versions of the software will be accepted. Make your submissions by May 21, 2020&nbsp;through social media, with the hashtag #TwinmotionChallenge in the text.<br /> <br /> There are several ways to enter:<br /> &nbsp; <ul style="margin-left:40px"> <li>Post the image in the <a href="https://www.facebook.com/groups/twinmotion.community/" target="_blank">Facebook Twinmotion Community Group</a></li> <li>Post the entry on your Twitter or LinkedIn account with the tag @Twinmotion</li> <li>Post the entry on your Instagram account with the tag @twinmotionofficial</li> </ul> <br /> Your entry can be posted with any one or more of the methods above. Just be sure to include #TwinmotionChallenge in the text! <h2><br /> <strong>Rules</strong></h2> <a href="https://epicgames.ent.box.com/s/wzuor59xqjvminklwg3jm3lnsapph9nz" target="_blank">Check out the official rules for this challenge</a>; they vary slightly each time. However, there’s one important rule that’s the same for every challenge:&nbsp;<br /> <br /> <strong>The items in your image cannot include any trademarked branding or logos.</strong> For example, if a laptop appears in your image, it cannot have a visible brand logo—you can still have the laptop in the scene, just not the logo. If your image includes a branded refrigerator, the refrigerator itself can stay in the scene, but the logo cannot appear in the image.<br /> <br /> Unfortunately, we have to disqualify any image with such branding or logos from the competition. We don’t want this to happen to you, so be sure to strip out all logos and branding before posting your entry.<br /> <br /> The rules for this challenge are also available in <a href="https://epicgames.ent.box.com/s/wzuor59xqjvminklwg3jm3lnsapph9nz/file/656279064868" target="_blank">Arabic</a>, <a href="https://epicgames.ent.box.com/s/wzuor59xqjvminklwg3jm3lnsapph9nz/file/656276413845" target="_blank">Chinese</a>, <a href="https://epicgames.ent.box.com/s/wzuor59xqjvminklwg3jm3lnsapph9nz/file/656271107252" target="_blank">French</a>, <a href="https://epicgames.ent.box.com/s/wzuor59xqjvminklwg3jm3lnsapph9nz/file/656283436022" target="_blank">German</a>, <a href="https://epicgames.ent.box.com/s/wzuor59xqjvminklwg3jm3lnsapph9nz/file/656285882029" target="_blank">Italian</a>, <a href="https://epicgames.ent.box.com/s/wzuor59xqjvminklwg3jm3lnsapph9nz/file/656281408140" target="_blank">Japanese</a>, <a href="https://epicgames.ent.box.com/s/wzuor59xqjvminklwg3jm3lnsapph9nz/file/656279058192" target="_blank">Korean</a>, <a href="https://epicgames.ent.box.com/s/wzuor59xqjvminklwg3jm3lnsapph9nz/file/656269230827" target="_blank">Polish</a>, <a href="https://epicgames.ent.box.com/s/wzuor59xqjvminklwg3jm3lnsapph9nz/file/656270465373" target="_blank">Portuguese</a>, <a href="https://epicgames.ent.box.com/s/wzuor59xqjvminklwg3jm3lnsapph9nz/file/656284320143" target="_blank">Russian</a>, Spanish (<a href="https://epicgames.ent.box.com/s/wzuor59xqjvminklwg3jm3lnsapph9nz/file/656280969649" target="_blank">EU</a>) (<a href="https://epicgames.ent.box.com/s/wzuor59xqjvminklwg3jm3lnsapph9nz/file/656279944357" target="_blank">LATAM</a>), and <a href="https://epicgames.ent.box.com/s/wzuor59xqjvminklwg3jm3lnsapph9nz/file/656283714826" target="_blank">Turkish</a>.<br /> <br /> We look forward to seeing your entries for the third Twinmotion Community Challenge. If you don’t have Twinmotion yet, you can <a href="https://www.unrealengine.com/en-US/twinmotion" target="_blank">download it here</a>.&nbsp;ArchitectureCommunityDesignTwinmotionTwinmotion Community ChallengeThu, 23 Apr 2020 13:00:00 GMTThu, 23 Apr 2020 13:00:00 GMThttps://www.unrealengine.com/blog/twinmotion-community-challenge-3-green-space-in-an-urban-jungle