Engine - News, Developer Interviews, Spotlights, Tech BlogsFeed containing the latest news, developer interviews, events, spotlights, and tech blogs related to Unreal. Unreal Engine 4 is a professional suite of tools and technologies used for building high-quality games and applications across a range of platforms. Unreal Engine 4’s rendering architecture enables developers to achieve stunning visuals and also scale elegantly to lower-end systems.en-USUnreal Engine 4.23 released! massive-scale physics and destruction to subtle ray-traced effects, Unreal Engine 4.23 continues to push the boundaries of cinematic quality and realism for real-time experiences. Meanwhile, our latest virtual production toolset is poised to change the art of filmmaking, while new developer-focused tools help you squeeze every ounce of performance out of your hardware.<h1 id="what'snew"><strong>What&#39;s New</strong></h1> <p>Thanks to our next-gen virtual production tools and enhanced real-time ray tracing, <strong>film and TV production is transformed</strong>. Now you can achieve final shots live on set, with LED walls powered by nDisplay that not only place real-world actors and props within UE4 environments, but also light and cast reflections onto them (Beta). We&#39;ve also added VR scouting tools (Beta), enhanced Live Link real-time data streaming, and the ability to remotely control UE4 from an iPad or other device (Beta). Ray tracing has received numerous enhancements to improve stability and performance, and to support additional material and geometry types—including landscape geometry, instanced static meshes, procedural meshes, and Niagara sprite particles.</p> <p>Unreal Engine lets you <strong>build realistic worlds without bounds</strong>. Fracture, shatter, and demolish massive-scale scenes at cinematic quality with unprecedented levels of artistic control using the new Chaos physics and destruction system. Paint stunning vistas for users to experience using runtime Virtual Texturing, non-destructive Landscape editing, and interactive Actor placement using the Foliage tool.</p> <p>We have optimized systems, provided new tools, and added features to help you <strong>do more for less</strong>. Virtual Texturing reduces texture memory overhead for light maps and detailed artist-created textures, and improves rendering performance for procedural or layered materials respectively. Animation streaming enables more animations to be used by limiting the runtime memory impact to only those currently in use. Use Unreal Insights to collect, analyze, and visualize data on UE4 behavior for profiling, helping you understand engine performance from either live or pre-recorded sessions.</p> <p>This release includes 192 improvements submitted by the incredible community of Unreal Engine developers on GitHub! Thanks to each of these contributors to Unreal Engine 4.23:</p> <p>Doug Richardson "drichardson", Morva Krist&oacute;f "KristofMorva", Reece Dunham "RDIL", "projectgheist", Jorgen P. Tjerno "jorgenpt", Ondrej Hrusovsky "Skylonxe", Miguel Fernandez "muit", Geordie Hall "geordiemhall", Artem Umerov "umerov1999", Marat Radchenko "slonopotamus", "AgentOttsel", Eric Spevacek "Mouthlessbobcat", Danny de Bruijne “danskidb”, Serta&ccedil; Ogan “SertacOgan”, Trond Abusdal “trond”, Joe Best-Rotheray “cajoebestrotheray”, Nick Edwards “NEdwards-SumoDigital”, Marcel “Zaratusa”, Mark Whitty “Mosel3y”, “YuchenMei”, Branislav Grujic “grujicbr”, “Rei-halycon”, Michael Hills “MichaelHills”, Nick Pearson “Nick-Pearson”, “mastercoms”, Zhi Kang Shao “ZKShao”, Nick “eezstreet”, “temporalflux”, Vladislav Dmitrievich Turbanov “vladipus”, Daniel Marshall “SuperWig”, Brian Marshall “TurtleSimos”, Sergey Vikhirev “Bormor”, Robert Rouhani “Robmaister”, Maxime Griot “”yamashi”, Igor Karatayev “yatagarasu25”, “Zeblote”, Hesham Wahba “druidsbane”, Joe Best-Rotheray “cajoebestrotheray”, MoRunChang “MoRunChang2015”, S&eacute;bastien Rombauts “SRombauts”, JinWook Kim “zelon”, Riley Labrecque “rlabrecque”, Дмитрий “Yakim3396”, “DanMilwardSumo”, Wesley Barlow “Wesxdz”, Franco Pulido “Franco Pulido”, Kimmo Hernborg “KimmoHernborg”, John Dunbar “Volbard”, Michał Siejak “Nadrin”, kalle H&auml;m&auml;l&auml;inen “kallehamalainen”, “KaosSpectrum”, Evan Hart “ehartNV”, Skyler Clark “sclark39”, Thomas Miller “tmiv”, Stephen A. Imhoff “Clockwork-Muse”, David Payne “davidpayne-cv”, “CyberKatana”, “roidanton”, Milan &Scaron;ťastn&yacute; “aknarts”, Alex P-B chozabu, Marco Antonio Alvarez “surakin”, Taikatou, Doğa Can Yanıkoğlu “dyanikoglu”, “Kalmalyzer”, “phi16”, Mikhail Zakharov “zz77”, Paul Hampson "TBBle", “NextTurn”, “Punlord”, kalle H&auml;m&auml;l&auml;inen “kallehamalainen”, Robert Pr&ouml;pper “rproepp”, Yohann Martel “”ymartel06”, Francis J. Sun “francisjsun”, Eric Wasylishen “ericwa”, Phillip Baxter “PhilBax”, Alan Liu “PicaroonX”,Mathias H&uuml;bscher “user37337”,Daisuke Ban “exceed-alae”, Brandon Wilson “Brandon-Wilson”, Marcin Gorzel “mgorzel”, “prolenorm”</p> <h1 id="majorfeatures"><strong>Major Features</strong></h1> <h2 id="new:chaos-destruction_beta_"><strong>New: Chaos - Destruction (Beta)</strong></h2> <p>Revealed in a <a href="" target="_blank">demo</a> at GDC 2019, Chaos is Unreal Engine&#39;s new high-performance physics and destruction system available to preview in Beta form with the 4.23 release. With Chaos, users can achieve cinematic-quality visuals in real-time in scenes with massive-scale levels of destruction and unprecedented artist control over content creation.</p> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <div class="note"> <p>Chaos functionality in Unreal Engine 4.23 must be enabled and compiled using a source build. See <a href="" target="_blank">this guide</a> for instructions on enabling Chaos.</p> </div> <p>For more information on Chaos Destruction, refer to the Chaos Destruction documentation pages. We have also added a Chaos Destruction Demo sample to the Learn Tab in the launcher to demonstrate how to set up various types of simulations and effects.</p> <h3 id="geometrycollections"><strong>Geometry Collections</strong></h3> <p>These are a new type of asset in Unreal for destructible objects. They can be built from one or more Static Meshes, including those gathered together in Blueprints or even nested Blueprints. Geometry Collections let the artist choose what to simulate and they also offer flexibility in terms of how you organize and author your destruction.<img src="" /></p> <p style="text-align: center;"><em>Left - One Wall Section - 31 Geometry Collections; Right - Exploded view of Static Mesh parts</em></p> <h3 id="fracturing"><strong>Fracturing</strong></h3> <p>Once you have a Geometry Collection, you can break it into pieces using the Fracturing tools. You can fracture each part individually, or apply one pattern across multiple pieces. In addition to standard Voronoi fractures, you can use Radial fractures, Clustered Voronoi fractures, and Planar Cutting using noise to get more natural results.</p> <div style="text-align: center;"><img src="" /></div> <p style="text-align: center;"><em>Left - Original Geometry Collection; Center - Fracture Across Entire Mesh; Right - Sub-fracturing Only Large Pieces</em></p> <h3 id="clustering"><strong>Clustering</strong></h3> <p>With optimization in mind, Sub-fracturing allows you to control where to add complexity. Each time you sub-fracture, an extra Level is added to the Geometry Collection. The Chaos system keeps track of each subsequent Level and stores that information into something you can control called a Cluster. Below is an example of a mesh where each fracture Level is combined into its own set of Clusters.</p> <div style="text-align: center;"><img src="" /><em>Left - Level 1: 6 Objects; Center - Level 3: 50 Objects; Right - Level 5: 513 Objects</em></div> <h3 id="connectiongraph"><strong>Connection Graph</strong></h3> <p>This is a lightweight connectivity map that is a bit of a different paradigm for destruction simulation. In the image below, we have a few statically anchored pieces, but everything else in the scene is potentially dynamic.</p> <div style="text-align: center;"><img src="" /><em>Blue - Potential for Chaos; Yellow - Anchored Nodes</em></div> <p>Rather than swapping from kinematic to dynamic, the Chaos system applies strain, which in turn breaks connections and Chaos destruction ensues. This is a great way to maximize interactivity, while retaining control over the amount of active rigid bodies.</p> <h3 id="fields"><strong>Fields</strong></h3> <p>Fields are the way that you can directly interact with and control simulations. Fields can be used to control any attribute on any part of your Geometry Collection. If you want to vary the mass for instance, or make something static, make the corner more breakable than the middle, apply some force; all of this can be controlled with Fields.</p> <h3 id="cachedsimulations"><strong>Cached Simulations</strong></h3> <p>With caching, high fidelity simulations can be pre-cached and played back in real-time resulting in a kinematic Geometry Collection. This means that you can author large scale destruction events and still allow interactivity with the player and environment.</p> <h3 id="niagaraintegration"><strong>Niagara Integration</strong></h3> <p>The Chaos system is a first class citizen of UE4, and as such, lives alongside all other systems that simulate your world including Niagara. Incorporating visual effects into your simulations can add a lot of depth and realism to the world. For example, when a building breaks apart it generates a large amount of dust and smoke. To create the marriage between destruction and VFX, data from the physics system can be sent to Niagara when an object collides or breaks apart, and that data can be used to generate interesting secondary effects.</p> <h2 id="new:real-timeraytracingimprovements_beta_"><strong>New: Real-Time Ray Tracing Improvements (Beta)</strong></h2> <p>Ray Tracing support has received many optimizations and stability improvements in addition to several important new features.</p> <p> </p> <div><img src="" /></div> <h3 id="performanceandstability"><strong>Performance and Stability</strong></h3> <p>A large focus this release has been on improving stability, performance, and quality of Ray Tracing features in Unreal Engine. This means:</p> <ul> <li>Expanded DirectX 12 Support for Unreal Engine as a whole</li> <li>Improved Denoiser quality for Ray Traced Features</li> <li>Increased Ray Traced Global Illumination (RTGI) quality</li> </ul> <h3 id="additionalgeometryandmaterialsupport"><strong>Additional Geometry and Material Support</strong></h3> <p>We now support additional geometry and material types, such as:</p> <ul> <li>Landscape Terrain (Measured Performance on a 2080Ti in KiteDemo: ~2ms geometry update time and ~500MB video memory)</li> <li>Hierarchical Instanced Static Meshes (HISM) and Instanced Static Meshes (ISM)</li> <li>Procedural Meshes</li> <li>Transmission with SubSurface Materials</li> <li>World Position Offset (WPO) support for Landscape and Skeletal Mesh geometries</li> </ul> <h3 id="multi-bouncereflectionfallback"><strong>Multi-Bounce Reflection Fallback</strong></h3> <p>We&#39;ve improved support for multi-bounce Ray Traced Reflections (RTR) by falling back to Reflection Captures in the scene. This means that intra-reflections (or reflections inside of reflections) that are displaying black, or where you&#39;ve set a max reflection distance, will fall back to these raster techniques instead of displaying black.</p> <p>This can subtly improve the quality of reflections without using multiple raytraced bounces, greatly increasing performance.</p> <div class="asyncgif"> <div style="text-align: center;"><img src="" /><em>1 - Single RTR Bounce; 2 - Two RTR Bounces; 3 - Single RTR Bounce with Reflection Capture Fallback for last bounce</em></div> </div> <h2 id="new:virtualtexturing_beta_"><strong>New: Virtual Texturing (Beta)</strong></h2> <p>With this release, Virtual Texturing beta support enables you to create and use large textures for a lower and more constant memory footprint at runtime.</p> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <h3 id="streamingvirtualtexturing"><strong>Streaming Virtual Texturing</strong></h3> <p>Streaming Virtual Texturing uses Virtual Texture assets to offer an alternative way to stream textures from disk compared to existing Mip-based streaming. Traditional Mip-based texture streaming works by performing offline analysis of Material UV usage, then at runtime decides which Mip levels of textures to load based on object visibility and camera distance. For Virtual Textures, all Mip levels are split into tiles of a small fixed size, which the GPU can then determine which Virtual Texture tiles are accessed by all visible pixels on the screen.</p> <p>Streaming Virtual Texturing can reduce texture memory overhead and increase performance when using very large textures (including Lightmaps and UDIM textures), however, sampling from a Virtual Texture is more expensive than sampling a regular texture.</p> <p>For full details, see the <a href="" target="_blank">Streaming Virtual Texturing</a> documentation.</p> <h3 id="runtimevirtualtexturing"><strong>Runtime Virtual Texturing</strong></h3> <p>Runtime Virtual Texturing uses a Runtime Virtual Texture asset with a volume placed in the level. It works similarly to traditional texture mapping except that it&#39;s rendered on demand using the GPU at runtime. Runtime Virtual Textures can be used to cache shading data over large areas making them a good fit for Landscape shading.</p> <p>For full details, see the <a href="" target="_blank">Runtime Virtual Texturing</a> documentation.</p> <h2 id="new:unrealinsights_beta_"><strong>New: Unreal Insights (Beta)</strong></h2> <p>Unreal Insights (currently in Beta) enables developers to collect and analyze data about Unreal Engine&#39;s behavior in a uniform fashion. This system has two main components:</p> <ul> <li>The <strong>Trace System API</strong> gathers information from runtime systems in a consistent format and captures it for later processing. Multiple live sessions can contribute data at the same time.</li> <li>The <strong>Unreal Insights Tool</strong> provides interactive visualization of data processed through the Analysis API, providing developers with a unified interface for stats, logs, and metrics from their application.</li> </ul> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <p>You can connect to one or more live sessions, or select live or pre-recorded session data to view, in the Trace Sessions window (under the Unreal Insights tab):</p> <p>Once you have selected the session data you want to examine, you can use the Timing Insights or Asset Loading Insights tabs to browse through it.</p> <h2 id="new:hololens2nativesupport_beta_"><strong>New: HoloLens 2 Native Support (Beta)</strong></h2> <p>Developers are now able to begin creating for the HoloLens 2. You&#39;ll have access to APIs for the platform&#39;s unique features, such as streaming and native deployment, finger tracking, gesture recognition, meshing, voice input, spatial anchor pinning, and more. Build an AR game or an enterprise application. And with robust emulator support, it doesn&#39;t matter if you have a device already or are still eagerly awaiting delivery - you can get started right away with UE4 and HoloLens 2 development.</p> <div style="text-align: center;"><img src="" /></div> <div><img src="" /></div> <p>For more information, see <a href="" target="_blank">Microsoft HoloLens 2 Development</a>.</p> <h2 id="new:virtualproductionpipelineimprovements"><strong>New: Virtual Production Pipeline Improvements</strong></h2> <p>Unreal Engine continues to lead the way with advancements in what is possible in a virtual production pipeline! Virtually scout environments and compose shots, use the virtual world to light the real world, connect live broadcast elements with digital representations to build a seamless experience, and control it all remotely using custom-built interfaces.</p> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <h3 id="in-cameravfx_beta_"><strong>In-Camera VFX (Beta)</strong></h3> <p>Using improvements for In-Camera VFX, you can achieve final shots live on set that combine real-world actors and props with Unreal Engine environment backgrounds, using an LED wall that can either display an Unreal Engine scene, or a digital greenscreen for real-time compositing in UE4.</p> <p>Camera frustum-based rendering enables real-world actors and props to receive lighting and reflections from the CG environment, and in some cases eliminates post-production workflows, significantly accelerating overall production. Save time by quickly placing greenscreens digitally on an LED wall with a click of a button instead of physically setting them up on stage. The entire solution can scale to LED walls of virtually any size or configuration thanks to nDisplay multi-display technology.</p> <h3 id="vrscoutingforfilmmakers_beta_"><strong>VR Scouting for Filmmakers (Beta)</strong></h3> <p>The new VR Scouting tools give filmmakers in virtual production environments new ways to navigate and interact with the virtual world in VR, helping them make better creative decisions.</p> <div><img src="" /></div> <p>Directors and DOPs can easily find locations, compose shots, set up scene blocking, and get an accurate representation of the filming location, while artists and set designers can experience the location in VR while building it, using measurement and interaction tools to check distances and modify the world. You can capture out images from the virtual world, helping the whole production team track the decisions made during the VR session. Controllers and settings can be customized in Blueprints, without needing to go into C++ and rebuild the Engine.</p> <p>For details, see <a href="" target="_blank">Virtual Scouting</a>.</p> <h3 id="livelinkdatatypesanduximprovements"><strong>Live Link Datatypes and UX Improvements</strong></h3> <p>The Live Link Plugin now handles more kinds of information and it is easier to apply the synchronized data to scene elements in Unreal! You can now drive character animation, cameras, lights, and basic 3D transforms dynamically from other applications and data sources in your production pipeline.</p> <p>You can assign a <em>role </em>to each scene element, which determines the information that Live Link synchronizes for that element. The Live Link Plugin offers built-in roles for character animation, cameras, lights, and basic 3D transforms. You can also drive an Actor in your Unreal Engine Level more easily from any Live Link source by assigning the Actor a new Live Link controller Component.</p> <p>Additional improvements:</p> <ul> <li>Pre-process the data coming through Live Link before it gets applied to your scene (for example, to apply axis conversions) and control how incoming data is transformed when you map a source from one role to another. You can also create and assign your own custom pre-processors and translators classes.</li> <li>Combine multiple sources into a Virtual Subject, which lets you drive a single scene element in Unreal based on information coming through multiple Live Link sources.</li> <li>Save and load presets for Live Link setups that you need to return to often.</li> <li>Status indicators show you at a glance what Live Link sources are currently sending data to the Unreal Engine.</li> <li>We have added support for ART tracking through Live Link enabling you to leverage ART technology for various tracking purposes in applications such as VR, Augmented Reality and Motion Capture.</li> </ul> <p>For details, see the <a href="" target="_blank">Live Link Plugin</a> documentation.</p> <h3 id="remotecontroloverhttp_beta_"><strong>Remote Control over HTTP (Beta)</strong></h3> <p>You can now send commands to Unreal Engine and Unreal Editor remotely over HTTP!</p> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <p>This makes it possible to create your own customized web user interfaces that trigger changes in your project&#39;s content. You can control Unreal Engine from any browser or custom app that supports modern web standards, and you can integrate your controls into other custom panels that you use to control other applications in your environment.</p> <p>By calling different endpoints provided by the remote control interface, you can set and get properties on Actors and Assets and call any function that is exposed to Blueprints and Python.</p> <p>For details, see <a href="" target="_blank">Web Remote Control</a>.</p> <h2 id="new:ndisplaywarpandblendforcurvedsurfaces"><strong>New: nDisplay Warp and Blend for Curved Surfaces</strong></h2> <p>You can now use nDisplay to project your Unreal Engine content on to a wider variety of physical installations, including curved and spherical screens, and scenarios that involve complex blending between overlapping projections.</p> <div style="text-align: center;"><img src="" /></div> <p>nDisplay offers built-in integrations for two common ways of expressing how to project and blend 2D images from multiple projects to warped and curved surfaces:</p> <ul> <li>The Scalable Mesh File (.smf) format, developed by <a href="" target="_blank">Scalable Display Technologies</a>.</li> <li>The Multiple Projection Common Data Interchange (MPCDI) standard, developed by <a href="" target="_blank">VESA</a>.</li> </ul> <p>In addition, third-party developers can now customize for the way nDisplay renders UE4 content by implementing an extended set of rendering interfaces.</p> <p>For more information on nDisplay, see <a href="" target="_blank">Rendering to Multiple Displays with nDisplay</a>.</p> <h2 id="new:niagaraimprovements_beta_"><strong>New: Niagara Improvements (Beta)</strong></h2> <h3 id="integrationintochaosphysics"><strong>Integration into Chaos Physics</strong></h3> <p>Niagara particle systems can now be generated by the physics simulation in Chaos! Whenever an object fractures, you can generate smoke and dust as well as more tiny fractured bits that enhance the physics simulation&#39;s visuals. There is now a Chaos destruction listener, which sends events to associated particle systems and provides information about Chaos interaction events such as break, collision, or trailing. Examples are available in the Chaos Destruction Content Example hallway.</p> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <h3 id="gpusimulationimprovements"><strong>GPU Simulation Improvements</strong></h3> <p>Performance of GPU simulation has been significantly improved to reduce idle time by providing better data management and more explicit synchronization primitives between compute jobs. This enables overlapping significant amounts of GPU work, which greatly increases throughput and allows for more opportunities to run compute shaders in parallel with other computing work.</p> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <h3 id="gpusupportforstaticandskeletalmeshsampling"><strong>GPU Support for Static and Skeletal Mesh Sampling</strong></h3> <p>GPU simulations can now sample the surface of a mesh, grab the UV coordinates, and sample from a texture, and then use that capability to drive complex simulation logic, narrowing the gap between what is possible on the CPU and what is possible on the GPU and enabling effects like the fairies in the GDC Troll demo, which spawned 600,000 particles a second.</p> <h3 id="raytracingsupportforniagaraspriteemitters"><strong>Ray Tracing Support for Niagara Sprite Emitters</strong></h3> <p>Niagara simulations can now generate geometry that is used by reflections and shadows when raytracing, such as the fairies in the GDC Troll demo which contribute to the reflections in the water as they fly over.</p> <div class="note"> <p>Currently only sprite emitter particles are supported.</p> </div> <h3 id="emitterinheritance"><strong>Emitter Inheritance</strong></h3> <p>You can now create an emitter that inherits from an existing emitter in your project when you create an emitter in Niagara. Now artists can reuse content more easily, since you can create inheritance trees of functionality for emitters with similar purposes.</p> <p>As an example, if you create an emitter for a basic muzzle flash used in weapon effects, you might then want to have a heavy and light variant of that muzzle flash. Through inheritance, you can create new emitters for light and heavy variations, but which both inherit the same base functionality and renderers as the original muzzle flash.</p> <h3 id="compilingandcookingimprovements"><strong>Compiling and Cooking Improvements</strong></h3> <p>You can now flush stale data and avoid unnecessary recompilation of Niagara Assets during load and cook operations using two new console commands:</p> <ul style="margin-left: 40px;"> <li><em>fx.PreventSystemRecompile</em> flushes data for a single system and all emitters it depends on.</li> <li><em>fx.PreventAllSystemRecompiles </em>finds every Niagara System in the project and will flush these Systems and the Emitters they depend on</li> </ul> <p>After flushing and resaving the affected Assets, your load and cook processes will be faster and free from potential failures that stem from stale data.</p> <h3 id="improvederrorreporting"><strong>Improved Error Reporting</strong></h3> <p>A new Niagara Message Log panel in the Script and Emitter/System editor displays messages, warnings and errors from the last compile and allows you to navigate to the nodes or pins in your Script Graphs that cause warnings or errors.</p> <p>This is particularly useful for technical and VFX Artists who write new functionality and behaviors through Niagara Scripts, or who composite Niagara Scripts into Systems as it improves your ability to rapidly iterate in the Script Editor. Moreover, these messages act as hyperlinks to the source of the error, which enables you to quickly find and resolve issues in Scripts or Systems.</p> <h3 id="staticswitches"><strong>Static Switches</strong></h3> <p>Niagara now supports Static Switch nodes to reduce compile time and improve runtime performance by dropping instructions and parameters that don&#39;t affect the graph&#39;s final output. Static Switch nodes support several unique features, such as value propagation, metadata support, parameter culling, and compiler constant evaluation.</p> <div><img src="" /></div> <h3 id="increasedfeatureparitywithcascade"><strong>Increased Feature Parity with Cascade</strong></h3> <p>This release increases the feature parity between Cascade and Niagara. Here are some features Cascade has that are now available in Niagara:</p> <ul style="margin-left: 40px;"> <li><strong>Sprite cutouts</strong> enable artists to reduce overdraw by creating a smaller bounding shape for the particle instead of rendering a full quad.</li> <li><strong>GPU sorting</strong> enables transparent objects to be sorted appropriately within the emitter.</li> <li><strong>AnimNotify events</strong> enable you to create particles in animation tracks, and manage their lifetime appropriately.</li> <li><strong>Standardized Activate/Deactivate</strong> make it easier to drop Niagara effects into existing pipelines.</li> <li><strong>Set Static Mesh and Skeletal Mesh targets in Blueprints</strong> make it easier to drop Niagara effects into existing pipelines.</li> <li><strong>Sampled Bone and Socket transforms</strong> enable effects that use the Skeletal Mesh to run on lower end target hardware.</li> </ul> <h2 id="new:platformextensions"><strong>New: Platform Extensions</strong></h2> <p>We have added the concept of Platform Extensions which are similar in nature to plugins as they encapsulate all code for an individual platform in a single location decoupling it from the main engine code. This has changed some fundamental ways that the engine and its build system works, and may affect you, depending on how much you have modified the engine. See <a href="" target="_blank">this guide</a> for more details on Platform Extensions.</p> <h2 id="new:skinweightprofiles"><strong>New: Skin Weight Profiles</strong></h2> <p>The new Skin Weight Profile system enables you to override the original Skin Weights that are stored with a Skeletal Mesh and improve visual fidelity on certain platforms where dynamic character parts are disabled for performance reasons. You can import a profile from the Skeletal Mesh Editor and will need to provide the FBX file containing the different Skin Weights to use, a name for the profile, and optionally an LOD index. You can also assign a Skin Weight Profile for a Skinned Mesh Component (or any child-class) using the new Blueprint exposed API.</p> <p>For more information, please see the Skin Weight Profiles documentation.</p> <h2 id="new:animationstreaming_experimental_"><strong>New: Animation Streaming (Experimental)</strong></h2> <p>Animations can now be streamed for improved memory usage and memory management, which is particularly useful when playing back animation data in long cinematics.</p> <p>The Animation Compression system can now compress animations into chunks to make them stream friendly, while the new Animation Stream Manager manages the loading and holding of the streamed animation data in memory and a new UAnimStreamable asset represents an animation that can be streamed.</p> <h2 id="new:dynamicanimationgraphs_experimental_"><strong>New: Dynamic Animation Graphs (Experimental)</strong></h2> <p>Dynamic Anim Graphs provide dynamic switching of sub-sections of an Animation Graph via Layers, which are separate Graphs that can be plugged into the Anim Graph via Layer Nodes**. This enables multi-user-collaboration and potential memory savings for functionality that is no longer needed.</p> <p>Layers are defined by an <strong>Animation Layer Interface </strong>asset, which is an Animation Blueprint that has restricted functionality and is analogous to a <strong>Blueprint Interface</strong>. It defines the number of Layers, their names, which group they belong to, and their inputs.</p> <p>Additionally, Sub Instance nodes have gained the ability to have their running class dynamically switched using SetSubInstanceClassByTag in UAnimInstance. Sub Instance nodes can also now expose multiple input poses from sub-graphs.</p> <p>Both the Animation Blueprint and Sub Animation Blueprint(s) have to implement an Animation Layer Interface to be able to create Layer graphs, and instantiate Layer Nodes.</p> <h2 id="new:opensoundcontrol_beta_"><strong>New: Open Sound Control (Beta)</strong></h2> <p>The Open Sound Control (OSC) plugin provides a native implementation of the Open Sound Control standard, which provides a common framework for inter-application and inter-machine communication for real-time parameter control over UDP.</p> <div><img src="" /></div> <h2 id="new:wavetablesynthesisplugin"><strong>New: Wavetable Synthesis Plugin</strong></h2> <p>We added a new monophonic Wavetable synthesizer that leverages the curve editor in Unreal Engine to author the time-domain wavetables, enabling a wide range of sound design capabilities that can be driven by gameplay parameters. The table index as well as all other parameters are controllable from C++ and Blueprint.</p> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <h2 id="new:csvtosvgtool_beta_"><strong>New: CSVToSVG Tool (Beta)</strong></h2> <p>CSVToSVG is a new command-line tool that helps visualize performance data by building vector graphics image files (.SVG) from .CSV files. The tool handles data from any numerical stat, and supports smoothing, budget lines, and stacked graphs. Developers can use this to produce graphs of any data output by the performance tools included with Unreal Engine, or data from their own tools.</p> <div><img src="" /></div> <h2 id="new:sequencercurveeditorimprovements_beta_"><strong>New: Sequencer Curve Editor Improvements (Beta)</strong></h2> <p>The Sequencer Curve Editor has been significantly improved with several highly requested features and extensibility support! Artists and developers can now extend the Curve Editor by adding new tool modes, new toolbar buttons, and custom data filters (such as smoothing) without modifying engine code. In addition, the Curve Editor can now be docked separately from Sequencer and includes a transform tool that includes scaling, as well as a key retiming tool.</p> <p>To use the updated Curve Editor, just open Sequencer or a UMG (Unreal Motion Graphics) Animation and click the Curve Editor button. The new Transform and Retime tools are provided through a plugin that is enabled by default. You can now remove and dock the Curve Editor window anywhere you want, and the position saves when you close Sequencer or the Editor and restores when you reopen Sequencer.</p> <h3 id="extendingthecurveeditor"><strong>Extending the Curve Editor</strong></h3> <p>The updated Curve Editor can be extended by plugins without modifying the engine code. There are three main ways to extend the curve editor:</p> <ul style="margin-left: 40px;"> <li><strong>Tool Modes - </strong>Tool modes are exclusive modes that allow you to catch the user input to the Editor first, which allows you to create a wide variety of tools. There are two example tools: FCurveEditorTransformTool, and FCurveEditorRetimeTool. To create a new custom tool, implement ICurveEditorToolExtension and register your extension with the Curve Editor module (see FCurveEditorToolsModule::StartupModule() for an example).</li> <li><strong>Toolbar Buttons - </strong>Toolbar Buttons allow you to insert new buttons into the Curve Editor toolbar, and these buttons can manipulate the Curve Editor in any way you want. There is an example which implements additional options for focusing on data (instead of just the default Focus Selected). To create a new toolbar button, implement ICurveEditorExtension and register it with the Curve Editor module (see FCurveEditorToolsModule::StartupModule() for an example).</li> <li><strong>Data Filters</strong> - Data Filters are a way to create new ways to manipulate selected curves as a singular action with user-specified settings. We provide three data filters by default which operate on your current selection: <ul> <li><strong>Bake</strong> (UCurveEditorBakeFilter)</li> <li><strong>Simplify</strong> (UCurveEditorReduceFilter)</li> <li><strong>Smoothing</strong> (UCurveEditorFFTFilter)</li> </ul> </li> </ul> <p>A new filter can be implemented by deriving a class from UCurveEditorFilterBase. No registration with the module is needed. New implementations will automatically appear inside the <strong>User Filter</strong> dialog. Any UPROPERTY marked as EditAnywhere will automatically appear, enabling you to manipulate them before applying the filter.</p> <h3 id="viewmodes"><strong>View Modes</strong></h3> <p>View your curves in new and exciting ways! Visible curves can be viewed in the tree view on the left side of the Curve Editor, and then you can choose how to view selected curves by clicking the View Mode button as shown below. There are three View Modes:</p> <ul style="margin-left: 40px;"> <li><strong>Absolute</strong> <strong>View</strong> shows all of your curves plotted in absolute space - this matches legacy behavior that everyone is used to.</li> <li><strong>Stacked View</strong> normalizes each curve and then draws them non-overlapping.</li> <li><strong>Normalized View</strong> normalizes each curve (like Stacked View), but draws them all on top of each other.</li> </ul> <div class="asyncgif"> <p> </p> <div><img src="" /></div> </div> <h3 id="newfiltermenu"><strong>New Filter Menu</strong></h3> <p>The new Filter menu applies different filters to your selection with a full Details panel and a wide range of settings. It persists after you click the Apply button, so you can easily iterate on your settings. You can add new options in this dialog by implementing a function on a new class which derives from UCurveEditorFilterBase.</p> <h3 id="multipleframingoptions"><strong>Multiple Framing Options</strong></h3> <p>Multiple different framing options are now available.</p> <ul style="margin-left: 40px;"> <li><strong>Frame Selection</strong> (or <strong>Frame All</strong> if there is no selection) is built-in and defaults to <strong>F</strong>.</li> <li><strong>Frame on Playback Cursor</strong> and <strong>Frame on Playback Range</strong> are new options. These are good examples of how you can extend the Toolbar using a plugin.</li> <li>The old <strong>Frame Horizontal/Frame Vertical</strong> options have been replaced with <strong>Focus All</strong> followed by <strong>ALT + SHIFT + RMB</strong> to zoom each axis independently. You can also follow Focus All with <strong>ALT + RMB</strong> to scale proportionally.</li> </ul> <div class="asyncgif"> <div><img src="" /></div> </div> <h3 id="retimingtool"><strong>Retiming Tool</strong></h3> <p>The Retiming Tool creates a one-dimensional lattice and lets you adjust the timing of your keys. It supports grid snap, multi-select and more! This tool is provided by the default plugin.</p> <div class="asyncgif"> <div><img src="" /></div> </div> <h3 id="transformtool"><strong>Transform Tool</strong></h3> <p>The Transform tool supports translating selected keys and scaling on both X and Y axes (where appropriate). Press ALT to change the anchor point from the opposite edge to the center.</p> <h2 id="new:sequencerusabilityimprovements"><strong>New: Sequencer Usability Improvements</strong></h2> <p>A number of workflow and usability improvements have been added including:</p> <ul style="margin-left: 40px;"> <li><strong>Filter and Search for Tracks </strong>- You can now filter and search for tracks and actors.</li> <li><strong>Stretch/Shrink Frames </strong>- You can now add or reduce the amount of time between sections and keys as long as there are no overlapping elements. <div class="asyncgif"> <div style="text-align: center;"><img src="" /></div> </div> </li> <li><strong>Adding and Editing Multiple Tracks </strong>- You can now group-add and edit multiple tracks by using Shift or Control. <div class="asyncgif"> <div style="text-align: center;"><img src="" /></div> </div> </li> <li><strong>Blend Audio Sections on Multiple Rows</strong> - You can now blend Audio sections which enables you to achieve crossfading effects. This functions similarly to blending Transform sections.</li> <li><strong>Restore State or Keep State for Skeletal Animation Poses</strong> - This option uses the When Finished property to either restore the Skeletal Mesh to its bind pose after the animation section has been evaluated, or, keep the animation pose of the last animation section.</li> <li><strong>UMG Multibinding </strong>- This feature has been around prior to 4.23, however, now you can add multiple widget bindings to an existing binding. This lets you animate one widget which can be used by other widgets.</li> </ul> <h2 id="new:datavalidationextensibility"><strong>New: Data Validation Extensibility</strong></h2> <p>The Data Validation plugin has been extended to support C++, Blueprint, and Python-based rules for asset validation. This enables tech artists who are working in Blueprints or Python to create asset validation scripts, instead of requesting C++ implementations. For developers, default engine classes are now able to go through a validation path.</p> <h2 id="new:debugcameracontrollerimprovements"><strong>New: DebugCameraController Improvements</strong></h2> <p>In this release, we have added new features to the DebugCameraController:</p> <ul style="margin-left: 40px;"> <li><strong>Orbit</strong> functionality which allows the user to orbit about a selected position or the center of a selected actor so you can review assets more thoroughly.</li> <li><strong>Buffer visualization</strong> overview, with an option to select buffers for fullscreen view makes it possible to examine the contents of the graphic card buffers.</li> <li><strong>View mode cycling</strong> gives you the ability to examine the different types of scene data being processed.</li> </ul> <p>The new features augment in-game debugging capabilities when you are using the Debug Camera Controller in Play-In-Editor (PIE). The ability to examine view modes and graphics buffers helps you diagnose unexpected scene results in-game.</p> <p>To open the debug camera controller in PIE, enter ToggleDebugCamera on the console command line, or use the new semicolon hotkey.</p> <h2 id="new:multi-usereditingimprovements_beta_"><strong>New: Multi-User Editing Improvements (Beta)</strong></h2> <p>Multi-User Editing is significantly improved with the goal of making it a better fit for real-world production use cases.</p> <ul style="margin-left: 40px;"> <li>We&#39;ve streamlined the user interface to remove unnecessary dialogs, and to bring all the information and controls you need to manage your sessions into one place.</li> </ul> <p>(The Multi-User Editing icon in the Toolbar is now hidden by default. You can open the new <strong>Multi-User Editing Browser</strong> from the <strong>Window > Developer Tools</strong> menu, or re-enable the Toolbar icon in the <strong>Project Settings</strong> dialog. See the <a href="" target="_blank">Multi-User Editing Reference</a>.)</p> <ul style="margin-left: 40px;"> <li>Reliability of the transaction system across all session participants is greatly improved.</li> <li>We have minimize the possibility of accidentally losing session data through user error by prompting the owner of a session to persist the changes made in the session before they leave. The server now also automatically archives all live sessions it&#39;s currently running when it shuts down, and you can recover the data from that archive later, if needed.</li> <li>We&#39;ve improved the Asset locking system to make it easier for multiple users to work on the same Assets. You&#39;ll now receive a notification when you try to modify an Asset that is already locked by another user in your session. In addition, if that other user releases the lock, you&#39;ll automatically get a lock on that Asset until the next time you save it.</li> </ul> <p>For more information on using the Multi-User Editing system, see <a href="" target="_blank">the documentation</a>.</p> <h2 id="new:disasterrecovery_experimental_"><strong>New: Disaster Recovery (Experimental)</strong></h2> <p>The new opt-in Disaster Recovery system augments the Editor&#39;s existing Auto-Save system to increase your confidence that you&#39;ll be able to recover changes you make in your project content, even in the event of an unexpected shutdown.</p> <p>As you work, it records the edits you make, using the same transaction system that underlies the <a href="" target="_blank">Multi-User Editing</a> system. If the Editor shuts down with unsaved changes, then the next time you open your project you&#39;ll be shown a list of all the transactions you made since the last save. You can restore all the listed changes, or all changes up to any recovery point you select.</p> <h2 id="new:draganddroptofillarray"><strong>New: Drag and Drop to Fill Array</strong></h2> <p>You can now select multiple Assets from the Content Browser and drag and drop them onto the array header to fill the array. Select any Assets of the same type that you want as the array, and drag them from the Content Browser into the array header. This works for any array that stores Assets, and simplifies the workflow when working with large arrays.</p> <p>See the <a href="" target="_blank">Array Control</a> page for more information.</p> <h2 id="new:editconditionsmetadataimprovements_beta_"><strong>New: EditConditions Metadata Improvements (Beta)</strong></h2> <p>You can now use simple boolean expressions to enable or disable property editing in the Details panel using the new expression parser for the EditCondition metadata of the UPROPERTY system. Support for enums and numeric types is also included, so programmers can write relatively complicated expressions with minimal overhead.</p> <p>Programmers writing classes that are intended to be edited by users, such as Actors or Components, where certain properties are only conditionally valid can quickly create more intuitive UIs without needing to write a fully detailed customization class.</p> <p>The EditCondition meta tag now accepts nearly any valid C++ expression. For example, here is a valid property definition after this change:</p> <pre class="prettyprint"> <code>UPROPERTY(EditAnywhere, Category=EditCondition, meta=( EditCondition="IntegerEditCondition >= 5" ))</code></pre> <p>Additional examples of valid expressions:</p> <ul style="margin-left: 40px;"> <li><code>MyInteger > 5 && MyFloat <= 10</code></li> <li><code>MyEnum == EnumType::B</code></li> <li><code>MyBool == true || MyInteger == MyFloat + 5</code></li> </ul> <h2 id="new:editorutilityblueprintsupdates_beta_"><strong>New: Editor Utility Blueprints Updates (Beta)</strong></h2> <p>Editor Utility Blueprints have been updated to improve the Editor&#39;s extensibility with Blueprints, bringing Blueprints extensibility more in line with Python and C++ extensibility. Editor Utility Widgets are now available to enable deeper customization of UIs than the previous auto-generated buttons for Editor Utility Blueprints.</p> <p>Improvements include:</p> <ul style="margin-left: 40px;"> <li>Parent classes can be any editor-only class that isn&#39;t a widget.</li> <li>Can create instances of Editor Utilities that need instancing for function calls.</li> <li>The ability to add a Run function to any Editor Utility Blueprint or Editor Utility Widget which will run scripts on Editor start. This will create an instance and allow for binding to stateful editor events.</li> </ul> <h2 id="new:hideunrelatednodes"><strong>New: Hide Unrelated Nodes</strong></h2> <p>Users can now select a node and use the Hide Unrelated feature to dim all other nodes not linked to the selected node so Materials and Blueprints can be debugged and understood in a much cleaner and direct way.</p> <h2 id="new:landscapesplinesimprovements"><strong>New: Landscape Splines Improvements</strong></h2> <p>You can now use <strong>ALT + Left Mouse Button (LMB) Drag</strong> to create a new spline control point and segment. This is in addition to the existing <strong>CTRL + Left-Click</strong> action to add new splines. The advantage of <strong>ALT + LMB Drag</strong> is that as you drag the cursor, you can see what the new spline will look like. You can also split splines with this action.</p> <p>To add a new control point, select an existing control point and <strong>ALT + LMB Drag</strong> in the direction you want to place the new point. To split an existing spline, select a spline point on either side of the segment and <strong>ALT + LMB Drag</strong> the cursor towards an existing segment to split the path.</p> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <h2 id="new:editorperformanceimprovements"><strong>New: Editor Performance Improvements</strong></h2> <p>If you regularly work with very large scenes, you should notice some significant performance improvements in the Unreal Editor aimed at making your work faster and smoother. It&#39;s much faster to select and de-select hundreds or thousands of separate Actors at a time, to show and hide Layers that contain thousands of Actors, to work with a selected Actor that is a parent of thousands of other Actors, to load Levels that contain thousands of Actors, and more.</p> <h2 id="new:materialeditorupdates"><strong>New: Material Editor Updates</strong></h2> <p>Material Editor and Material Instance Editor workflows are improved, and we added increased scriptability for Materials and Material Instances.</p> <div><img src="" /></div> <ul style="margin-left: 40px;"> <li>There is now a Hierarchy button to the Material Editor toolbar to display a menu that shows all the immediate children, and enables quick access to edit them.</li> <li>The Hierarchy menu in the Material Instance Editor now shows all immediate children in addition to the parent chain of Materials and Material Instances.</li> <li>The following nodes were added to the material scripting library, for use in Editor Utility Widgets, Editor Utility Blueprints, Python, and C++. <ul> <li>For material instances, GetStaticSwitchParameterValue.</li> <li>For materials, GetMaterialDefaultScalarParameterValue, GetMaterialDefaultVectorParameterValue, GetMaterialDefaultTextureParameterValue, GetMaterialDefaultStaticSwitchParameterValue, HasMaterialUsage (for checking whether or not a material has a given usage flag), GetChildInstances, Get___ParameterNames (to get an array of scalar, vector, texture, or static switch parameter names), Get___ParameterSource (to find the source asset, whether it be function or material, where the parameter is defined, and again there&#39;s one for Scalar, Vector, and so on)</li> </ul> </li> <li>When you mouse over a parameter name in a Material Instance, you will see the name of the Asset where that parameter was defined. This makes it easier to work with Materials that have many levels of nested functions.</li> </ul> <h2 id="new:umgaccessibilityscreenreadersupport_experimental_"><strong>New: UMG Accessibility Screen Reader Support (Experimental)</strong></h2> <p>UE4 now supports third party screen readers for Windows or VoiceOver on iOS, which enables you to make sure your game UI is accessible and helps you comply with CVAA standards. Screen readers, such as NVDA and JAWS, allow a software application&#39;s UI to be narrated to the user. This is a critical feature that enables those who are visually impaired to use and navigate software applications.</p> <p>As of 4.23, there are now APIs included in UE4 to allow the use of third-party screen readers to read UI text. This supports a number of common UMG widgets, such as Text Block, Editable Text Box, Slider, Button, and Checkbox. This built-in functionality removes the need to implement custom text-to-speech technology, making screen readers easier to support.</p> <p>To enable screen reader support, you need to go into either your project or Engine Console Variable configuration file. Once in the file, add the variable Accessibility.Enable=1.</p> <p>For more information, see <a href="" target="_blank">Supporting Screen Readers</a>.</p> <h2 id="new:wacomtabletsupport_experimental_"><strong>New: Wacom Tablet Support (Experimental)</strong></h2> <p>Programmers working on features like a painting plugin or modeling tools can now take advantage of stylus inputs, such as pen pressure and tilt, using a new plugin that provides access to the additional inputs that Wacom-style tablet and stylus systems provide.</p> <div class="note"> <p>Not all tablets support all possible values that the API supports - the subsystem has intentionally been written to expose a superset of all supported values. It is up to the user of the subsystem to determine what values they are interested in.</p> </div> <p>This is an experimental feature, and it is not supported by any Unreal Engine tools as of yet.</p> <h2 id="new:umgwidgetdiffing"><strong>New: UMG Widget Diffing</strong></h2> <p>We have expanded and improved Blueprint Diffing to support Widget Blueprints as well as Actor and Animation Blueprints! The new tools also show changes made to the structure of the Blueprint, adding property and function flags, class settings, parent class, and added Components, in addition to default property values (which now include the default properties of Widgets and Widget Slots) and changes to Blueprint Graphs.</p> <div style="text-align: center;"><img src="" /></div> <p>If you have written a custom Blueprint subclass that you would like to diff, you can override the FindDiffs function, which enables you to list specific changes you want to show, and to request subobjects for diffing.</p> <h2 id="new:non-destructivelandscapeediting_experimental_"><strong>New: Non-Destructive Landscape Editing (Experimental)</strong></h2> <p>Landscape Heightmaps and paint layers can now be edited with non-destructive layers. You can add multiple layers to your landscape that can be edited independently from each other. These new layers function as a foundation for sculpting and painting a landscape, which allows you to manipulate and maintain landscapes more efficiently.</p> <p>As you add layers, you can lock layers you don&#39;t want to change and focus on editing one layer at a time. You can also hide layers to help focus on a specific layer, or see what the landscape looks like without a certain layer. Lastly, by enabling Layer Contribution, the layer highlights in the viewport and you can see all of the sculpting and painting within that layer, even as you add to the layer. There are other new options, such as ordering layers and adjusting the layer alpha blending.</p> <p>Landscape splines terrain deformation and painting can now be a non-destructive process by reserving a layer for splines. This means that you can now edit, change, and move splines non-destructively and the landscape will long roads or paths through your landscape.</p> <p>For more information, see <a href="" target="_blank">Non-Destructive Landscape Layers and Splines</a>.</p> <h2 id="new:placeinteractiveactorswithfoliagetool"><strong>New: Place Interactive Actors with Foliage Tool</strong></h2> <p>The Foliage Tool now supports populating a scene with interactive Actors in addition to Static Meshes. Actors placed by the Foliage Tool will behave the same way that Static Meshes do, automatically scattering when you sculpt the terrain or move the Static Meshes that you painted them on.</p> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <p>For example, in this video, the scattered trees are Blueprint Actors that contain interaction logic along with their Static Mesh Components. And when the terrain is later modified, these Tree Actors are automatically updated to match the new height of the terrain.</p> <div class="note"> <p>Foliage Actors are not rendered as Instanced Meshes because they are treated by the renderer as if they were individually placed by hand into the Level.</p> </div> <p>For additional information, read more about <a href="" target="_blank">Foliage Mode</a>.</p> <h2 id="new:hdribackdropactor"><strong>New: HDRI Backdrop Actor</strong></h2> <p>The new HDRI Backdrop Actor makes it faster and easier to create a realistic background and lighting environment for your Level from a single HDRI image. Place the Actor into your Level and assign it the HDRI texture you want to get your image projected onto a backdrop, a Sky Light automatically set up to provide ambient light drawn from the image, accurate reflections, and a floor surface that captures shadows cast from your Level content.</p> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <p>For details on the workflow, and on all the settings that you can use to control the projection, see <a href="" target="_blank">HDRI Backdrop</a>.</p> <h2 id="new:dualheightfogforexponentialheightfog"><strong>New: Dual Height Fog for Exponential Height Fog</strong></h2> <p>We improved control over the fog with additional parameters for Fog Density, Height Falloff, and Height Offset of an additional fog layer when using an Exponential Height Fog Volume.<img src="" /></p> <div class="asyncgif"> <p style="text-align: center;"><em>1 - No Exponential Height Fog; 2 - Exponential Height Fog; 3 - Dual Exponential Height Fog</em></p> </div> <h2 id="new:dynamicshadowbiasimprovement"><strong>New: Dynamic Shadow Bias Improvement</strong></h2> <p>We improved shadow biasing for movable lights in this release by adding some new parameters that can be set per-light or globally per-light type using a console variable. In addition to the constant Shadow Bias parameter, we&#39;ve added the Slope Bias parameter to help resolve some (but not all) issues with shadow artifacts and acne. For Directional Lights, we added an extra depth bias parameter with Shadow Cascade Bias Distribution to control the bias strength across cascades.</p> <div class="asyncgif"> <div style="text-align: center;"><img src="" /></div> <p style="text-align: center;"><em>1 - Before Slope Bias and Shadow Cascade Bias Distrubution; 2 - After Adjustments to Slope Bias and Shadow Cascade Bias Distribution</em></p> </div> <p>There are two parts to consider: the Slope Bias is used during shadow map rendering, and the Receiver Bias during the shadow map fetching. The Slope Bias is controllable per-light and is proportional to the Shadow (Constant) Bias. With these two parameters (the Constant and Slope Bias), there is a trade off in shadow quality versus accuracy that will resolve some artifacts.</p> <p>To control the Receiver Bias, use the console variables under r.Shadow.* to set a value between 0 (better accuracy with more artifacts) and 1 (less accuracy with fewer artifacts) to set the receiver bias globally per-light type. You can set any light-type&#39;s constant and slope bias globally using console variables, also.</p> <h2 id="new:pre-skinnedlocalboundsmaterialexpression"><strong>New: Pre-Skinned Local Bounds Material Expression</strong></h2> <p>We&#39;ve added a new Material Expression to return the Pre-Skinned Local Bounds of Skeletal Meshes. This expression enables you to be able to calculate a relative position inside the bounds of an Actor through the Material. Since these bounds don&#39;t change - even when animated - it enables you to apply a material in a consistent or fixed manner, such as a pattern like a decorative wrap applied to a weapon or vehicle. In the Animation Editors, you can visualize the Pre-Skinned Bounds of any mesh from the Character drop-down under the Mesh category.</p> <h2 id="new:storingcustomper-primitivedata"><strong>New: Storing Custom Per-Primitive Data</strong></h2> <p>We now support storing data on a per-primitive level instead of per-Material Instance, enabling primitives to be automatically considered for dynamic instance drawing.</p> <p>Storing data with Use Custom Primitive Data per-primitive has the advantage of lowering the number of draw calls required for similar geometry, even if each primitive were to have to have its own custom data.</p> <h2 id="new:frommaterialexpressionshadingmodel"><strong>New: From Material Expression Shading Model</strong></h2> <p>We&#39;ve added support for multiple shading models to be used in a single Material using the new From Material Expression Shading Model!</p> <div style="text-align: center;"><img src="" /></div> <p>Previous workflows using the Material Output and Material Attributes node are supported, and common workflows using If-Statements, BlendMaterialAttributes, Static Switches, and Material Instances are also fully supported.</p> <p>This can be an opportunity to optimize and reduce the number of draw calls used if a single asset is using two separate Materials, equaling two draw calls used for each instance of that asset in the level. Using a single Material with two shading models can reduce this to a single draw call.</p> <p>For additional details, see the <a href="" target="_blank">From Material Expression</a> documentation.</p> <h2 id="new:iesprofileimprovements"><strong>New: IES Profile Improvements</strong></h2> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <p>With this release, we&#39;ve made some improvements to working with IES Profiles in the Editor and with different light types:</p> <ul> <li>Selected Point and Spot lights with an assigned IES Texture now provide a 3D visualization of photometric data.</li> <li>There is better support for Type C IES files that improve axial symmetry display without artifacts.</li> <li>IES Texture icons in the Content Browser now give a preview of the photometric data.</li> </ul> <p>For additional details, see <a href="" target="_blank">IES Light Profiles</a>.</p> <h2 id="new:renderdependencygraph"><strong>New: Render Dependency Graph</strong></h2> <p>The Rendering Dependency Graph (RDG) — or simply, "Render Graph" — is designed to take advantage of modern graphics APIs to improve performance through the use of automatic asynchronous compute scheduling, as well as more efficient memory and barrier management</p> <p>Unreal Engine&#39;s renderer is being actively ported to RDG and is the primary API for authoring render passes for new features going forward. The implementation is still in early development and lacks many of the performance benefits right now, but the API is stable enough to use in production.</p> <p>For additional information, see Render Dependency Graph.</p> <h2 id="new:composureimprovements"><strong>New: Composure Improvements</strong></h2> <p>In this release we have added some additional options for Composure:</p> <ul> <li><strong>AlphaHoldOut Blend Mode</strong> - This new Material Blend Mode enables objects to hold out the alpha in the Material, punching a hole through objects behind it.</li> <li><strong>Color Grading</strong> - Composure now supports color grading and white balance using floating point post processing lookup tables (LUTs).</li> <li><strong>Composure Layer inherits Post Process Parameters from Scene Camera</strong> - Composure Layers can now inherit Post Processing parameters from the scene camera, enabling color correction controls to be optionally enabled separately for each CG Layer.</li> </ul> <p>The Composure sample available from the Learn Tab in the launcher has also been updated to reflect the latest workflows for compositing in Unreal Engine!</p> <h2 id="new:pythonimport/exportfbx"><strong>New: Python Import/Export FBX</strong></h2> <p>Python scripting now supports importing and exporting FBX animations.</p> <h2 id="new:provideocodecs"><strong>New: Pro Video Codecs</strong></h2> <p>The Unreal Engine now supports additional video codecs, making it easier to integrate Unreal into professional video production pipelines and workflows.</p> <p>You can now export Pro Media files to <a href="" target="_blank">Apple ProRes Encoder</a>:</p> <ul> <li>All formats of the codec are supported: 4444 XQ, 4444, 422 HQ, 422, 422 LT, 422 Proxy</li> <li>Multiple frame rates and resolutions.</li> <li>Embedded timecode track is supported.</li> <li>No embedded audio. Audio is mixed down and exported to a separate .wav file.</li> <li>Supported on Windows platforms only.</li> </ul> <p>For details, see the new how-to guide, <a href="" target="_blank">Exporting Pro Media Files to Apple ProRes</a></p> <p>The Media Framework can now playback files encoded with <a href="" target="_blank">HAP Codecs</a>.</p> <ul> <li>All formats of the codec are supported: HAP, HAP Alpha, HAP Q, HAP Q Alpha</li> <li>Supports playback for a <strong>1x 4K 60 FPS</strong> movie or <strong>2x 4K 30 FPS</strong> movie, which can be stretched to <strong>2x 4K 60 FPS</strong> movies.</li> <li>Full support for alpha channels.</li> <li>Multiple frame rates and resolutions.</li> <li>No embedded audio or timecode support.</li> <li>8K and 16K are not supported at this time</li> </ul> <p>For details, see the HAP Codec Playback Support section in <a href="" target="_blank">Media Framework Technical Reference</a>.</p> <h2 id="new:stereopanoramiccapturetoolimprovements_experimental_"><strong>New: Stereo Panoramic Capture Tool Improvements (Experimental)</strong></h2> <p>With the updates to the Stereo Panoramic Capture tool, it&#39;s much easier to capture high-quality stereoscopic stills and videos of the virtual world in industry-standard formats, and to view those captures in an Oculus or GearVR headset. You have expanded control over render settings, bit depth, and quality; you can also choose the graphic buffers you want to capture, making it possible to post-process and composite the images in other applications.</p> <h2 id="new:platformsdkupgrades"><strong>New: Platform SDK Upgrades</strong></h2> <p>In every release, we update the Engine to support the latest SDK releases from platform partners.</p> <div style="text-align: center;"><img src="" /></div> <ul style="margin-left: 40px;"> <li><strong>IDE Version the Build farm compiles against</strong> <ul> <li>Visual Studio - Visual Studio 2017 v15.9.11 toolchain (14.16.27023) and Windows 10 SDK (10.0.16299.0) <ul> <li>Minimum Supported versions <ul> <li>Visual Studio 2017 v15.6</li> </ul> </li> <li>Requires NET 4.6.2 Targeting Pack</li> </ul> </li> <li>Xcode - Xcode 10.3</li> </ul> </li> <li><strong>Android</strong> <ul> <li>Android NDK r14b (New CodeWorks for Android 1r7u1 installer will replace previous CodeWorks on Windows and Mac; Linux will use 1r6u1 plus modifications)</li> </ul> </li> <li><strong>ARCore</strong> <ul> <li>1.7</li> </ul> </li> <li><strong>HTML5</strong> <ul> <li>Emscripten 1.38.31</li> </ul> </li> <li><strong>Linux "SDK" (cross-toolchain)</strong> <ul> <li>v14_clang-8.0.1-centos7 - downloadable from <a href="" target="_blank"></a> (now with the installer!)</li> </ul> </li> <li><strong>Oculus Runtime</strong> <ul> <li>1.37</li> </ul> </li> <li><strong>OpenXR</strong> <ul> <li>1.0</li> </ul> </li> <li><strong>Google Stadia</strong> <ul> <li>1.34</li> </ul> </li> <li><strong>Lumin</strong> <ul> <li>0.19.0</li> </ul> </li> <li><strong>Steam</strong> <ul> <li>1.42</li> </ul> </li> <li><strong>SteamVR</strong> <ul> <li>1.5.17</li> </ul> </li> <li><strong>Switch</strong> <ul> <li>SDK 8.3.0 + optional NEX 4.6.3 (Firmware 7.x.x-x.x)</li> <li>Supported IDE: Visual Studio 2017, Visual Studio 2015</li> </ul> </li> <li><strong>PS4</strong> <ul> <li>6.508.001</li> <li>Firmware Version 6.510.011</li> <li>Supported IDE: Visual Studio 2017, Visual Studio 2015</li> </ul> </li> <li><strong>XBoxOne</strong> <ul> <li>XDK: July 2019 QFE-9</li> <li>Firmware Version: May 2019 10.0.18362.3055</li> <li>Supported IDE: Visual Studio 2017</li> </ul> </li> <li><strong>MacOS</strong> <ul> <li>SDK 10.14</li> </ul> </li> <li><strong>iOS</strong> <ul> <li>SDK 12</li> </ul> </li> <li><strong>tvOS</strong> <ul> <li>SDK 12</li> </ul> </li> </ul> <h1 id="upgradenotes"><strong>Upgrade Notes</strong></h1> <h2 id="editor"><strong>Editor</strong></h2> <h3 id="matinee"><strong>Matinee</strong></h3> <ul style="margin-left: 40px;"> <li>With the release of Unreal Engine 4.23, Matinee is no longer supported and will be removed from the engine in an upcoming release. Once removed, you will no longer be able to access or open Matinee files. Please use the <a href="" target="_blank">Matinee to Sequencer Conversion Tool</a> to convert any Matinee sequences to Sequencer Sequences as soon as possible.</li> </ul> <h3 id="vreditor"><strong>VR Editor</strong></h3> <ul style="margin-left: 40px;"> <li>The VR Mesh Editor presented at GDC 2017 is no longer supported. The Mesh Editor plugin will be removed from the engine in an upcoming release.</li> </ul> <h2 id="platforms"><strong>Platforms</strong></h2> <h3 id="html5"><strong>HTML5</strong></h3> <ul style="margin-left: 40px;"> <li>HTML5 platform support will be migrated to GitHub as a community-supported <a href="" target="_blank">Platform Extension</a> and no longer officially supported by Epic in upcoming releases.</li> </ul> <h3 id="ios"><strong>iOS</strong></h3> <ul style="margin-left: 40px;"> <li>Support for iOS 10 has been removed. iOS 11 is now the minimum supported version.</li> <li>OpenGL on iOS will be removed in an upcoming release, potentially as early as 4.24. Once removed, Metal will be the only rendering path for iOS devices going forward.</li> </ul> BlueprintsCommunityFeaturesNewsVRRay TracingJeff WilsonWed, 04 Sep 2019 12:00:00 GMTWed, 04 Sep 2019 12:00:00 GMT lineup at Unreal Engine User Group kick-starts Epic’s SIGGRAPH week you were part of the great crowd at the iconic Orpheum Theatre for this year’s Unreal Engine User Group, you know what an amazing lineup of presentations the speakers delivered. Unable to be there in person? Watch the recording here.If you were part of the great crowd at the iconic Orpheum Theatre on Monday, we’re sure you’ll agree it was an awesome way to start the week. An amazing lineup of storytellers and visual effects experts presented their inspirational work, topped off by a fireside chat with a very special guest: director, producer, actor, and writer <a href="" target="_blank">Jon Favreau</a>. <br /> <img alt="blog_body_favreau_img1.jpg" height="auto" src="" width="auto" /><br /> For those of you who were not able to attend in person, or if you’d just like to relive the experience, the event is now available for viewing online.  <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> After opening statements by General Manager of Unreal Engine <strong>Marc Petit</strong>, we kicked things off with Epic’s Worldwide Creative Director <strong>Donald Mustard</strong>—who just wrapped the Fortnite World Cup in New York—to hear his vision on the convergence of games, storytelling, and linear content.<br /> <br /> Then, CTO <strong>Kim Libreri</strong> provided a peek at the future roadmap for Unreal Engine, culminating in a video showing how next-generation virtual production tools coming to Unreal Engine 4.23 are transforming the art of filmmaking. We partnered with <strong>Magnopus</strong>, <strong>Lux Machina</strong>, <strong>Quixel, Profile Studios</strong>, <strong>ARRI</strong>, and DP <strong>Matt Workman</strong> to show how greenscreens and post-production VFX are fast becoming old-school techniques, with a demonstration of our new in-camera VFX workflow, nDisplay-powered LED walls, and collaborative VR scouting tools.<br />   <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <br /> Next, we got a fascinating insight into developer Matt Workman’s UE4-based realistic cinematography visualization tool, <a href="" target="_blank">Cine Tracer</a>, which is designed from the ground up for people without 3D software experience, like your average director or DP.<br /> <br /> <strong>Doug Roble</strong>, Sr. Director of Software R&D at <strong>Digital Domain</strong>, was next on stage, where he introduced his virtual twin, <a href="" target="_blank">Digital Doug</a>. This breathtakingly realistic digital human has been a multiyear labor of love for Doug and his team, and has resulted in groundbreaking advances in machine learning.  <br /> <img alt="blog_body_digital_doug_img1.jpg" height="auto" src="" width="auto" /><br /> <strong>Stargate Studios</strong> CEO and Founder <strong>Sam Nicholson, ASC</strong> is equally forward-thinking. He showed how his studio is already shooting <a href="" target="_blank">VFX in-camera</a>, a technique that most studios are only now starting to realize is possible. This presentation was complemented by a demo in the lobby where attendees could try it for themselves.<br /> <br /> Another hands-on experience in the lobby was <strong>ILMxLAB</strong>’s <a href="" target="_blank">Vader Immortal</a>, and we were honored to have Executive in Charge <strong>Vicki Dobbs Beck</strong> take to the stage and give us some insights into the philosophy behind its creation. <br /> <br /> Star Wars fans are eagerly anticipating the opening of the new <a href="" target="_blank">Millenium Falcon: Smugglers Run</a> attraction at the end of August. But Unreal Engine User Group attendees got an advance viewing, courtesy of Technical Studio Executive <strong>Bei Yang</strong>, who joined us from <strong>Walt Disney Imagineering</strong>.<br /> <br /> <a href="" target="_blank">Game of Thrones</a> is the <a href="" target="_blank">most-watched HBO TV series</a> of all time, and received a historic <a href="" target="_blank">32 Emmy nominations</a> this year. While some of the storylines in the final season may have been controversial, the exceptional quality of the visuals has never been disputed. Virtual Production Supervisor <strong>Kaya Jabar</strong> from <strong>The Third Floor </strong>took us behind the scenes to explain how some of the most ambitious shots and sequences were achieved.<br /> <img alt="blog_body_ThirdFloor_img.jpg" height="auto" src="" width="auto" /><br /> Just when the presentation seemed complete, who should pop in for a fireside chat but <strong>Jon Favreau</strong>? Between directing, producing, acting, and writing, we were thrilled that he found the time to drop by. His IMDB profile has a long list of credits that goes back 25 years. He directed and produced <a href="" target="_blank">Iron Man</a> and <a href="" target="_blank">Iron Man 2</a>, was Executive Producer on all of the Avengers films, and of very recent note, played the role of Happy Hogan in <a href="" target="_blank">Spider-Man: Far from Home</a> and <a href="" target="_blank">Avengers: Endgame </a>and directed <a href="" target="_blank">The Lion King: Live Action</a>—all currently in theaters. He also directed and stars in Netflix’s <a href="" target="_blank">The Chef Show</a>.<br /> <br /> He’s considered one of the pioneering creative forces behind modern virtual production workflows. Favreau sat down with Epic’s Business Development Manager <strong>Miles Perkins</strong> to share his thoughts on how real-time technology is changing storytelling. <br /> <br /> We’d like to thank all our amazing presenters, and all of you who showed up to make it such a great afternoon. We hope you enjoyed this year’s Unreal Engine User Group as much as we did.<br /> <br /> Inspired by what you saw? <a href="" target="_blank">Download Unreal Engine</a> and start creating your own amazing experiences today.<br />  EnterpriseFilm And TelevisionVirtual ProductionCommunitySIGGRAPH 2019Wed, 31 Jul 2019 15:00:00 GMTWed, 31 Jul 2019 15:00:00 GMT Virtual Production Field Guide: a new resource for filmmakers production is impacting the way film and episodic television content is made. Find out what it’s all about, who’s using it, and how it blurs the lines between pre- and post-production while giving filmmakers more creative freedom than ever. On the <a href="" target="_blank">Virtual Production hub</a>, Epic Games has been keeping you up to date with some of the latest and most creative advancements in virtual production. As part of our continuing effort to make virtual production (VP) accessible to all, we’ve just published <a href="" target="_blank">The Virtual Production Field Guide</a>, a downloadable PDF intended as a foundational information document for anyone either interested in VP or already leveraging these techniques in production. <br /> <br /> <em>Virtual production</em> is a broad term for a wide range of computer-aided filmmaking methods that are intended to enhance creativity and save time. The use of real-time tools can turn the traditional linear pipeline into a parallel process where the lines between pre-production, production, and post-production are blurred, making the entire pipeline more fluid and collaborative. <br /> <a href="" target="_blank"><img alt="blog_body_field_guide_CTAv2-(1).jpg" height="auto" src="" width="auto" /></a><br /> Virtual production workflows can be faster, more creative, more iterative, and more collaborative, giving a better sense of the final shots and scenes much earlier in the production process. With a virtual set built ahead of time, a director and department heads can scout locations in VR, explore shots, and lay out exact camera angles and moves. Instead of green screen, LED walls allow for in-camera effects, where sets can be changed in real time while providing more accurate real-world lighting and ultimately informing performances. These are just a few of the ways filmmakers are using real-time technology to be more creative in the moment, having ultimate creative control over the story they are telling.<br /> <br /> <em>The</em> <em>Virtual Production Field Guide </em>explores these techniques, and many more. You’ll gain a basic understanding of how VP works, discover details of projects which have already used it, and develop an understanding of how the latest workflows introduce an element of creative spontaneity that is sometimes missing in modern filmmaking. <br /> <img alt="blog_body_ThirdFloor_img1.jpg" height="auto" src="" width="auto" /><br /> The guide also outlines how each film department can leverage VP workflows, and how those with experience in traditional methodologies can transfer their knowledge to this new creative sandbox. Lastly, we feature interviews with a number of film professionals who discuss how they’ve used virtual production workflows, including director/actor Sir Kenneth Branagh, Oscar-winning visual effects supervisor Ben Grossman, and cinematographer Bill Pope, ASC.<br /> <img alt="blog_body_KG_BenG_img.jpg" height="auto" src="" width="auto" /><br /> <a href="" target="_blank">The Virtual Production Field Guide</a> is free to download and share. Get your copy today and find inspiration for your own projects, then visit our <a href="" target="_blank">Virtual Production hub</a> for more case studies, podcasts, and insights.<br />  Film And TelevisionVirtual ProductionCommunityEnterprisePrevisVisualizationVRMiles PerkinsThu, 25 Jul 2019 15:30:00 GMTThu, 25 Jul 2019 15:30:00 GMT simplifies real estate with immersive VR archviz solution in Unreal Engine, ZipView offers a high-quality virtual look into residential and commercial buildings via a fully immersive 3D VR environment. In this guest blog, Olim Planet’s Director of Team Platform Mark Kim talks about how the product is shaking up Korea’s real estate industry.<em>In this guest blog, we invite Mark Kim, Director of Platform Team at Korean technology company <a href="" target="_blank">Olim Planet</a>, to talk about the company’s brainchild, ZipView, an Unreal Engined-powered real estate platform.</em><br /> <br /> <br /> ZipView literally means “viewing the house”—zip is the Korean word for house. But ZipView goes beyond the act of just physically seeing a place; it encompasses activities like visiting showrooms before making a purchase, receiving consultations from a realtor, buying and selling real estate, signing leases or rental contracts, and browsing interior designs. <br /> <br /> The platform acts as a new distribution ecosystem that connects businesses with clients, and leverages advanced technology for all stages of real estate transactions including sales, brokerage, leasing, development, management, and residential services. Our goal is to make transactions between businesses and clients more accessible and reliable.<br /> <br /> Using ZipView, clients can easily and conveniently make real estate transactions online or offline, when and where they want, and with significant time and cost savings; the immersive 3D VR environment offers a virtual showroom that offers the same realistic experience for viewing residential and commercial buildings as physically being on site.<br /> <img alt="Spotlight_OlimPlanet_blog_body_devices_img.jpg" height="auto" src="" width="auto" /> <h3><img alt="Spotlight_OlimPlanet_blog_body_VR_img.jpg" height="auto" src="" width="auto" /><br /> <img alt="Spotlight_OlimPlanet_blog_body_customize_img.jpg" height="auto" src="" width="auto" /><br /> <strong>Unreal Engine and Datasmith drastically reduce time and costs</strong></h3> In order to meet the client’s expectations, the immersive real estate content has to be as realistic as the real thing, to eliminate the need for an actual viewing. ZipView’s VR content, powered by Unreal Engine’s photorealistic real-time rendering, immerses the user in the experience of being in an actual house, with sunlight shining through the windows and inviting furniture placed in the virtual rooms. The exceptional quality gets people asking us if the service uses photographs. Many of them are surprised to hear that it is entirely CG. <br /> <br /> Unreal Engine was the natural choice to achieve this level of hyperrealistic quality, to manage large assets, and to maintain stable, high frame rates for VR. Unreal Engine also greatly enhanced the productivity of our content creation pipeline. Prior to using Unreal, creating the final output required complicated pipelines and long wait times. Unreal Engine’s real-time rendering enables us to quickly view the final pixels in real time, and even offers the option to export images and videos for various marketing materials. With the introduction of Datasmith (bundled with Unreal Engine in Unreal Studio), we are able to directly import 3ds Max and CAD data into the Unreal Editor, significantly reducing the overall production time. <h3><img alt="Spotlight_OlimPlanet_blog_body_editor_img1.jpg" height="auto" src="" width="auto" /><img alt="Spotlight_OlimPlanet_blog_body_editor_img2.jpg" height="auto" src="" width="auto" /><br /> <strong>Enhancing the user experience with Unreal Engine</strong></h3> Most importantly, Unreal Engine is extremely accessible. ZipView’s 3D designers, who were accustomed to other software packages, were able to quickly learn Unreal from tutorials and templates. Also, the Unreal community offers tips and know-how from around the world, coupled with high-quality assets that are freely available. <br /> <br /> Unreal’s features—such as large-scale collaborative build capabilities, data management, and profiling—boost our work effectiveness and help us manage multiple projects. In addition, we continue to improve productivity in ZipView through various proprietary automation plugins added onto Unreal Engine. <br /> <br /> Last but not least, Unreal Engine enables us to keep our platform up to date with the latest real-time technology by offering features like <a href="" target="_blank">real-time ray tracing</a> and virtual production tools, which consistently improve user experiences and help keep ZipView at the forefront of the industry. <h3><img alt="Spotlight_OlimPlanet_blog_body_room_img.jpg" height="auto" src="" width="auto" /><br /> <img alt="Spotlight_OlimPlanet_blog_body_outdoor_img2.jpg" height="auto" src="" width="auto" /><br /> <strong>Saving time and cost for construction companies and clients</strong></h3> With ZipView’s immersive 3D VR environment, clients can experience all aspects of houses before they are constructed including the interior space, the view of the surrounding environment, and the building exterior. At the same time, construction companies can operate online and offline digital showrooms throughout the country based on ZipView’s VR content and real estate distribution ecosystem, reducing both the construction fee for mock-ups and the indirect cost of promoting content. <br /> <br /> By offering the option to customize various interactive content like the view on each floor—using a physical single-floor apartment space with fake windows made of LED screens—the balcony structure, the furniture, the interior design, and more, ZipView enhances the marketability of real estate and delivers a unique, immersive experience to clients. There have been cases of sales being closed on multi-purpose residential buildings simply through ZipView, prior to building mock‑ups. <br /> <img alt="Spotlight_OlimPlanet_blog_body_window_img.jpg" height="auto" src="" width="auto" /><br /> Since commercial facilities like shops, malls, hotels, resorts, and other business complexes are difficult to replicate, ZipView is the sole provider of a virtual viewing experience in the Korean market. Customers can easily view houses by accessing the online service across various platforms that include PC and mobile, or through offline businesses that use ZipView. <h3><img alt="Spotlight_OlimPlanet_blog_body_user_img.jpg" height="auto" src="" width="auto" /><br /> <strong>Popularizing high-quality, immersive 3D VR content through proprietary technologies</strong></h3> Having created Korea&#39;s first immersive 3D VR interior space experience in 2015, we welcomed Daelim Industrial Co., Ltd, one of the leading construction companies in the country, as our first partner to install ZipView at the sales site. They received many accolades for their launch of the first VR showroom. <br /> <br /> Last year, ZipView was well-received at the exhibition hall of Acro Seoul Forest, which had the highest sales price across the nation. ZipView is now being used in the real estate sales market by top apartment brands and construction companies in Korea, enabling clients to experience Korea’s iconic landmark complexes such as Hannam The Hill, Signiel Residence, Trimage, and the Galleria Foret. ZipView is expanding into the global market, starting off in Vietnam in the first half of 2019 with plans to enter markets in Malaysia and China.<br /> <br /> In addition, ZipView owns numerous proprietary technologies and patents on VR technology as a result of the team’s R&D efforts. The company registered a patent for its immersive 3D VR interior space information web viewer technology, which was awarded the grand prize for Best Patent Products in the latter half of 2016. ZipView also has two patents relating to the easy creation and distribution of 3D objects in interior virtual spaces. <br /> <br /> ZipView significantly reduced the time and cost of creating and distributing existing VR real estate content by developing a modularized automation system, and has contributed to the popularization of high-quality, immersive 3D VR content. Currently, ZipView has the most VR real estate data in Korea.<br />  <br /> <br /> Want to create your own high-quality, immersive 3D VR content? <a href="" target="_blank">Download the free Unreal Studio beta</a>, which includes Unreal Engine and Datasmith, today. <br />  Olim PlanetVREnterpriseArchitectureVisualizationZipViewMark Kim, Director, Olim PlanetThu, 19 Sep 2019 15:30:00 GMTThu, 19 Sep 2019 15:30:00 GMT Rock Studios' Journey of the Gods leverages Blueprints to bring designers into the coding process of the Gods uses everything Turtle Rock Studios has learned about VR thus far to become one of the medium&#39;s best games. Perhaps most known for being the studio behind the original <em>Left 4 Dead</em>, Turtle Rock Studios has been at the forefront of modern VR development. The California-based company worked on VR titles that include <a href="" target="_blank">The Well</a> and the <a href="" target="_blank">Face Your Fears</a> series. The studio&#39;s newest release, <em>The Legend of Zelda</em>-inspired <a href="" target="_blank">Journey of the Gods</a>, encompasses everything they have learned about the medium with the developer setting out to make a substantial, in-depth VR game. With a beautiful and wondrous world to explore, <em>Journey of the Gods</em> released to rave reviews and currently has a <a href="" target="_blank">near five-star rating</a> on the Oculus store. To see how Turtle Rock Studios accomplished its monumental goal, we interviewed eight team members across art, engineering, and more. <br /> <br /> Considering you play as a giant god and a human in the game, they talk about how they were able to play with scale in VR. The developers also elaborate on how they designed the game&#39;s unique and challenging enemies and balanced the combat for the medium. In addition, the studio discusses how they optimized performance for the mobile Oculus Quest headset and explain how going with the game&#39;s beautiful minimalistic aesthetics helped them make the game more performant without sacrificing visual fidelity. Finally, Turtle Rock talks about the advantages of working with UE4 and shares some of the benefits they were able to glean from Blueprints. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <strong>Journey of the Gods has drawn a lot of comparisons to The Legend of Zelda. Was that franchise influential in the project? What other titles might have inspired the game?<br />  <br /> Designer Chris Ashton: </strong>The original pitch for <em>Journey of the Gods</em> was more of an old-school god game. We were referencing games like <em>Black and White</em>. The idea was that you could summon a god and then command it through gestures, effectively remote controlling it to solve puzzles by moving large objects, carrying villagers, using fire, etc. The hook for VR was seeing these giant gods manipulating the world while under your command.<br /> <img alt="DeveloperInterview_Journey_of_the_gods_004.jpg" height="auto" src="" width="auto" /><br /> As we were building the prototype, we found it was more fun to actually become the god and play it in first-person (as opposed to remote controlling it). The world looks small from that point of view, you feel very powerful, and you get to manipulate the world directly.<br />  <br /> This made the god mode feel great, and in comparison, the human mode felt a little lackluster. We decided to try making the human player responsible for fighting most of the enemies in the world and using the god mode for puzzle solving. The two sides would work together, almost like you&#39;re playing both sides of a co-op game.<br />  <br /> That&#39;s when we started to reference <em>The Legend of Zelda</em>, primarily as an example of good action, adventure, and pacing.<br />  <br /> Zelda also made us feel comfortable having speech bubbles. Originally, we planned to have no talking in the game, but we soon discovered that some of the gestures and mission goals were extremely difficult to make intuitive. Sometimes you just need to tell the player why they are here or how to use an ability. It&#39;s kind of an old-school way to communicate to the player, but Zelda proves that it still works today, even in big-budget games.<br />  <br /> <strong>Journey of the Gods features a beautiful minimalist look. How did the studio come up with the visual style of the game? <br />  <br /> Artist Justin Cherry: </strong>From an art perspective, we wanted to make sure the player experience had a sense of wonder. The minimalist style allowed us to insert otherwise unbelievable features (disconnecting limbs, having floating objects, etc.) without having to make that a part of the fiction.<br />  <br /> We broke down all identifiable parts of characters or environments and distilled those into archetypal shapes, to mimic the glyph-like nature of the text and overall game narrative.<br /> <img alt="DeveloperInterview_Journey_of_the_gods_005.jpg" height="auto" src="" width="auto" /><br /> <strong>Considering Journey of the Gods runs and looks great on the Quest, how did you optimize performance around Oculus&#39; mobile hardware? <br />  <br /> Engineer Ryan Adams: </strong>We started by identifying the type of platform we were attempting to release on and picked an art style, using minimal textures and unlit materials, that we knew would allow us to be able to present the experience and game world we wanted. Then, using our set of profiling tools to identify the areas of the game we needed to improve upon, we began iterating. We made sure to leverage all of the assets and features that Unreal makes available to optimize the game, such as <a href="" target="_blank">level streaming</a>, <a href="" target="_blank">LODs</a>, instancing, etc. We also made sure to take performance into consideration as we designed each part of the game from start to finish, rather than trying to address it at the end of the project.<br />  <br /> <strong>In the game, players can quickly swap between using a sword and a shield or a crossbow, all of which are fun to use and are upgradeable throughout the course of the game. How did the studio ultimately decide on those weapon sets?</strong><br /> <img alt="DeveloperInterview_Journey_of_the_gods_011.jpg" height="auto" src="" width="auto" /><br /> <strong>Ashton: </strong>We wanted the weapons and combat to start very simple and get more interesting over time. All weapons can usually be lumped into one of two categories - melee or ranged. If we had one of each, players could easily cycle between the two. There would be no need for an inventory and elaborate weapon-selection mechanic.<br />  <br /> Depth would be added through different enemies that must be defeated in different ways and weapon upgrades that would allow the player to choose from an assortment of strategies while still only wielding the two sets of weapons.<br />  <br /> A sword and a shield are classic, and in VR, they feel especially great, so that was an easy choice. For our ranged option, we actually started with a slingshot, then moved to a bow and arrow and ultimately wound up with the crossbow. In playtests, players cited the crossbow as feeling the coolest and easiest to aim, so that&#39;s what we stuck with.<br /> <img alt="DeveloperInterview_Journey_of_the_gods_012.jpg" height="auto" src="" width="auto" /><br /> <strong>How did you approach designing sword combat so that players couldn&#39;t just lackadaisically swing their swords back and forth, yet didn&#39;t require such force that players would quickly grow tired?<br />  <br /> Engineer Gary Kroll: </strong>We detected average linear velocity over time instead of checking the speed each frame. This prevents tiny but fast movements from activating the sword. We also measured the velocity closer to the hilt than the tip of the sword, which made it easy to reach the required speed when moving your hand and the hilt while making it hard to just waggle the sword since that action only moves the hilt of the sword slightly. Then we kept adjusting until people [thought it felt right].<br />  <br /> <strong>Players can turn into a behemoth-sized god to solve environmental puzzles or to easily take down enemies. How did you come across this size-shifting concept?<br />  <br /> Ashton:</strong> As mentioned earlier, god mode came out of the original pitch, but it was realized in a way we didn&#39;t expect. In VR, the world has depth because each eye sees the world from its own perspective. Add these two different perspectives together and our brain perceives a 3D world.<br />  <br /> Behind the scenes, the game engine is rendering from two different cameras. If you change how far apart these cameras are, you can make the world feel really big or really small. What it&#39;s simulating is the distance between your eyes, as if your head was human-sized or giant. When you first see the Kraken boss, you&#39;re human sized and he&#39;s huge, but when you go into god mode, you&#39;re looking down on the Kraken and he seems small. The results are spectacular and only something you can really experience in a 3D medium.<br /> <img alt="DeveloperInterview_Journey_of_the_gods_010.jpg" height="auto" src="" width="auto" /><br /> <strong>Journey of the Gods features a wide array of enemies, each with their own unique attack patterns, strengths, and weaknesses. This includes gigantic boss battles. How did you come up with enemy designs both from an aesthetic and gameplay perspective?<br />  <br /> Ashton: </strong>We usually come up with a gameplay design first and then we pitch it to the other team leads in a "kick-off" meeting. Everyone has a chance to react, give feedback, and pitch their own ideas, which sometimes results in design modifications. Once we&#39;re happy with the design, the concept team explores what that character might look like and we go from there.<br />  <br /> On the design side, our ideal is that each enemy should add value to the player experience. So we add a lot of variety in how they attack, how they move, how the player has to respond to their actions, etc. We introduce players to each new enemy in isolation so they can learn how to defend against and defeat it. If the player&#39;s interaction with each enemy is unique, then it starts to get really interesting when you face combinations of enemies. Now the player has to be very strategic and approach different combinations in different ways. It adds a lot of depth to the combat.<br />  <br /> <strong>With rolling hills, life-size waterfalls, massive treetops to walk across, and expansive vistas, the world in Journey of the Gods is gorgeous and plays with elevation change more than most VR games. How did you approach designing the game&#39;s environments?<br />  <br /> Artist Brenton Hesse:</strong> Great question. The design team first starts with ironing out a space that would feel interesting to navigate and engage in combat with enemies. While this goes on, we have our concept artist put together a look and feel of what the biome could be. Then the environment modeling team takes both the concept art and design framework and works with the art director to make a space that both looks good to run around in and gives the player an interesting navigation and gameplay.<br /> <img alt="DeveloperInterview_Journey_of_the_gods_001.jpg" height="auto" src="" width="auto" /><br /> <strong>The game allows players to play seated, standing, or with room-scale. How important was it for the studio to facilitate all these modes of play?<br />  <br /> Producer Chloe Skew:</strong> With <em>Journey of the Gods</em> alternating frequently between exploration and combat, we wanted to take full advantage of the untethered Oculus Quest headset, allowing players to freely explore their environments and react instinctively (for example, to enemies approaching from behind) without worrying about getting tangled or losing controller tracking at crucial moments. At the same time, we know that players have different preferences for how they play in VR so we wanted to ensure that the seated experience would be just as satisfying.<br />  <br /> <strong>What made UE4 a good fit for the game?<br />  <br /> Ashton:</strong> In addition to PC and consoles, Unreal runs on <a href="" target="_blank">Android</a> and supports <a href="" target="_blank">Oculus</a>, making it a one-stop shop for all projects in our studio. It allows us to share tech and knowledge between different dev teams. Best of all, from a game designer&#39;s perspective, the awesome <a href="" target="_blank">Blueprints</a> tools allow designers to effectively write game code, which meant that all of our levels could have different gameplay modes and objectives (something that was impossible on previous non-UE4 projects).<br /> <img alt="DeveloperInterview_Journey_of_the_gods_007.jpg" height="auto" src="" width="auto" /><br /> <strong>Does the studio have any favorite UE4 features or tools?<br />  <br /> Designer Chris Holmes:</strong> Blueprints is an extensive and versatile tool that allows our designers to be a direct part of the development process from prototype to final product. We&#39;re able to take a large load off our engineers to allow them to focus on core features like AI, combat, player interactions, etc. by putting a lot of experimental prototype work and level scripting on designers using Blueprints. This is especially valuable for a smaller team. When a designer has an idea for a feature, puzzle, world interaction, or anything really, they can spend a day or sometimes just a couple hours setting up a test to demonstrate it. If it sticks, we&#39;ll iterate on it, and depending on the needs, we may eventually task an engineer to create the final shipping version or just polish the Blueprints version.<br />  <br /> Blueprints is incredibly powerful and flexible for level scripting. We&#39;re able to create and iterate on any sequence of events necessary for a given level based on conditions, timing, triggers, etc. On <em>Journey of the Gods</em>, we were able to essentially create whole new game modes for many of our maps using <a href="" target="_blank">Level Blueprints</a>, not to mention setting up complex puzzle logic, creating various enemy spawning systems for different encounters, game state tracking for checkpoints, and even unique scripted events for <a href="" target="_blank">AI</a> characters and real-time cinematics!<br />  <br /> <strong>Having worked on several VR titles like <a href="" target="_blank">Face Your Fears</a> and The Well, including Journey of the Gods, what have you learned about the medium thus far and what do you think of its future?<br />  <br /> Ashton: </strong>We have been fortunate to ease into VR and learn a lot of good lessons along the way.<br />  <br /> <em>Face Your Fears</em> taught us that VR amplifies your reactions. A scary game is more scary in VR. A beautiful game is more beautiful in VR. It feels like anything we build in VR has more impact than it would in another medium. Your work carries more weight and results in a bigger payoff, which is super cool and rewarding as a game developer.<br />  <br /> We are constantly learning. Back when we were working on <em>Face Your Fears</em>, the consensus was that movement in VR was a non-starter, that you couldn&#39;t do it without players getting sick. That&#39;s why teleporting was the early go-to movement mechanic. But we started to experiment with movement in <a href="" target="_blank">The Well</a> and learned that acceleration and deceleration is a key factor in player comfort. We had confidence that we could make free movement work so that became a requirement for <em>Journey of the Gods</em>. And through building that game, we have learned even more.<br /> <img alt="DeveloperInterview_Journey_of_the_gods_002.jpg" height="auto" src="" width="auto" /><br /> We&#39;re also learning that the VR community is unique. Immersion, for example, is more important than ease of use. Opening a door, for instance, can be super simple and easy in a console game, requiring the player to only look towards the door to press the "use" key. In VR, the preferred method would be for the player to reach out, grab the door knob, rotate it and push the door to open it. What&#39;s important isn&#39;t the actual door mechanic, what&#39;s important is that you&#39;re using your hands to manipulate the world around you in a manner that makes sense and makes the world more believable.<br />  <br /> <strong>The game has a <a href="" target="_blank">near five-star rating</a> on the Oculus store with many fans asking for a sequel. What has it been like to see such a positive player reaction and might we see more Journey of the Gods content moving forward?<br />  <br /> Ashton:</strong> The positive reviews are a huge boon for the team and really makes us feel like we are on the right track. We believe that VR gamers are starving for real games with substance. Games that are interesting for longer periods of time, that leverage more than a few game mechanics. Games that feel like they are a complete experience. That was the goal for us on <em>Journey of the Gods</em> and a lot of the reviews are backing up that dev philosophy.<br />  <br /> I know the dev team would love to work on a sequel. When developing a new IP, you spend a lot of time and resources figuring out what the game is. If we get the chance to work on a sequel, we get a head start and we can focus on doing crazier stuff with more content.<br />  <br /> <strong>Thanks for your time. Where can people learn more about Journey of the Gods?<br />  <br /> Community Manager Alissa Barry-Toth:</strong> You can join us on our <a href="" target="_blank">official Discord</a>, where a bunch of the devs are available to chat with or check out the <a href="" target="_blank">store page</a>.<br />  GamesArtBlueprintsDesignVRTurtle Rock StudiosJourney of the GodsJimmy ThangTue, 17 Sep 2019 11:30:00 GMTTue, 17 Sep 2019 11:30:00 GMT Density delivers live broadcast virtual production solutions Density brings the paradigm of traditional VFX-style node-based compositing into live production broadcast studios with its Unreal Engine-implemented Reality Engine solution. Founded in Istanbul in 2014, <a href="" target="_blank">Zero Density</a> is an international technology company dedicated to developing creative products for the broadcast, augmented reality, live event, and e-sports industries. The founders, who shared a common background in broadcast and media, came together on a mission to use their creativity to develop innovative virtual production solutions for live broadcast. The company’s headquarters remain in Turkey, but the company now has an extensive network of clients all over the world. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> Zero Density provides unique and visually stunning broadcast experiences to millions of viewers with <a href="" target="_blank">Reality Engine</a>, its photorealistic virtual studio and augmented reality solution powered by Unreal Engine. Core to Zero Density’s technical initiative is its advanced real-time node-based compositing tools, and Reality Keyer, its proprietary keying technology. From the outset, the team saw that it was pivotal to have this technology deeply integrated with Epic’s Unreal Engine—all running on the GPU. <br /> <img alt="VP_ZeroDensity_blog_body_studio_img.jpg" height="auto" src="" width="auto" /><br /> Traditional post-production artists find Reality Engine’s user interface and approach very familiar, and not unlike Autodesk’s Flame Batch tool. But unlike Flame, Zero Density’s virtual production solution is designed to work on air in a broadcast environment, which necessitates being frame-accurate and ensuring a frame is never dropped.<br /> <br /> The system takes the sophistication and flexibility of a post-production workflow and provides it in a robust broadcast solution—something that has not previously been done. Traditionally, graphics packages have only supplied fill and key signals and assumed a basic composite in the vision switcher. Zero Density provides a much more powerful system that understands gamma, linear workflows, and LUTs. It adds back into a live environment the controls normally reserved for offline solutions. <br /> <img alt="VP_ZeroDensity_blog_body_election_img.jpg" height="auto" src="" width="auto" /><br /> After two years of intensive R&D with game industry technology, the team has succeeded in integrating it with the rigors of live-to-air broadcasting. The company first released the resulting product, which enables multiple Unreal Engine cameras to be set up on the virtual set, at the National Association of Broadcasting (NAB) conference in 2016. The release marked something of a milestone: Zero Density became the first company to use Unreal Engine in broadcast. The product went on to win a set of major awards at the International Broadcast Convention (IBC) in Amsterdam just months later. <br /> <br /> Because Epic gives full access to the Unreal Engine source code, all Zero Density’s code is able to live in Unreal Engine, making use of shader compilers, C++, and Unreal Engine’s cross-platform functionality. For example, the company’s central Reality Keyer is actually implemented as a shader in the GPU code. <br /> <br /> In terms of its keying conceptualization, Reality Keyer is not dissimilar to the Foundry’s advanced Image Based Keyer (IBK), which is found in the company’s Nuke compositing solution. Reality Keyer works with a clean plate and combines this with the system’s tracking functionality to produce a mesh representation of any studio green screen <a href="" target="_blank">cyclorama</a>. The 3D representation is generated by the program, allowing for cleaner keying and dynamic garbage masking to produce a clean and realistic keyed final image, even with sweeping pans and tilts. The system’s use of projection mapping of the clean plate assists the keying and makes the system much more advanced than just a normal chroma keyer. It is the first and only real-time image-based keyer with such advanced clean plate technology, yet it is implemented as a shader inside UE4.<br /> <img alt="VP_ZeroDensity_blog_body_cricket_img.jpg" height="auto" src="" width="auto" /><br /> FOX Sports, who adopted Reality Engine in February 2019, makes extensive use of the keying technology in its NASCAR Race Hub. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> Reality Engine is designed to work with a wide range of industry tracking technology. Zero Density takes in tracking data from most of the major solutions, along with lens information if those programs provide it. If no lensing information is available, the software starts with the zoom and focus information from the last calibration and then estimates lens curvature and other properties such as field of view—all in Unreal Engine. <br /> <br /> While the company is very well known for its work with broadcasters such as FOX Sports, Zero Density is now bringing in clients who are producing episodic TV such as children’s programming. These projects shoot on green screen and then use the recorded live output to immediately go to final editorial, without the need to engage in traditional offline rendering and post-production visual effects.<br /> <br /> In this episodic workflow, the software can output not only the final composite, but also various key layers such as garbage masks. In the future, the team would like to move from this level to an even more advanced version that would support a real-time workflow that also outputs additional data to enable the whole final comp to be tweaked and re-edited in post. <br /> <br /> One huge improvement in recent times has been the addition of real-time ray tracing using NVIDIA RTX graphics cards. At NAB in April 2019, Zero Density previewed version 2.8 of the Reality Engine with ray tracing implemented, demonstrating how video screens on the virtual sets can now reflect onto other relevant parts of the set, together with improved shadows and many other optical enhancements. The company released version 2.8 in August. The latest version, 2.9, will be previewed at <a href="" target="_blank">IBC</a> in Sept 2019. <br /> <br /> Reality Engine is a powerful, robust, high-end solution, and Zero Density not only provides the main system, but also the necessary automation, monitoring, and controlling interfaces. The system is designed to run on multiple Unreal Engine instances that are all controlled and managed as one. Case in point, Turkish arts and culture channel TRT2 broadcasts seven different shows with its virtual studio, which is powered by three Reality Engines.  <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> With the move to virtual production, Zero Density is also showing how broadcast workflows with tight on-air requirements can provide valuable tools to a variety of programs and deliver complex visuals to increasingly demanding live broadcasts.<br /> <br /> <br /> This article is part of our <a href="" target="_blank">Visual Disruptors</a> series. Visit our <a href="" target="_blank">Virtual Production</a> hub for more interviews, articles, insights, and resources.<br />  BroadcastEnterpriseFilm And TelevisionVirtual ProductionZero DensityReality EngineMike SeymourWed, 11 Sep 2019 17:30:00 GMTWed, 11 Sep 2019 17:30:00 GMT Unreal Dev Days 2019 place in Seattle and NYC, Unreal Dev Days offers presentations and strategic info about Unreal Engine to help developers be as successful as possible. Plus, get the details on our first ever Unreal Indie Dev Days!Epic is pleased to announce Unreal Dev Days 2019, taking place at the Sheraton Grand Seattle on <strong>October 8, 2019</strong> from <strong>10 AM - 6 PM</strong>, with check-in starting at <strong>9 AM</strong>.<br /> <br /> We’re planning a fantastic day of content featuring Epic presenters and partners, designed to provide insight into the Unreal Engine roadmap, a preview of our open-world features, workflows for large-scale teams and projects, and the latest from our Seattle-based XR team. The full event agenda will be coming soon.<br /> <img alt="Events_DevDays2019_blog_body_image.jpg" height="auto" src="" width="auto" /><br /> We will also be hosting Unreal Dev Days in NYC on <strong>November 5</strong> for our East Coast developers, where we’ll deliver the same content lineup as in Seattle. The specific venue information will be announced soon. Those interested in attending can register for updates below.<br /> <br /> In addition to Unreal Dev Days, a separate Unreal Indie Dev Days event will take place this year at the Sheraton Grand Seattle on <strong>October 9, 2019</strong> from <strong>10 AM - 6 PM</strong>, with check-in starting at<strong> 9 AM</strong>. This event will also feature Epic presenters and partners and is designed to provide smaller dev teams and those new to Unreal with strategic info about the Unreal Engine ecosystem, best practices, and the latest features in UE 4.23.<br /> <img alt="Events_DevDays2019_blog_body_indie_image.jpg" height="auto" src="" width="auto" /><br /> Registration is $25 per person, per event. This includes breakfast, lunch, and an invite to our evening networking mixer where drinks and light hors d&#39;oeuvres will be served. <br /> <br /> Specific presentation agendas for both Unreal Dev Days 2019 and Unreal Indie Dev Days 2019 will be announced soon, but you can secure your attendance for either (or both) of the Seattle events and register for updates regarding our NYC event below right now.<br /> <br /> <strong>Unreal Dev Days 2019 (Seattle) | October 8</strong> - <a href="" target="_blank">REGISTER NOW</a><br /> <strong>Unreal Indie Dev Days 2019 (Seattle) | October 9 </strong>- <a href="" target="_blank">REGISTER NOW</a><br /> <strong>Unreal Dev Days 2019 (NYC) | November 5 - </strong><a href="" target="_blank">REGISTER FOR UPDATES</a><br /> <br /> We hope to see you there!<br />  Unreal Dev DaysEventsUnreal Indie Dev DaysUnreal Dev Days 2019Daniel KayserTue, 10 Sep 2019 15:00:00 GMTTue, 10 Sep 2019 15:00:00 GMT 19 - A lovingly crafted motorcycle racing experience AI that’s fueled by machine learning coupled with a new online infrastructure and tons of historical content, MotoGP 19 is the series’ best installment yet. Developed by <a href="" target="_blank">Milestone srl</a>, the MotoGP series has long been revered as one of the best motorcycle racing game franchises ever created. <a href="" target="_blank">MotoGP 19</a> really kicks it into high gear by being the Italian developer’s best iteration yet. This installment leverages machine learning to create realistic AI that learns to race like us, which ensures that the AI is both ruthless and fair. This follow-up also revamps MotoGP’s online infrastructure by introducing dedicated servers and, for the first time, allows players to customize the look of their riders. The fact that MotoGP 19 includes a plethora of historical challenges, classic race courses, and legendary riders is icing on the track. <br /> <br /> To see how the studio developed such an impressive game, we interviewed Producer Michele Caletti. He talks about how they balanced making a super fast and realistic hardcore racing simulator that’s also accessible to newcomers, elaborates on how the studio used reference images to strive for a photorealistic look, and discusses how the team faithfully recreated all the real-world tracks and motorcycles in the game.  <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <strong>Thanks for your time! What do you think makes MotoGP 19 the most advanced iteration of the long-running franchise to date?<br /> <br /> Producer Michele Caletti: </strong>Thanks for having us! There&#39;s a number of technical features that make MotoGP 19 stand out, from the Neural AI, which is a breakthrough that’s incredibly challenging and realistic, to the cool editors, which allow players to create their own helmets, racing numbers, and patches for the first time. There&#39;s also the new online system, which uses dedicated servers. 2019 is a great year for the franchise!<br />  <br /> <strong>How did you balance making a hardcore racing simulator that is also welcoming to newcomers?<br /> <br /> Caletti: </strong>The trick is to offer a deep simulator at the core, but be able to ease up on it for newcomers. We have a number of options that beginners can adjust, which include tire wear and joint brakes. We also have assist options that include auto braking and simplified physics. There are also numerous race options that welcome all players. In fact, we have two kinds of gamers. One crowd is super hardcore and like to max out the AI, engage in the lengthier races, and disable assist systems. The other is more casual gamers who prefer shorter races and easier AI. Both seem quite pleased with the game.<br /> <img alt="DeveloperInterview_MotoGP_19_23.jpg" height="auto" src="" width="auto" /> <br /> <strong>For MotoGP 19, the studio leveraged machine learning to create what you&#39;re referring to as Neural AI. Can you speak to this implementation and what it adds to the experience?<br /> <br /> Caletti:</strong> It&#39;s a completely different approach. Instead of making the AI run on a given trajectory, with explicit rules, we&#39;ve created neural agents. They are in control of the bike, just like players, and can "see" and perceive the environment: the track, the other riders, feel the tire grip, and everything else. With this, they start massive training sessions, hosted on servers, that make them grow from total beginners to fully fleshed-out MotoGP riders. Everything they do is self-learned, which is obtained when we reward them for accomplishing the best lap times and best behaviors. This creates a fair yet aggressive approach to AI. <br />  <br /> <strong>The game seems like a love letter to MotoGP history with the inclusion of historical challenges, which include 50 legendary riders and classic race tracks. How important was it for the studio to include this aspect into MotoGP 19?<br /> <br /> Caletti: </strong>It is important because MotoGP is a story of heritage. There&#39;s the new, shiny, contemporary MotoGP dominated by Marquez, but there are also decades of sport, challenge, heroes, and rivalries, and players love to dig deep into this vast aspect. So we not only try to provide players things to play with, but try to provide historical context so they can learn something in the process. People seem to really love it. <br /> <img alt="DeveloperInterview_MotoGP_19_25.jpg" height="auto" src="" width="auto" /> <br /> <strong>With 19 real-world tracks in the game, can you talk about how you faithfully recreated them?<br />  <br /> Caletti:</strong> We have 19 tracks from 2019, but also some historical ones, namely Donington, Laguna Seca, and an older layout of Catalunya. Our recreation pipeline starts with a drone scan, and a huge set of pictures of all the details, backdrops, and buildings. From the drone scan data, we recreate a "point cloud" and then a game-usable 3D asset that we complete with all the required props. It&#39;s a complex path, but the precision achieved is top notch.<br />  <br /> <strong>With dozens of motorcycles in the game, how did the team set about accurately modeling them?<br /> <br /> Caletti: </strong>We have four categories for 2019: MotoGP, Moto2, Moto3, and MotoE, which is something new to both the series and the real world, and then we have the historical 500cc and MotoGP ones. For the contemporary bikes, we work with multiple sources, from detailed pictures taken at the first round in Qatar to precise part measurements. There&#39;s always some secrecy about these prototypes, so it&#39;s not possible to laser scan them, but the accuracy level is very high.<br />  <br /> Older bikes are much harder to make: some of them are hardly available for the public now, but museums, older material from Dorna, and lots of cross checking allow for precise reconstructions even 20 years after they&#39;ve been off the tracks. <br /> <img alt="DeveloperInterview_MotoGP_19_21.jpg" height="auto" src="" width="auto" /><br /> <strong>For the first time in the series&#39; history, MotoGP 19 features an extensive graphics editor, which lets users customize new helmets and outfit designs. Can you speak to this feature and why it was included? <br /> <br /> Caletti: </strong>Players entering the career mode or online need to create a personal image, and our experience developing Ride3 tells us that they love to create, exchange, and vote for other player creations. So, for MotoGP19, we decided to enrich this area with a helmet editor. We have some 26,000 helmets created to date, which is growing at a healthy pace of 500 per day! Then there&#39;s the racing number editor, so your personal number on the bike won&#39;t just be a bland font, but complex artwork, just like what’s on the real riders. Finally, the butt patch: your nickname will stand out and be a clear message to those who follow. We have also integrated a sticker editor to make complex designs easier, and to be able to exchange logos, brands, and patterns.<br />  <br /> <strong>With impressive visuals that feature distant mirage effects on hot days juxtaposed against realistic-looking rainy nights, how did you deliver the game&#39;s great graphics?<br /> <br /> Caletti:</strong> We tried matching real pictures against the game, which is something that&#39;s not easy to do. Sometimes in games, reality is edulcorated, contrasts are tamed, colors are saturated, but we wanted to match what you see on TV during a race weekend. We found that a faithful approach can achieve visuals that can be stunning and realistic at the same time. The goal was to convey feelings, from the extremes of a hot summer day in Mugello to the hard weather of a stormy Sepang. <br />  <br /> We chose picture sets, analyzed colors and temperatures, spotted those details that make every track unique, and iterated until we were satisfied with the result. <br /> <img alt="DeveloperInterview_MotoGP_19_28.jpg" height="auto" src="" width="auto" /> <br /> <strong>MotoGP 19 features an incredible sense of speed and makes you feel like you&#39;re actually out there on the race tracks with a palpable sense of danger. How did you manage to make the game feel so real? <br /> <br /> Caletti: </strong>That&#39;s right, the sense of danger. Some games miss it, and try to cope with the sense of speed with tricks like excessive blur, but we went in the opposite direction. When you&#39;re on the track, time flows differently: on a straight, you&#39;re fast, but the track is wide and you feel still. Then the corner approaches, and time moves faster, and you need to catch the breaking point to [slow] the bike. Reference points flow fast at your side, and you feel the risk. Into the corner, speeds are not [overwhelming], but you’re at the edge of control. The sense of speed is not only visual, it&#39;s multi-sensorial and comes from the sense of risk, of challenge, from the sounds and from all the little cues that build up to generate what it feels like to be on a motorcycle. <br />  <br /> <strong>Lots of people are praising MotoGP 19 for its realistic physics engine. How did you create it?<br /> <br /> Caletti: </strong>It&#39;s the result of years of experience. We start from some real measurements; again, MotoGP bikes are super-secret, but some data is available. Then we create a generic model that takes into account all the forces and all the reactions. Gyroscopic forces play a key role and are very hard to model correctly, for example. Then we fine-tune a single class or bike, and we cross-check speeds along the track, braking points, and all that&#39;s available using onboard laps and timings while keeping in mind that we want to offer a great sensation for players.<br />  <br /> Sometimes simulators go by the rule, "the harder, the more realistic," but that&#39;s not necessarily true. Racing bikes are conceived to be very usable and consistent up to 95 percent of their performance - then comes the challenge. That&#39;s what we want to convey: bikes that feel real, look real, and are a blast to play with.<br /> <img alt="DeveloperInterview_MotoGP_19_17.jpg" height="auto" src="" width="auto" /><br /> <strong>How large was the game&#39;s development team?<br /> <br /> Caletti: </strong>We have a core team of programmers, artists, and designers reporting to leads. These individuals are relatively few, around 20 people overall. Then, depending on the production phase, the team expands considerably, adding both developers, but also outsourcers that are used for assets like bikes. It’s worth mentioning that Milestone is also a publisher, so we have in-house professionals that work on marketing, PR, promotional content, and social media. We can top 100 people at peak capacity.<br />  <br /> <strong>What made UE4 a good fit for the MotoGP franchise?<br /> <br /> Caletti: </strong>Unreal Engine has clear strengths, from the straightforward <a href="" target="_blank">multiplatform</a> support, to the powerful tool sets available to artists and designers. It was a winning choice for MotoGP as it offers great visuals, ease of integration for third-party plugins, and offers a well-known pipeline for outsourcers among other advantages. Overall, the package is powerful, stable, and flexible, and allows us to reach great quality quickly and on budget.<br /> <img alt="DeveloperInterview_MotoGP_19_19.jpg" height="auto" src="" width="auto" /><br /> <strong>Does the team have any favorite UE4 tools or features?<br /> <br /> Caletti: </strong>Generally speaking, it would be the speed in which you see things in the editor and have them realized in the game. Being able to have a clear idea of complex assets, <a href="" target="_blank">lighting</a>, <a href="" target="_blank">animations</a>, and being one click away from seeing all of those aspects in motion together is a powerful concept that’s brilliantly executed.<br />  <br /> <strong>For the first time in the franchise&#39;s history, MotoGP features dedicated servers. Can you speak to your efforts in bolstering the game&#39;s online functionality?<br /> <br /> Caletti:</strong> It&#39;s been a mandatory step for us. Players wanted to have a stable, high-quality experience and only the use of servers, provided by Amazon, allowed that to become a reality. Our framework is very complex: different platforms, complex races with flexible sessions and regulations, the need for precise physics and collisions, but also the use of player-generated content, and then we have the eSport season. There&#39;s a lot of ground to cover. We had an R&D team allocated for the whole development cycle, and some MotoGP team developers working together to build the infrastructures and the game essentially at the same time. It&#39;s been quite a challenge, but it paid off in the end. <br />  <br /> <strong>Thanks again for your time. Where can people learn more about the game?<br /> <br /> Caletti:</strong> You can find out more at <a href="" target="_blank"></a>.<br />  Milestone srlMotoGP 19GamesJimmy ThangMon, 09 Sep 2019 11:00:00 GMTMon, 09 Sep 2019 11:00:00 GMT Eye Pictures uses latest ray tracing features to create branded visuals for UE4 by Epic Games to create a package of branded visuals for SIGGRAPH 2019, Evil Eye Pictures took the opportunity to show off the latest ray tracing features in Unreal Engine with the short film “emergence,” and a brand new UE4 logo animation. Here, they reveal how they did it.For SIGGRAPH 2019, Epic Games commissioned <a href="" target="_blank">Evil Eye Pictures</a> to create a package of branded visuals, which included an ambient animated piece to play while attendees took their seats at the Unreal Engine User Group. The result was the beautiful short film <em>emergence</em>, together with a brand-new Unreal Engine logo animation, both of which are entirely rendered in Unreal Engine. Here we take a look at the company behind the project, and find out how they used the latest UE4 features to pull it off. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> Founded in 2004 by contributors to the award-winning visuals on <em>The Matrix </em>film trilogy, <a href="" target="_blank">Evil Eye Pictures</a> continued to produce imagery for the VFX industry for the next decade. In 2013, right around the time that mobile 360/AR and 6DoF VR was taking off, the company began working in conjunction with <a href="" target="_blank">Google Spotlight Stories</a> to produce short, immersive films that showcased the tech giant’s technology in that space. The first of these, <a href="" target="_blank">Pearl</a>, a VR film directed by Patrick Osborne, was nominated for an Oscar for Best Animated Short Film and won an Emmy for Outstanding Innovation in Interactive Storytelling, and its 360° version has garnered over three million views on YouTube. The film’s popularity propelled the company into real-time work.<br /> <br /> Initially, the team was working with Google’s real-time engine and pipeline, the <a href="" target="_blank">Story Development Kit</a>. By the end of 2018, they started experimenting with Unreal Engine, hoping to prove that they could recreate their photoreal visual effects in real time. “Our Associate VFX Supervisor Steve DeLuca used a digital asset we were working on at the time as a test case to build a photoreal environment in UE4,” says Dan Rosen, the company’s Co-Founder and Creative Director. “We took that build up to show the director on a VR headset, and it blew him away.” <h2>The concept behind “emergence”</h2> When Epic approached Evil Eye Pictures about the SIGGRAPH brand visuals project, the team was excited by the challenge. “We saw this as a wide-open opportunity to help brand Unreal using the software to create a visual beyond what you might typically see,” says Rosen. For the ambient introduction, they planned to create an infinitely looping short, rendered entirely in engine, and incorporating the latest UE4 features including <a href="" target="_blank">real-time ray tracing</a>.<br /> <br /> Finding the right design lead was crucial. “Each and every project for us is a design challenge for both art and science,” says Rosen. “For the art on this one, I knew that I wanted to work with Conor Grebel. He’s an amazing designer and motion artist who not only works in Cinema 4D, he also uses Unreal Engine.” Grebel was quick to accept the job of Lead Designer on the project.<br /> <br /> “Based on our initial conversations with Epic Games, there was a lot of talk about integrating ‘motion graphics’ design and appeal into this project,” Grebel says. “It was really serendipitous that Evil Eye reached out to me, as I was in the process of switching careers from motion graphics to game art and design. I was excited about the idea of using my newfound love for Unreal Engine to create art that appealed to other artists of a similar design background.”<br /> <br /> Grebel came up with a concept that explores the journey from creation to collapse. The ambient short <em>emergence </em>is just one element of that broader concept that has the potential to be further developed in future.<br /> <img alt="Spotlight_Emergence_blog_body_forms_img.jpg" height="auto" src="" width="auto" /><br /> “I wanted to use parametric design methods to create a seemingly infinitely complex form,” he says. “Something that combined man-made structures and organic forms, inspired by the complexity and repetition of nature. There is something so undeniably inorganic about brutalist architecture. It is the summation of centuries of engineering and artistic progress, finalizing in an oppressive and minimal masterpiece. Its angles, materials, and shapes are completely inorganic. Taking that aesthetic and reinterpreting it in the context of organic and fractal growth, we thought was a very interesting design contrast. <br /> <br /> “I was constantly asking myself questions during the initial design phase of this project: ‘Is this inspiring design? Will UE4 artists be impressed this was done in engine? Will motion graphics artists be impressed with UE4?’,” he explains. “I was constantly checking my progress with these prompts. I wanted the design alone to be fresh and mind-bending, but I also wanted it to be impressive within the context of the game engine. Ultimately the team was motivated by the idea of triggering a slew of ‘How did they make this?’ reactions.”<br /> <br /> Goal achieved. As well as those congratulating the team on the beauty of the piece, many of the comments on the YouTube posting of the video ask simply “How?” The team is happy to divulge the details. <h2>Creating light, shadow, and fractal patterns</h2> Acting as Technical Supervisor on the project, DeLuca explains that the three megastructures seen in <em>emergence</em> consist of eight octants, each with identical animation inversely scaled in the appropriate axial planes to create an object that has perfectly mirrored motion in three dimensions. <br /> <img alt="Spotlight_Emergence_blog_body_octant.jpg" height="auto" src="" width="auto" /><br /> Each octant is made up of a series of branches, with each branch containing smaller ornaments, which are themselves composed of individual kit pieces. <br /> <br /> “The title <em>emergence</em> speaks to this, as any single piece on its own is dull and fairly simple, but when combined with all the other parts becomes something extraordinary and complex,” he says. “Creating a single octant by hand allowed us to art-direct the animation of a branch, an ornament, or even a single kit piece.” <br /> <br /> The very first step in the design process was creating a <a href="" target="_blank">kitbash</a> set of brutalist architecture parts. Starting in Maya, Grebel created about 25 modular pieces that could snap together in various ways on a predetermined grid size. <br /> <img alt="Spotlight_Emergence_blog_body_kitbash.jpg" height="auto" src="" width="auto" /><br /> Grebel then quickly sketched layouts using the mograph tools in Cinema 4D. <br /> <img alt="Spotlight_Emergence_blog_body_Cinema4D_layout.jpg" height="auto" src="" width="auto" /><br /> After finding a few layouts the team liked, he imported the individual low-poly pieces into ZBrush to use as a base for sculpting the detailed normals. He then created some custom aged-cement materials in Substance Painter using the high-poly sculpt from ZBrush. The base cell blocks and cluster formations were laid out in an overall structure that was now complete in its stationary form.<br /> <img alt="Spotlight_Emergence_blog_body_Cinema4D_structure.jpg" height="auto" src="" width="auto" /><br /> Next, it was the turn of Euisung Lee, Cinematographer and Animation Lead on the project. Working in Maya, Lee started adding motion both on the cluster level and at the level of the overall structure.<br /> <br /> “Imagine a Christmas tree with moving branches and animated ornaments, which is a simple metaphor to understand the process and also the methodology of importing the animated rig into UE,” explains Lee. “I exported one FBX file that contained only the branch motion joints and another with the cluster motion loop. Once we had the rig set up, it was painless to iterate on animation.” <br /> <img alt="Spotlight_Emergence_blog_body_animated_rig_img.jpg" height="auto" src="" width="auto" /><br /> De Luca takes up the story. “Once we had a single octant animated in Maya, we brought it into UE4 and it was added to a Blueprint that would assign a geometry shader, mirroring that animation in the remaining seven octants,” he says.  <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> It was only at this point that the team could truly see the final animation of each megastructure. Each of the three animations was about five minutes in length from initial formation, through transformation, to collapse. <br /> <br /> “We did a rough pass of lighting on all the structures and then began the process of creating camera moves that either traveled around and through the structures for the full five minutes or discovered unique compositions hidden in the complexity,” says DeLuca. “At the end of this process we had over three hours of ‘footage’, and our editor Matt McDonald combed through the dailies and assembled our final five-minute edit. After we’d reviewed it several times, we revisited each shot’s lighting; not only to refine it, but also to create an evolution over the piece from a neutral palette to a hyper-realized sunset full of color, then back again to a simple noir scheme.”  <h2>Overcoming challenges</h2> Given the complexity of the geometry and the newness of the preview-release Unreal Engine code the team was using, it’s not surprising that they faced—and overcame—some challenges along the way.<br /> <br /> “We were constantly exceeding the boundaries of the possible during the creation process, and Epic was enthusiastically encouraging us to break things so they could fix them,” says Producer Yovel Schwartz.<br /> <br /> “The elegance of the engine can be frustratingly complex, allowing for multiple approaches that get you tantalizingly close to your goal before revealing a better way to get there,” he explains. “We were working with the absolute latest features and pushing them to their limits the entire time, which naturally created a constant stream of technical hurdles to overcome. But those challenges are at the heart of what makes the engine so powerful and its final results so impressive. The extensive abilities of the engine just make you want to push it farther.” <br /> <img alt="Spotlight_Emergence_blog_body_illumination_img2.jpg" height="auto" src="" width="auto" /><br /> Another challenge the team had to overcome was handling the texture sizes required for the extreme close-up shots.<br /> <br /> “Originally I had divided up the pieces into groups of four or five that shared a single UV space, however this proved problematic once we viewed everything in UE4,” explains Grebel. “The pieces get instanced thousands of time in scene, and although the majority of their screen space is very small, there are constant shifts in scale in the film that puts the camera very close to some of the geometry. Eventually I was forced to give each piece of geometry its own UV space and 4K texture. While it was overkill for most shots, it dramatically helped the fidelity of the close-ups.” <br /> <br /> Lee explains how team had to learn to design their camera work in Unreal Engine’s <a href="" target="_blank">Sequencer</a>, rather than doing it in Maya as they were accustomed to, since the final masked and mirrored rig could only be seen in the engine. “We got used to it quickly, and in most cases it turned out to be a better workflow for the shot exploration and composition, thanks to the fact that you see the final lighting and effects,” he says.<br /> <br /> “The migration of our animated assets from Maya to UE4 proved to be the trickiest bit due to our desire to use ray-traced global illumination (GI) and shadows in the engine,” says DeLuca. “Since nothing uses pre-baked lighting, everything required fairly high samples to produce quality results at our 4K deliverable resolution.”<br /> <br /> To reduce the scene complexity enough to enable them to use a higher number of GI samples, the team had to import the animated ornaments as static meshes rather than skeletal meshes, and manually attach them to the joint sockets of the base skeletal mesh for the megastructure.<br /> <br /> In the end, though, the effort was worth it. “I think the most essential feature that tied this film together was the ray-traced global illumination,” says Grebel. “The moment we enabled ray tracing, the scene just came alive. It was that missing piece that turned our designs from previs to art piece. The light bounces and accurate shadows were such a dramatic improvement that the focus of the project shifted towards highlighting the ray tracing feature.”<br /> <img alt="Spotlight_Emergence_blog_body_illumination2.jpg" height="auto" src="" width="auto" /> <h2>Designing a new Unreal Engine logo animation</h2> Another element of the package the team was tasked with delivering was a new Unreal Engine logo animation, to be used for video intros. On this project, Rosen provided the creative direction, while Grebel once again led the design process, DeLuca supervised the final look, and Lee handled camera and cinematography. <br /> <br /> The team considered this another unique opportunity to help brand Unreal Engine, and to show off the ray tracing feature—the piece is rendered in real time in engine. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> Grebel created a number of concepts from which the team selected one that reflected long light bars, or “light lines,” as Grebel termed them. Working in Substance and Cinema 4D, he created shaders that juxtaposed fully reflective and fully rough surfaces, and animated between the two to create the visual treatment, which he called “Defrost”. He was then able to migrate the assets to Unreal Engine.<br /> <br /> “We worked together to strike a balance of design that felt original and authentically Unreal,” says Rosen. “Since it was running in real-time in UE4, we could make tweaks on all of it and see results instantaneously. <br /> <br /> “What’s so dynamic about Unreal is that we could break off a shot or two from the main logo scene to create separate deliverables,” he continues. “I was able to change camera, lighting, and color on a couple of shots to provide the Unreal logo as a speaker backdrop for the SIGGRAPH show. Then Steve took another angle of it to provide for a print on the Epic booth. The turnaround time was incredibly fast.”<br /> <img alt="Spotlight_Emergence_blog_body_logoanimation_img.jpg" height="auto" src="" width="auto" /> <h2>Looking ahead for real-time technology</h2> Asked what excites them about the future of real-time technology in their industry, the team has plenty to say.<br /> <br /> Evil Eye Pictures’ Co-Founder and Editor Matt McDonald shares his thoughts about the technology’s potential role in a new world where traditionally separate media and entertainment verticals are starting to converge, and where interactive experiences are customizable for the individual.<br /> <br /> “These types of hybrid spaces could be narrative—creating experiences that overlap between games and film—or musical, between musician, composer, and even audience,” he says. “Customization can happen directly through the interaction with objects or simply through using one&#39;s movement in a space. This overlap is now possible across countless combinations of visual, audio, and even tactile media.<br /> <br /> “Relatedly, I&#39;m excited that real-time technology is blurring the lines of collaboration both as a producer and consumer, and even what those previously distinct roles mean. While I have traditionally created more fixed and finite experiences for others, real-time technology affords the capability to empower an audience or player to be the creator of their own experiences and even save their own versions to share with others or to re-experience later.” <br /> <br /> Rosen also looks forward to continuing to use real-time technology across the convergence of movies, games, VR, and AR. “Whether VR lives or dies, it was exciting to stand inside our creations and interact with them,” he says. “The software has caught up, but now we want the hardware that can match it in the same seamless way. Using Unreal Engine for any number of screens, projections, and visuals is an exciting way to share our images with all types of audiences.”<br /> <br /> Schwartz interjects yet another angle on the topic. “I think that we’re on the cusp of a really exciting convergence between traditional narratives, games, and other interactive experiences,” he says. “When you can design, light, render, composite, and edit in a single technological space, the creative freedom is almost overwhelming. From virtual production to motion graphics, I can’t imagine a corner of the image-making industry that won’t be affected.” <br /> <br /> Grebel lends his perspective from his background in motion graphics and design. “The ability to achieve ray traced lighting, shadows, and reflections at 60+ fps is something I am majorly excited about,” he enthuses. “Once Octane and Redshift took hold of the 3D design community, making ray-traced rendering an industry standard, it changed the game. High-quality design became more easily accessible, and a wave of new ideas flooded the internet. Now, a similar power is awakening inside Unreal Engine. Combining the interactive capabilities of UE4 with the visual beauty of ray tracing will be another game changer for sure.” <br /> <img alt="Spotlight_Emergence_blog_body_logo_anim_img.jpg" height="auto" src="" width="auto" /><br /> For DeLuca, it’s a question of real-time rendering freeing artists across all graphics industries to be more creative and less burdened by technical hurdles. “Having been witness to overnight renders for a few frames of a hard-surface object at near photoreal quality, to now be able to relight an entirely photoreal ray-traced scene while it plays back at 30 fps with full lensing effects is staggering!” he exclaims.<br /> <br /> Cinematographer Lee agrees. “Real-time graphics overall has made incredible strides lately and UE is clearly at the forefront—I’m very excited that such technology is available to everyone. I have no doubt that it is encouraging more collaboration and sharing across the industry and propelling progress at a faster rate than it would be in an alternate reality where everything is proprietary and costly,” he says. “On a personal note, I love that the real-time workflow gives you a whole new perspective on your own creative process and possibilities. It feels intimidating at times, but it’s a good jolt of stimuli I’ve been missing after a long time of doing the same things over and over.” <br /> <br /> Schwartz has the final word. “As a producer, for me everything is in service of artistic collaboration,” he says. “How can we tell the best story? Create an immersive experience? Whatever the art form, there will always be toolsets that help you get there. When you add real-time technology as a resource, your ability to iterate creatively throughout the entire production pipeline is staggering—it’s a revelation. <br /> <br /> “If you’re a writer, and halfway through your novel you realize that it would be better if there was some change back in chapter 2, your decision to rework the idea is basically free of consequences. That’s never really been true for necessarily collaborative art forms like film, games, or now VR / AR experiences. There was a time when asking to see what a CG scene looks like at night, for example, instead of the daytime look you’d been developing for weeks or months, could create an existential crisis. Now we can explore those creative changes so comparatively fast that we’re all going to need a lot less therapy!”<br /> <br /> <br /> Want to find out what real-time technology can bring to your industry? <a href="" target="_blank">Download Unreal Engine</a> today.<br /> <br />  emergenceEnterpriseEvil Eye PicturesFilm And TelevisionSIGGRAPH 2019BlueprintsJohn JackFri, 06 Sep 2019 19:30:00 GMTFri, 06 Sep 2019 19:30:00 GMT Sports sets new record with FIFA Women’s World Cup broadcast a billion total viewers over the course of 30 days, learn how broadcaster FOX Sports, in addition to Drive Studio and Dekogon, utilized Unreal Engine for on-air graphics and photoreal environments during the 2019 FIFA Women’s World Cup.The 2019 FIFA Women’s World Cup captivated fans around the world over the course of its 30-day broadcast, reaching over one billion total viewers for the first time in the tournament’s history. Broadcaster FOX Sports, in addition to its creative vendors <a href="" target="_blank">Drive Studio</a> and <a href="" target="_blank">Dekogon</a>, all utilized Unreal Engine for various components of the broadcast from set design through to on-air graphics, ensuring photoreal environments and the ability to dynamically change graphics content on the fly.<br />   <div style="text-align: center;"><iframe allow="autoplay; fullscreen" allowfullscreen="" frameborder="0" height="360" src="" width="640"></iframe></div> <p><br /> Following their successful contribution to the <a href="" target="_blank">2018 FIFA Men’s World Cup broadcast</a>, Drive Studio was once again tapped by FOX Sports, this time to design the Parisian set. Knowing that the local set would be based in Cafe de l’Homme, in the Trocad&eacute;ro area surrounding the iconic Eiffel Tower, the Drive Studio team created a VR model in Unreal based on true-to-scale physical dimensions of the space. Exploring the venue in advance in VR allowed designers to get a sense of not only the available space for the on-air team to work with, but also how it would integrate with the surrounding environment and which camera angles and movements would work most effectively in the live broadcast.</p> <div style="text-align: center;"><img alt="Spotlight_FIFA-Womens-World_Cup_Timelapse02.gif" height="auto" src="" width="auto" /></div> “The main goal was for everyone to stage themselves within the set ahead of time so that key decisions could be made regarding physical set design. This allowed us to determine optimal camera placement that would best capture the surrounding Eiffel Tower environment and areas for fans to gather,” explained Marco Bacich, Founder and Technical Director at Drive Studio. “With Unreal, everyone could interactively explore the set in real time and visualize the sight lines and the best placement and spacing for physical set elements.”<br /> <br /> With the set design in place, Dekogon then worked with FOX Sports to fully build out a photoreal environment asset for the areas surrounding the set, namely the Trocad&eacute;ro and the Eiffel Tower. Using Unreal to test out different lighting and materials, as well as different virtual camera movements within the environment, Dekogon ultimately delivered 10 different shots each at 96 frames per second. This enabled the FOX team to utilize the environment assets live on air, incorporating a suite of graphic overlays as needed.<br /> <br /> “We are longtime Unreal users but this was actually our first time delivering broadcast content, and new features within Unreal such as dynamic global illumination and high frame rate support allowed us to easily make the transition,” shared Clinton Crumpler, Founder and Creative Director, Dekogon. “The key needs for us and ultimately FOX were the ability to make changes in real time on air depending on the local weather conditions and to insert dynamic graphics, and with <a href="" target="_blank">Blueprints</a> we could easily set up the environments to change the time of day and corresponding lighting on the fly.”<br /> <br /> Michael Dolan, SVP of Design for FOX Sports Creative Services, added: “By creating the Trocad&eacute;ro environment as a real-time graphics scene, Dekogon allowed us to get flyover shots and other perspectives that we never would have been able to get in the physical world with the restrictions around the Eiffel Tower.” <div style="text-align: center;"><img alt="Spotlight_FIFA-Womens-World_Cup_Timelapse01.gif" height="auto" src="" width="auto" /></div> <div>Lastly, FOX Sports’ in-house design team went to work creating a range of customizable branded on-air graphics that could be dynamically updated and displayed within the photoreal Trocad&eacute;ro environment. The FOX team, which has heavily leveraged Unreal Engine to power its state-of-the-art <a href="" target="_blank">NASCAR virtual studio</a> in North Carolina, continues to iterate and find new ways to utilize the engine’s real-time capabilities for broadcast with each new sporting event. For the FIFA Women’s World Cup, FOX created a variety of graphics elements specific to the tournament, including brackets, player bios, matchups, lineups, standings, and more. Customizing its pipeline to pull in the latest stats and information from Vizrt, native Unreal templates were then populated and displayed within one of the desired scene cameras around the Trocad&eacute;ro environment.</div> <div><br /> Dolan concluded: “The future is definitely real-time – once you’re able to utilize this type of technology in a production environment, the possibilities are incredible. The fidelity we can get right out of Unreal is spectacular, it truly made Paris look photoreal – we wouldn’t have been able to do this a couple years ago. Based on the great experiences we’ve had using Unreal Engine on our NASCAR virtual set and with major broadcasts like the 2018 and 2019 FIFA World Cups, we’re continuing to make Unreal a bigger part of our workflow and look forward to what the future holds.”</div> <div><span style="display: none;"> </span> </div> FIFAFIFA Women's World CupFOX SportsBlueprintsAndy BlondinFri, 06 Sep 2019 17:30:00 GMTFri, 06 Sep 2019 17:30:00 GMT Temple: Telling a powerful story through UE4 environments along as SCAD student Brian Lesiangi walks us through his development process in creating Balinese Temple, an environment created in Unreal Engine.Hi everyone! My name is <a href="" target="_blank">Brian Lesiangi</a> and I am from Bali, Indonesia. I recently completed my thesis project, <em>Balinese Temple</em>, as part of my master&#39;s degree submission at <a href="" target="_blank">Savannah College of Art and Design</a> in Savannah, GA. Although this project was mainly created by myself, I had the help of my fellow students Meghana Reddy, Kylie Gay, and Sunny Chan during the pre-production process of the project. <br />   <div style="text-align: center;"><iframe allow="autoplay; fullscreen" allowfullscreen="" frameborder="0" height="273" src="" width="640"></iframe></div> <br /> Before we start breaking down the project, I’d like to thank Epic Games for the opportunity to share my work. I hope I can help you learn a trick or two through this blog post! <h2>How the project started</h2> <em>Balinese Temple</em> focuses on creating a real-time cinematic experience through environment storytelling. This project was highly inspired by the <a href="" target="_blank">Gnomon talk</a> (also available <a href="" target="_blank">here</a>) from 2017. The idea is that environment art can be used as a powerful tool to tell stories without the presence of a character. To make it more believable, stories are often grounded with real-life cultures, such as myths and legends.<br /> <br /> Franchises like <em>Lord of the Rings</em>, <em>Harry Potter</em>, <em>Legend of Zelda</em>, <em>God of War</em>, and <em>Assassin’s Creed </em>are a few examples that show a direct influence of mythology in popular culture. Although some of these cultures (namely the Japanese, Greek, Persian, and Norse mythology) are more recognized than others, it does not mean that they are the only reference an environment artist can look for. <br /> <img alt="Tech-Blog_Balinese-Temple_Blog-body-2.jpg" height="auto" src="" width="auto" /><br /> Recent movies like <em>Moana </em>(Polynesian) and <em>Coco </em>(Mexican) show a shifting trend from using mainstream mythology sources. As someone that comes from a third-world country, this trend allows me to explore the use of an under-exposed culture, Balinese, into my environment artwork.  <h2>Art Direction</h2> Before starting my project, I knew that I wanted to create an environment that told a story about my hometown. Thus, I began spending a bit of time gathering references on a style to support the narrative setting. Games like <em>Uncharted 4</em>, <em>Horizon Zero Dawn</em>, and <em>Shadow of Tomb Raider </em>are a few examples that highly resonate with my idea.<br /> <img alt="Tech-Blog_Balinese-Temple_Blog-body-3.jpg" height="auto" src="" width="auto" /><br /> I then start brainstorming and fleshing out ideas after getting inspired by the quality of work. At this point, I knew that my concept would evolve as I continued to work on it. I therefore wanted to leave room for future ideas that could improve my project. <br /> <img alt="Tech-Blog_Balinese-Temple_Blog-body-4.jpg" height="auto" src="" width="auto" /><br /> <img alt="Tech-Blog_Balinese-Temple_Blog-body-5.jpg" height="auto" src="" width="auto" /><br /> Once I had the basic idea nailed down, I created a series of mood boards that focused on color, light, and composition.<br /> <img alt="Tech-Blog_Balinese-Temple_Blog-body-6.jpg" height="auto" src="" width="auto" /> <h2>Blockout</h2> The first blockout was quite simple, but it helped me a lot to get the overall feel of the environment. I usually start with rough blockouts that I make in Maya to define the basic shapes before I lay it out in Unreal. The idea is to create modular proxy assets that I can substitute with more detailed versions later on.<br /> <img alt="Tech-Blog_Balinese-Temple_Blog-body-7.jpg" height="auto" src="" width="auto" /><br /> One thing I noticed was that the proxy geo I used to build the environment did not completely represent my sculpted version. Thus, I decided to export the blockout assets from Unreal as a base proxy and sculpted the whole environment in Zbrush.<br /> <img alt="Tech-Blog_Balinese-Temple_Blog-body-8.jpg" height="auto" src="" width="auto" /><br /> <img alt="Tech-Blog_Balinese-Temple_Blog-body-9.jpg" height="auto" src="" width="auto" /><br /> Although the decimated blockout represented the concept quite well, the amount of time needed to create a production-ready asset is too high. In the end, I decided to use the blockout as a guide and proceeded to use photogrammetry for asset creation. <h2>Asset Production</h2> Due to the amount of details needed to create Balinese inspired assets, photogrammetry has become a huge part of my workflow. I first learned about this technique during my internship at <a href="" target="_blank">Turn 10 Studios</a>. I used <a href="" target="_blank">RealityCapture</a> for this project. <br /> <img alt="Tech-Blog_Balinese-Temple_Blog-body-10.jpg" height="auto" src="" width="auto" /><br /> <img alt="Tech-Blog_Balinese-Temple_Blog-body-11.jpg" height="auto" src="" width="auto" /><br /> Once the scanned asset was generated, all I had to do was clean up the mesh and create a low-poly version with the UV unwrapped to bake textures. Low-poly assets were created using either Maya for foreground objects or <a href="" target="_blank">Instant Meshes</a> for background objects.<br /> <img alt="Tech-Blog_Balinese-Temple_Blog-body-12.jpg" height="auto" src="" width="auto" /><br /> <img alt="Tech-Blog_Balinese-Temple_Blog-body-13.jpg" height="auto" src="" width="auto" /><br /> In addition to the assets I scanned on my own, I heavily used <a href="" target="_blank">Quixel Megascans</a> to quickly populate the environment. As a result, a new environment, which consists of scanned assets based on the previously decimated blockout, was created.<br /> <br /> The next step in the project is to apply materials to the environment. The material setup I used was quite simple, a layered material function with an option to add procedural surface detail like moss or grunge. In addition, there is also an option to input a texture mask file.<br /> <img alt="Tech-Blog_Balinese-Temple_Blog-body-14.jpg" height="auto" src="" width="auto" /><br /> <img alt="Tech-Blog_Balinese-Temple_Blog-body-15.jpg" height="auto" src="" width="auto" /><img alt="Tech-Blog_Balinese-Temple_Blog-body-16.jpg" height="auto" src="" width="auto" /> <h2>Lighting</h2> Coming from a more traditional CG background, I tend to follow the general rule of three-point lighting. As it implies, this principle is based on the placement of three different sources of light: the key light, fill light, and rim light. I usually use a <strong>Directional Light</strong> as <em>Key</em>, <strong>Sky Light </strong>as <em>Fill</em>, and any lights for Rim for the initial setup. <div style="text-align: center;"><img alt="Tech-Blog_Balinese-Temple_Blog-body-17.gif" height="auto" src="" width="auto" /></div> <em><strong>TIP: </strong>Start with a grey material setup. In Unreal, you can do this easily by enabling the Lighting Only or the Detailed Lighting mode. </em><br /> <br /> <em>Image Based Lighting</em> (<strong>IBL</strong>) is another common technique I use. From this point, I switched the <strong>Sky Light </strong><em>Source Type </em>to <em>SLS Specified Cubemap </em>and plugged in HDRi that I got from <a href="" target="_blank">HDRI Haven</a>.<br /> <img alt="Tech-Blog_Balinese-Temple_Blog-body-18.jpg" height="auto" src="" width="auto" /><br /> <em><strong>TIP: </strong>Set the exposure level in the Post Process Volume (<strong>Unbound</strong>) to 1.0 for both the min and max setting. This way, the engine won’t try to adapt the exposure automatically, which can be quite annoying when setting up a primary lighting set. You can turn auto exposure off in the Project Settings as well.</em><br /> <br /> Additionally, I’ve also used a combination of <em>Exponential Height Fog </em>and <em>Volumetric Fog</em>, which was introduced back in Unreal Engine version 4.16. However, I ended up creating a few light shafts using a custom translucent material applied on a simple cylinder. This way, I have better control at defining the look of the environment, especially in terms of the general shape, intensity, and placement of each ray.<br /> <img alt="Tech-Blog_Balinese-Temple_Blog-body-19.jpg" height="auto" src="" width="auto" /><br /> <img alt="Tech-Blog_Balinese-Temple_Blog-body-20.jpg" height="auto" src="" width="auto" /><img alt="Tech-Blog_Balinese-Temple_Blog-body-21.jpg" height="auto" src="" width="auto" /><br /> <em><strong>TIP: </strong>Create multiple light shaft materials, each with a different Noise and Depth Opacity value. You can also add a Panner node to add motion to the material. This way you get a sense of depth with each light shaft layer you place in the scene.</em><br /> <br /> Here are a few examples of my lighting process. I usually start out defining the primary light, usually using a directional light, and a sky light for the fill. Unlike with a more traditional render engine, I have to place my own bounce lights in Unreal.  <div style="text-align: center;"><img alt="Tech-Blog_Balinese-Temple_Blog-body-22.gif" height="auto" src="" width="auto" /></div> <div style="text-align: center;"><img alt="Tech-Blog_Balinese-Temple_Blog-body-23.gif" height="auto" src="" width="auto" /></div> <em><strong>TIP: </strong>Use color temperature when creating a lighting scenario. This way it will give you a more natural and accurate result. </em> <h2>Environment Animation and FX</h2> One thing often overlooked by artists when creating an environment is a subtle background animation. Although it does not need to be extravagant, the dynamic vibrancy adds a lot to the story and overall feeling of the environment.<br /> <img alt="Tech-Blog_Balinese-Temple_Blog-body-24.jpg" height="auto" src="" width="auto" /> <div style="text-align: center;"><img alt="Tech-Blog_Balinese-Temple_Blog-body-25.gif" height="auto" src="" width="auto" /></div> <em><strong>TIP: </strong>Epic provides a good example project on Particles that show off a variety of particle systems such as fire, water, and snow. </em><br /> <br /> Although the animation itself is a small part of the environment, it has a fundamental influence on the environment design. The added weathering effect can help enforce the idea of a quiet and overgrown living environment. In addition to where they are placed, the animation FX can also create an interesting lighting situation without drawing too much attention to the eye. <div style="text-align: center;"><img alt="Tech-Blog_Balinese-Temple_Blog-body-26.gif" height="auto" src="" width="auto" /></div> <div style="text-align: center;"><img alt="Tech-Blog_Balinese-Temple_Blog-body-27.gif" height="auto" src="" width="auto" /></div> <h2>Post Processing</h2> The final step in the pipeline is <em>Compositing </em>or <em>Post Processing</em>. Unreal Engine provides a <em>Post Process Volume </em>which allows artists to tweak the overall look of the scene, such as adding bloom, chromatic aberration, and color grading. The most direct effect is the ability to give the render a more cinematic look.  <div style="text-align: center;"><img alt="Tech-Blog_Balinese-Temple_Blog-body-28.gif" height="auto" src="" width="auto" /></div> <div style="text-align: center;"><img alt="Tech-Blog_Balinese-Temple_Blog-body-29.gif" height="auto" src="" width="auto" /></div> <div style="text-align: center;"><img alt="Tech-Blog_Balinese-Temple_Blog-body-30.gif" height="auto" src="" width="auto" /></div> <em><strong>TIP: </strong>I created a <a href="" target="_blank">lookup table (LUT)</a> in Nuke for my color grading, but you can also use Photoshop or any other photo editing software. </em> <h2>Challenges</h2> One of the biggest challenges I faced during the production was creating a color harmony for the environment. It wasn’t until I saw this <a href="" target="_blank">GDC talk</a> by Dan Cox<strong> </strong>where he talked about the <a href="" target="_blank">60-30-10 Rule</a>, which helped me tremendously on the project. <br /> <img alt="Tech-Blog_Balinese-Temple_Blog-body-31.jpg" height="auto" src="" width="auto" /><br /> <em><strong>TIP: </strong>Psychology of color is a good way to convey a specific <a href="" target="_blank">narrative theme</a>. I mainly use green as the focal point of the environment color as it enforces the idea that Balinese art must be infused with life (nature).</em><br /> <br /> Another thing that was hard to adjust was the correct value for <em>Depth of Field (DoF)</em>. I had a tendency to overuse this function in my earlier versions, creating a really shallow DoF in my renders. I learned that having other people look at my project with a fresh perspective is necessary, especially for a project as ambitious as mine. <br /> <br /> <em><strong>TIP: </strong>Aside from getting direct feedback from mentors I met in real life, there are a lot of online communities that can help in getting critique. I found myself posting on <a href="" target="_blank">r/vfx</a>, <a href="" target="_blank">r/unrealengine</a>, <a href="" target="_blank">Unreal Engine Dev Community</a>, <a href="" target="_blank">10K</a>, <a href="" target="_blank">Level Up</a>, and <a href="" target="_blank">80 Level </a>a lot. </em> <h2>Afterword</h2> Thank you very much for reading. I hope you enjoyed this breakdown! If you have any questions or feedback, you can always find me on <a href="" target="_blank">Artstation</a> and <a href="" target="_blank">LinkedIn</a>.<br />   <h3><strong>Credits</strong></h3> Concept Artist - <a href="" target="_blank">Meghana Reddy</a><br /> Storyboard Artist - <a href="" target="_blank">Kylie Gay</a><br /> Previs Artist - <a href="" target="_blank">Sunny Wai Yan Chan</a><br /> Sound Designer - <a href="" target="_blank">H. Albert Holguin</a><br /> Thesis Supervisor - <a href="" target="_blank">Brett Rutland</a> | <a href="" target="_blank">Bridget Gaynor</a> | <a href="" target="_blank">Charles Shami</a><br />  EducationBalinese TempleBrian LesiangiFri, 06 Sep 2019 13:30:00 GMTFri, 06 Sep 2019 13:30:00 GMT ray tracing in Unreal Engine - Part 1: the evolution its debut at GDC in March 2018, real-time ray tracing in Unreal Engine has come a long way, as has the hardware required to run it. Here, we trace its path from prototype code to the latest UE 4.23 release, and see what’s coming next.First <a href="" target="_blank">unveiled</a> at GDC in March 2018, <a href="" target="_blank">real-time ray tracing</a> in Unreal Engine has come a long way in the past 18 months, and it continues to evolve. In this overview of its journey from prototype code to freely available, battle-tested feature, we’ll take a look at that evolution and at what still lies ahead for the technology. <h2>What is ray tracing, anyway?</h2> But first, let’s quickly define ray tracing, and review at a high level how it differs from rasterization, the rendering method traditionally used in game engines prior to the advent of real-time ray tracing.<br /> <br /> <strong>Ray tracing </strong>is a method of determining the color of each pixel in a final render by tracing the path of light from the camera as it bounces around a scene, collecting and depositing color as it goes. Because it mimics the physical behavior of light, it delivers much more photorealistic results than rasterization, like soft, detailed shadows and reflections from off-screen objects.<br /> <br /> <strong>Rasterization</strong>, on the other hand, works by drawing the objects in a scene from back to front, mapping 3D objects to the 2D plane via transformation matrices. It determines the color of each pixel based on information (color, texture, normal) stored on the mesh, combined with the lighting in the scene. It is generally much faster than ray tracing, but cannot portray effects that rely on bounced light like true reflections, translucency, and ambient occlusion.  <h2>The technology catalyst</h2> Ray tracing has been used in feature films for decades by those striving for the most realistic imagery. But until very recently, ray tracing a single frame at film resolution could take several hours, even days. The idea of rendering ray-traced frames at 24 frames per second (fps) or higher was hard to imagine. <br /> <br /> Then came two concurrent game-changing technology advances: Microsoft’s <a href="" target="_blank">DXR (DirectX Raytracing) framework</a>, and NVIDIA’s <a href="" target="_blank">RTX platform</a>. Working with NVIDIA and ILMxLAB, Epic Games was able to show real-time ray tracing for the first time at GDC 2018 with our <a href="" target="_blank">Reflections</a> tech demo, which featured characters from <em>Star Wars: The Last Jedi</em>. <br /> <img alt="News_Real-Time_raytracing_blog_body_starwars_img2.jpg" height="auto" src="" width="auto" /> <div style="text-align: center;"><em>Image from “Reflections”</em></div> <br /> The piece demonstrated textured area lights, ray-traced area light shadows, ray-traced reflections, ray-traced ambient occlusion, cinematic depth of field, and NVIDIA GameWorks ray tracing denoising—all running in real time. Because of Unreal Engine’s hybrid approach, passes that don’t benefit from ray tracing continued to be rasterized, boosting overall performance.<br /> <img alt="News_Real-Time_raytracing_blog_body_area_shadows_img.jpg" height="auto" src="" width="auto" /> <div style="text-align: center;"><em>Ray-traced area light shadows</em></div> <br /> These were early days. The prototype Unreal Engine code was not yet available to users; the hardware required to run the demo in real time consisted of an NVIDIA DGX Station with four Tesla V100 GPUs, with a price tag somewhere in the region of $100,000—so neither the software nor the hardware was exactly what you could call accessible to the average user. <h2>Democratizing real-time ray tracing</h2> Five months later, at SIGGRAPH 2018, again working with NVIDIA and joined this time by Porsche, we launched “<a href="" target="_blank">The Speed of Light” Porsche 911 Speedster Concept</a>, a real-time cinematic experience. With Moore’s Law in full effect, this demo only required two NVIDIA Quadro RTX cards—still quite an investment at over $6,000 each at the time, but significantly less expensive than the earlier setup.<br /> <img alt="News_Real-Time_raytracing_blog_body_speedoflight_img.jpg" height="auto" src="" width="auto" /> <div style="text-align: center;"><em>Image from “The Speed of Light”</em></div> <br /> In addition to the previously demonstrated features, <em>The Speed of Light</em> showed off ray-traced translucent and clear coat shading models, plus diffuse global illumination. As well as producing the cinematic, the team created an <a href="" target="_blank">interactive demo</a> to illustrate the real-time nature of the scene.<br /> <br /> Fast forward to GDC 2019 in March of this year, and we were ready to make the ray tracing functionality available as Beta features in Unreal Engine 4.22. To show how far the technology had come within 12 months, our partners <a href="" target="_blank">Goodbye Kansas</a> and <a href="" target="_blank">Deep Forest Films</a> created <a href="" target="_blank">Troll</a>, a real-time cinematic featuring a digital human.<br /> <img alt="News_Real-Time_raytracing_blog_body_troll_img.jpg" height="auto" src="" width="auto" /> <div style="text-align: center;"><em>Image from “Troll”</em><br />  </div> <em>Troll </em>is an order of magnitude more complex than <em>Reflections</em>. While both pieces run at 24 fps, <em>Troll </em>has 62M triangles compared to <em>Reflections’</em> 5M, 16 lights compared to four, and the ray-traced passes were rendered at full 1080p resolution, as opposed to <em>Reflections</em>, where they were rendered at half resolution and scaled up. <em>Troll </em>also required the ray tracing of particles, and interoperability with other fundamental components of the Unreal Engine cinematographic pipeline including Alembic geometry cache, hair, and skin. Despite all this, <em>Troll</em> was rendered on a single NVIDIA RTX 2080 Ti, with a current list price of $1,199.<br /> <img alt="News_Real-Time_raytracing_blog_body_reflection_img.jpg" height="auto" src="" width="auto" /> <div style="text-align: center;"><em>Comparison between screen-space reflections (L) and ray-traced reflections (R) in “Troll”</em></div> <br /> Today, with the software available for free in Unreal Engine, and graphics cards requirements that won’t break the bank, real-time ray tracing is accessible to everyone. <h2>Ray tracing in Unreal Engine 4.23</h2> When we were working on the initial implementation of ray tracing in Unreal Engine, we were writing our code at the same time as Microsoft was putting the finishing touches on the DXR framework, and NVIDIA was working on new hardware and drivers. In addition, in order to support ray tracing, we underwent a major rendering pipeline refactor. In Unreal Engine 4.23, previewed at SIGGRAPH 2019, we’ve been able to spend effort on addressing instabilities caused by both of these factors.<br /> <br /> Performance has also been improved for many areas including the sky pass, global illumination, and denoising. Denoising is critical to real-time ray tracing since the budget for the number of rays we can trace is very limited.<br /> <img alt="News_Real-Time_raytracing_blog_body_area_illunimation_img.jpg" height="auto" src="" width="auto" /> <div style="text-align: center;"><em>Global illumination was put to good use in “emergence”, a short film by Evil Eye Pictures</em></div> <br /> We’ve also implemented intersection shaders, enabling you to support your own primitive types and render complex geometry such as hair or fractals. We’ve begun work on supporting multiple GPUs; the low-level support added in 4.23 offers control over which GPU will render a certain pass. And we’re continuing to support additional materials and geometry types, including landscape geometry, instanced static meshes, procedural meshes, and Niagara sprite particles. <h2>The road ahead for real-time ray tracing in UE4</h2> We’re not done yet! The team at Epic is still hard at work on improving our implementation. While we’ve made big strides in performance and stability with 4.23, there’s still more work to do in those areas before the feature gets the official “production ready” stamp. Although, try telling that to so many of our users who are happily using the current Beta code in production. We love that they’re stress-testing the code for us, along with our internal Fortnite cinematics team, who are certainly keeping us on our toes.<br /> <br /> As we mentioned earlier, we added low-level multi-GPU support in 4.23, but we’re looking to make that much more “plug and play.” And we’re always looking to support more geometry types. One of the most challenging examples is foliage with world position offset—required for effects such as the animation of leaves blowing in the wind—and we’re working on developing some ideas there. <br /> <br /> If you’d like to get a more in-depth view of our work so far, check out this <a href="" target="_blank">Tech Talk</a> from SIGGRAPH 2019. <br /> <br /> <br /> We’d like to thank everyone who’s already tried out real-time ray tracing in Unreal Engine and is actively sending us their valuable feedback. If you haven’t taken it for a test drive yet, why not <a href="" target="_blank">download the latest Unreal Engine</a> today? We look forward to hearing what you think.<br />  EnterpriseArchitectureAutomotiveFilm And TelevisionTraining And SimulationRay TracingJuan CañadaThu, 05 Sep 2019 17:00:00 GMTThu, 05 Sep 2019 17:00:00 GMT Industries brings realistic infantry training to the simulation community moddable battle game “Squad” has earned accolades for its realistic scenarios, visuals, and sound effects that mirror real conflicts and combat. Now it’s been turned into a framework to support the creation of the next infantry generation of virtual training applications.<h2>The future of infantry training</h2> Army personnel might already be familiar with <a href="" target="_blank">Squad</a>, a moddable first-person shooter game where teams compete against each other in modern, realistic environments. <br /> <br /> <em>Squad’s </em>creators, <a href="" target="_blank">Offworld Industries (OWI)</a>, are working on an Unreal Engine-based VR framework for a custom army virtual training solution. Army organizations in the USA, UK, and abroad are exploring the use of this upcoming framework to develop immersive VR training, offering their personnel safe instruction for unsafe environments.  <br /> <img alt="blog_body_img_tank1.jpg" height="auto" src="" width="auto" /><br /> The ever-evolving need for training in a safe, controllable environment has driven innovation in the field of training and simulation (T&S) for many years. But the two sides of the T&S coin—an understanding of how personnel learn and retain knowledge, and the computer graphics technology to make use of this understanding—haven’t always advanced at the same pace.<br /> <br /> Only ten years ago, training techniques for military applications were ahead of the technology curve. Independent of the larger computer graphics industry, the defense sector continued to develop applications for its own needs. The evolutions of the operation doctrines and tactics have been the drivers of CGFs, SAFs, and geospatial intelligence software innovation, and continue to drive more innovation to this day. <br /> <br /> Paradoxically, traditional virtual training software, while focusing on applications layers, has incurred a technology debt at its core level over the last decade. Compared to what we could call traditional training solutions, the next era of digitization will be less hardware-intensive and more software-intensive, which will cause it to incur even more technology debt if the issue of software is not addressed. To bridge the gap between the opportunities presented by high-performance hardware and the inherent limitations of older T&S software, game engines have emerged as a solution. <br /> <br /> Experienced T&S experts have identified phases, or eras, in the evolution of virtual training, going from monolithic prime-delivered solutions to defense-industry-grown COTS solutions. Because of the immense technology and content-creation effort assumed by all the actors of the simulation niche, the industry never had a chance to change its mindset and its business model. We are now at the beginning of a new era—not only is the technology ready, but new actors who can tackle a larger audience are joining the training and simulation efforts as well, leading to a strong change in the business model.<br /> <br /> In this new era of training, companies like Offworld Industries, leveraging their expertise in Unreal Engine, can draw a bridge between these two worlds and generate training experiences that mix a high level of trainee involvement with the accuracy of a well-defined curriculum. Based on the same philosophy as Unreal Engine, OWI solutions are open and allow for an unlocked content pipeline, putting end users back in charge of their own destiny and their own data.   <h3><strong>Birth of a military sim game</strong></h3> It all started with <em>Battlefield 2</em>, a 2005 first-person shooter game where players form teams and use modern military weapons and tactics. Chris Greig, Business Development Lead at Offworld Industries, and a colleague, Will Stahl, had developed a realism mod for <em>Battlefield 2</em> called <em>Project Reality </em>that not only provided additional landscapes, maps, and weapons, but also placed greater emphasis on teamwork and communication than the original game.<br /> <br /> Stahl wanted to take the ethos and team-working portions of <em>Project Reality </em>and turn them into a game that could also serve as a training and simulation product for the military. Stahl eventually became CEO and Co-Founder of Offworld Industries.<br /> <br /> It wasn’t long after <em>Squad’s </em>alpha release in 2015 that military personnel started discussing the game on their own forums, lauding it for its realistic situations, graphics, and audio.<br /> <img alt="blog_body_img_tank2.jpg" height="auto" src="" width="auto" /> <h3><strong>Squad: more than a game</strong></h3> <em>Squad, </em><a href="" target="_blank">currently available as an alpha</a>, is technically a game. But with its focus on duplicating real-life military situations, it is also a scalable solution for military training that lets users stay focused during a fire-fight and make critical decisions in the heat of combat. <br /> <br /> "What we&#39;re most interested in is a human side, and that&#39;s getting the reaction, that real, true, honest reaction out of a player," says Greig. "We work with a lot of ex-military, and we make sure that the crack of that bullet, or the smoke, or the explosion, really does simulate what they experienced when they were in conflicts."<br /> <br /> This attention to realism, and Offworld’s background in storytelling, has naturally led to military and police organizations taking an interest in <em>Squad </em>as a customizable solution for their own training. Organizations from the USA, UK, and Ukraine have reached out to Offworld to help them develop VR simulations to train their own personnel.<br /> <img alt="blog_body_img_terrain.jpg" height="auto" src="" width="auto" /><br /> “What we hear from our clients is that they want to spend most of their time running experiments, but that they end up spending most of their time in preparation,” says Greig. “The goal of these tools, and of using Unreal Engine for them, is that they spend less time preparing and more time actually executing their experiments.”<br /> <br /> On the data end, Offworld couples Unreal Engine’s realistic graphics output with its <a href="" target="_blank">Blueprint visual scripting</a> system to handle any body of data that needs to be incorporated into the training. Between Offworld’s attention to realism and their deep knowledge of Unreal Engine, they are able to produce flexible, effective, custom simulation environments that match their clients’ physical environments while also maximizing training transfer.<br /> <br /> “When we meet with stakeholders for the first time, we explain that we can quickly prototype a system that meets their needs, to show them the fidelity,” says Greig. “The system has access to the <a href="" target="_blank">source code</a>, which means whatever data they want to collect is limitless. There’s also a lot of flexibility as to sources of assets, which doesn’t lock them into any one provider.” <h3><strong>Becoming a key defense player </strong></h3> The popularity and success of<em> Squad</em> generated a multitude of requests from defense end users to add functionality that went beyond a modifiable game. These requests prompted Offworld to create a framework beyond the game itself.<br /> <br /> The framework is a set of plugins, assets, vehicles, and human representations dedicated to defense applications, a true-to-life sound effects generation system, and map terrains representative of field operation stages. All are geared toward creating an applicative software layer that can be used to create training exercises. The framework is built on top of Unreal Engine, which is Offworld’s core platform. <br /> <br /> The plugins themselves are written in C++ with Unreal Engine’s Blueprint visual scripting system, making them easily modifiable and customizable to any military application.<br /> <br /> “Right now, our focus is on proving the framework—prototype quickly, prove the capabilities, then look at the future where we can develop platforms together using Unreal Engine,” says Greig. “We want to teach them to fish rather than fishing for them."<br /> <br /> “The goal here is to be able to grow it, and grow it, and grow it, and eventually provide all the tools that anybody might need to do military research on top of Unreal Engine.”<br /> <br /> Offworld chose Unreal Engine for its realism, robust toolset, and flexibility in producing real-time scenarios. “The realism we can get out of Unreal is unparalleled by any other real-time engine,” says Greig. “That, plus the breadth of tools we can create and customize within the engine, made Unreal the natural choice for military applications.”<br /> <br /> <br /> Interested in finding out what UE4 can do for your training and simulation requirements? <a href="" target="_blank">Get in touch</a> and let’s start that conversation.<br />  EnterpriseGamesOffworld IndustriesSquadTraining And SimulationVRDefenseBlueprintsSébastien LozéThu, 05 Sep 2019 16:30:00 GMTThu, 05 Sep 2019 16:30:00 GMT in the director’s seat and join the first-ever Unreal Film Jam for a chance at an assortment of prizes and to connect with the experts at Blur Studio. Have you been looking for a chance to show off your storytelling skills? Well, here it is; join and be a part of the first-ever Unreal Film Jam! <br /> <br /> Between now and October 19, create a short film or animated sequence based on the theme, “Oh, the places you’ll go” and enter for your chance at a package that includes: $5,000, personalized project reviews with Blur Studio (creators of <em>Love, Death & Robots</em>) and Epic Games, a feature on Unreal Engine’s blog, and more fantastic prizes.<br /> <br /> Each submission will be reviewed by a panel of judges, consisting of members from both Blur Studio and Epic Games, on narrative, use of theme, overall aesthetics, and originality. <br /> <br /> Many thanks to our sponsors <a href="" target="_blank">Blur Studios</a>, <a href="" target="_blank">SideFX</a>, and <a href="" target="_blank">Quixel</a>.<br /> <br /> Get full contest details and submit your entry on the <a href="" target="_blank">Unreal Film Jam page</a>. Good luck!Film And TelevisionNewsCommunityEnterpriseAmanda SchadeWed, 04 Sep 2019 18:00:00 GMTWed, 04 Sep 2019 18:00:00 GMT enjoyed more than 100 Unreal Engine games at gamescom 2019 <em>Gears 5</em> to <em>A Juggler’s Tale</em>, the halls of gamescom 2019 were packed with Unreal Engine-powered titles from teams of all sizes.Gaming enthusiasts from around the world recently gathered in Cologne, Germany for <a href="" target="_blank">gamescom 2019</a> — a massive multi-day event featuring over 200,000 square meters of exhibitor space with 370,000 attendees from more than 100 different countries. Of course, they were all there to witness and experience great games and over 100 Unreal Engine-powered titles were on display throughout the show. <div style="text-align: center;"> </div> From AAA blockbusters like The Coalition’s <a href="" target="_blank">Gears 5</a> and Square Enix’s <a href="" target="_blank">Final Fantasy VII Remake</a>, to the unveiling of <a href="" target="_blank">Disintegration</a> and <a href="" target="_blank">Everspace 2</a>, as well as innovative indie projects such as <a href="" target="_blank">Dead Static Drive</a>, <a href="" target="_blank">Raji: An Ancient Epic</a>, <a href="" target="_blank">Last Oasis</a>, and <a href="" target="_blank">A Juggler’s Tale</a>, there was an immense variety of Unreal Engine developers demonstrating their projects at this year’s show.<br />   <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div>   <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div>   <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div>   <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <br /> Below is a list of the UE-powered projects that had a presence at gamescom 2019. Congratulations to all in attendance on an impressive showing — we can’t wait to see how these teams and their games progress!<br /> <br /> <a href="" target="_blank">A Juggler&#39;s Tale</a> | A Juggler&#39;s Tale Team<br /> <a href="" target="_blank">A Year of Rain</a> | Daedalic Entertainment<br /> <a href="" target="_blank">Ancestors: The Humankind Odyssey</a> | Panache Digital Games, Private Division<br /> <a href="" target="_blank">Astroneer</a> | System Era Softworks<br /> <a href="" target="_blank">Be:Brave</a> | One Dream<br /> <a href="" target="_blank">Biomutant</a> | Experiment 101, THQ Nordic<br /> <a href="" target="_blank">Blair Witch</a> | Bloober Team<br /> <a href="" target="_blank">Blazing Sails</a> | Get Up Games<br /> <a href="" target="_blank">Bleeding Edge</a> | Ninja Theory, Xbox Game Studios<br /> <a href="" target="_blank">Borderlands 3</a> | Gearbox Software, 2K Games<br /> <a href="" target="_blank">Boundary</a> | Surgical Scalpels<br /> <a href="" target="_blank">Bramble - The Mountain King</a> | Dimfrost Studio<br /> <a href="" target="_blank">Bus Simulator</a> | Still Alive Studios, Astragon Entertainment<br /> <a href="" target="_blank">Cainwood</a> | Invisible Walls<br /> <a href="" target="_blank">Chernobylite</a> | The Farm 51<br /> <a href="" target="_blank">Close To The Sun</a> | Storm in a Teacup, Wired Productions<br /> <a href="" target="_blank">Code Vein</a> | BANDAI NAMCO Studios <br /> <a href="" target="_blank">Cold Comfort</a> | Gamma Minus<br /> <a href="" target="_blank">Comanche</a> | Nukklear, THQ Nordic<br /> <a href="" target="_blank">Concrete Genie</a> | PixelOpus, Sony Interactive Entertainment<br /> <a href="" target="_blank">Creature In The Well</a> | Flight School Studio<br /> <a href="" target="_blank">Darksiders Genesis</a> | Gunfire Games, THQ Nordic<br /> <a href="" target="_blank">Dead Static Drive</a> | Fanclub <br /> <a href="" target="_blank">Deconstruction Corp</a> | Frogsong Studios<br /> <a href="" target="_blank">Deep Rock Galactic</a> | Ghost Ship Games, Coffee Stain Studios<br /> <a href="" target="_blank">Deliver Us The Moon</a> | KeokeN Interactive, Wired Productions<br /> <a href="" target="_blank">Destroy All Humans</a> | Black Forest Games, THQ Nordic<br /> <a href="" target="_blank">Devil&#39;s Hunt</a> | Layopi Games, 1C Publishing<br /> <a href="" target="_blank">Disintegration</a> | V1 Interactive, Private Division<br /> <a href="" target="_blank">Dragon Ball Z Kakarot</a> | CyberConnect2, BANDAI NAMCO Entertainment<br /> <a href="" target="_blank">Dreadout 2</a> | Digital Happiness<br /> <a href="" target="_blank">DOLMEN</a> | Massive Work Studio <br /> <a href="" target="_blank">Dreamo</a> | Hypnotic Ants Studiom Carbon Studio<br /> <a href="" target="_blank">Drone Champions League</a> | Drone Champions League, THQ Nordic<br /> <a href="" target="_blank">ESPIRE 1: VR Operative</a> | Digital Lode, Tripwire Interactive<br /> <a href="" target="_blank">Everspace</a> | ROCKFISH Games<br /> <a href="" target="_blank">Everspace 2</a> | ROCKFISH Games<br /> <a href="" target="_blank">Faith + Honor: Barbarossa</a> | encurio <br /> <a href="" target="_blank">Final Fantasy VII Remake</a> | Square Enix<br /> <a href="" target="_blank">Fishing Barents Sea</a> | Misc Games, Astragon Entertainment<br /> <a href="" target="_blank">Gates of Mirnah</a> | Number Mill<br /> <a href="" target="_blank">Gears 5</a> | The Coalition, Xbox Game Studios<br /> <a href="" target="_blank">Genesis Noir</a> | Feral Cat Den, Fellow Traveller<br /> <a href="" target="_blank">Ghostrunner</a> | One More Level, All in! Games<br /> <a href="" target="_blank">Grand Guilds</a> | Drix Studios<br /> <a href="" target="_blank">Groundhog Day: Like Father Like Son</a> | Tequila Works, Sony Pictures Virtual Reality<br /> <a href="" target="_blank">Hell Pie</a> | Sluggerfly<br /> <a href="" target="_blank">Hoverloop</a> | Not A Company, Cronos Interactive<br /> <a href="" target="_blank">Hubris</a> | Cyborn<br /> <a href="" target="_blank">HYPERCHARGE: UNBOXED</a> | Digital Cybercherries <br /> <a href="" target="_blank">HyperParasite</a> | Troglobytes Games, Hound Picked Games<br /> <a href="" target="_blank">In The Black</a> | Impeller Studios<br /> <a href="" target="_blank">INNER</a> | KillaSoft<br /> <a href="" target="_blank">Insurgency</a> | New World Interactive, Focus Home Interactive<br /> <a href="" target="_blank">KAMILE VR</a> | Kamile VR Team, Gluk Media, UPĖ Media<br /> <a href="" target="_blank">Killsquad</a> | Novorama <br /> <a href="" target="_blank">Kine</a> | Gwen Frey<br /> <a href="" target="_blank">King&#39;s Bounty II</a> | 1C Entertainment <br /> <a href="" target="_blank">Kingdom Hearts 3</a> | Square Enix Division 3, Square Enix<br /> <a href="" target="_blank">Last Oasis</a> | Donkey Crew<br /> <a href="" target="_blank">Life Is Strange 2</a> | DONTNOD Entertainment, Square Enix<br /> <a href="" target="_blank">Little Nightmares 2</a> | Tarsier Studios, BANDAI NAMCO Entertainment<br /> <a href="" target="_blank">Lost Hero</a> | Gold Knight<br /> <a href="" target="_blank">Main Assembly</a> | Bad Yolk Games<br /> <a href="" target="_blank">Man Of Medan</a> | SUPERMASSIVE Games, BANDAI NAMCO Entertainment<br /> <a href="" target="_blank">Maneater</a> | Tripwire Interactive<br /> <a href="" target="_blank">Marvel vs. Capcom: Infinite</a> | Capcom<br /> <a href="" target="_blank">Maze Slaughter</a> | Giant Gun Games<br /> <a href="" target="_blank">Medievil</a> | Other Ocean, Sony Interactive Entertainment<br /> <a href="" target="_blank">Metamorphosis</a> | Ovid Works, All in! Games<br /> <a href="" target="_blank">Minecraft Dungeons</a> | MOJANG, Microsoft Studios<br /> <a href="" target="_blank">Moons Of Madness</a> | Rock Pocket Games, Funcom<br /> <a href="" target="_blank">Mortal Kombat 11</a> | NetherRealm Studios, Warner Bros. Interactive Entertainment<br /> <a href="" target="_blank">Nyx The Awakening</a> | Black Sail Games, All in! Games<br /> <a href="" target="_blank">OMNI</a> | North 3D Studio <br /> <a href="" target="_blank">Omno</a> | Studio Inkyfox <br /> <a href="" target="_blank">Orcs Must Die! 3</a> | Robot Entertainment<br /> <a href="" target="_blank">Out Of Place</a> | Bagpack Games<br /> <a href="" target="_blank">Pagan Online</a> | Madhead Games,<br /> <a href="" target="_blank">Partisans 1941</a> | Alter Games<br /> <a href="" target="_blank">Predator: Hunting Grounds</a> | Illfonic, Sony Interactive Entertainment<br /> <a href="" target="_blank">Project Shelter</a> | Atomicom<br /> <a href="" target="_blank">Project Witchstone</a> | Spearhead Games <br /> <a href="" target="_blank">Protocore</a> | IUMTEC <br /> <a href="" target="_blank">RAD</a> | Double Fine Productions, BANDAI NAMCO Entertainment<br /> <a href="" target="_blank">Raji: An Ancient Epic</a> | Nodding Heads Games, SUPER.COM<br /> <a href="" target="_blank">Ready Set Heroes</a> | Robot Entertainment, Sony Interactive Entertainment<br /> <a href="" target="_blank">Remothered: Tormented Fathers</a> | Stormind Games, Darril Arts<br /> <a href="" target="_blank">Robo Recal: Unpluggedl</a> | Drifter Entertainment, Epic Games, Oculus<br /> <a href="" target="_blank">Roboquest</a> | RyseUp Studios<br /> <a href="" target="_blank">Sea Of Thieves</a> | Rare, Microsoft Studios<br /> <a href="" target="_blank">Session</a> | Crea-ture Studios Inc.<br /> <a href="" target="_blank">Shadows of Larth</a> | Zine FALOUTI <br /> <a href="" target="_blank">Silver Chains</a> | Cracked Heads Games, Headup Games<br /> <a href="" target="_blank">Slice Back</a> | Atomic Kraken<br /> <a href="" target="_blank">Space Cows</a> | Happy Corruption, All in! Games<br /> <a href="" target="_blank">Sponge Bob Squarepants Battle for Bikini Bottom - Rehydrated</a> | Purple Lamp, THQ Nordic<br /> <a href="" target="_blank">Spyro Reignited Trilogy</a> | Toys for Bob, Activision Publishing<br /> <a href="" target="_blank">Stalin vs. Martians 4</a> | KREMLINCORP<br /> <a href="" target="_blank">Stray Blade</a> | Point Blank Games<br /> <a href="" target="_blank">SYNCED: Off-Planet</a> | NEXT Studios<br /> <a href="" target="_blank">TAUCETI Unknown Origin</a> | BadFly Interactive<br /> <a href="" target="_blank">The Cycle</a> | Yager Development<br /> <a href="" target="_blank">The Red Solstice 2</a> | Ironward<br /> <a href="" target="_blank">The Suicide Of Rachel Foster</a> | 101 Percentage, DAEDALIC Entertainment<br /> <a href="" target="_blank">The Waylanders</a> | GATO STUDIO<br /> <a href="" target="_blank">Tropico 6</a> | Limbic Entertainment, Kalypso Media Group<br /> <a href="" target="_blank">Vigor</a> | Bohemia Interactive<br /> <a href="" target="_blank">WARTILE</a> | Playwood Games, Deck13 Interactive, Whisper Games<br /> <a href="" target="_blank">Weakless</a> | Punk Notion<br /> <a href="" target="_blank">Westworld Awakening</a> | Survios, Warner Bros. Interactive Entertainment, HBO<br /> <a href="" target="_blank">Xevorel</a> | Amberaxe<br />  Daniel KayserTue, 03 Sep 2019 18:00:00 GMTTue, 03 Sep 2019 18:00:00 GMT Digital puts players first in the Deathgarden: BLOODHARVEST reboot year and one reboot later, Behaviour Digital continues to improve Deathgarden: BLOODHARVEST with the help of community feedback.Launched into Early Access in August 2018, Deathgarden has seen a lot of changes since its debut. Leaning heavily on input from the community alongside learning their own lessons along the way, developer Behaviour Interactive has strived to make the game everything a fan of asymmetric titles would want to play. On May 30, 2019, the game known simply as Deathgarden received a full reboot and was reborn as <a href="" target="_blank">Deathgarden: BLOODHARVEST</a>. The revamped version of the game introduced new gameplay, more characters, and a darker direction for the game’s aesthetic.<br /> <br /> Set in an unforgiving dystopian future, Deathgarden: BLOODHARVEST puts players in the role of either a ruthless hunter or a desperate scavenger while incorporating asymmetric gameplay. Every scavenger strives to enter the Enclave, a safe haven in a world where only a select few live in comfort while the rest of humanity suffers in the slums. The only way into the Enclave, however, is through the Hunter in a deadly game of cat and mouse.<br /> <br /> Chatting with Behaviour Digital Design Director Matt Jackson and Lead Developer Yohann Martel, we learn what’s changed since Deathgarden’s launch one year ago, how Unreal Engine has got them to where they are now, and how the game is going to keep changing as the team moves ever closer to their 1.0 launch. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <strong>Thanks for chatting with us. Can you tell us a little bit about Behaviour Interactive and some of the games you&#39;re most proud of?</strong><br /> <br /> <strong>Design Director Matt Jackson:</strong> Montreal-based Behaviour Interactive was founded in 1992 and is now one of North America’s largest independent game developers with close to 550 employees and over 70 million games sold on every platform. To date, the company’s mobile games have reached over 200 million players worldwide. In 2019, its most successful IP, the award-winning <a href="" target="_blank">Dead by Daylight</a>, celebrated 12 million players. Every game we have worked on throughout these years allowed us to learn and gain experience, whether working with clients or developing our own IPs. Picking a favorite game is like picking a favorite child - it’s impossible!<br /> <br /> <strong>What was the catalyst behind creating Deathgarden? Were there any specific inspirations?<br /> <br /> Jackson:</strong> Coming off the success of Dead by Daylight, we felt that the asymmetric genre had been somewhat underserved in the industry. There have been a few titles over the past decade or so that have tried their hand at it, but there hasn’t been a groundswell of new games in the genre. We know players are hungry for new multiplayer experiences and felt that with our learned-knowledge creating asymmetric games, we were in a unique position to deliver one of these new experiences. Obviously, we also really love the genre and wanted to expand it in a meaningful way with Deathgarden. <br /> <br /> The seed of the idea for our game was a simple question, “What if we took the concepts of Dead by Daylight and expanded them into a fast-paced FPS/third-person experience?” It’s a challenging, yet rewarding, design problem to solve so that’s kept us very engaged. <br /> <img alt="DeveloperInterview-Deathgarden-Blog-body-replacement.png" height="auto" src="" width="auto" /><br /> <strong>Deathgarden is an asymmetric title pitting five runners against an unstoppable hunter. In contrast to many other asymmetric games, the hunter cannot be brought down, which really changes the gameplay. How and when did this twist on the game design come around? <br /> <br /> Jackson:</strong> Our first internal iteration of the game, we had a type of Hunter “death” but they would respawn after a short time. What we found, however, was that this design promoted two unwanted side effects. First, these deaths made players who were struggling with the Hunter role have an even harder time against the Scavengers, which often promoted "trolling" of the Hunter character. <br /> <br /> The second effect, and arguably more salient point, is that it damaged the Hunter fantasy of being a badass killer who is “in charge” of the Garden. We also wanted Scavengers to “feel the fear” of knowing that they have literally no power against the Hunter and embrace the unique and powerful emotions that come from that dynamic.<br /> <br /> <strong>The first arena in Deathgarden is centered in beautiful British Columbia, Canada across the daytime, a foggy-day setting, and night-time. How did Unreal Engine 4 help you most when creating such a lush environment? <br /> <br /> Lead Developer Yohann Martel:</strong> It helped us in a few ways. The <a href="" target="_blank">automatic LOD generation</a> certainly helped for the performance. We also developed a tool in <a href="" target="_blank">Blueprints</a> to create an impostor billboard for some assets directly in Unreal, which gave us even better performance and view distance. Using the <a href="" target="_blank">profiling tools</a> and different buffer visualization also helped quickly identifying bottlenecks.<br /> <br /> Another thing that helped us achieve interesting results was the way hardware instancing was exposed through the hierarchical instanced static meshes. Without the level of control of hardware instancing, having dense and detailed procedurally generated environments running at 60 FPS would have been nearly impossible. <br /> <img alt="DG_Bloodharvest-(4).png" height="auto" src="" width="auto" /><br /> <strong>The lore behind Deathgarden is quite deep giving an interesting backdrop to its world. Many games in this genre fail to capitalize on any sort of narrative. How important is it to you to incorporate your lore into the gameplay experience, and how do you pull it off? <br /> <br /> Jackson:</strong> I think it’s very important. Providing a strong context for the player experience sets the tone when you start to play, as well as provides continuing reasons to come back to the game. We worked hard at not only writing strong lore, but also visual elements that set this tone. For example, every time you log into Deathgarden, the “Locker Room” area for each role (Hunter/Scavenger) tells a story. The Scavengers inhabit a worn-out, dark, foreboding location. It reflects their lot in life: ever repeating this Garden contest to prove their worth and potentially enter the “Enclaves.” In contrast, the Hunter “Locker Room” is a much brighter, welcoming place that reminds players of the power and privilege that comes with being a Hunter. <br /> <br /> <strong>If the team had to pick their favorite tool in Unreal Engine 4, what would it be and why? <br /> <br /> Martel: </strong>I asked the team and we came up with the following:<br />   <ul style="margin-left: 40px;"> <li><a href="" target="_blank">Material Editor</a> - When the team asks for a new feature in the master material and it takes us just a few minutes to achieve, they are always impressed and satisfied with the result.</li> <li>The skeleton sharing and retargeting system is good because it removes a lot of the hassle of creating a specific animation for each skeleton.</li> <li>The <a href="" target="_blank">Graph Editor</a> that is at the heart of many core features of Unreal (Blueprint scripts, materials, <a href="" target="_blank">animations</a>, <a href="" target="_blank">particles</a>, <a href="" target="_blank">UI</a>, etc.) offers a large amount of control and flexibility. It’s also very accessible. Everybody on the project, whether they’re a game designer, an animator, or a 3D artist, would make use of it. During the development of Deathgarden, we saw the arrival of new features like <a href="" target="_blank">Niagara</a> that made this system even more powerful. </li> </ul> <img alt="DG_Bloodharvest-(19).png" height="auto" src="" width="auto" /><br /> <strong>The arenas in Deathgarden are procedurally generated, which I would imagine is usually a lengthy if not complicated feature to implement. Was there anything specific in Unreal Engine 4 that made this process easier for you? <br /> <br /> Martel:</strong> We were able to do a procedurally generated map prototype in just a few days by using Blueprints. It really helped define what kind of system we would need to code once we were ready to implement the real system in <a href="" target="_blank">C++</a>. <br /> <br /> Also, almost everything in our game is instantiated, which means we cannot bake many elements and we needed a way to optimize the game. Unreal has a great way of helping artists by creating the <a href="" target="_blank">LODs for the assets automatically</a>. That saved us an incredible amount of time compared to making them all by hand. <br /> <br /> <strong>Deathgarden launched into Early Access August 14th of last year. How much has feedback from your community influenced your path at this point? <br /> <br /> Jackson:</strong> It has influenced our path 100 percent. Games are a product and an experience meant to be “touched,” interacted with, and invested into by players. Deathgarden is meant to be malleable and improve over time with the help and passion of our fans. We are not here to dictate changes, we are here to guide our vision into a desired package that players will enjoy and want to come back to. Without players, there are no games. We value the feedback we receive very highly and hope it continues as much as it has. <br /> <img alt="DG_Bloodharvest-(15).png" height="auto" src="" width="auto" /><br /> <strong>What changes has the game seen since the initial launch into Early Access? <br /> <br /> Jackson:</strong> Upon the initial Early Access release, we received a lot of positive feedback regarding certain elements of the game, as well as improvements to other areas of the game. While the movement, controls and fast-paced gameplay were highly praised, our game modes left something to be desired - there was too much reliance on strict team play for many. It made the game somewhat impenetrable to new players. <br /> <br /> We also lacked a meaningful progression system for players to keep investing time into. We took this feedback to heart, worked hard on removing restrictions by softening team play, and added a robust progression system. Revisiting the game also allowed us to invest in other areas that we wanted to explore, namely lore, fantasy, and characters. We continue to make improvements to the game, both major and minor, and hope that players will continue to take this journey with us. <br /> <br /> <strong>Considering the sheer number of games under Behaviour Digital&#39;s belt, what advice would you give to a developer learning UE4 for the first time? <br /> <br /> Martel:</strong> Here are the major pieces of advice I would give a developer learning Unreal Engine 4:<br />   <ul style="margin-left: 40px;"> <li>Start learning the C++ API to the same level as Blueprints, because while each of these has its limitations, when you mix them together, it becomes a wonderful and flexible system for the whole team. </li> <li>Also, when working with Blueprints or Materials, try to be as generic as possible and keep it simple. Make reusable functions, it will keep your graphs clean and save you time in the long run.</li> </ul> <img alt="DG_Bloodharvest-(6).png" height="auto" src="" width="auto" /><br /> <strong>Where are all the places people can go to learn about Deathgarden? </strong><br /> <br /> We are present on many platforms and always happy to chat with our players!<br />   <ul style="margin-left: 40px;"> <li><a href="" target="_blank">Facebook</a> </li> <li><a href="" target="_blank">Twitter</a> </li> <li><a href="" target="_blank">Instagram</a> </li> <li><a href="" target="_blank">Reddit</a> </li> <li><a href="" target="_blank">Discord</a> </li> <li><a href="" target="_blank">Twitch</a> </li> <li><a href="" target="_blank">YouTube</a> </li> <li><a href="" target="_blank">Forum</a></li> </ul> Behavior DigitalDeathgarden: BLOODHARVESTGamesBlueprintsShawn PetraschukTue, 03 Sep 2019 11:00:00 GMTTue, 03 Sep 2019 11:00:00 GMT Free Marketplace Content - September 2019 an interactive narrative with a scripting and dialogue system, get bombastic with a collection of explosion FX, create a frozen winter wonderland with icy cool materials, and so much more!In partnership with Unreal Engine Marketplace creators, select content will be available for free to the Unreal community each month to give artists, designers, and programmers access to even more resources at no additional cost.<br /> <br /> See this month’s great selection below! <h2><strong>September&#39;s Featured Free Content:</strong></h2> <h2><a href="" target="_blank">Blueprint Dialogues</a> | <a href="" target="_blank">Alain Bellemare</a>  </h2> <div style="text-align: center;"><img alt="News_UESC_SEPT2019_Blog1.jpg" src="" /><br /> <em>This versatile dialogue and scripting system enables quick dialogue creation, with a myriad of inline text formatting tools and configurable interface options.</em></div> <h2><a href="" target="_blank">Ice Cool</a> | <a href="" target="_blank">Krystian Komisarek</a></h2> <div style="text-align: center;"><img alt="News_UESC_SEPT2019_Blog2.jpg" src="" /><br /> <em>Create icy ground, ice cubes, icebergs, crystals, and icicles using this advanced master material designed and optimized for mobile, VR, and stylized PC and console games.</em></div> <h2><a href="" target="_blank">OwnIcon - Create your own Icons</a> | <a href="" target="_blank">SchmidtGames</a></h2> <div style="text-align: center;"><img alt="News_UESC_SEPT2019_Blog3.jpg" src="" /><br /> <em>Quickly and easily make unique icons using existing meshes from your Unreal Engine project.</em></div> <h2><a href="" target="_blank">Skeleton Crew Bundle</a> | <a href="" target="_blank">Bitgem 3D</a></h2> <div style="text-align: center;"><img alt="News_UESC_SEPT2019_Blog4.jpg" src="" /><br /> <em>Five fully-animated, low-poly skeletons are ready to drop in and fight for their life.</em></div> <h2><a href="" target="_blank">VFX Grenade Pack</a> | <a href="" target="_blank">Gentleman Fred FX</a></h2> <div style="text-align: center;"><img alt="News_UESC_SEPT2019_Blog5.jpg" src="" /><br /> <em>Detonation just got easier with this collection of grenade explosion effects for a variety of surface types, including concrete, dirt, grass, ice, and more.</em></div> <h2><strong>New Permanently Free Content:</strong></h2> <h2><a href="" target="_blank">Horror Engine</a> | <a href="" target="_blank">tansuergene</a></h2> <div style="text-align: center;"><img alt="News_UESC_SEPT2019_Blog6.jpg" src="" /></div> <div style="text-align: center;"><em>Use this robust system including inventory, questing, UI elements, and more to build the spookiest game one can conjure.</em></div> <h2><a href="" target="_blank">SuperGrid Starter pack</a> | <a href="" target="_blank">ZeOrb</a></h2> <div style="text-align: center;"><img alt="News_UESC_SEPT2019_Blog7.jpg" src="" /><br /> <em>Rapidly prototype new levels with loads of modular meshes and materials.</em></div> <br /> The monthly content will only be available through the end of September. Download it before new, free content arrives in October!<br /> <br /> Are you a Marketplace creator interested in sharing your content for free with the community? Visit <a href="" target="_blank"></a> to learn how you could be featured!<br />  CommunityLearningMarketplaceNewsAmanda SchadeSun, 01 Sep 2019 14:00:00 GMTSun, 01 Sep 2019 14:00:00 GMT brain surgeons practice with real-time simulation anatomy is complex, and so is brain surgery. Find out how a team at The University of Tokyo developed a medical simulation package for brain surgeons with Unreal Engine.Brain surgery remains one of the most intricate and challenging tasks a surgeon might have to perform, and requires extensive training and practice. While the medical field has long used CGI to visualize the human body inside and out, many applications are limited to opaque shaders on rigid objects, which don’t accurately depict the many translucent, thin, or flexible parts of the brain’s structure. <br /> <br /> In their 2018 paper <a href="" target="_blank">Enhancement Techniques for Human Anatomy Visualization</a>, Hirofumi Seo and Takeo Igarashi state that “Human anatomy is so complex that just visualizing it in traditional ways is insufficient for easy understanding…” To address this problem, Seo has proposed a practical approach to brain surgery using real-time rendering with Unreal Engine. <br /> <br /> Now Seo and his team have taken this concept a step further with their 2019 paper <a href="" target="_blank">Real-Time Virtual Brain Aneurysm Clipping Surgery</a>, where they demonstrate an application prototype for viewing and manipulating a CG representation of a patient’s brain in real time. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> As part of the User Interface Research Group, Igarashi Laboratory, Graduate School of Information Science and Technology at The University of Tokyo, Seo and his team are working on a real-time visualization and training application for brain surgery that more accurately portrays the brain’s structure and how it deforms during surgery. The software prototype, made possible with a grant (Grant Number JP18he1602001) from <a href="" target="_blank">Japan Agency for Medical Research and Development (AMED)</a>, helps surgeons visualize a patient’s unique brain structure before, during, and after an operation.<br /> <img alt="Blog_Body_Image_4.jpg" height="auto" src="" width="auto" /> <h3><strong>Addressing the challenges of aneurysm surgery</strong></h3> The brain aneurysm or cerebral aneurysm—a bulge like a balloon on a brain artery—is present in about 3% of the adult population worldwide. Aneurysms can rupture and cause the artery to bleed out internally, which causes death in 40% of cases, and permanent neurological damage in 66% of survivors. A ruptured aneurysm is the most common cause of strokes.<br /> <br /> One of the most effective treatments for an aneurysm is <a href="" target="_blank">clipping</a>, where a surgeon places a small clip across the neck of the bulge. Clipping prevents further blood flow to the aneurysm and effectively holds the artery closed. <br /> <br /> Any clipping procedure involves entering an opening in both the skull and at least one sulcus (groove) in the brain. The <a href="" target="_blank">transsylvian approach</a> to clipping pulls and opens the <a href="" target="_blank">Sylvian fissure</a>, a deep sulcus between the frontal lobe and the temporal lobe of the brain. <br /> <img alt="Blog_Body_Image_5.jpg" height="auto" src="" width="auto" /><br /> Within the Sylvian fissure are several blood vessels connected across the frontal and temporal lobes. To safely open the Sylvian fissure during surgery, neurosurgeons must pull each vessel aside to its dominant region. Choosing the correct direction for each vessel is important, as failure to do so could cause instability of the blood vessel, or hemorrhage.<br /> <br /> When the surgeon can see these vessels directly, making this determination is basically straightforward. However, during the surgery the visible area is very limited—only partial segments of the blood vessels are visible.<br /> <img alt="Blog_Body_Image_6.jpg" height="auto" src="" width="auto" /><br /> “Neurosurgeons all over the world performing aneurysm surgery want some kind of pre-surgical simulation, practice, or check, because the actual surgical view is very limited and the surgery itself is very difficult,” says Seo. “They also know that the brief blood-vessel-branch dominant region of each blood vessel is easily predictable if they can see the whole brain and the blood vessels. So many neurosurgeons have wanted to use 3D CG for a long time, but they don’t know how to implement it.”<br /> <img alt="Blog_Body_Image_3.jpg" height="auto" src="" width="auto" /> <h3><strong>Creating an app</strong></h3> About two years ago, Seo’s Igarashi Laboratory was asked to collaborate with the Department of Neurosurgery at The University of Tokyo Hospital to develop a CG tool to help surgeons visualize the transsylvian approach in real time, and as realistically as possible.<br /> <br /> In their aforementioned paper Real-Time Virtual Brain Aneurysm Clipping Surgery, Seo and his fellow authors propose the approach of creating a deformable CG brain from patient data, with intelligent algorithms to automatically determine the dominant region for each blood vessel. The model includes automatically synthesized virtual trabeculae (strands of connective tissue) to represent the thin strings that connect the brain and vessels. In the application, the user can “pull” on the brain and the blood vessels to deform and open them at the sulcus, with the visuals updating instantly in real time to show the result.<br /> <img alt="Blog_Body_Image_1.jpg" height="auto" src="" width="auto" /><br /> With a real-time 3D visualization, the surgeon can load a model of a patient’s brain from the individual’s MRI and 3D Rotational Angiography (3DRA) data, look at it from any angle, pull apart the CG sulcus to see inside, and even make the lobes invisible to better see the blood vessels. The user controls everything via simple mouse cursor movements or via multi-touch, making the app easy and accessible for surgeons without technical experience.<br /> <img alt="Blog_Body_Image_2.jpg" height="auto" src="" width="auto" /><br /> In developing the application, Seo’s team chose Unreal Engine as the underlying real-time technology because of its graphics and programming tools. “Unreal Engine has powerful mathematical <a href="" target="_blank">C++ APIs</a> such as FVector, FMath, and UKismetMathLibrary, so we find it to be a suitable platform for research on 3D CG geometry,” says Seo.<br /> <br /> Due to the need to implement a super-fast physics simulation, speed was also a factor. The real-time app Seo’s team developed runs at 40-50 frames per second, something the medical industry is unaccustomed to. “Real-time deformation of the brain is a big surprise to people discovering our applications,” says Seo. “The beautiful rendering quality is also very new to the medical field.”<br /> <br /> Epic Games is pleased to support this innovative use of real-time rendering. As also evidenced in <a href="" target="_blank">virtual reality orthopedic surgery</a>, the ability to realistically portray anatomy in real time gives the medical community enhanced methods to train surgeons in not only the practical aspects of surgery, but also in the decision-making process it inevitably includes.<br /> <br /> <br /> Interested in finding out how you could use Unreal Engine for medical simulation? <a href="" target="_blank">Get in touch</a> and we’ll be glad to start that conversation.<br />  BlueprintsEnterpriseProgrammingThe University of TokyoTraining And SimulationVisualizationMedicalSébastien LozéFri, 30 Aug 2019 16:00:00 GMTFri, 30 Aug 2019 16:00:00 GMT SketchUp to Twinmotion, build your archviz scene in minutes: webinar replay our latest webinar on Twinmotion? Now you can watch it on demand! Learn how to build an archviz scene from scratch, importing from SketchUp using the new direct link.We recently hosted the live webinar <strong>From SketchUp to Twinmotion, build your archviz scene in minutes</strong>. If you missed it, no problem! The replay is available right here.<br /> <br /> In this webinar, Martin Krasemann, Twinmotion Product Specialist at Epic Games takes you through the building of an archviz scene from the beginning, explaining the process step by step.<br /> <br /> He uses the newly available SketchUp direct link for Twinmotion, as well as the latest collection of grass assets.<br /> <br /> Watch this webinar replay and learn how to create high-quality images, panoramas, and standard or 360° VR videos!<br /> <br /> <strong>You’ll learn:</strong><br />   <ul style="margin-left: 40px;"> <li>How to use the new SketchUp direct link, a faster and easier way to work with SketchUp data in Twinmotion</li> <li>How to use the Twinmotion feature set to bring your project to life</li> <li>How to output the finished media</li> </ul> <br /> You can watch the replay here: <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> For an introduction to the tool, take a look <a href="" target="_blank">at this webinar</a>, in which we discuss how it fits into a workflow for fast real-time architectural, construction, urban planning, and landscaping visualization.<br /> <br />  ArchitectureCommunityDesignEnterpriseLearningNewsVisualizationTwinmotionWebinarFri, 30 Aug 2019 15:30:00 GMTFri, 30 Aug 2019 15:30:00 GMT Games explains how The Dark Pictures Anthology refines the formula for twisted tales developer of horror game Until Dawn speaks to how they incorporated two-player and couch co-op for their ambitious The Dark Pictures Anthology.When <a href="" target="_blank">Supermassive Games</a> released horror game <a href="" target="_blank">Until Dawn</a> in 2015, the interactive drama quickly became a fan favorite. Its engaging branching narrative made it incredibly popular on streaming sites. In an attempt to refine their craft, the British developer set out to create The Dark Pictures Anthology, a collection of horror games that tackle different genre archetypes. <br /> <br /> To see how Supermassive Games is pushing the horror genre forward with <a href="" target="_blank">Man of Medan</a>, the first entry in the series, we interviewed Game Director Tom Heaton. He talks about what they learned while developing Until Dawn that they’re building on for The Dark Pictures Anthology. For instance, Supermassive noticed that groups of people would often huddle around the TV as one person played the survival thriller, so the studio incorporated a couch co-op mode for Man of Medan that let’s friends pass around a single gamepad to control different in-game characters. Heaton also talks about designing their new two-player online co-op mode, which created difficult challenges for the developer to overcome in terms of keeping the action not only engaging, but in sync between two players who go down divergent paths. <br /> <br /> In our discussion, Heaton also explains how Man of Medan has been designed with replayability in mind, making the game the studio&#39;s biggest branching game yet. Finally, Heaton expounds on his take for making a compelling horror game, talks about how they achieved the Man of Medan’s gorgeous visuals, and elaborates on the benefits of using UE4. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <strong>Considering Until Dawn had a passionate following, what did you learn from developing that choice-driven horror game that you&#39;re building on for The Dark Pictures Anthology? <br /> <br /> Game Director Tom Heaton: </strong>We learnt so much from making Until Dawn and watching people playing it. One of the key things was how people liked to play the game in groups. It was part of the inspiration to bring multiplayer modes to <a href="" target="_blank">The Dark Pictures Anthology</a>. And we loved how people related to the characters and really got into their relationships. We boosted our whole relationship system because of this.<br /> <br /> <strong>Can you tell us more about your vision for The Dark Pictures Anthology and explain where Man of Medan fits in with it?<br /> <br /> Heaton:</strong> The Dark Pictures Anthology is a series of intense, stand-alone cinematic horror games that combine powerful storytelling with film-like graphics to create a terrifying new gaming experience. The anthology format has a great heritage in horror and we thought it would work very well in games, too. It gives us the opportunity to tell new stories using different sub-genres of horror. We actually counted 39 different sub-genres of horror, so we have plenty of ideas to play with!<br /> <br /> Man of Medan is the first game in The Dark Pictures Anthology. Five friends set sail on a holiday diving trip. With a rumored WWII wreck to find, and plenty of on-deck partying to be had. Our group gets ready for what should be the dive trip of a lifetime. As the day unfolds, and a storm rolls in, their trip soon changes into something much more sinister and terrifying. Put it this way, things go wrong very quickly!<br /> <img alt="DeveloperInterview_Man_of_Medan_02.jpg" height="auto" src="" width="auto" /><br /> <strong>With numerous dialogue options, how did the studio craft branching pathways that would ensure a cohesive yet unique experience for players? <br /> <br /> Heaton:</strong> It’s a fun challenge for sure! Man of Medan is the most branching game we’ve ever made. We built an in-house tool that allows us to map out the game and track all the branches. We can play the game at a very basic level and see what works straight away. With the two-player Shared Story, it’s key that both players enjoy a great story no matter where they are in the game. <br /> <br /> <strong>The Curator’s Cut will allow players to experience the game with new choices from the perspective of different characters. Can you talk about how you implemented that mode?<br /> <br /> Heaton: </strong>The Curator’s Cut allows you to play an alternative version of the game, and is different to, say, a traditional Director’s Cut as it gives the player new ways to directly influence the story and its outcomes. Players will see and change events and relationships from the perspective of characters that weren’t under their control in the first playthrough. It adds a whole new layer to how the player understands and interacts with the story in a way that’s never been done before, and we’re excited for people to experience it. <br /> <img alt="DeveloperInterview_Man_of_Medan_04.jpg" height="auto" src="" width="auto" /><br /> <strong>With 69 different ways for protagonists to die in the game, was replayability a core tenant of the game’s design?<br /> <br /> Heaton:</strong> Absolutely! We know that players love to replay the games to try and save the characters that got killed the first time around. We built the game with replayability in mind and you can have a very different experience over several playthroughs. For instance, we estimate you would need to play the game nine times just to see all the deaths!<br /> <br /> <strong>Man of Medan will offer both two-player online co-op and couch co-op, the latter of which allows up to five people to pass a single controller around to inhabit the game’s five in-game characters. Can you explain the thought process behind implementing these novel modes? <br /> <br /> Heaton:</strong> We wanted to bring something new and compelling to the genre, so Man of Medan – like all the games in the Dark Pictures Anthology – has been designed with multiplayer in mind from the start. We thought two people playing online in the same story would be a really exciting new way to experience a narrative horror game. Also, we were inspired by how people played Until Dawn socially and that helped us decide to do the pass-the-pad Movie Night mode, too.<br /> <img alt="DeveloperInterview_Man_of_Medan_05.jpg" height="auto" src="" width="auto" /><br /> <strong>With two gamers playing together simultaneously online and going down different paths, how do you keep the narrative consistently engaging for both players while keeping it in sync in real time?<br /> <br /> Heaton:</strong> It’s a challenge, maybe the biggest challenge in designing games of this type. We have a few tricks up our sleeves to keep everything in sync. But the main thing is that we play the game over and over again, from early in development, to make sure that both players have an interesting and engaging experience.<br /> <br /> <strong>In your opinion, what is the formula for making a gripping scary game?<br /> <br /> Heaton:</strong> You have to have a great idea for a core concept and then you need to develop a gripping narrative featuring interesting and distinctive characters that the audience will really care about. There’s no magic button you can press that will give you that, so there’s a lot of hard work and false steps before you finally get there.<br /> <img alt="DeveloperInterview_Man_of_Medan_08.jpg" height="auto" src="" width="auto" /><br /> <strong>With amazing lighting and captivating visual effects, Man of Medan features extremely life-like visuals. The aesthetics are so impressive that we nominated it for our <a href="" target="_blank">E3 Eye Candy award</a>. How did you achieve the game&#39;s high-fidelity graphics?<br /> <br /> Heaton: </strong>Delivering a brilliant cinematic look is really important to us. We want players to feel like they’re playing a horror movie. We have brilliant production design and art teams that give us fantastic environments that our players can explore. And lighting is really important to the look we achieve, so we also have dedicated lighting and camera teams. And we use best-in-class facial scanning and performance capture.<br /> <br /> <strong>Considering Man of Medan is a highly cinematic experience with movie-like shots and hand-held camera movements, did <a href="" target="_blank">Sequencer</a> play a big role in the game’s development? <br /> <br /> Heaton: </strong>Yes. Sequencer is a core part of our workflow and lets us integrate elements like <a href="" target="_blank">animation</a>, <a href="" target="_blank">audio</a>, <a href="" target="_blank">lighting</a>, and <a href="" target="_blank">cameras</a>. It allows us to have very fine-grained control over all elements of a shot, and to make director-level edits quickly and safely.<br /> <img alt="DeveloperInterview_Man_of_Medan_10.jpg" height="auto" src="" width="auto" /><br /> <strong>With excellent performances, can you detail how you mo-capped the actors for the game? <br /> <br /> Heaton:</strong> We use top-quality Hollywood talent, so it’s important to us that we capture every nuance of their performance. Shawn Ashmore is excellent as Conrad in Man of Medan. We have created a process that is centered on the actors and gives them the freedom to develop the characters and deliver brilliant performances without us getting in the way. The actors wear lightweight head rigs for the shoot, and they are free to move around and interact with other actors in the scene.<br /> <br /> <strong>Considering Until Dawn used a different game engine, what made UE4 a good fit for Man of Medan?<br /> <br /> Heaton: </strong>Our games use a bespoke version of the Unreal Engine, which is carefully tailored to suit our needs. Unreal allows us to make a fantastic looking product across multiple platforms and we are really happy with the results.<br /> <img alt="DeveloperInterview_Man_of_Medan_13.jpg" height="auto" src="" width="auto" /><br /> <strong>What has been the biggest challenge making the game and how did you overcome it?<br /> <br /> Heaton:</strong> Man of Medan is the most branching game we have ever made. Every time a player does something, the game branches, sometimes in small ways, but sometimes opening up whole new narrative paths or killing off a major character. Keeping track of all that branching is a massive challenge, especially when you throw two-player co-op into the mix. It also means we have to keep track of a large quantity of data. But it’s worth it because players really respond positively to how the game changes depending on what actions they take.<br /> <br /> <strong>Thanks for your time! Where can people learn more about Man of Medan and The Dark Pictures Anthology?<br /> <br /> Heaton:</strong> Thanks! You can find out more about Man of Medan and The Dark Pictures Anthology on our official website or social media.<br /> <br /> <a href="" target="_blank"></a><br /> <a href="" target="_blank"></a><br /> <a href="" target="_blank"></a><br /> <a href="" target="_blank"></a><br /> <br /> The Dark Pictures Anthology: Man of Medan releases August 30, 2019 on PC. PlayStation 4, and Xbox One.<br />  ArtDesignGamesMocapSupermassive GamesThe Dark Pictures Anthology: Man of MedanJimmy ThangThu, 29 Aug 2019 10:00:00 GMTThu, 29 Aug 2019 10:00:00 GMT Engine at SIGGRAPH 2019—all recordings now available the User Group and our next-gen virtual production demo to the booth presentations, demonstrations, and Tech Talks, there was so much for Unreal Engine users to catch at SIGGRAPH 2019. Couldn’t see everything live? We have all of the recordings here.At the recent SIGGRAPH 2019 convention in Los Angeles, we were honored to work with an amazing lineup of guest presenters; our in-house team also shared a lot of information on the latest Unreal Engine tools and workflows. Here’s a recap of everything you can see on our <a href="" target="_blank">post-event page</a>. <h2>Unreal Engine User Group</h2> The stage at the Orpheum Theatre was brought alive by presentations from <strong>Matt Workman </strong>from <strong>Cine Tracer</strong>, <strong>Doug Roble </strong>from <strong>Digital Domain</strong>, <strong>Sam Nicholson</strong>, <strong>ASC </strong>from <strong>Stargate Studios</strong>, <strong>Vicki Dobbs Beck </strong>from <strong>ILMxLAB</strong>, <strong>Bei Yang </strong>from <strong>Walt Disney Imagineering</strong>, and <strong>Kaya Jabar </strong>from <strong>The Third Floor</strong>, topped off by a fireside chat with the incredible director, producer, actor, and writer <strong>Jon Favreau</strong>. We’ve already shared the recording, but in case you missed it, there’s another chance to check it out.<br /> <img alt="Siggraph_all_content_blog_body_Disney_img.jpg" height="auto" src="" width="auto" /> <h2>Booth presentations</h2> Presenters from some of the same companies also joined us at our booth, along with additional speakers, to share their latest Unreal Engine projects in a packed three-day schedule. They included <strong>Digital Domain</strong>, <strong>ILMxLAB</strong>, <strong>Framestore</strong>, <strong>Fox VFX Lab</strong>, <strong>Halon Entertainment</strong>, <strong>Cine Tracer</strong>, <strong>Capacity</strong>, <strong>GiantStep</strong>, <strong>HOK</strong>, <strong>Magic Leap</strong>, <strong>Sequin</strong>, and <strong>Quixel</strong>. They kept their audiences engaged with a fascinating array of real-time projects—digital humans, virtual production, next-gen rides, VR experiences, cinematic lighting, and much more. <br /> <img alt="Siggraph_all_content_blog_body_DD_img.jpg" height="auto" src="" width="auto" /><br /> Also at our booth, we offered a number of demonstrations from our in-house team of experts. Visitors got the lowdown on the <strong>fundamentals of ray tracing</strong>, using <strong>UE4 for design visualization</strong>, <strong>PBR workflows</strong>, <strong>real-time archviz</strong>, using <strong>Sequencer for linear animation</strong>, and creating <strong>visualizations with Twinmotion</strong>.  <h2>Tech Talks</h2> To get a more in-depth understanding of some of our latest Unreal Engine technology, you need look no further than our quartet of Tech Talks. The people behind the technology shared an insider view on new <strong>in-camera VFX</strong>, the evolution of <strong>ray tracing </strong>in UE4, our new <strong>Chaos physics and destruction </strong>system, and the latest <strong>virtual production </strong>tools and workflows. <br /> <img alt="Siggraph_all_content_blog_body_Chaos_img.jpg" height="auto" src="" width="auto" /> <h2>Next-gen virtual production</h2> We’re also able to share a behind-the-scenes look at our demonstration of in-camera VFX for next-gen virtual production, a project we embarked upon with partners <strong>Lux Machina</strong>, <strong>Magnopus</strong>, <strong>Profile Studios</strong>, <strong>Quixel</strong>, <strong>ARRI</strong>, and <strong>Matt Workman</strong>. Attendees got a brief glimpse of this at the User Group.<br /> <br /> Thanks to all of you who did come by and see us. Whether you were able to be among them or not, there’s no reason to miss out on any of the Unreal action. Check out our post-event page for all the recordings.  <div style="text-align: center;"><a href="" target="_blank"><img alt="see_the_video_button.png" height="auto" src="" width="auto" /></a></div> EnterpriseFilm And TelevisionVirtual ProductionSIGGRAPH 2019Wed, 28 Aug 2019 15:00:00 GMTWed, 28 Aug 2019 15:00:00 GMT up to 50% off during the Marketplace 5-year Anniversary Sale now and celebrate the Marketplace&#39;s 5-year anniversary with up to 50% products during the sale!It’s officially the 5-year anniversary of the <a href="" target="_blank">Unreal Engine Marketplace</a>!<br /> <br /> Celebrate with us by saving up to 50% on select products. Over 4,000 fantastic products have been discounted, ranging from environments, Blueprints, props,characters and beyond!<br /> <br /> The sale runs now through September 3 at 11:59 PM EDT. <br /> <br /> Thank you to all the amazing Marketplace creators who have contributed to the Unreal development community over the last five years by offering outstanding content and support. <div style="text-align: center;"><br /> <a href="" target="_blank">Happy shopping!</a></div> CommunityNewsMarketplaceAmanda SchadeTue, 27 Aug 2019 13:30:00 GMTTue, 27 Aug 2019 13:30:00 GMT Digital Games talks about creating ambitious epic Ancestors: The Humankind Odyssey ambitious open-world game Ancestors: The Humankind Odyssey, which tracks the evolution of human history across eight million years, comes from the creative director who worked on Prince of Persia: The Sands of Time and several Assassin’s Creed games.  With an esteemed background working on <em>Prince of Persia: The Sands of Time</em> and several <em>Assassin’s Creed</em> games, former Ubisoft Creative Director Patrice D&eacute;silets set out to create a new historical epic beginning nine million years ago. To fully execute on his vision, he co-founded Panache Digital Games to create <a href="" target="_blank">Ancestors: The Humankind Odyssey</a>. As the name suggests, the highly ambitious open-world title charts the evolution of human history across eight million years. <br /> <br /> The studio started out with a handful of people, but has grown to 35 developers that are collectively able to punch above their weight. To see how <a href="" target="_blank">Panache Digital Games</a> is creating a game of AAA quality as a new indie developer, we reached out to D&eacute;silets and Ancestors Development Director Frederic Laporte. The two talk about how they balanced making a game that is historically accurate and fun, share how they unconventionally recreated ancient Africa, and discuss how they designed a game with no predetermined narrative. The duo also elaborates on how they created a gameplay loop that doesn’t hold your hand, and provides tips to developers who are thinking of starting their own studio.  <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <strong>With the game charting the course of human evolution across eight million years throughout Africa, Ancestors certainly has a novel premise. How did you come up with the concept and setting for the game? </strong><br /> <br /> <strong>Cofounder and Creative Director Patrice D&eacute;silets:</strong> When we founded Panache, we needed to create a first game that could serve as a toolbox for all future games at Panache and have our three Cs: character, camera, and controls. I also needed to find a game in the same vein as what I had done before, [which is] historical games. And then one night I had this idea: a prehistoric game about our ancestors. We wouldn’t have to create cities, civilizations or combat systems with swords and stuff. But I also didn’t want to approach the subject like it has always been done. I wanted to go back all the way to the very beginning of the evolution of our kind, 10 million years ago. The game design followed.<br /> <br /> <strong>Considering the game is built on the core principles that players will need to "explore, expand, and evolve," how did you come up with Ancestor&#39;s unique gameplay loop?<br /> <br /> D&eacute;silets:</strong> I always study the subject of my games and it is through the subject that I identified the pillars of the game design. Human evolution was done through exploration and we needed a clan to survive. In order to evolve, you also need to make babies and befriend outsiders to have a stronger clan and pass generations. Survival is obviously an important aspect. We need to eat, drink, and sleep to survive. But if you stay too long in one location, resources will deplete and that’s why you need to explore and expand your territory. You then encounter all sorts of dangers and you learn skills. So, the gameplay loop is based on human nature itself. In the end, the most important thing was to let the player make their own decisions and evolve their own way.<br /> <img alt="DeveloperInterview_Ancestors_05.jpg" height="auto" src="" width="auto" /><br /> <strong>How much historical and scientific research went into the production of the game? <br /> <br /> D&eacute;silets: </strong>The first two years consisted of studying the subject, reading, watching documentaries, and meeting specialists, but then I needed to distance myself from all the scientific facts in order to create a fun game. So there was a lot of research put into the game, but it is not an accurately scientific game as we took liberties to focus on the pure fun of playing.<br /> <br /> <strong>How did you walk the line between developing a game that would be historically accurate yet fun to play?<br /> <br /> D&eacute;silets:</strong> Being historically accurate is what we tried at first, but we soon realized it was a little boring. So I decided that we were going to put all the ingredients in the game, right from the start, and let the players make their own choices. So, someone might actually discover some tool way before what science tells us. But that’s okay! [That just means players] were curious and intelligent enough to make this new discovery. <br /> <img alt="DeveloperInterview_Ancestors_01.jpg" height="auto" src="" width="auto" /><br /> <strong>With dangerous predators that include crocodiles, sabertooth tigers, and giant centipedes to fend against, how did you decide and design what creatures to include in the game?<br /> <br /> D&eacute;silets:</strong> We put some classic predators in there such as the sabertooth, snakes, etc. Then, through research, we learned about some animals and predators that we had never heard of before so we included some of them. But again, we took some liberties. And what is important to remember is that the world hasn’t changed that much since then. 10 million years might seem like a lot, but regarding Earth’s story, it’s quite short. <br /> <br /> On another note, what we found most interesting is that many plants we thought were indigenous African plants weren’t! <br /> <br /> <strong>The studio has been upfront about Ancestors being a survival game that doesn&#39;t hold players&#39; hands. It doesn&#39;t feature an inventory system or minimap, and even allows users to turn off the in-game HUD. What was the studio&#39;s reasoning behind this approach?<br /> <br /> Development Director Frederic Laporte:</strong> Immersion was the key here. Spending time in menus reminds the player he’s playing a game and our goal was to get players lost in Africa 10 million years ago. This philosophy pushed us to make the game understandable without any in-game HUD by providing visual and/or audio feedback for everything that usually requires HUD feedback. Playing with the HUD is considered a bonus, not a necessity. <br /> <br /> <strong>D&eacute;silets:</strong> On top of that, it was about real life. Our ancestors, those hominids, did not have maps or anyone to show them the way. They did it on their own, using their instinct. And that’s why I’m asking players, “Hey, you Homo Sapiens, can you survive like our ancestors did?”<br /> <br /> <strong>The studio has mentioned that the game doesn&#39;t have a predetermined narrative and that "players&#39; own curiosity will drive the show." Was this philosophy challenging to design around? <br /> <br /> D&eacute;silets:</strong> Yes and no. Obviously, we always think of the players when designing. We always ask ourselves, “Will players be curious enough for that?” However, there are different types of players, so we had to make sure to find the right balance. Some might be overwhelmed with the freedom at first, but once you get the hang of it, it’s pretty awesome.<br /> <br /> <strong>With a motto that asserts "extinction is the norm, survival is the exception" coupled with all of the deadly creatures and environmental hazards that the game has to offer, how do you set about balancing the game&#39;s difficulty?<br /> <br /> D&eacute;silets:</strong> Once again, it’s all about the players and how they will play the game. We have to make sure to balance for one way of playing, but keep in mind that all players will play differently. People will get better. You can actually get good at it! The more you play, the better you will get. And not having a narrative makes the replayability infinite. <br /> <img alt="DeveloperInterview_Ancestors_06.jpg" height="auto" src="" width="auto" /><br /> <strong>With vast and realistic-looking jungles, plains, and lakes, the game&#39;s various biomes look like they were crafted with care and attention to detail. How did you set about recreating ancient Africa?<br /> <br /> Laporte:</strong> We first spent a lot of time doing research and built ourselves a huge reference library to ensure we could get as authentic as possible and all have a common vision. That being said, when building virtual worlds, devs usually start with a set of character metrics and the environment is first grey boxed in by level designers and later covered up with art. We didn’t do that. We started with art right away. This forced us to adapt what our character could do to the environment and not the other way around. This also generated gameplay we didn’t initially anticipate, which was a nice bonus!<br /> <br /> <strong>The concept of a skill tree seems like a perfect fit for Ancestors. Can you elaborate on how you’re designing a system that allows players to evolve throughout the course of the game?<br /> <br /> D&eacute;silets:</strong> This is the most scientific aspect of the game I would say. We started by deconstructing what makes us hominids and separated those elements across four categories: sense, intelligence, communication, and reflexes. So we track how players play and depending on their actions, they unlock new abilities. <br /> <br /> Neuronal energy is accumulated through babies, so the number of babies in your clan will have an effect on the speed at which you will evolve. That neuronal energy can then be used to lock abilities when passing a generation. <br /> <br /> <strong>How many developers worked on Ancestors and how long did it take? <br /> <br /> Laporte:</strong> Ancestors started with a handful of people and is now, four years later, 35 strong.<br /> <br /> <strong>D&eacute;silets:</strong> And I’m blown away at the scale of what we created with such a small team. I’m so proud!<br /> <img alt="DeveloperInterview_Ancestors_02.jpg" height="auto" src="" width="auto" /><br /> <strong>What has been the biggest challenge making the game and how did the studio overcome it? <br /> <br /> D&eacute;silets:</strong> The biggest challenge was to create an open world that makes sense, with characters that don’t speak. The subject matter of our game made it impossible to make a conventional game, however, we managed to make a game that is fun, unique, and mind-blowing. Unreal’s tech was a great asset in creating with a small team.<br /> <br /> <strong>Considering Ancestors will be the first title from Panache Digital Games, how has development been different as an indie studio?<br /> <br /> Laporte:</strong> Big teams obviously have advantages, but being agile is not one of them. By having a small team, we have been able to quickly change direction whenever it was needed. Knowing everyone around you also makes interdisciplinary work easier as you’re always aware of who does what.<br /> <br /> Being smaller also forces everyone to wear multiple hats and step out of their comfort zone. While it’s sometimes uncomfortable, it’s also super exciting to do more than what’s in their job description. In the end, it makes everyone more versatile.<br /> <br /> <strong>D&eacute;silets:</strong> At Panache, we also decided to get rid of all the middle management stuff. This allows everyone to concentrate on their craft and to create the game. <br /> <img alt="DeveloperInterview_Ancestors_04.jpg" height="auto" src="" width="auto" /><br /> <strong>As an independent studio working on their first game, can you provide any tips to developers that are thinking about starting their own team?<br /> <br /> D&eacute;silets:</strong> Don’t be afraid to be bold. And stay passionate about your game! That might sound clich&eacute;, but it really is the key. Also keep in mind that your initial vision might change and that’s okay. You should embrace change. <br /> <br /> <strong>What made UE4 a good fit for the game?<br /> <br /> Laporte:</strong> Being a small studio means we obviously have limited resources and UE4 provided us a solid tool base and rendering engine. This allowed us to focus mostly on gameplay and less on non-game related issues. Plus, it allowed us to test our ideas from the very beginning and make design decisions in accordance.<br /> <br /> <strong>Does the studio have any favorite UE4 tools or features?<br /> <br /> Laporte:</strong> The <a href="" target="_blank">Blueprints</a> system empowers people to prototype ideas and thus quickly validate, or invalidate game concepts. While we can’t often ship those prototypes as is, they still provide us with a lot of freedom and save us engineering time.<br /> <br /> <strong>Thanks for your time. Where can people learn more about the game?</strong><br /> <a href="" target="_blank"></a><br />  AncestorsThe Humankind OdysseyPanache Digital GamesPatrice DésiletsJimmy ThangMon, 26 Aug 2019 16:00:00 GMTMon, 26 Aug 2019 16:00:00 GMT