Unreal Engine RSShttps://www.unrealengine.comUnreal Engine. Voted Best Game Engine time and again. The engine of choice for Gears of War, Infinity Blade, and more big titles.en-USUnreal Engine 4.20 Released!<h1><strong>What’s New</strong></h1> <p>Unreal Engine 4.20 delivers on our promises to give developers the scalable tools they need to succeed. Create a future-focused mobile game, explore the impact of Niagara, breathe life into compelling, believable digital humans, and take advantage of workflow optimizations on all platforms.</p> <p>You can now <strong>build life-like digital characters and believable worlds</strong> with unparalleled realism. Take your visual effects to the next level with Unreal Engine’s new Niagara particle editor to add amazing  detail to all aspects of your project. Use the new Digital Humans technology powering the “Meet Mike” and “Siren” demos to raise the bar on realism. With the new Cinematic Depth of Field, you can achieve cinema quality camera effects in real-time.</p> <p>Unreal Engine empowers you to <strong>make things your way</strong> by giving you the tools to customize the creation process to your preferred style and workflow. With the new Editor Scripting and Automation Libraries, you can can create completely customized tools and workflows. Make the lives of designers and artists easier by adding new actions to apply to Actors or assets thanks to scripted extensions for Actor and Content Browser context menus.</p> <p>Battle-tested mobile and console support means you can <strong>create once and play on any device</strong> to deliver experiences anywhere users want to enjoy them. Epic has rallied around the mobile release of Fortnite to optimize Unreal Engine for mobile game development. We have made tons of performance improvements including implementing both hardware and software occlusion queries to limit the amount of work the hardware needs to do. Proxy LOD is now production-ready and can further reduce the complexity of the geometry that needs to be rendered at any time. </p> <p>In addition to all of the updates from Epic, this release includes <strong>165 improvements submitted by the incredible community of Unreal Engine developers on GitHub</strong>! Thanks to each of these contributors to Unreal Engine 4.20: </p> Adam Moss (adamnv), Akihiro Kayama (kayama-shift), Alan Edwardes (alanedwardes), Alan Liu (PicaroonX), Andrew (XenonicDev), Andrew Haselgrove (Dimpl), Anton Rassadin (Antonrr), arkiruthis, Begounet, Brandon Wilson (Brandon-Wilson), c4tnt, Changmin (cmheo), Christian Loock (Brainshack), Clinton Freeman (freemancw), Daniel Assuncao (dani9bma), David Payne (dwrpayne), Deep Silver Dambuster Studios (DSDambuster), Derek van Vliet (derekvanvliet), Eduard Gelbling (NachtMahr87), frankie-dipietro-epic, Gautier Bo&euml;da (Goutye), George Erfesoglou (nonlin), Giovanny Guti&eacute;rrez (bakjos), Gregor Gullwi (ggsharkmob), Hannah Gamiel (hgamiel), Hyuk Kim (Hybrid0), Ibraheem Alhashim (ialhashim), Ilya (ill), Jacob Nelson (JacobNelsonGames), Jaden Evanger (cyberblaststudios), Jared Taylor (Vaei), Jesse Yeh (jesseyeh), Jia Li (shrimpy56), J&oslash;rgen P. Tjern&oslash; (jorgenpt), June Rhodes (hach-que), Junichi Kimura (junkimu), Kalle H&auml;m&auml;l&auml;inen (kallehamalainen), kinolaev, Kory Postma (korypostma), krill-o-tron, Kryofenix, Lallapallooza, Layla (aylaylay), Lee Berger (IntegralLee), Leon Rosengarten (lion03), Lirrec, malavon, Marat Radchenko (slonopotamus), Marat Yakupov (moadib), Mathias L. Baumann (Marenz), Matt Hoffman (LordNed), Matthew Davey (reapazor), Maxime Turmel (maxtunel), Michael Allar (Allar), Michael K&ouml;sel (TheCodez), Michael Puskas (Mmpuskas), Mikayla Hutchinson (mhutch), mimattr, Mitsuhiro Koga (shiena), Muhammad A.Moniem (mamoniem), nakapon, Nicolas Lebedenco (nlebedenco), Paul Eremeeff (PaulEremeeff), Phillip Baxter (PhilBax), projectgheist, Rama (EverNewJoy), redfeatherplusplus, Rei-halycon, Robert Khalikov (nbjk667), Roman Chehowski (RChehowski), S-Marais, Sam Bonifacio (Acren), Satheesh  (ryanjon2040), Scott Freeman (gsfreema), SculptrVR, Sebastian Aaltonen, S&eacute;bastien Rombauts (SRombauts), Seokmin Hong (SeokminHong), Serta&ccedil; Ogan (SertacOgan), stephenwhittle, Temaran, Thomas Miller (tmiv), Trond Abusdal (trond), TWIDan, Tyler (tstaples), Usagi Ito (usagi), yama2akira, Yang Xiangyun (pdlogingithub), yehaike, Zachary Burke (error454) <h1><strong>Major Features</strong></h1> <h2><strong>New: Optimizations and Improvements for Shipping on Mobile Platforms</strong></h2> <p><strong>Unreal Engine 4.20 brings well over 100 mobile optimizations developed for Fortnite on iOS and Android</strong>, marking a major shift for developers in terms of ability to more easily ship games and seamlessly optimize gameplay across platforms. Major enhancements include improved Android debugging, mobile landscape improvements, and occlusion queries on mobile.</p> <div style="text-align: center;">  <img height="336" src="https://dnnrz1gqa.blob.core.windows.net/portals/0/Images/Builds/4_20/iPhoneXFNBRMobile_04.jpg?sv=2017-04-17&amp;sr=b&amp;si=DNNFileManagerPolicy&amp;sig=d640f4l8sFAkjXxA%2FDimYKaZdvN1GRKA80Of%2FytwgzA%3D" style="border: none; transform: rotate(0rad);" width="670" /></div>   <p><strong>Hardware and Software Occlusion Queries on Mobile</strong></p> <p>Hardware Occlusion Queries are now supported for high-end mobile devices on<strong> </strong>iOS and Android that support ES 3.1 or Vulkan using the GPU. They are enabled by default for any device that supports them. </p> <p>Software Occlusion Queries is an experimental feature that uses the CPU to cull primitive components from the scene. Because it uses a conservative approach, it can be used on any mobile device. </p> <table align="center" cellpadding="1" cellspacing="1"> <tbody> <tr> <td><img src="https://dnnrz1gqa.blob.core.windows.net/portals/0/Images/Builds/4_20/OcclusionCulling1_01.png?sv=2017-04-17&amp;sr=b&amp;si=DNNFileManagerPolicy&amp;sig=Kd352%2BuBESHX7PmYMrJGomX0snQM4BctsfiIc6wsdYw%3D" style="border: none; transform: rotate(0rad); width: 375px; height: 211px;" /> </td> <td><img src="https://dnnrz1gqa.blob.core.windows.net/portals/0/Images/Builds/4_20/OcclusionCulling2_01.png?sv=2017-04-17&amp;sr=b&amp;si=DNNFileManagerPolicy&amp;sig=Yqet4w7nMup5WuUmKx7pUpJYB33ZF6pt%2FjCk6HAbtI0%3D" style="border: none; transform: rotate(0rad); width: 375px; height: 211px;" /> </td> </tr> </tbody> </table> <p style="text-align: center;"><em>Left - r.Mobile.AllowSoftwareOcclusion 1, r.SO.VisualizeBuffer 1; Right - Render frozen showing occluded parts</em></p> <p>To enable Software Occlusion Queries, follow these steps:</p> <ol> <li>Enable r.Mobile.AllowSoftwareOcclusion 1.</li> <li>Disable r.AllowOcclusionQueries 0.</li> <li>Enable any primitive to be an occluder by setting <strong>LOD for Occluder Mesh</strong> true in the Static Mesh Editor.</li> </ol> <p>You can visualize the results in the <a href="https://docs.unrealengine.com/en-us/Platforms/Mobile/Previewer" target="_blank">Mobile Previewer</a> when using <strong>High-End Mobile</strong> and then enable r.SO.VisualizeBuffer 1.</p> <p><strong>Platform Material Stats</strong></p> <p><strong>Quickly profile and optimize your Materials</strong> using the new Platform Stats window inside of the Material Editor! You can now see stats for multiple shader platforms and quality levels. For mobile platforms, we use an offline shader compiler to give more accurate instruction and Texture usage information.</p> <p style="text-align: center;"><img height="479" src="https://dnnrz1gqa.blob.core.windows.net/portals/0/Images/Builds/4_20/PlatformStats_01.png?sv=2017-04-17&amp;sr=b&amp;si=DNNFileManagerPolicy&amp;sig=0TbzDOLp7tzA0vhFpfPzbxqgGa0MijoBIytT6zreyfs%3D" style="border: none; transform: rotate(0.00rad); -webkit-transform: rotate(0.00rad);" width="931" /></p> <p><strong>Improved Android Debugging</strong></p> <p><strong>Iterate and debug on Android without having to repackage the UE4 project!</strong> When compiling Android, we now generate a Gradle project file which can be opened in Android Studio. You can place breakpoints in C++ and Java code and use Android Studio to launch a debug session.  You can also make changes to C++ source code and recompile. If you start a new debug session, Android Studio will notice the change and quickly upload the new shared library to your device.</p> <p><strong>Mobile Landscape Improvements</strong></p> <p>Make your terrains on mobile more interesting now that you can have <strong>unlimited Landscape Material layers on mobile devices</strong>! While three is still the best optimized case, any number of Landscape layers are supported, provided there are sufficient Texture Samplers available.</p> <p>You can now use the Feature Level Switch Material nodes in Landscape Materials enabling you to <strong>create a single Landscape Material for all platforms</strong>.</p> <p style="text-align: center;"><img height="354" src="https://dnnrz1gqa.blob.core.windows.net/portals/0/Images/Builds/4_20/MobileLandscape_01.gif?sv=2017-04-17&amp;sr=b&amp;si=DNNFileManagerPolicy&amp;sig=Jl5EwXNq%2BSaAR%2BMaRfqAhzLPbdD7POnGIvr0GCLKzz4%3D" style="border: none; transform: rotate(0.00rad); -webkit-transform: rotate(0.00rad);" width="654" /></p> <p style="text-align: center;"><em>1 - Mobile Landscape; 2 - PC Landscape</em></p> <p id="mobileImprovements"><strong>Miscellaneous Mobile Improvements</strong> </p> <p>The following improvements were made to ship Fortnite on mobile and brought into Unreal Engine 4.20 to benefit all developers:</p> <ul> <li>Minimum Static Mesh LOD per platform</li> <li>Minimum Skeletal Mesh LOD per platform</li> <li>Hardware occlusion improvements</li> <li>HLOD tools and workflow optimizations</li> <li>Audio quality node</li> <li>Audio variation culling</li> <li>Audio downsampling  per platform</li> <li>Audio compression quality per platform</li> <li>Shading model tweaks to better match PC</li> <li>Reflection capture brightness fix</li> <li>Landscape support for four layers</li> <li>Landscape tessellation improvements</li> <li>No memory cost for unused LODs, including: <ul> <li>Static Meshes</li> <li>Skeletal Meshes</li> <li>Material quality levels</li> <li>Grass and foliage</li> <li>High detail components and meshes</li> <li>High detail emitters in Cascade</li> </ul> </li> <li>Settings based on device memory</li> <li>Material memory reduction</li> <li>Editor scriptability for bulk asset changes</li> <li>Particle component pooling</li> <li>Material parameter collection update cost</li> </ul> <h2><strong>New: Optimizations and Improvements for Shipping on Nintendo Switch</strong></h2> <p>We have significantly improved Nintendo Switch development by releasing <strong>tons of performance and memory improvements built for Fortnite on Nintendo Switch to all Unreal Engine developers</strong>!</p> <p style="text-align: center;"><img height="524" src="https://dnnrz1gqa.blob.core.windows.net/portals/0/Images/Builds/4_20/BR04_Social_SwitchAnnounce01.jpg?sv=2017-04-17&amp;sr=b&amp;si=DNNFileManagerPolicy&amp;sig=WtB%2FnttakcEqyGuEZjU2HOJSPQXTkyJlPYeBIo5hX7E%3D" style="border: none; transform: rotate(0.00rad); -webkit-transform: rotate(0.00rad);" width="931" /></p>   This includes the following: <ul> <li>Support for Dynamic Resolution and Temporal Upsampling</li> <li>Low Latency Frame Syncing for Controller Input</li> <li>Significant CPU Rendering Optimizations</li> <li>Improvements to Threading</li> <li>Better Texture Compression</li> <li>Support for Memory Profiling</li> <li>Backbuffer support for 1080p while in docked mode</li> <li>And many other fixes!</li> </ul> <h2><strong>New: Proxy LOD Improvements</strong></h2> <p>The new <strong>Proxy LOD tool</strong> has graduated from “<strong>Experimental</strong>” to <strong>production-ready</strong>! This tool provides performance advantages by reducing rendering cost due to poly count, draw calls, and material complexity which results in <strong>significant gains when developing for mobile and console platforms</strong>. This tool provides an alternative to the third-party package Simplygon and can be used in conjunction with the <strong>Level of Detail (LOD)</strong> systems in Unreal Engine.</p> <p>The Proxy LOD tool produces a simpler representation by creating a proxy in the form of a single low-poly parameterized mesh and associated textures that visually approximate a collection of more complex source geometry models. This proxy can then be displayed at runtime when a reduction in model quality is acceptable - for example, when geometry only occupies a small number of pixels on screen.</p> <p><strong>Note:</strong> The Proxy LOD tool is currently only available in Unreal Editor on Windows.</p> <p style="text-align: center;"><img height="524" src="https://dnnrz1gqa.blob.core.windows.net/portals/0/Images/Builds/4_20/ProxyLOD_OverviewShot_01.jpg?sv=2017-04-17&amp;sr=b&amp;si=DNNFileManagerPolicy&amp;sig=WOn01dIaW%2BFXQQ5b%2FOL%2BJuIp6rc8W8m7tbPv3msNDmM%3D" style="border: none; transform: rotate(0.00rad); -webkit-transform: rotate(0.00rad);" width="931" /></p> <p style="text-align: center;"><em>The above image shows the buildings and parking lots in </em><em><strong>Fortnite Battle Royale</strong></em><em> constructed using the</em><em><strong> Proxy LOD tool </strong></em><em>where both </em><em><strong>Gap-Filling</strong></em><em> and </em><em><strong>Hard-Edge Splitting</strong></em><em> were in use</em><em>. </em></p> <p>The production-ready version of the Proxy LOD tool has several enhancements over the Experimental version found in 4.19. Particularly, improved user control over the Normals on the Proxy Geometry and the ability to generate much simpler proxies by using  gap-filling to automatically close doors and windows.</p> <p><strong>Improved Normal Control : Hard Edge Split Normal</strong></p> <p>The extreme constraints on Fortnite memory usage call for highly efficient uses of LODs. For most proxies, very small base color textures are generated and no Normal map is used, this approach requires the highest possible quality Normals on the proxy mesh itself.  </p> <p style="text-align: center;"><img height="529" src="https://dnnrz1gqa.blob.core.windows.net/portals/0/Images/Builds/4_20/HardEdge_01.gif?sv=2017-04-17&amp;sr=b&amp;si=DNNFileManagerPolicy&amp;sig=ltoYb3mL8Sf0x3itYzJ4HELiTbkHLyMDl%2ByM26hGoig%3D" style="border: none; transform: rotate(0.00rad); -webkit-transform: rotate(0.00rad);" width="582" /></p> <p style="text-align: center;"><em>1 - Hard Edge Angle = 80; 2 - Hard Edge Angle = 0</em></p> <p>The above gif shows the effect of hard-edge splitting for vertex normals. The image 2 shows smooth vertex normals, as calculated in the 4.19 Experimental version of the Plugin -  the dark regions near the bottom of the house are indicative of the shortcomings. Compare this with image 1 which shows hard-edge vertex normal splitting with a user-supplied hard-edge cutoff angle.</p> <p>In addition to the hard-edge cutoff angle, the user may now specify the method used in computing the vertex normal, by selecting between <em><strong>Angle Weighted</strong></em>, <em><strong>Area Weighted</strong></em>, and <em><strong>Equal Weighted</strong></em>.</p> <p><strong>Gap Filling</strong></p> <p>For watertight geometry, the Proxy system automatically discards any inaccessible structures (for example, interior walls or furniture within a closed house). For ideal results, source geometry should be constructed or altered with this in mind, but due to game production constraints that isn’t always feasible. To facilitate the generation of efficient Proxy LODs from source geometry that is <em>nearly watertight</em>, the Proxy LOD tool can optionally use the level set-based techniques of dilation and erosion, to close gaps. The intended use case is primarily doors and windows in distant buildings.</p> <p style="text-align: center;"><img height="496" src="https://dnnrz1gqa.blob.core.windows.net/portals/0/Images/Builds/4_20/GapFill_01.gif?sv=2017-04-17&amp;sr=b&amp;si=DNNFileManagerPolicy&amp;sig=8V3ZtrpM2MNzGP6VyZZwPHf0STLjBt2GvUNMJGsI2Ck%3D" style="border: none; transform: rotate(0.00rad); -webkit-transform: rotate(0.00rad);" width="471" /></p> <p style="text-align: center;"><em>1 - Original Mesh; 2 - No Gap Filling; 3 - Gap Filling</em></p> <p>The above gif shows the effect of using Gap Filling. All images were constrained to use a fixed small amount of texture space. Image 2 is the result of Proxy LOD on a building without using Gap Filling, in which case the LOD includes the interior of the building (at the cost of unseen triangles and texels). Image 3 is the same building with Gap Filling used to automatically close the doors and windows of the buildings, resulting in fewer total triangles and a better use of the limited texture resource.</p> <h2><strong>New: Cinematic Depth of Field</strong></h2> <p>The new <strong>Cinematic Depth of Field (DoF)</strong> enables you to achieve your vision of rendering <strong>cinema quality scenes in a real-time environment</strong>! This new method is designed as a higher-quality replacement for the Circle DoF method and is faster than most other DoF methods, such as Bokeh. With Cinematic DoF, the depth of field effect is cleaner, providing a cinematic appearance with the use of a procedural Bokeh simulation. This new DoF implementation also supports alpha channel, dynamic resolution stability, and includes settings to scale it down for console projects.</p> <p style="text-align: center;"><img height="524" src="https://dnnrz1gqa.blob.core.windows.net/portals/0/Images/Builds/4_20/StarWarsCineDOF_01.png?sv=2017-04-17&amp;sr=b&amp;si=DNNFileManagerPolicy&amp;sig=MzoPtr20f9fBihocKfXqF1kQN5Z982MB0Ia11bccej0%3D" style="border: none; transform: rotate(0.00rad); -webkit-transform: rotate(0.00rad);" width="931" /></p> <table align="center" border="0" cellpadding="10" cellspacing="0"> <tbody> <tr> <td><img src="https://dnnrz1gqa.blob.core.windows.net/portals/0/Images/Builds/4_20/GroundGame_withDOF.png?sv=2017-04-17&amp;sr=b&amp;si=DNNFileManagerPolicy&amp;sig=jnLR0H6KH2c32yQWXnoWjIvqd6TPdwBlDe1L192raXY%3D" style="border: none; transform: rotate(0rad); width: 375px; height: 211px;" /> </td> <td><img src="https://lh6.googleusercontent.com/G6xzzW1cdHXG0PUpVdO03U8W45H14MWnZCIcziGh8F3wgKSEgQ3T5p90iFXmrxztu_-lrqD1qt-kYkqkl9aliqkVX6k7PV2s78xub4zvANNZf25-00G31GtbkFj4Dk2xCEYeybvI" style="border: none; transform: rotate(0rad); width: 375px; height: 211px;" /></td> </tr> </tbody> </table> <p style="text-align: center;"><em>1 - Cinematic Depth of Field enabled; 2 - Depth of Field disabled</em></p> <p>Cinematic Depth of Field is enabled by default and replaces the current selection for the <strong>Circle DoF</strong> method in the <strong>Camera </strong>and <strong>Post Process </strong>settings.</p> <ul> <li>Cinematic DoF supports the following Platforms: <ul> <li>D3D11 SM5, D3D12 SM5, Vulkan SM5, PlayStation 4, Xbox One, and Mac.</li> </ul> </li> <li>The procedural Bokeh simulation supports the following features: <ul> <li>Configuring the number of blades for the Diaphragm.</li> <li>Configuring the curvature of the blades directly with the Lens’ largest aperture (Minimal F-stop).</li> <li>Configurable controls available in the <strong>Camera</strong> settings of the Post Process Volume, Camera Actor, and Cine Camera Actor.</li> </ul> </li> <li>Many customizable scalability settings using r.DOF.* console variables to scale it according to your project needs on hardware with finite resources.</li> </ul> <p>For additional information, please see the <a href="https://docs.unrealengine.com/en-us/Engine/Rendering/PostProcessEffects/DepthOfField/CinematicDOFMethods" target="_blank">Depth of Field</a> documentation.</p> <h2><strong>New: Niagara Visual Effects Editor (Early Access)</strong></h2> <p>The <strong>Niagara visual effects (VFX) Editor</strong> is now available as an <strong>Early Access</strong> plugin! Try out an early access version of the all-new visual effects tool that will eventually replace Unreal Cascade. <a href="https://youtu.be/mNPYdfRVPtM" target="_blank">Watch this GDC talk</a> for a deeper dive on the vision for Niagara.</p> <p><strong>Note:</strong> The early access nature of this feature means that we are far enough along in development that we want to share it with our customers and get as much feedback as possible before it becomes a standard UE4 Feature. Early Access <strong>does not</strong> mean that Niagara is production ready as we still have quite a bit of performance optimization and bug fixing that needs to be done before you can consider using this tool for production. However, we hope that effects developers begin investing in learning Niagara and work with us to make it the best VFX editor that it can be.</p> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/s3qHFgP8t_E" width="100%"></iframe></div> For an overview of Niagara, please watch the GDC 2018 presentation <a href="https://www.unrealengine.com/en-US/events/gdc2018/programmable-vfx-with-unreal-engine-s-niagara" target="_blank">Programmable VFX with Unreal Engine’s Niagara</a> and read the <a href="https://docs.unrealengine.com/en-us/Engine/Niagara" target="_blank">Niagara</a> documentation. <p><strong>Improvements to Effect Design and Creation</strong></p> <p style="text-align: center;"><img height="216" src="https://dnnrz1gqa.blob.core.windows.net/portals/0/Images/Builds/4_20/Niagara_Improvements_01.png?sv=2017-04-17&amp;sr=b&amp;si=DNNFileManagerPolicy&amp;sig=te73x4bv9oKsotnRUMTJm4Os6lRBe7t3DV7OU0kusPc%3D" style="border: none; transform: rotate(0.00rad); -webkit-transform: rotate(0.00rad);" width="931" /><br /> <em>Left - Particle system utilizing dynamic input module; Right - Dynamic input module</em></p> <ul> <li><strong>Skeletal Meshes</strong> can specify their emission from the surface, being driven by either Material name or a named bone influence region.</li> <li>Specifying default values in Modules has been improved, allowing a wide variety of behaviors from calling functions to using default dynamic inputs.</li> <li>Mesh particles now support Angular Velocity.</li> <li>Beams support has been added to the Ribbon renderer with new corresponding Modules.</li> <li>Dependencies between Modules can now be defined, enabling the user to be informed when they are putting the stack in a bad configuration. Also, users are being given options to <strong>auto-fix</strong>.</li> <li>Multiple improvements have been made to merging System Emitters and Base Emitters, enhancing overall stability.</li> <li>Modules can now be moved up and down the stack via drag-and-drop. Inherited Modules cannot be moved because doing so complicates merging.</li> <li>Modules can now be enabled/disabled within the stack. This will also work for inheritance.</li> <li><strong>Sequencer </strong>and <strong>Blueprint </strong>support for setting Niagara User Namespace variables has been added.</li> <li>You can drive parameters by custom <strong>HLSL Expressions</strong>, <strong>Dynamic Inputs (graph snippets)</strong>, links to other <strong>variables</strong>, or <strong>by</strong> <strong>value</strong>.</li> <li>Optionally, particles can now have a <strong>Persistent ID, </strong>which is guaranteed to be unique for that emitter.</li> <li>Multiple renderers of each type can be applied to an emitter. Each instance can adjust where it gets the values for a given parameter. For example, an emitter could have two sprite renderers, one pulling its position from a particle’s position and the other pulling its position from a particle’s offset position.</li> <li>The <strong>Niagara Extras</strong> <strong>Plugin</strong> also contains a debug Material that routes various per-particle parameters to a dialog-like display.</li> <li>Houdini has provided a simple CSV importer to Niagara, enabling demo content for GDC 2018.</li> <li>A wide variety of functionality for Niagara has been added under the <strong>Automated Testing</strong> system.</li> </ul> <p><strong>Updated User Interface</strong></p> <p>The Niagara interface has been designed to be make complex effects intuitive to create. It uses a stack metaphor as its primary method of combining pieces of script logic together. Inside of the stack, you will find a <strong>Timeline</strong> to control aspects of the effect over time<strong>, </strong>a <strong>Parameters Panel</strong> for easy access to variables available in the effect<strong>, </strong>and a<strong> </strong><strong>Attribute Spreadsheet</strong> to quickly find and react to information as the effect is running. </p> <p style="text-align: center;"><img height="522" src="https://dnnrz1gqa.blob.core.windows.net/portals/0/Images/Builds/4_20/NiagaraUI_01.png?sv=2017-04-17&amp;sr=b&amp;si=DNNFileManagerPolicy&amp;sig=00iF04lvefOqHT%2Bffa41kKsF12hPJ12AVy1h5ztV8u4%3D" style="border: none; transform: rotate(0.00rad); -webkit-transform: rotate(0.00rad);" width="929" /></p> <p><strong>New Modules</strong></p> <p>All of Niagara’s Modules have been updated or rewritten to support commonly used behaviors in building effects for games and adhere to a consistent set of coding standards. New UI features have also been added for the Niagara stack that mimic the options developers have with UProperties in C++, enabling inline enable/disable or variable display based on the state of another variable. </p> <p><strong>GPU Simulation</strong></p> <p>Niagara now has support for <strong>GPU </strong>Simulation when used on <strong>DX11</strong>,<strong> PS4</strong>, <strong>Xbox One</strong>, <strong>OpenGL (ES3.1)</strong>, and <strong>Metal</strong> platforms. There are plans  for <strong>Vulkan </strong>and <strong>Switch </strong>to support GPU Simulation in a future release. Current limitations and known issues with GPU simulation are described below:</p> <ul> <li>Full support for Niagara requires the ability to read-back data from the GPU. Currently only our <strong>DX11 </strong>and <strong>PS4</strong> rendering interfaces support this functionality, and <strong>OpenGL </strong>and <strong>Metal </strong>are in progress.</li> <li><strong>Collision</strong>, <strong>Curves</strong>, and <strong>Curl Noise Fields</strong> are supported on the GPU. <strong>Meshes</strong>, <strong>Skinned Meshes</strong>, <strong>Spline Components</strong>, and more specialized data interfaces are not yet supported. The API for GPU shaders to interact with <strong>UNiagaraDataInterfaces </strong>has been redesigned as well.</li> <li><strong>Sprite </strong>and <strong>Instanced Static Mesh </strong>rendering from particles is supported on GPU simulations. At this time, <strong>Light Generation</strong> from Particles and Ribbons from Particles do not work on the GPU.</li> <li>Events only work on the CPU and will be undergoing significant changes after Unreal Engine 4.20.</li> </ul> <p><strong>CPU Simulation & Compilation</strong></p> <p>Niagara <strong>CPU </strong>Simulation now works on <strong>PC, PS4, Xbox One, OpenGL (ES3.1)</strong> and <strong>Metal</strong>. At this time, <strong>Vulkan </strong>and <strong>Switch </strong>are <strong>not </strong>supported.</p> <ul> <li>The CPU virtual machine (VM) now compiles its contents to the DDC on a background thread, significantly improving overall compilation speed and team efficiency. Further work is required to make the final and expensive VM optimization step occur in ShaderCompileWorker because it depends on non-thread safe libraries. Compilation dependencies are properly tracked across Modules, clearly identifying when we need to recompile certain parts of the stack.</li> <li>Physics simulation on the CPU should properly model the <strong>Physics Material</strong> values for friction and restitution (bounciness).</li> <li>Emitters will now simulate in parallel on worker threads.</li> </ul> <h2><strong>New: Digital Human Improvements</strong></h2> <p>As part of Epic’s character explorations to develop Digital Humans that started with the <a href="https://docs.unrealengine.com/en-us/Resources/Showcases/PhotorealisticCharacter" target="_blank">Photorealistic Character</a> bust, many rendering improvements have been made to develop realistic believable characters that come to life. </p> <p style="text-align: center;"><img alt="MeetMike_Image2.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Funreal-engine-4-20-released%2FMeetMike_Image2-1920x1080-eb35c371772fe8455f48cfce50dc42db307cadf1.jpg" width="100%" /></p>   While developing these characters, the following rendering improvements have been made for Skin, Eyes, Lighting, and Subsurface Scattering. <ul> <li>Added a new Specular model with the Double Beckman Dual Lobe method.</li> <li>Light Transmission using Backscatter for Subsurface Profiles.</li> <li>Better contact shadowing for Subsurface Scattering with Boundary Bleed Color.</li> <li>Short Distance Dynamic Global Illumination through Post Process Materials.</li> <li>Added detail for eyes using a separate normal map for the Iris.</li> </ul> <p>For additional information, see <a href="https://docs.unrealengine.com/en-us/Resources/Showcases/DigitalHumans" target="_blank">Digital Humans</a>.</p> <h2><strong>New: Rectangular Area Lights</strong></h2> <p>Rectangular Area Lights enable you to make more realistic lighting for environments containing large light sources, such as fluorescent overhead lights, televisions, lit signs, and more! Rectangular Area Lights are accessible from the Modes panel along with the other light types.</p> <p style="text-align: center;"><img height="517" src="https://dnnrz1gqa.blob.core.windows.net/portals/0/Images/Builds/4_20/AreaLight_01.jpg?sv=2017-04-17&amp;sr=b&amp;si=DNNFileManagerPolicy&amp;sig=mY1lHCAqnQz9YGmOJG3arvfviWraQ9YQmjPISJsgFl8%3D" style="border: none; transform: rotate(0.00rad); -webkit-transform: rotate(0.00rad);" width="926" /></p> <ul> <li>Currently only supports the Deferred Renderer.</li> <li>Acts mostly like a Point Light, except it has Source Width and Height to control the area emitting light.</li> <li>Static and Stationary mobility shadowing works like an area light source with Moveable dynamic shadowing, currently working more like a point light with no area.</li> </ul> <p>Performance Considerations: </p> <ul> <li>More expensive overall than Point or Spot Lights with the dominant cost being incurred when movable and casting shadows. Shadowing generally has the same cost.</li> <li>Stationary Light mobility or Non-Shadow Casting lights can be double the cost with cost scaling depending on the platform being used. If you’re using Static Lights, the cost is free.</li> </ul> <h2><strong>New: Mixed Reality Capture (Early Access)</strong></h2> <p><strong>Create compelling spectating experiences for mixed reality applications</strong> using the new Mixed Reality Capture functionality, which makes it easy to <strong>composite real players into a virtual play space</strong>!</p> <p style="text-align: center;"><img height="409" src="https://dnnrz1gqa.blob.core.windows.net/portals/0/Images/Builds/4_20/MixedReality_03.gif?sv=2017-04-17&amp;sr=b&amp;si=DNNFileManagerPolicy&amp;sig=4KPTCZY1NcBzb%2Bu5wReQRsdpDDytBVKvrJl3x6ckBSI%3D" style="border: none; transform: rotate(0.00rad); -webkit-transform: rotate(0.00rad);" width="726" /></p> <p>The early access Mixed Reality Capture support has three components: video input, calibration, and in-game compositing.  We have a list of supported webcams and HDMI capture devices that enable you to pull real world green screened video into the Unreal Engine from a variety of sources.  If you have a Vive Tracker or similar tracking device, Mixed Reality Capture can match your camera location to the in-game camera to make shots more dynamic and interesting. Setup and calibration is done through a standalone calibration tool that can be reused across Unreal Engine 4 titles. Once you set up your filming location, you can use it across all applications.</p> <p>While feature support is in early access, we’re looking forward to getting feedback as we continue to improve the system. More information about Mixed Reality Capture setup can be found in the Mixed Reality Development <a href="https://docs.unrealengine.com/en-us/Platforms/MR" target="_blank">documentation</a>. </p> <h2><strong>New: nDisplay Flexible, Multi-Display Rendering</strong></h2> <p>Effortlessly <strong>create video walls for large visualization installations</strong> using the new <strong>nDisplay</strong><em><strong> </strong></em>system! Automatically launch any number of Unreal Engine instances -  locked firmly together, with deterministic content and frame-accurate time synchronization - across any number of host computers, each instance driving its own projector or monitor display. Use active or passive stereoscopic rendering to enhance the viewer’s sense of immersion in the 3D scene and built-in VRPN support to drive the system from mobile VR controllers. </p> <p style="text-align: center;"><img height="699" src="https://dnnrz1gqa.blob.core.windows.net/portals/0/Images/Builds/4_20/nDisplay_01.JPG?sv=2017-04-17&amp;sr=b&amp;si=DNNFileManagerPolicy&amp;sig=ZnFwypFbSJlLpeS7sl4pgIcyXqMcMr0Hc3qmPZvd6ps%3D" style="border: none; transform: rotate(0.00rad); -webkit-transform: rotate(0.00rad);" width="931" /></p> <p>For more information, please see the <a href="https://docs.unrealengine.com/en-us/Engine/Rendering/Rendering-to-Multiple-Displays-with-nDisplay" target="_blank">documentation</a>.</p> <h2><strong>New: Submix Audio Recording</strong></h2> <p>In the new audio engine, we’ve added the ability to record Engine output - or any individual Submix’s output - to a *.wav file or SoundWave Asset.</p> <p style="text-align: center;"><img height="436" src="https://dnnrz1gqa.blob.core.windows.net/portals/0/Images/Builds/4_20/AudioSubmixRN_1.png?sv=2017-04-17&amp;sr=b&amp;si=DNNFileManagerPolicy&amp;sig=1IaybOI%2B96R0BM%2Bbr1Vap33FhGFMRsxkgi%2BorzsabQ8%3D" style="border: none; transform: rotate(0.00rad); -webkit-transform: rotate(0.00rad);" width="725" /></p> <p style="text-align: center;"><em>Exporting Submix output to a SoundWave Asset.</em></p> <p style="text-align: center;"><img height="395" src="https://dnnrz1gqa.blob.core.windows.net/portals/0/Images/Builds/4_20/AudioSubmixRN_2.png?sv=2017-04-17&amp;sr=b&amp;si=DNNFileManagerPolicy&amp;sig=h5PqNFbjQcbPIvHApOkycLrqCV0VDMJFXLCJLuuNcGI%3D" style="border: none; transform: rotate(0.00rad); -webkit-transform: rotate(0.00rad);" width="729" /><br /> <em>Exporting Submix output to a *.wav file.</em></p> <h2><strong>New: Shared Skeletal Mesh LOD Setting </strong></h2> <p><strong>Set LOD settings once and reuse them across multiple Skeletal Mesh assets</strong> using the new LOD Settings asset! Inside the Asset Details panel for a Skeletal Mesh, under LOD Settings, you can now select an LOD Settings asset to use, or you can generate a new asset based on the current settings.</p> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/oivIYM7PimE" width="100%"></iframe></div> <p>Please see the <a href="https://docs.unrealengine.com/en-US/Engine/Animation/Persona/MeshDetails#sharinglodsettings" target="_blank">Sharing LOD Settings</a> section of the Skeletal Mesh Asset Details page for more information.</p> <p>You can also assign the LOD setting and regenerate LODs from Blueprint using a Blutility. </p> <h2><strong>New: Streaming GeomCache and Improved Alembic importer (Experimental)</strong></h2> <p>We continue to make stability and performance improvements to the geometry cache system, as noted in the following:</p> <ul> <li>Individual vertex animation frames are now compressed using an intra-frame codec based on Huffman encoding. Compressed data is streamed from disk, enabling playback of longer sequence with a low amount of memory overhead. <em>The new implementation is still very experimental and is not ready for use in production</em></li> <li>The Alembic importer has been changed to iteratively import frames rather than importing all frames in bulk. This should improve the PCA pipeline and overall stability and speed.</li> </ul> <h2><strong>New: Scripted Extensions for Actor and Content Browser Context Menu</strong></h2> <p><strong>Easily create in-context tools and workflow enhancements without writing a line of code</strong> by extending the context menus for Actors and Content Assets in the Browser using Blueprint Utilities, or Blutilities. </p> <ul> <li>Create a new Blutility using one of the new parent classes - AssetActionUtility (for Content Browser extensions) or ActorActionUtility (for Actor extensions).</li> <li>You can specify what types of Actors or Assets the actions apply to with the GetSupportedClass function.</li> <li>Add logic in events (or functions) with no return value, marking them as “Call In Editor” so they show up in the context menu, and a pop-up dialog will display when the event is triggered to fill in values for any parameters you define on your events</li> </ul> <h2><strong>New: Animation Retarget Manager Improvements</strong></h2> <p>Animation Retarget Manager now supports saving and loading of the mapping data, so you can <strong>save and reuse mapping data on multiple meshes</strong>. You can also quickly save multiple rig data for different animations and reuse them with this feature.</p> <p style="text-align: center;"><img height="536" src="https://dnnrz1gqa.blob.core.windows.net/portals/0/Images/Builds/4_20/Retarget_01.gif?sv=2017-04-17&amp;sr=b&amp;si=DNNFileManagerPolicy&amp;sig=8DHUiVsf6KlbNXa56AsSTII3Bc0iUktdVShU9tphMuY%3D" style="border: none; transform: rotate(0.00rad); -webkit-transform: rotate(0.00rad);" width="679" /></p> <p>Please see the <a href="https://docs.unrealengine.com/en-us/Engine/Animation/Persona/BasePoseManager#setuprig" target="_blank">Retarget Manager</a> page for more information.</p> <h2><strong>New: RigidBody Anim Node Improvements</strong></h2> <p>You can now have movement on simulated bodies when moving the Skeletal Mesh Component around in the world when using ‘Local Space’ simulation, which offers greater stability for your simulation. We have now added some options to look at the linear velocity and acceleration of the component in world space, and apply them (scaled and clamped) to the local space simulation. </p> <p>We also added the option for any joint to be the base of simulation, and added support for dynamics to easily be reset.</p> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/zldxTJ6NPgw" width="100%"></iframe></div> <h2><strong>New: Clothing Improvements</strong></h2> <p>Physics Assets now support tapered capsules for collision in clothing simulation.<br /> <iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/fyUK7NjBnec" width="100%"></iframe></p> <p><strong>Note</strong>:These are not supported for collisions in rigid body simulations. </p> <p>You can also now copy Skeletal Mesh vertex colors to any selected Clothing Parameter Mask. </p> <h2><strong>New: Garbage Collection Improvements</strong></h2> <p>Garbage collection performance has been optimized <strong>reducing some operations by as much as 13x</strong>! Specifically, we made the following improvements:</p> <ul> <li>The “Mark” phase has been optimized and is now multithreaded. On machines with multiple cores, the cost of marking Objects as unreachable has been <strong>reduced from 8 ms to 0.6 ms for approximately 500,000 Objects</strong>.</li> <li>The “BeginDestroy” phase (unhashing Objects) now runs across multiple frames, using <strong>no more than 2 ms per frame</strong>. The cost of unhashing Objects will no longer be included in the same frame as the “Mark” phase and reachability analysis.</li> <li>Garbage Collection assumption verification, which runs in development builds, now uses the same multithreaded code as reference-gathering. As a result, development builds will see an improvement in Garbage Collection times. In Epic&#39;s tests, sample timings for about 500,000 Objects <strong>reduced from over 320 ms to under 80 ms</strong>.</li> </ul> <h2><strong>New: Visual Studio 2017</strong> </h2> <p>UE4 now uses the Visual Studio 2017 compiler, and the Engine will generate project files for Visual Studio 2017 by default. Visual Studio 2015 is still being supported, but <a href="https://docs.unrealengine.com/en-us/Programming/Development/VisualStudioSetup" target="_blank">requires some configuration</a>. Additionally, we’ve added support for the Windows 10 SDK. </p> <p><strong>Note:</strong> Visual Studio 2017 supports the installation of multiple compiler versions side-by-side.</p> <p>See our <a href="https://docs.unrealengine.com/en-us/GettingStarted/RecommendedSpecifications" target="_blank">Hardware & Software Specifications</a> for more information.</p> <h2><strong>New: Development Streams on GitHub</strong></h2> <p>Unreal Engine development streams are now updated live on <a href="https://github.com/EpicGames/UnrealEngine" target="_blank">GitHub</a>. If you want the latest version of development code, you can now pull these streams directly, without waiting for Epic to merge changes from the development teams into our main branch. Note that these streams are live, and have not been vetted by our QA team, which is typically the case in our binary releases or in the main branch. </p> <p>To learn more, check out our <a href="https://www.unrealengine.com/en-US/blog/development-branches-now-available-on-github" target="_blank">blog post</a>.</p> <h2><strong>New: UMG Safe Zone Improvements</strong> </h2> <p>The Screen Sizes you select in UMG and Play-In-Editor (PIE) settings are now linked with Device Profiles, which also takes into account the Mobile Content Scale Factor, meaning that the final resolution and DPI scale will change based on the device screen size selected. </p> <p style="text-align: center;"><img height="426" src="https://dnnrz1gqa.blob.core.windows.net/portals/0/Images/Builds/4_20/SafeZones_01.png?sv=2017-04-17&amp;sr=b&amp;si=DNNFileManagerPolicy&amp;sig=D2%2B9X65BFzfKDbsy%2BzYNszxoEquPrwRZn6mambtWcs4%3D" style="border: none; transform: rotate(0.00rad); -webkit-transform: rotate(0.00rad);" width="926" /></p> <table align="center" border="0" cellpadding="1" cellspacing="1" style="width:770px;"> <tbody> <tr> <td><img src="https://dnnrz1gqa.blob.core.windows.net/portals/0/Images/Builds/4_20/SafeZones_UMGDisabled_01.png?sv=2017-04-17&amp;sr=b&amp;si=DNNFileManagerPolicy&amp;sig=p%2BaCkXm6gjM98jODoVCey4964jhENJrXudWhsTZTa5I%3D" style="border: none; transform: rotate(0rad); width: 375px; height: 215px;" /> </td> <td><img src="https://dnnrz1gqa.blob.core.windows.net/portals/0/Images/Builds/4_20/SafeZones_UMGEnabled_01.png?sv=2017-04-17&amp;sr=b&amp;si=DNNFileManagerPolicy&amp;sig=SNHC2qBVY%2FT6ez8pc8uCyyV1L%2BR6YTtzkfSZw6CbxRA%3D" style="border: none; transform: rotate(0rad); width: 375px; height: 215px;" /></td> </tr> </tbody> </table> <p>The following improvements have been made for UMG Safe Zone workflow:</p> <ul> <li>Safe Zone previewing is now automatically enabled for Debug Title Safe Zone when using a value less than 1 to test screen sizes for TVs and Monitors.</li> <li>Using the command r.MobileContentScaleFactor works to scale phone and tablet resolutions in UMG previews and PIE modes.</li> <li>Non-Uniform safe zones are now supported for devices like the iPhoneX, where parts of the screen are inaccessible.</li> <li>Safe Zones, Scale Boxes, and Common Border Widgets react correctly to non-uniform safe zones and UMG Designer sizes.</li> <li>UMG now displays the selected device, its screen size, and uniform scaling factor for easy reference in the Designer Graph.</li> <li>Use r.MobileContentScaleFactor to scale phone and tablet resolutions in UMG and PIE modes.</li> </ul> <p>For additional information, see <a href="https://docs.unrealengine.com/en-us/Engine/UMG/UserGuide/UMGSafeZones" target="_blank">UMG Safe Zones</a>.</p> <h2><strong>New: Curve Atlases in Materials</strong></h2> <p>Materials can now use a Curve Atlas asset to store and access linear color curve data with additional support provided through Blueprint. The Curve Atlas uses the same linear curve color as before, except you can use as many linear color curves as the size of your specified Atlas.  </p> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/RlEDza_3TC4" width="100%"></iframe></div> <p>To create a new Curve Atlas, use the Content Browser to select<strong> Add New > Miscellaneous </strong>and select <strong>Curve Atlas</strong>.</p> <p>When you open a Curve Asset Editor, you’ll be able to adjust the Hue, Saturation, Brightness, Vibrance, and Alpha clamps of any individual curve. Additionally, the Preview thumbnails in the Content Browser will display the gradient set by the curve.</p> <p>For additional information, see <a href="https://docs.unrealengine.com/en-us/Engine/Rendering/Materials/CurveAtlasesInMaterials" target="_blank">Curve Atlases in Materials</a>.</p> <h2><strong>New: Mesh Description Mesh Format</strong> </h2> <p>UE4 is moving to a new higher-level intermediate format which can represent any type of mesh asset in the Engine. This is a gradual process that will improve workflow and enable us to provide some great new features. </p> <p>The goal of moving to a new mesh format is:</p> <ul> <li>All meshes (Static, Skeletal, and potential other mesh-like objects such as terrain and BSP) can have the same internal representation with some interchangeability, to a certain degree.</li> <li>Most UE4 geometry tools will work on any type of mesh based on the geometry format.</li> <li>Any mesh using the new format can be examined and modified using a standard API enabling runtime, native or scripted modification, opening up many possibilities for procedurally generated content.</li> <li>Meshes will be imported directly to the format with the ability to preserve higher-level mesh representations, such as quads or edge hardness. Currently, these are lost when importing a Static or Skeletal Mesh.</li> <li>The new mesh format is structured internally so that modifications can be made in real-time, even to the most complicated meshes. This forms the basis of a work-in-progress mesh editing feature, which is also scriptable, that will be developed for a future release.</li> </ul> <p>In this release, only Static Mesh has been converted to use the new mesh format. Users will not notice any difference to their everyday workflow and the assets themselves will not change. Currently, the new data is automatically created from the old format and cached in the DDC.</p> <h2><strong>New: Label Saved Colors in Color Picker</strong></h2> <p>Colors saved in your Theme Bar or Theme Menu can now have labels for identification purposes! Labels can easily be set by right-clicking the saved color swatch and entering a name for the saved color.</p> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/qCtrqlFCYHA" width="100%"></iframe></div> <p>For additional information, see <a href="https://docs.unrealengine.com/en-us/Engine/UI/ColorPicker" target="_blank">Color Picker</a>.</p> <h2><strong>New: Recently Opened Filter in Content Browser</strong></h2> <p>Quickly find recently viewed Assets in the Content Browser using the new <strong>Recently Opened</strong> filter! This filter lists the 20 most recently opened assets.</p> <p style="text-align: center;"><img height="520" src="https://dnnrz1gqa.blob.core.windows.net/portals/0/Images/Builds/4_20/RecentlyOpenedFilter.png?sv=2017-04-17&amp;sr=b&amp;si=DNNFileManagerPolicy&amp;sig=ySQW%2B90QhnUPqS%2BSWFCx3TR7JBy%2Ba3wiNzAL6uWe3HA%3D" style="border: none; transform: rotate(0.00rad); -webkit-transform: rotate(0.00rad);" width="928" /> </p> <p>You can find the Recently Opened filter in the <strong>Filters</strong> list under <strong>Other Filters</strong>. You can change the number of recently opened assets listed in <strong>Editor Preferences > Content Browser</strong> with <strong>Number of Assets to Keep in the Recently Opened Filter</strong>.</p> <p>For additional information, see <a href="https://docs.unrealengine.com/en-US/Engine/Content/Browser/UserGuide/Filters" target="_blank">Content Browser Filters</a>.</p> <h2><strong>New: Shotgun Integration (Early Access)</strong></h2> <p>Streamline your production pipeline using the new Shotgun integration for Unreal Engine 4! </p> <p style="text-align: center;"><img height="504" src="https://dnnrz1gqa.blob.core.windows.net/portals/0/Images/Builds/4_20/Shotgun_01.png?sv=2017-04-17&amp;sr=b&amp;si=DNNFileManagerPolicy&amp;sig=0YXMBF8gRo7YZjEKzNZaEn0HvCkr7z%2FoRcEf6Vy8jc8%3D" style="border: none; transform: rotate(0.00rad); -webkit-transform: rotate(0.00rad);" width="931" /></p> <p>Features include: </p> <ul> <li>It adds the Unreal Editor to your Shotgun launcher, so artists can reliably open the right version of Unreal for the Shotgun project.</li> <li>You can open the Shotgun panel in the Unreal Editor interface, so you can stay up to date with the activity in the Shotgun project as you work.</li> <li>It hooks into the Shotgun loader, so you can easily bring assets into your Unreal Project, and control where they end up in your Content Browser.</li> <li>It even adds Shotgun interaction commands to the contextual menus you get when you right-click Actors in a Level, or assets in the Content Browser.</li> </ul> <p><strong>Note:</strong> We&#39;re working out the last details before we can share our integration on GitHub. Check back soon for updates and documentation!</p> <h2><strong>New: Editor Scripting and Automation Libraries</strong></h2> <p>The <strong>Editor Scripting Utilities</strong> <strong>Plugin</strong> is now available to all Unreal Engine users. This Plugin offers simplified interfaces for scripting and automating the Unreal Editor, working with assets in the Content Browser, working with Actors in the current Level, editing the properties of Static Mesh assets, and more.</p> <p>For details, see <a href="https://docs.unrealengine.com/en-us/Editor/Scripting-and-Automating-the-Editor" target="_blank">Scripting and Automating the Editor</a>.</p> <h2><strong>New: Import Asset Metadata through FBX</strong></h2> <p>When you import an FBX file into Unreal, any FbxProperty data that is saved in that file is now imported as well. You can access this metadata in Blueprint or Python scripts that you run in the Unreal Editor. This can help you customize your own asset management pipelines for Unreal based on information about your assets that comes from your content creation tools. </p> <p>For details, see <a href="https://docs.unrealengine.com/en-us/Editor/Content/FBX/FBX-Asset-Metadata-Pipeline" target="_blank">FBX Asset Metadata Pipeline</a>.</p> <h2><strong>New: Improved Script Access to Static Meshes for LODs and Collisions</strong> </h2> <p>Blueprint and Python scripts that you run in the Unreal Editor can now modify more properties of your Static Mesh assets. This allows you to automate some of the tools offered by the user interface of the Static Mesh Editor. For example:</p> <ul> <li>You can now auto-generate Levels of Detail (LODs) for your geometry, which increases the rendering performance of your scene by using progressively less detailed versions of your geometry as the distance from the camera viewpoint to the geometry increases. See <a href="https://docs.unrealengine.com/en-us/Editor/Content/FBX/FBX-Asset-Metadata-Pipeline" target="_blank">Creating Levels of Detail in Blueprints and Python</a>.</li> <li>You can now auto-generate collision meshes that represent your Static Mesh assets in the physics simulation. See <a href="https://docs.unrealengine.com/en-us/Editor/Scripting-and-Automating-the-Editor/Editor-Scripting-How-Tos/Setting-up-Collision-Properties-in-Blueprints-and-Python" target="_blank">Setting up Collisions with Static Meshes in Blueprints and Python</a>.</li> </ul> <h2><strong>New: Blueprint Bookmarks</strong></h2> <p>The Blueprint Bookmarks feature provides the ability to create named Bookmarks in any function graph in the Blueprint Editor. Bookmarks being created will be listed in a new UI window, where you can click them to restore the position and zoom level of the Viewport (as well as the active tab you were viewing). In addition to the Bookmarks you create, you can also quickly jump to any Comment node in your Blueprint by selecting the comment from a separate list. Bookmarks are stored locally on your machine, so they won&#39;t affect the Blueprints themselves, and syncing content will not overwrite your Bookmarks with those of another user.</p> <h2><strong>New: Blueprint Watch Window</strong></h2> <p>The <strong>Blueprint Watch Window</strong> is designed to speed up debugging by giving you access to the variables and nodes that you want to watch, even across multiple Blueprints. Watch data from every Blueprint that you open in the Editor, and that is part of the current call stack, will be consolidated into a single list, enabling you to inspect variables and function outputs. Also, you can jump between Blueprints with ease. You can click on an entry in the "Node Name" column to go to the named node in any Blueprint, while selecting entries in the "Object Name" column will select the instance of the object associated with that entry. Arrays, Sets, Maps, and other data structures can be expanded, making a drill-down examination of any data they contain quick and convenient.</p> <p style="text-align: center;"><img height="563" src="https://dnnrz1gqa.blob.core.windows.net/portals/0/Images/Builds/4_20/watches.png?sv=2017-04-17&amp;sr=b&amp;si=DNNFileManagerPolicy&amp;sig=sah0Gvw5jz4BVjUyrlQKXwiS9mWP6%2BNprTqBRLBDuMQ%3D" style="border: none; transform: rotate(0.00rad); -webkit-transform: rotate(0.00rad);" width="931" /></p> <h2><strong>New: Navigation System Code Moved to a Module</strong></h2> <p>Most Navigation System-related code has been moved out of the Engine code and into a new Navigation System Module. Game-specific code using navigation system functionality might need to be updated.</p> <p>A<a href="https://epicgames.box.com/s/537wc3x95udak7uqq8wja19fyvbgz7ck" target="_blank"> Python (3.5) script</a> is available to parse your project’s source code and point out lines that need updating. Optionally, the script can perform the changes but make sure to use this option with caution and assisted by a version control system. Script options can be found at the top of the file.</p> <p>Please see the Programming Upgrade Notes section for details on upgrading your project to work with these changes.</p> <h2><strong>New: Improved Mobile Specular Lighting Model</strong></h2> <p><strong>Mobile</strong> <strong>specular response</strong> has been changed to use the <strong>GGX Lighting Model</strong> by default. This improves mobile specular quality and better matches <strong>SM5 </strong>but adds a small cost to shader processing time. </p> <p style="text-align: center;"><img height="433" src="https://dnnrz1gqa.blob.core.windows.net/portals/0/Images/Builds/4_20/MobileSpecular_01.gif?sv=2017-04-17&amp;sr=b&amp;si=DNNFileManagerPolicy&amp;sig=LxPm7b%2BMHevR%2B8b7ULeuBrrdeF2GlpNBrwQl1p1%2Fx58%3D" style="border: none; transform: rotate(0.00rad); -webkit-transform: rotate(0.00rad);" width="800" /></p> <p><em>1 - 4.20 Default GGX Specular; 2 - 4.19 Spherical Gaussian Specular</em></p> <p>The previous <strong>Spherical Gaussian Specular</strong> model is still accessible via the <strong>‘Use legacy shading mode’</strong> project option and can be found under <strong>Rendering</strong> <strong>></strong> <strong>Mobile</strong>.  </p> <h2><strong>New: Mobile Skylight Reflections</strong></h2> <p>The Mobile Renderer now uses a <strong>Skylight Cubemap</strong> for <strong>Specular Reflections</strong> when no <strong>Reflection Captures</strong> are relevant.</p> <p style="text-align: center;"><img height="472" src="https://dnnrz1gqa.blob.core.windows.net/portals/0/Images/Builds/4_20/Reflections_01.gif?sv=2017-04-17&amp;sr=b&amp;si=DNNFileManagerPolicy&amp;sig=b%2FGgJkKf%2BX4sciiFJiEicYM1DsRrzKRQO8SiiYdoMt0%3D" style="border: none; transform: rotate(0.00rad); -webkit-transform: rotate(0.00rad);" width="486" /></p> <p style="text-align: center;">1 - Mobile, no reflection captures ; 2 - PC, no reflection captures</p> <h2><strong>New: Replication Driver / Replication Graph</strong></h2> <p>The Replication Graph Plugin provides a replication system optimized for games with large Actor and player counts. The system works by building a series of customized nodes that can centralize data and computation. These nodes persist across multiple frames and can be shared by client connections, cutting down on redundant CPU work and enabling Actors to be grouped together in nodes based on game-specific update rules. We may make changes to the API, so this is considered Experimental in 4.20, but it is in use in Fortnite Battle Royale and it will be a fully supported feature. </p> <h2><strong>New: Steam Authentication</strong></h2> <p>Steam Authentication has been added! Games can now add a packet handler component that interfaces with Steam’s authentication APIs, enabling them to advertise their servers properly, handle VAC/publisher bans, and provide better validation of clients. If enabled, clients joining a server now have to be authenticated by Steam before being allowed into gameplay. By default, clients who fail authentication are kicked from the server.</p> <h2><strong>Virtual Camera Plugin</strong></h2> <p>New to 4.20, the Virtual Camera Plugin enables a user to drive a Cine Camera in Unreal Engine 4 (UE4) using an iPad Pro in a virtual production environment. With ARKit, a Vive Tracker, or an optical motion capture system such as Vicon or Optitrack, the position and rotation of the iPad is broadcast wirelessly to the PC, with the PC sending video back to the iPad.<br /> <br /> Camera settings such as focal length, aperture, focus distance, and stabilization can be adjusted using touch input. Additionally, the virtual camera can be used for taking high-res screenshots, setting waypoints, recording camera motion and other tasks related to virtual production.</p> <p>On the Learn tab of the Epic Games Launcher under the <strong>Engine Feature Samples</strong> section, there is a <strong>Virtual Camera</strong> project which includes a sample scene and project set up for use with the Virtual Camera Plugin.</p> <p>For more information, please see the <a href="https://docs.unrealengine.com/en-us/Engine/Plugins/VirtualCameraPlugin" target="_blank">Virtual Camera Plugin</a> documentation.</p> <h2><strong>New: Frame Accuracy Improvements for Sequencer</strong></h2> <p>Sequencer now stores all internal time data as integers, enabling robust support of frame-accuracy in situations where it is a necessity. Keys, section bounds, and other data are now always locked to the underlying user-controllable sequence resolution; this can be as fine or as coarse as the use-case demands. Very high resolutions will support greater fidelity of key placement and sub-frames, while reducing overall sequence range.</p> <h2 style="text-align: center;"><strong><img height="243" src="https://dnnrz1gqa.blob.core.windows.net/portals/0/Images/Builds/4_20/FrameAccuracy_01.jpg?sv=2017-04-17&amp;sr=b&amp;si=DNNFileManagerPolicy&amp;sig=VjEd2cbLL2T4ad5Zk3dXR%2BmRcYxeqwSYma8WCsLgg9c%3D" style="border: none; transform: rotate(0.00rad); -webkit-transform: rotate(0.00rad);" width="924" /></strong></h2> <p>Key Updates:</p> <ul> <li>The time cursor in Sequencer is now represented as a block that spans the full range of the currently evaluated Tick, showing very clearly which keys are evaluated and which are not for any given frame.</li> <li>“Force Fixed Frame Interval” playback has been rebranded as “Frame Locked”, setting the Engine max FPS to the Sequence’s display rate, and locking time to whole frame numbers (no sub-frame interpolation)</li> <li>Sub frame evaluation remains fully supported for situations where frame accuracy is not a consideration (such as UMG animation).</li> <li>Various time sources are now supported for runtime evaluation such as the Engine clock (supporting world-pause), audio clock and platform clock.</li> <li>The UI can now be viewed in Non Drop Frame (NDF) Timecode and Drop Frame (DF) Timecode. NDF Timecode is available to all frame rates and directly converts the frame number to hours, minutes, seconds, and remaining frames. DF Timecode is only supported on NTSC Rates (23.976, 29.97, 59.94). The display format can be changed with the Ctrl + T keyboard combination or with the framerate UI menu.</li> </ul> <p>Please see the new <a href="https://docs.unrealengine.com/en-us/Engine/Sequencer/Workflow/SequencerTimeRefactorNotes" target="_blank">Sequencer Time Refactor Notes</a> page for more information.</p> <h2><strong>New: Media Track for Sequencer</strong></h2> <p>Sequencer has a new track for playing media sources. It is like the audio track, but for movies. Simply drag-and-drop a <strong>Media Source</strong> asset into the track view or create a <strong>Media Track</strong> from the <strong>Add Track</strong> menu. This feature currently works best with Image Sequences, especially EXR. Image Sequences in the Media Track will accurately sync frames with rendered output.</p> <h2 style="text-align: center;"><strong><img height="429" src="https://dnnrz1gqa.blob.core.windows.net/portals/0/Images/Builds/4_20/MediaTrack_01.png?sv=2017-04-17&amp;sr=b&amp;si=DNNFileManagerPolicy&amp;sig=1PBi8WiVjOdt87%2FU0xQKDRhyEjZsKOCpzj0uzuxmS40%3D" style="border: none; transform: rotate(0.00rad); -webkit-transform: rotate(0.00rad);" width="586" /></strong></h2> <p>Please see the <a href="https://docs.unrealengine.com/en-us/Engine/Sequencer/HowTo/Using-Media-Tracks" target="_blank">Using Media Tracks</a> page for more information.</p> <h2><strong>New: Sequencer Curve Editor and Evaluation Enhancements</strong></h2> <p>Several enhancements have been made to the <strong>Curve Editor</strong> and <strong>Evaluation </strong>in Sequencer including:</p> <p>Weighted tangents are now supported on float curves.</p> <p style="text-align: center;"><img height="260" src="https://dnnrz1gqa.blob.core.windows.net/portals/0/Images/Builds/4_20/WeightedCurves_03.gif?sv=2017-04-17&amp;sr=b&amp;si=DNNFileManagerPolicy&amp;sig=Xj1PMRbWGgRmCNX3GS4ZSo%2FMSHKF5%2Bl0QkFY7TQSNK8%3D" style="border: none; transform: rotate(0.00rad); -webkit-transform: rotate(0.00rad);" width="926" /></p> <p style="text-align: center;"><em>Using weighted curves in the sequencer curve editor</em></p> <p>Added support for continuous Euler Angle changes when changing rotations. Euler angles are no longer limited to -180,180, which is necessary to avoid flips in animation.</p> <p>You can now turn on Quaternion Rotation on a 3D Transform Section via the track’s Properties menu to utilize quaternion interpolation to smoothly interpolate between two rotations. This is similar to the feature previously available in Matinee.</p> <h2><strong>New: Animating Variables on Anim Instances in Sequencer</strong> </h2> <p>It is now possible to animate variables on <strong>Anim Instances</strong> through possessables, enabling direct control of Anim Blueprint variables, functions and other content. To add an Anim Instance binding to Sequencer, look for its name in the  [+Track] button for Skeletal Animation Components. Any variables that are exposed to cinematics will be shown on its track picker.</p> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/xTEDgK3j8Gg" width="100%"></iframe></div> <p>Please see the <a href="https://docs.unrealengine.com/en-US/Engine/Sequencer/HowTo/ControlAnimInstances" target="_blank">Controlling Anim Instances with Sequencer</a> page for more information.</p> <h2><strong>New: Final Cut Pro 7 XML Import/Export in Sequencer</strong></h2> <p>Sequencer movie scene data can now be exported to and imported from the Final Cut Pro 7 XML format. This can be use to roundtrip data to Adobe Premiere Pro and other editing software that supports FCP 7 XML. You can trim and offset shots in editing software and map those back to sequencer automatically during import.</p> <p><strong>Note:</strong> Audio is not supported at this time.</p> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/lq5uhsZXidc" width="100%"></iframe></div> <h2><strong>New: Sequence Recorder Improvements</strong> </h2> <p><strong>Sequence Recorder</strong> now supports a profile system that is stored in the Persistent Level. Recording profiles enable you to store which actors you wish to record and their settings, as well as the output path to store the recorded data in. Sequence Recorder also now supports recording multiple takes for each of the selected actors.</p> <p>Please see the <a href="https://docs.unrealengine.com/en-us/Engine/Sequencer/Workflow/SequenceRecorder" target="_blank">Sequence Recorder</a> page for more information.</p> <h2><strong>New: Sequencer Track Usability Improvements</strong> </h2> <p>Several updates have been made to improve the usability of <strong>Tracks </strong>within Sequencer. Tracks, Actors and Folders can now be reordered, Event Track names are displayed next to the event keyframe, you can now resize sections to their source duration, you can mask individual transform channels, create Pose Assets from the blended pose and more.</p> <p>Please see the new <a href="https://docs.unrealengine.com/en-us/Engine/Sequencer/HowTo/WorkingWithTracks" target="_blank">Working with Tracks in Sequencer</a> page for more information.</p> <h2><strong>New: Translucency Support for Instanced Stereo Rendering</strong></h2> <p>We’ve taken the improvements to the Instance Stereo Rendering (ISR) path that we made for Robo Recall, and improved them to work across more features in the engine. Unreal Engine 4.20 adds support for performing the translucency rendering pass using Instanced Stereo Rendering, which can significantly reduce CPU cost on translucency-heavy scenes. No content changes are needed; any project with Instanced Stereo enabled in the project settings will automatically get the benefits of Instanced Stereo Rendering.</p> <h2><strong>New: Magic Leap One™ Early Access Support</strong></h2> <p>At GDC, we announced Early Access support for Magic Leap One™: Creator Edition, a software toolkit for early development of experiences for Magic Leap&#39;s personal spatial computing platform, as part of a larger partnership between the two companies. As of Unreal Engine 4.20, you can develop for the Magic Leap One™ using the fully supported release of Unreal Engine. </p> <p style="text-align: center;"><strong><img height="583" src="https://dnnrz1gqa.blob.core.windows.net/portals/0/Images/Builds/4_20/MagicLeap_01.png?sv=2017-04-17&amp;sr=b&amp;si=DNNFileManagerPolicy&amp;sig=YG9X2LEk8EMzOUakdIydUQ1oc6%2FNFd8u7KMeaGEj%2Bsg%3D" style="border: none; transform: rotate(0.00rad); -webkit-transform: rotate(0.00rad);" width="931" /></strong></p> <p>Unreal Engine 4 support for Magic Leap One uses our built in frameworks for things like camera control, world meshing, motion controllers, and forward and deferred rendering. We’ve also added more robust support for features like eye tracking and gestures.</p> <p>Developers can download the Magic Leap software development kit and simulator at <a href="http://developer.magicleap.com" target="_blank">developer.magicleap.com</a>.  For those developers with access to hardware, Unreal Engine 4.20 can deploy and run on the device in addition to supporting Zero Iteration workflows through Play In Editor. </p> <h2><strong>New: Apple ARKit 2.0 Support</strong></h2> <p>We’ve added support for Apple’s ARKit 2.0, which includes better tracking quality, support for vertical plane detection, face tracking, 2D image detection, 3D object detection, persistent AR experiences and shared AR experiences. Support for these new features enables you to place AR objects on more surfaces, track the position and orientation of a face, recognize and bring 2D images to life, detect 3D objects, and facilitate new types of collaborative AR experiences.</p> <h2><strong>New: Google ARCore 1.2 Support</strong> </h2> <p>We’ve added support for Google’s ARCore 1.2, which includes support for vertical plane detection, Augmented Images, and Cloud Anchors. Support for these new features enables you to place AR objects on more surfaces, recognize and bring images to life, and facilitate new types of collaborative AR experiences.</p> <h2><strong>New: Platform SDK Upgrades</strong></h2> <p>In every release, we update the Engine to support the latest SDK releases from platform partners. </p> <p style="text-align: center;"><img height="524" src="https://dnnrz1gqa.blob.core.windows.net/portals/0/Images/Builds/4_20/SDK_420.jpg?sv=2017-04-17&amp;sr=b&amp;si=DNNFileManagerPolicy&amp;sig=gNHj%2B0XJ0UlbH%2FAr0e9NY40m7E%2BEr9vYmWMlrWjR54g%3D" style="border: none; transform: rotate(0.00rad); -webkit-transform: rotate(0.00rad);" width="931" /></p>   <ul> <li><strong>IDE Version the Build farm compiles against</strong> <ul> <li><strong>Visual Studio:</strong>  Visual Studio 2017 v15.6.3 toolchain (14.13.26128) and Windows 10 SDK (10.0.16299.0) <ul> <li>Minimum supported versions <ul> <li>Visual Studio 2017 v15.6</li> <li>Visual Studio 2015 Update 3</li> </ul> </li> </ul> </li> <li><strong>Xcode:</strong>  Xcode 9.4</li> </ul> </li> <li><strong>Android:  </strong> <ul> <li>NDK 12b (New CodeWorks for Android 1r6u1 installer will replace previous CodeWorks for Android 1R5 before release, still on NDK 12b)</li> </ul> </li> <li><strong>HTML5:</strong> Emscripten 1.37.19</li> <li><strong>LInux:</strong> v11_clang-5.0.0-centos7</li> <li><strong>Lumin: </strong>0.12.0</li> <li><strong>Steam:</strong> 1.39</li> <li><strong>SteamVR:</strong> 1.39</li> <li><strong>Oculus Runtime:</strong> 1.25</li> <li><strong>Switch:</strong> <ul> <li>SDK 4.5.0 + optional NEX 4.2.1 (Firmware 4.1.0-1.0)</li> <li>SDK 5.3.0 + optional NEX 4.4.2 (Firmware 5.0.0-4.0)</li> <li>Supported IDE: VS 2015 / 2017</li> </ul> </li> <li><strong>PS4:</strong> <ul> <li>5.508.031</li> <li>Firmware Version 5.530.011</li> <li>Supported IDE: Visual Studio 2015, Visual Studio 2017</li> </ul> </li> <li><strong>Xbox One (XB1, XB1-S, XB!-X):</strong> <ul> <li>XDK: April 2018</li> <li>Firmware Version: April 2018 (version 10.0.17133.2020)</li> <li>Supported IDE: Visual Studio 2017</li> </ul> </li> <li><strong>macOS:</strong> SDK 10.13</li> <li><strong>iOS:</strong> SDK 11</li> <li><strong>tvOS:</strong> SDK 11</li> </ul> To view the full list of release notes, visit our <a href="https://forums.unrealengine.com/unreal-engine/announcements-and-releases/1502911" target="_blank">forum</a> or <a href="https://docs.unrealengine.com/en-US/Builds/4_20" target="_blank">docs</a> pages.communityfeaturesnewsJeff WilsonMon, 16 Jul 2018 13:30:00 GMThttps://www.unrealengine.com/blog/unreal-engine-4-20-releasedhttps://www.unrealengine.com/blog/unreal-engine-4-20-releasedIntel® VR Arena and Unreal Engine: Making a Fan ExperienceWhen <a href="https://www.intel.com/content/www/us/en/homepage.html" target="_blank">Intel</a>® set out to showcase the power of the Intel® Core™ i9 processor for a virtual reality project, they wanted to go big. As in, arena big. Not just with the structure itself, but with its most striking feature - the thousands of enthusiastic fans inside it.<br /> <br /> To show how its processors could handle such a computationally-intensive project, Intel created <a href="https://youtu.be/o6z8o_df-zg" target="_blank">Project Arena</a>, a large virtual facility complete with a playing area, arena seating, lobby, and food court. The experience also includes thousands of virtual 3D fans who not only walk, sit, and move around like a real crowd, but also stand up and cheer at the click of a button.<br /> <br /> <strong>Inside the Virtual Reality Experience</strong><br /> <br /> In the VR experience, visitors can use a virtual dashboard to jump to different spots in the arena, change the colors of fans’ jerseys, make basketballs appear on the court, and even produce fireworks over the playing field. <br /> <br /> But perhaps the best part is the interaction with virtual fans, where you can mill around in a crowd of moving people in real time. The experience includes more than 2500 animated characters engaged in normal arena activities—getting in and out of seats, chatting with friends in the food court, waving, and generally acting like people do at a large venue. The unique motions for each character, coupled with crowd sound effects in each area, make for a truly immersive experience.<br /> <br /> Using a VR display, you can sit next to a virtual person in the stands, get close to the players milling around the sidelines, or position yourself center court and command a standing ovation, complete with a roaring cheer from the entire crowd. <br /> <img alt="Blog-body-img1.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fintel-vr-arena-and-unreal-engine-making-a-fan-experience%2FBlog-body-img1-1640x1000-a4ef93151446faaf8254ef4b370e4bd36f08e47f.jpg" width="100%" /> <div style="text-align: center;">Crowd seated in the VR arena.</div> <br /> <strong>Creating a Multi-Million-Polygon VR Experience</strong><br /> <br /> To make the arena experience, Intel brought together several companies. Design firm <a href="http://www.hok.com/" target="_blank">HOK</a> provided a 3ds Max model of the arena, and <a href="http://www.agilelens.com/" target="_blank">Agile Lens</a> brought it into <a href="https://www.unrealengine.com/en-US/studio" target="_blank">Unreal Engine</a> using Datasmith. <a href="http://www.supersymmetric.com/" target="_blank">Supersymmetric</a> managed the crowd simulation with characters and motions from <a href="https://www.axyz-design.com/" target="_blank">AXYZ design</a>, and <a href="http://gamedevsource.com/" target="_blank">Octopus Games</a> provided custom programming. <a href="https://glimpse-consulting.com/" target="_blank">Glimpse Consulting</a> managed the project, fitting together all the pieces these companies contributed.<br /> <br /> To convert a scene of this scale to Unreal Engine, Datasmith proved to be key to the process. “Converting 3ds Max to Unreal Engine used to be a cumbersome process,” says Alex Coulombe, Creative Director at Agile Lens. “The new workflow using Datasmith is a breath of fresh air.”<br /> <img alt="Blog-body-img2.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fintel-vr-arena-and-unreal-engine-making-a-fan-experience%2FBlog-body-img2-1640x1000-56935f6538558a5fa763bd7296e294900b10f619.jpg" width="100%" /> <div style="text-align: center;">The arena is built to scale with attention to architectural detail.</div> <br /> The arena itself has 6.5 million polygons, with much of the scene originally created in Autodesk Revit then imported to Autodesk 3ds Max. For scene cleanup in 3ds Max prior to export, Agile Lens optimized by combining objects. “There was a ton of small geometry in the scene that it didn&#39;t make sense to instance, for example every chair leg on every bar stool,” says Coulombe. “I combined them to form larger meshes, then instanced these meshes in Unreal Engine.” <br /> <br /> Agile Lens also greatly reduced a large number of near-identical materials by creating a master material in Unreal Engine and instancing it with different textures and parameters. The characters in the scene added another two million polygons. To optimize the geometry for the 2500 animated people in the scene, Agile Lens generated LOD meshes with <a href="http://www.instalod.com/" target="_blank">InstaLOD</a>. <br /> <img alt="Blog-body-img3.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fintel-vr-arena-and-unreal-engine-making-a-fan-experience%2FBlog-body-img3-1640x1000-1100a7faa69b453056ef506b92b5e3d677f806f7.jpg" width="100%" /> <div style="text-align: center;">Virtual 3D crowds moving naturally in public areas make the arena a truly immersive experience.</div> <br /> These optimizations ultimately paid off in the final VR experience. “We were able to get 40 to 90fps, depending on the hardware used,” says Rod Recker, General Manager at Glimpse.<br /> <br /> <strong>Built for Speed</strong><br /> <br /> The team used Unreal Editor to pull together animation, building data, user interface elements, and testing interactions. They also used it to build the dataset and the final executable of the VR experience. <br /> <br /> Coulombe credits his hardware setup as being integral to the efficient processing of such a large dataset. "When using the i9 with the Optane SSD drive, we saw a remarkable productivity increase in our Unreal Engine workflow,” he says. “Loading scenes and importing Datasmith files was lightning fast. Lightmass generated in a fraction of the time as it did with even the most powerful i7."<br />  <br /> The VR dashboard also includes the option to perform a render of any view as a 360-degree panorama with Chaos Group V-Ray. “It was icing on the cake to discover the speed at which we could generate V-Ray 360 renders directly from within the VR experience,” says Coulombe.<br /> <img alt="Blog-body-img4.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fintel-vr-arena-and-unreal-engine-making-a-fan-experience%2FBlog-body-img4-1600x976-0370a63d6a03cabe08ec0305f6812be716637842.jpg" width="100%" /> <div style="text-align: center;">The VR dashboard controls the experience, including the ability to jump to different areas of the arena.</div> <br /> Coulombe also names Datasmith as a great time-saver on the project. “On the projects I’ve used Datasmith with so far, I’ve seen at least a 500% increase in productivity,” he says. “It’s by far the most robust and frictionless workflow for architectural visualization on the market today. The i9 with Optane, coupled with Unreal Engine, turned out to be an even more powerful combination than we anticipated.”<br /> <br /> Coulombe, who has a theater background, appreciates that Datasmith leaves more room for creativity. “With the time I save with Datasmith, I’m able to spend more time focusing on the experience,” Coulombe says. “Once all the materials, lights, geometry and cameras are in Unreal Engine and looking good, I can spend more time experimenting with subtler elements and focusing on making the experience the best it can be.”<br /> <br /> Want to create your own VR experience? Get the free <a href="https://www.unrealengine.com/en-US/studio" target="_blank">Unreal Studio beta</a> today!<br /> <br /> <strong>Tech Specs</strong><br /> <br /> Machines used during production: <ul> <li>Intel® Core™ i9 processor, 3.3 Ghz 10 cores hyperthreaded, 16GB, 960GB Intel® Optane™ SSD</li> <li>Intel® FCore™ i7 processor, 3.60 GHz, 6 cores hyperthreaded, 16GB, 960GB Intel Optane™ SSD</li> </ul> <br /> Software used: <ul> <li>Autodesk® Revit®</li> <li>Autodesk® 3ds Max®</li> <li>Chaos Group® V-Ray®</li> <li>Epic Games® Unreal Engine</li> </ul> enterprisevrdesignnewsKen PimentelFri, 17 Aug 2018 18:00:00 GMThttps://www.unrealengine.com/blog/intel-vr-arena-and-unreal-engine-making-a-fan-experiencehttps://www.unrealengine.com/blog/intel-vr-arena-and-unreal-engine-making-a-fan-experienceBalancing Blueprint and C++When designing the architecture for an Unreal Engine 4 game, one of the primary questions you’ll have to answer is whether gameplay logic and data should be implemented in Blueprints, C++, or both. There is no single answer to this question, as it varies from one project to the next. In fact, a mix of both approaches is often in order for a project.<br /> <img alt="ARPG_Screenshot_02.png" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fbalancing-blueprint-and-c%2FARPG_Screenshot_02-1376x773-de27ec1030fedf636df841e8abfb4aeb2911d8d9.png" width="100%" /><br /> While the question concerning when or where to implement Blueprints or C++ depends on the specific need, what is conclusive is that having access to both is crucial to developing and shipping your project.<br /> <br /> With Blueprints, you can rapidly prototype and ship playable interactive content without touching a line of code. With complete C++ source code, you can study, customize and debug the entire engine, and ship your project without obstruction.<br /> <br /> To help guide you in choosing the best method for your project, we recently published a section of our documentation, called <a href="https://docs.unrealengine.com/en-US/Resources/SampleGames/ARPG/BalancingBlueprintandC-" target="_blank">Balancing Blueprint and C++</a>. On this page, we explain options for implementing gameplay logic, provide a comparison of Blueprint and C++ classes, and discuss performance considerations. You’ll also find information on converting between Blueprints and C++.<br /> <br /> The documentation is part of our section on the new sample <a href="https://docs.unrealengine.com/en-us/Resources/SampleGames/ARPG" target="_blank">Action RPG Game</a> that we provide as a learning tool. Action RPG is a third-person “hack-and-slash” game designed specifically for developer education. <br /> <br /> To learn more about the use of Blueprints and C++ in game development, check out all the resources for <a href="https://docs.unrealengine.com/en-us/Resources/SampleGames/ARPG" target="_blank">Action RPG</a>.blueprintscommunityeducationfeaturesgameslearningnewsprogrammingtutorialsSam DeiterFri, 17 Aug 2018 17:30:00 GMThttps://www.unrealengine.com/blog/balancing-blueprint-and-chttps://www.unrealengine.com/blog/balancing-blueprint-and-cDive Head First into the Wacky World of Iguanabee’s HeadsnatchersWe’ve all seen them, those videos of crazy Japanese game shows making their players do the wackiest of tasks. Taking inspiration from such shows as Takeshi&#39;s Castle, MXC, and AKBingo!, Chilean indie developer Inguanabee decided to make a video game based on the quirky Japanese variety show concept. <br /> <br /> In <a href="https://www.iceberg-games.com/headsnatchers/" target="_blank">Headsnatchers</a> you certainly won’t find anything as crazy (or as gross!) as the AKBingo! <a href="https://youtu.be/r6PecCB6Vs8?t=1m43s" target="_blank">‘Blowing Cockroach’</a> game, but you&#39;ll find there are 25 unique arenas that allow you to do everything from using your opponent’s heads as a bowling balls to flushing their noggins down a toilet. An absolute riot to play with your friends and vividly brought to life with Unreal Engine 4.<br /> <br /> Released into Early Access on July 24, you can hit up <a href="https://store.steampowered.com/app/797410/Headsnatchers/" target="_blank">Steam</a> to take a look for yourself. In the meantime, we had a chance to interview Daniel Winkler, Co-founder of Iguanabee. The Headsnatchers Lead Programmer discusses everything from inspiration to his most effective and favorite tools within the Unreal Engine 4 suite. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/YGEtJ2I0ofU" width="100%"></iframe></div> <strong>IguanaBee is a small but talented indie studio based out of Chile. Tell us what brought the team together and what you hope to achieve as you develop your games.</strong><br /> <br /> We’re hungry to make unique games. That’s the formula that brought us together. In spite of the inherent difficulties of being an indie dev team coming from a Latin country like Chile, we have been working hard and have a huge passion to deliver amazing experiences to our players. We seek to push the limits of our talents and skills with every game we make.<br />  <br /> <strong>Headsnatchers appears to be strongly inspired by Japanese game shows. What can you tell us about this inspiration? Were there any shows particularly inspiring to you?</strong><br /> <br /> In recent years we have been traveling to Japan and we love the country. That inspired us to mix Japanese culture into our games. Indeed, Japanese game and variety shows have been a source of great inspiration for Headsnatchers, especially in terms of the ridiculous tasks they need to perform.<br /> <img alt="Headsnatchers-Screenshot-10.png" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fdive-head-first-into-the-wacky-world-of-iguanabee-s-headsnatchers%2FHeadsnatchers-Screenshot-10-1600x900-4abcfe992694c55811bcf0aa57d2e24ae20d11ff.png" width="100%" /><br /> <strong>Keeping with this Japanese game-show inspiration, how well do you feel this translated into video game form?</strong><br /> <br /> Our main goal was to create a game that would be fun for four players. Japanese game shows are the epitome of fun, and taking inspiration from them opened up a lot of possibilities to create all kinds of crazy situations.  <br />  <br /> <strong>When creating so many varied arenas and games, which Unreal Engine 4 tool was the most useful to you?</strong><br /> <br /> Well, for the levels themselves, <a href="https://docs.unrealengine.com/en-us/Engine/Sequencer/Overview" target="_blank">Sequencer</a> was of great help letting our animators produce interesting in-game intros in a comfortable way. Also, for creating the 100+ unique heads the way we desired (with physical animations), the <a href="https://docs.unrealengine.com/en-us/Engine/Physics/PhAT" target="_blank">PhAT</a> was an extreme help.<img alt="Headsnatchers-Screenshot-2.png" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fdive-head-first-into-the-wacky-world-of-iguanabee-s-headsnatchers%2FHeadsnatchers-Screenshot-2-1600x900-a14997536080ebccecd9ba9bed61e0d6971b00c8.png" width="100%" /><br /> <strong>What was the creation process like when coming up with so many different games and arenas?</strong><br /> <br /> We follow Chef Gusteau’s (of Pixar’s Ratatouille!) philosophy in that "anyone can cook". We have brainstorming sessions where anyone can come up with their own ideas on how to make the game funnier. In those sessions, we received suggestions of new levels and then we work into shaping them into the form you end up playing. Even during the development, if someone comes up with an interesting and fun idea on how to improve a level, we evaluate and potentially implement it.<br />  <br /> <strong>With each game having its own set of rules and logic, did you have to start from scratch on each one or were you able to use some Unreal Engine 4 tricks? </strong><br /> <br /> Thanks to the <a href="https://docs.unrealengine.com/en-us/Engine/Blueprints" target="_blank">Blueprints</a> tool, we were able to reuse a lot of the actors and other classes we created. So, when making a new level, we always contemplate the already-created code and Blueprints, and navigate into reusing them in a smart way whenever possible. <br /> <img alt="Headsnatchers-Screenshot-9.png" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fdive-head-first-into-the-wacky-world-of-iguanabee-s-headsnatchers%2FHeadsnatchers-Screenshot-9-1600x900-91b7ce5388c3418f8a03055c0dc8b618c3622e93.png" width="100%" /><br /> <strong>Did you have a favorite tool from Unreal Engine 4? What was it and why?</strong><br /> <br /> The <a href="https://docs.unrealengine.com/en-us/Engine/Animation/AnimBlueprints" target="_blank">Animation Blueprint</a> is a very complete tool that helped us to focus on what is really important, while allowing us to improve the game by adding cool stuff using its capabilities. The Animation Blueprint is by far better than the animation tool of other engines. Also, the other very useful tool was the <a href="https://docs.unrealengine.com/en-us/Gameplay/DataDriven" target="_blank">Data Tables</a>. Data Tables are a great way to maintain structures in an ordered way, while making it very easy to tweak values without needing to recompile.<br />  <br /> <strong>In the trailer, I noticed mention of winning prizes on the Headsnatchers Show! Is this a component of online play and what can you tell us about it?</strong><br /> <br /> The Headsnatchers Show is a local multiplayer game mode where you are part of a TV show with a host. There, the players compete to win a “car” or “what is inside the mystery box”. Of course, the mystery box allows you to unlock really fun in-game content.<br /> <img alt="Headsnatchers-Screenshot-5.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fdive-head-first-into-the-wacky-world-of-iguanabee-s-headsnatchers%2FHeadsnatchers-Screenshot-5-1600x900-eca8824ce8cf8b7d8e606e8c8953d3668ea0658e.jpg" width="100%" /><br /> <strong>Headsnatchers has released into Early Access, meaning that there&#39;s more on the way before you hit that 1.0 mark. What else do you have in store for players who jump into the game?</strong><br /> <br /> We’re currently working on adding support to more and more levels for the online mode, and improving the game by using the feedback of the people playing it.<br /> <br /> <strong>If you could offer any piece of advice to someone jumping into Unreal Engine 4 for the first time, what would it be?</strong><br /> <br /> Learn about the tools that Unreal Engine 4 provides. They are very complete and strong, so don’t try to reinvent the wheel!<br /> <img alt="Headsnatchers-Screenshot-8.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fdive-head-first-into-the-wacky-world-of-iguanabee-s-headsnatchers%2FHeadsnatchers-Screenshot-8-1600x900-241c9b8b42898a8dcdef3d5f913b9a41a50f3fee.jpg" width="100%" /><br /> <strong>Where can people go to stay on top of everything Headsnatchers and IguanaBee?</strong><br /> <br /> We can be found on <a href="https://steamcommunity.com/app/797410" target="_blank">Steam</a>, <a href="https://www.reddit.com/r/Headsnatchers/" target="_blank">Reddit</a>, <a href="https://twitter.com/HeadsnatchersKO" target="_blank">Twitter</a>, <a href="https://www.facebook.com/headsnatchers/" target="_blank">Facebook</a> and of course our <a href="https://www.iceberg-games.com/headsnatchers/" target="_blank">official website</a>, but the most direct communication can be through <a href="https://discord.io/headsnatchers" target="_blank">our Discord</a>.gamescommunityshowcaseShawn PetraschukThu, 16 Aug 2018 14:25:47 GMThttps://www.unrealengine.com/blog/dive-head-first-into-the-wacky-world-of-iguanabee-s-headsnatchershttps://www.unrealengine.com/blog/dive-head-first-into-the-wacky-world-of-iguanabee-s-headsnatchersUnreal Engine 4 Instructor Guides are Now AvailableWhile summer vacation welcomes a relaxing break from the rigors of teaching, this is also the perfect time for educators to add valuable Epic-approved content to their lectures. With Unreal Engine developers in such high demand throughout the world, more and more academic institutions are looking to bring UE4 into their classrooms. Whether it comes to learning the fundamentals of game development, developing virtual reality games, apps, and movies, or learning Blueprints, Epic has got you covered! Introducing our official <a href="https://www.unrealengine.com/en-US/educators/resources" target="_blank">Unreal Engine Instructors’ Guides</a> - free to use for all educators.<br /> <img alt="educator-blog-header2.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Funreal-engine-4-instructor-guides-are-now-available%2Feducator-blog-header2-1920x960-14b70e0ca7bd6aaef797499dbce49e532777916b.jpg" width="100%" /> <h2><strong>Teach Your Students How to Use Unreal Engine 4</strong></h2> <br /> The Unreal Engine 4 Instructors’ guides were designed for a seamless transition from textbook to lecture as they cover the same tools used by professional developers to ship successful games. Instructors can now access these professionally-authored courses to add to their current teaching materials, all for free! These courses includes lectures, quizzes, and tests. You can start teaching Unreal Engine 4 this coming academic term with these fully-featured Instructors’ Guides. <h2><strong>What’s Inside the Guides?</strong></h2> <img alt="whats-inside-the-guides-image.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Funreal-engine-4-instructor-guides-are-now-available%2Fwhats-inside-the-guides-image-1640x1200-737851b88d899be291d61b67036587420885425d.jpg" width="100%" /><br /> These guides include:<br /> <br /> • <strong>Exercises</strong> to promote active learning and engagement<br /> • Complete <strong>lectures</strong> and <strong>class notes</strong> in PowerPoint that can be added into your own lectures<br /> • Ready to use Unreal Engine 4 <strong>game project files</strong> and <strong>example content</strong><br /> • Summative and formative <strong>assessments,</strong> premade weekly quizzes, midterm and final exams complete with answer keys<br /> • <strong>Unreal Engine terminology section</strong>. A glossary for each unit that covers the need-to-know terminology and concepts to ensure lectures can be taught effectively <h2><strong>How Do I Use the Guides?</strong></h2> <strong>Usage Rights </strong><br /> <br /> Usage rights are under <a href="https://creativecommons.org/" target="_blank">Creative Commons</a> licensing. You are free to use these guides according to these <a href="https://creativecommons.org/licenses/by-nc/4.0/legalcode" target="_blank">licensing terms</a>. For more information please see the Creative Commons Summary of the <a href="https://creativecommons.org/licenses/by-nc/4.0/" target="_blank">license</a>.<br /> <br /> <strong>Incorporate Into Your Own Curriculum</strong><br /> <img alt="incorporate-curriculum-img.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Funreal-engine-4-instructor-guides-are-now-available%2Fincorporate-curriculum-img-1640x1200-f1aba70d35c295a4952e78babfe5276aa5ee874b.jpg" width="100%" /><br /> While using the Unreal Engine 4 Instructors’ Guides, you are free to:<br /> <br /> • Deploy directly into the classroom and/or incorporate into your own curriculum<br /> • Share - copy and redistribute the material in any medium or format<br /> • Adapt - remix, transform, and build upon the material<br />   <h2><strong>The Guides</strong></h2> <div style="text-align: center;"><img alt="Unreal-Engine-VR-CookBook-Original-Cover.png" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Funreal-engine-4-instructor-guides-are-now-available%2FUnreal-Engine-VR-CookBook-Original-Cover-1024x1024-241c19419a1f5dd5ff329b99862da473917da642.png" width="100%" /></div> <strong>Unreal Engine 4 VR Cookbook - Available Now</strong><br /> <br /> This guide is based on the book Unreal Engine VR Cookbook: Developing Virtual Reality with UE4, by Mitch McCaffrey. Featuring 12 lectures, with accompanying quizzes, terminology sheets, and associated project files. Teach the fundamentals of VR development from an Unreal Engine-centric approach. <div style="text-align: center;"><img alt="Game-Development-with-Unreal-Engine-4-Cover.png" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Funreal-engine-4-instructor-guides-are-now-available%2FGame-Development-with-Unreal-Engine-4-Cover-1024x1024-1daafe1f4f075b86c08682a17eff26ce5c0b4765.png" width="100%" /></div> <strong>Game Development with Unreal Engine 4 - Available Now</strong><br /> <br /> Written by Epic Game&#39;s Chris Murphy, this guide includes 12 detailed lectures, and complete teacher&#39;s guide with notes, terminology and sample content. This guide will be very helpful for building your game-development courses.<img alt="Sams-Teach-Yourself-Cover.png" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Funreal-engine-4-instructor-guides-are-now-available%2FSams-Teach-Yourself-Cover-1024x1024-ac12c32a027769b439e82e7f6f8378e8023b545c.png" style="text-align: center;" width="100%" /><br /> <br /> <strong>Teach Yourself Unreal Engine 4 - Available Now</strong><br /> This guide is based on the book, Sams Teach Yourself Unreal Engine 4 Game Development in 24 Hours by Aram Cookson, Ryan DowlingSoka and Clinton Crumpler. The guide features 22 lectures with corresponding quizzes, tests, and content. This guide covers learning Unreal Engine&#39;s fundamental features. <div style="text-align: center;"><img alt="Unity-Cover.png" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Funreal-engine-4-instructor-guides-are-now-available%2FUnity-Cover-1024x1024-52429d6973eb3566e0274b69cfc7c796db81f080.png" width="100%" /></div> <strong>Unreal Engine 4 for Unity Users - Coming Soon</strong> <div style="text-align: center;"><img alt="Unreal-Engine-4-for-Design-and-Visualization-Cover.png" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Funreal-engine-4-instructor-guides-are-now-available%2FUnreal-Engine-4-for-Design-and-Visualization-Cover-1024x1024-5d1b051ed57d9a135189d4b40ca876b41548955d.png" width="100%" /></div> <strong>Unreal Engine 4 for Design and Visualization - Coming Soon</strong><br /> <strong>Unreal Engine 4 Blueprints Guide - Coming Soon</strong><br />  <br /> Visit the <a href="https://www.unrealengine.com/en-US/educators/resources" target="_blank">educator’s resource section</a> of our site to download the available guides today!educationcommunitynewstutorialslearningMelissa RobinsonThu, 16 Aug 2018 14:00:00 GMThttps://www.unrealengine.com/blog/unreal-engine-4-instructor-guides-are-now-availablehttps://www.unrealengine.com/blog/unreal-engine-4-instructor-guides-are-now-availableStream MotionBuilder Animation to Unreal Engine with Live LinkMotionBuilder, known to its users as MoBu, has long been a favorite of character animators, whether working with motion capture data or animating by hand. Now, with the MotionBuilder Live Link plugin, there’s no need to export/import MoBu data to Unreal Engine while you’re working - you can stream MoBu motion capture and animation directly into Unreal Engine and preview it in real time.<br /> <img alt="Blog-share-img.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fstream-motionbuilder-animation-to-unreal-engine-with-live-link%2FBlog-share-img-1200x630-847d20818c2494d5b937f98ad05fb325e8d75a8f.jpg" width="100%" /><br /> The free Live Link plugin has been production-tested and is ready to use with Unreal Engine. “We&#39;ve used the plugin on four of our virtual production shoots in the last couple of months, all of them streaming our mocap into Unreal Engine,” says Richard Graham, CaptureLab Studio Manager at creative house <a href="https://www.framestore.com/" target="_blank">Framestore</a>. “If you&#39;re used to using MotionBuilder with live mocap, it&#39;s very easy; you&#39;re just using the HIK tool to push motion from source to target. It just works.”<br /> <br /> Visualization house <a href="http://thethirdfloorinc.com/" target="_blank">THE THIRD FLOOR</a> uses MotionBuilder Live Link to control multiple Unreal Engine sessions at a time from a single MotionBuilder machine, generating a live composite, projecting accurate shadows, and controlling lighting elements. “The ability to retarget animation and blend clip changes while seeing the high-quality render in Unreal Engine improves the overall animation workflow and turnaround time,” says Addison Bath, Global Head of R&D at THE THIRD FLOOR. “Live Link is a great platform to empower our rapid visualization workflow.”  <br /> <img alt="Blog-body-img2.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fstream-motionbuilder-animation-to-unreal-engine-with-live-link%2FBlog-body-img2-1640x1000-a3854566017935b3cfc933db3768b52920e66fa6.jpg" width="100%" /> <div style="text-align: center;"><em>MotionBuilder Live Link in action</em><br />  </div> The MotionBuilder Live Link plugin works with Unreal Engine 4.19 and above, and is offered free of charge. To get the plugin, <a href="https://github.com/ue4plugins/MobuLiveLink" target="_blank">download MotionBuilder Live Link from GitHub</a>. For information on installing and using the plugin, see the topic <a href="https://docs.unrealengine.com/en-us/Engine/Animation/Live-Link-Plugin/ConnectingLiveLinktoMobu" target="_blank">Connecting Live Link to MoBu</a> in the Unreal Engine documentation.<br /> <br /> Live Link is a framework for which MotionBuilder is just one integration. If you have custom external software or hardware that you want to interface with Unreal Engine, you can write your own plugin that integrates Live Link and get all the same benefits. See our <a href="https://docs.unrealengine.com/en-us/Engine/Animation/Live-Link/Live-Link-Plugin-Development" target="_blank">Live Link Plugin Development Guide</a> for details.communityenterprisefeaturesfilm and televisionnewsprogrammingDavid HibbittsWed, 15 Aug 2018 13:30:00 GMThttps://www.unrealengine.com/blog/stream-motionbuilder-animation-to-unreal-engine-with-live-linkhttps://www.unrealengine.com/blog/stream-motionbuilder-animation-to-unreal-engine-with-live-link3Lateral Reveals Digital Human and Character Performance Advancements in UE4Earlier this year at the Game Developers conference, we partnered with <a href="http://www.3lateral.com/" target="_blank">3Lateral </a>to reveal a stunning development in real-time, human-driven digital characters, featuring Andy Serkis. Powered by Unreal Engine and 3Lateral’s Meta Human Framework, we were able to create both a <a href="https://www.unrealengine.com/en-US/enterprise/blog/epic-games-and-3lateral-introduce-digital-andy-serkis" target="_blank">digital Andy</a> and a digital character Osiris Black using volumetric data capture of Serkis’ "Macbeth" performance, achieving unprecedented levels of fidelity and realism. These demos show how one set of performance capture data can drive the creation of two vastly different digital characters.<br /> <br /> Today 3Lateral has unveiled the next iteration in this story. Now, you can watch as Osiris Black seamlessly morphs into digital Andy while being driven by the same Macbeth performance. 3Lateral’s comprehensive technology framework uses Unreal Engine to power 4D volumetric capture, and empowers much more than just extraction of data for animation – it provides detailed information of the facial deformations and motions to build fully valid animation curves, a next-generation process compared to 2D and 3D facial tracking and solving. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/3kxZPyUPtIc" width="100%"></iframe></div> This new demo illustrates 3Lateral&#39;s Rig Logic and Gene Splicer technologies, components of Meta Human Framework, that enable transition between alien and human digital DNAs. Furthermore, 3Lateral has developed a unique file format for comprehensive and memory-light digital representation of distinct characters. The morph video also shows how multiple different characters along the transition from alien to human are driven with the same animation curves. This is one more proof point of the empowering nature of 3Lateral technologies. <br /> <br /> This gathering of annotated data and level of photorealism have wider implications for machine learning, and can be applicable in entertainment, biometrics, medicine and many other fields. We’re blown away by 3Lateral’s capabilities, and look forward to seeing what they&#39;re able to achieve next.<br />  enterpriseeventsfeaturesnewsfilm and televisionDana CowleyTue, 14 Aug 2018 17:06:04 GMThttps://www.unrealengine.com/blog/3lateral-reveals-digital-human-and-character-performance-advancements-in-ue4https://www.unrealengine.com/blog/3lateral-reveals-digital-human-and-character-performance-advancements-in-ue4Announcing Unreal Engine Online Learning<p>Epic is happy to introduce the <a href="https://academy.unrealengine.com/" target="_blank">Unreal Engine Online Learning</a> platform, the new home for all our training series and video tutorials on Unreal Engine. Our extensive online training illustrates common workflows in a series of detailed yet easy-to-follow videos so you can master Unreal Engine for your own projects.</p> <p><img alt="UEOL_Announce_Hompage_v3.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fannouncing-unreal-engine-online-learning%2FUEOL_Announce_Hompage_v3-1595x826-e700ed33d9e6b7a77dccb73507da9b364ff6778a.jpg" width="100%" /></p> <p>Unreal Engine Online Learning content is split into several tracks:</p> <ul> <li>Game Development</li> <li>Architecture</li> <li>Industrial Design</li> <li>Media and Entertainment</li> </ul> <img alt="UEOL_Pic1.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fannouncing-unreal-engine-online-learning%2FUEOL_Pic1-1600x976-77e76dca28ea1e8f65e40d679de7556ef7878c44.jpg" width="100%" /> <p>Additional tracks sort content by job roles like Designer or Programmer, and each series is labeled with levels from Getting Started to Master Level. Videos are available on demand, and series are broken into short chunks for convenient learning anytime, anywhere.</p> <p>This new platform includes a lot of the great video content you’ve seen on our website in the past, plus dozens of new videos on common workflows, new features, and a whole lot more! The learning platform is open to everyone, and offered free of charge. More content will be added regularly.</p> <p>You can access the Unreal Engine Online Learning platform through the Video Tutorials option under the Learn tab at the Unreal Engine website. <a href="https://academy.unrealengine.com/">Check out the videos</a> and get started on your journey to mastering Unreal Engine!</p> communityeducationenterpriselearningnewstutorialsvisualization and trainingDana CowleyTue, 14 Aug 2018 13:32:52 GMThttps://www.unrealengine.com/blog/announcing-unreal-engine-online-learninghttps://www.unrealengine.com/blog/announcing-unreal-engine-online-learningPorsche, NVIDIA and Epic Games Reveal ‘The Speed of Light’ for Porsche 911 Speedster ConceptIn a breakthrough demonstration today at the SIGGRAPH conference, NVIDIA CEO Jensen Huang revealed "The Speed of Light," a real-time cinematic experience utilizing NVIDIA Turing architecture, RTX technology and new Unreal Engine rendering advancements, featuring the Porsche 911 Speedster Concept. The forward-looking technology unveiled onstage is the culmination of a joint development effort to unlock offline-quality ray-traced rendering in a game engine. <br /> <br /> "The Porsche 911 Speedster Concept is the first car to be visualized with interactive real-time ray tracing," said Francois Antoine, Director of HMI at Epic Games, and creative director and VFX supervisor on the project. "In concert with NVIDIA we’re accelerating the adoption of real-time ray tracing across many industries.”  <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/Z85aPqqJzs0" width="100%"></iframe></div> “NVIDIA RTX technology is purpose-built to provide a generational leap in the quality of real-time computer graphics, and ‘The Speed of Light’ perfectly demonstrates that we are delivering on our promises,” said Tony Tamasi, SVP of Content and Technology at NVIDIA. “With Unreal Engine it’s possible to create physically-accurate, realistic scenes that feature a level of fidelity and detail we’ve previously only been able to imagine.”<br /> <br /> "Porsche&#39;s collaboration with Epic and NVIDIA has exceeded all expectations from both a creative and technological perspective," said Christian Braun, Manager of Virtual Design at Porsche. "The achieved results are proof that real-time technology is revolutionizing how we design and market our vehicles."<br /> <img alt="Speed_of_Light_Hero_1_1920.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fporsche-nvidia-and-epic-games-reveal-the-speed-of-light-for-porsche-911-speedster-concept%2FSpeed_of_Light_Hero_1_1920-1920x873-0006adad2a4df33615d426901e38ae89a7a5d25a.jpg" width="100%" /><br /> Porsche is known for experimenting with the most advanced design techniques, and Braun has always envisioned a pipeline featuring a single toolset at its core. The enabling of real-time ray tracing with ray-traced diffuse global illumination in Unreal Engine has made it possible for Porsche to visualize this latest concept car, which will be made available to consumers. The experimental real-time ray tracing features used to create “The Speed of Light” will be built into a future Unreal Engine 4 release to benefit the entire development community.<br /> <img alt="Speed_of_Light_Silhouette_1920.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fporsche-nvidia-and-epic-games-reveal-the-speed-of-light-for-porsche-911-speedster-concept%2FSpeed_of_Light_Silhouette_1920-1920x873-5aa652de8a3a9dd3228cddb63472a56bfa4f5bbd.jpg" width="100%" /><br /> “When you see the quality of the cinematic, it’s remarkable to note that no baking or lightmaps were required. There is no precomputed lighting—it’s all fully dynamic for both objects and light,” said Antoine. “When we’re designing cars, we have to explore every option from every angle, and having the ability to review fully visualized renders in real time has completely transformed the way we ideate.”<br /> <img alt="Speed_of_Light_Wheel_Detail_1920.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fporsche-nvidia-and-epic-games-reveal-the-speed-of-light-for-porsche-911-speedster-concept%2FSpeed_of_Light_Wheel_Detail_1920-1920x1164-c32bb31fbc81f9882203607ebc1fe5ca5411f3bb.jpg" width="100%" /><br /> Bringing dynamic global illumination and ray-traced lighting to Unreal Engine is a critical development for users who work with very large datasets, and the advancements will benefit creators in game development, filmmaking, architecture, design, manufacturing, AR/VR and simulation. <br /> <br /> “The Speed of Light” demo debuted running on two NVIDIA Quadro RTX cards. New Unreal Engine features demonstrated include: <br /> <br /> ·      Ray-traced translucency<br /> ·      Ray-traced rectangular area light shadows<br /> ·      Ray-traced reflections<br /> ·      Ray-traced diffuse global illumination<br /> ·      Dynamic textured area lights<br /> <br /> “With Turing architecture NVIDIA have shattered the photorealism barrier that current-generation rasterizing techniques have presented until now,” said Kim Libreri, CTO, Epic Games. “Just as we saw with the movie business over a decade ago, ray tracing is going to revolutionize the realism of real-time applications, cinematic experiences and high-end games. Now, we will see artists and designers using Unreal Engine technology to create, view and interact with content that is indistinguishable from reality.”newscommunityenterprisefilm and televisionDana CowleyMon, 13 Aug 2018 22:00:00 GMThttps://www.unrealengine.com/blog/porsche-nvidia-and-epic-games-reveal-the-speed-of-light-for-porsche-911-speedster-concepthttps://www.unrealengine.com/blog/porsche-nvidia-and-epic-games-reveal-the-speed-of-light-for-porsche-911-speedster-conceptThe Future Group and The Weather Channel Create Lightning and Tornadoes with Unreal Engine<em>Weather can be dangerous.</em> That’s what <a href="https://weather.com/tv" target="_blank">The Weather Channel</a> has been telling us for years through videos and stories on their TV channel and website. We’ve been able to witness the most perilous types of weather—hurricanes, tornadoes, tsunamis—through videos, commentary from meteorologists, and abstract maps overlaid with arrows and text.<br /> <br /> While such coverage can provide important information, it doesn’t necessarily give you the full sense of what’s happening and, more importantly, how you can keep yourself safe from lightning, falling power lines, flying debris, and other storm hazards. These aren’t abstract concepts like maps with arrows and text - these are painfully real consequences.<br /> <br /> <strong>Making it Real (or Unreal)</strong><br /> <br /> This is where augmented reality (AR) and mixed reality (MR) come in, combining live footage and photoreal, computer-generated objects (trees, cars, etc.) to create an immersive, three-dimensional experience that can mimic any real-life environment. Additional visuals like signs, arrows, and text can also be added. Such technology can take us inside a storm to move around at will, dodging flood waters, lightning, and flying objects, and spying pop-up safety tips about the hazards. This transforms an abstract conversation into one that drives home the risks and consequences of these life-threatening events.<br /> <img alt="Blog-body-img1.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fthe-future-group-and-the-weather-channel-create-lightning-and-tornadoes-with-unreal-engine%2FBlog-body-img1-1640x1000-123df3fa5ae45a1969c3d6f1a2936dff0fab66a7.jpg" width="100%" /> <div style="text-align: center;"><em>How fast and hot is lightning? Mike Betts tells you all about it while mixed reality shows its effects.</em></div> <br /> To move toward such a future, The Weather Channel has partnered with <a href="https://www.futureuniverse.com/" target="_blank">The Future Group</a> to incorporate mixed reality into their broadcasts. Two videos from this partnership were recently shown on The Weather Channel, the first of many custom experiences that The Future Group will provide to TWC using their Frontier technology powered by Unreal Engine for real-time rendering.<br /> <br /> In the first video, which aired on The Weather Channel in June 2018, TWC Meteorologist Jim Cantore narrates the effects of a virtual tornado that appears to be approaching the studio. As the tornado gets close enough to pull up trees and throw around large objects, a fizzing power line falls at Cantore’s feet, and up pops safety information about avoiding such a hazard. The pi&egrave;ce de r&eacute;sistance comes at the end, when a virtual car is thrown by the tornado and flies into the studio, crashing at Cantore’s feet to billow smoke and dust from its mangled body.<br />   <div style="padding:56.25% 0 0 0;position:relative;"><iframe allowfullscreen="" frameborder="0" mozallowfullscreen="" src="https://player.vimeo.com/video/276077931" style="position:absolute;top:0;left:0;width:100%;height:100%;" webkitallowfullscreen=""></iframe></div> <div style="text-align: center;"><em>Jim Cantore dodges tornado hazards, including a flying car.</em></div> <br /> In a more recent immersive MR video, TWC meteorologist Mike Betts explains why trees explode when struck by lightning. Betts stands just a few feet away from a virtual lightning strike as it replays in slow motion. Photoreal, real-time computer graphics are what make these illusions possible. <br />   <div style="padding:56.25% 0 0 0;position:relative;"><iframe allowfullscreen="" frameborder="0" mozallowfullscreen="" src="https://player.vimeo.com/video/282676985" style="position:absolute;top:0;left:0;width:100%;height:100%;" webkitallowfullscreen=""></iframe></div> <script src="https://player.vimeo.com/api/player.js"></script> <div style="text-align: center;"><em>Mike Betts explains lightning and its dangers.</em></div> <br /> The hallmark of MR, as opposed to traditional technology like compositing, is that the graphics and effects are generated in real time. As the camera moves around, the perspective of computer-generated elements change as well, cementing the illusion that the virtual elements are actually there. Storm-related phenomena that are too dangerous to capture up close on video can be simulated and navigated in real time, and integrated seamlessly with live video and text even with a moving camera.<br /> <br /> While The Weather Channel’s videos could have also been created with traditional rendering and compositing techniques, matching up all the perspectives for live video and pre-rendered sequences would have taken many orders of magnitude longer than it did with real-time mixed reality techniques. Using a real-time process means a change, tweak, or do-over takes minutes instead of weeks, even with the photoreal objects and special effects shown in the videos.<br /> <br /> <strong>The Future of MR</strong><br /> <br /> The Future Group, focused on immersive mixed-reality experiences, anticipates providing The Weather Channel with many hyper-real storms, tornadoes, and hurricanes in the future. <br /> <br /> “We’re changing the landscape of what can be done with broadcast,” says Rob DeFranco, VP Sales and Development at The Future Group. “Mixed reality is providing the ability to tell new stories, and it can be used by any business that wants to engage their audience with more emotional stories.”<br /> <img alt="Blog-body-img2.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fthe-future-group-and-the-weather-channel-create-lightning-and-tornadoes-with-unreal-engine%2FBlog-body-img2-1640x1000-674a8fe94ea798b4d4072d2c0b62dd047464ae1c.jpg" width="100%" /> <div style="text-align: center;"><em>Watch out, Jim! Tornadoes can carry large, heavy objects over great distances.</em></div> <br /> The Future Group, who aims to be the number one provider of real-time visual effects, appreciates the power of Unreal Engine to achieve their goals. “Besides the powerful real-time capabilities of Frontier powered by UE4, The Future Group leverages our skilled team of VFX, gaming, broadcast tech and creative artists who truly understand how to create breakthrough experiences for broadcast,” says Lawrence Jones, EVP of the Americas/ECD of The Future Group.  <br /> <br /> Oystein Larsen, VP Virtual at The Future Group, adds that when it came down to pure rendering quality and ease of use, Unreal Engine 4 was the obvious choice. “Our selection of the engine for real-time visual effects gave us the tools to really push the edge of live production,” he says.<br /> <br /> “Epic´s team really understands our needs. This gives our team not only the support and tools needed, but also the option to extend the amazing toolset for each and every project.”<br /> <br /> Want to create your own mixed reality experience? <a href="https://www.unrealengine.com/en-US/studio" target="_blank">Get Unreal Studio today</a> to import your scenes to Unreal Engine!enterprisefilm and televisionnewsarshowcaseKen PimentelFri, 10 Aug 2018 16:00:00 GMThttps://www.unrealengine.com/blog/the-future-group-and-the-weather-channel-create-lightning-and-tornadoes-with-unreal-enginehttps://www.unrealengine.com/blog/the-future-group-and-the-weather-channel-create-lightning-and-tornadoes-with-unreal-engineEpic Games, Unreal Engine and the Academy Software FoundationToday sees the launch of the <a href="http://www.aswf.io/" target="_blank">Academy Software Foundation</a>, a non-profit organization designed to provide funding, structure and infrastructure for the open source community in the motion picture industry. We at Epic couldn’t be more pleased to be a founding member. Details can be found in the official announcement below.<br /> <br /> <img alt="FB_ASF.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fepic-games-unreal-engine-and-the-academy-software-foundation%2FFB_ASF-1200x630-18f1a22f14fb2d156eaab08c8450a43bcb9d1cd4.jpg" width="100%" /> <div style="text-align: center;"><strong>Academy of Motion Picture Arts and Sciences and The Linux Foundation Launch the Academy Software Foundation</strong></div>   <div style="text-align: center;"><em>Founding members include Animal Logic, Autodesk, Blue Sky Studios, Cisco, DNEG, DreamWorks, Epic Games, Foundry, Google Cloud, Intel, SideFX, The Walt Disney Studios, and Weta Digital</em></div>  <br /> SAN FRANCISCO and LOS ANGELES, August 10, 2018 – The <a href="http://www.oscars.org/" target="_blank">Academy of Motion Picture Arts and Sciences</a> and <a href="http://www.linuxfoundation.org/" target="_blank">The Linux Foundation</a> today launched the <a href="http://www.aswf.io/" target="_blank">Academy Software Foundation (ASWF)</a> to provide a neutral forum for open source software developers in the motion picture and broader media industries to share resources and collaborate on technologies for image creation, visual effects, animation and sound.<br />  <br /> “We are thrilled to partner with The Linux Foundation for this vital initiative that fosters more innovation, more collaboration, more creativity among artists and engineers in our community,” said Academy CEO Dawn Hudson. “The Academy Software Foundation is core to the mission of our Academy: promoting the arts and sciences of motion pictures.”<br />  <br /> “Open Source Software has enabled developers and engineers to create the amazing visual effects and animation that we see every day in the movies, on television and in video games,” said Jim Zemlin, Executive Director of The Linux Foundation. “With the Academy Software Foundation, we are providing a home for this community of open source developers to collaborate and drive the next wave of innovation across the motion picture and broader media industries.”<br />  <br /> The ASWF is the result of a two-year investigation by the Academy’s Science and Technology Council into the use of Open Source Software (OSS) across the motion picture industry. The survey found that more than 80% of the industry uses open source software, particularly for animation and visual effects. However, this widespread use of OSS has also created challenges including siloed development, managing multiple versions of OSS libraries (“versionitis”) and varying governance and licensing models that need to be addressed in order to ensure a healthy open source ecosystem.<br />  <br /> “Developers and engineers across the industry are constantly working to find new ways to bring images to life, and open source enables them to ​start with a solid foundation while focusing​ on solving ​unique​,​ creative challenges rather than reinventing the wheel,” said Rob Bredow, ​SVP, Executive Creative Director and Head of Industrial Light & Magic and Member of the Academy’s Science and Technology Council​,​ Open Source Investigation Committee. “We are very excited to launch the Academy Software Foundation and provide a home for open source developers to collaborate​,​ ​regardless of where they work, and​ share best practices ​which we believe will drive innovation across the industry.”<br />  <br /> <strong>ASWF Mission: </strong>The mission of the ASWF is to increase the quality and quantity of open source contributions by developing a governance model, legal framework and community infrastructure that lowers the barrier to entry for developing and using open source software. Developers interested in learning more or contributing can sign up to join the mailing list at <a href="http://www.aswf.io/community" target="_blank">www.aswf.io/community</a>.<br />  <br /> <strong>The ASWF Goals are to:</strong><br />  <br /> ●       Provide a neutral forum to coordinate cross-project efforts, establish best practices and share resources across the motion picture and broader media industries.<br /> ●       Develop an <a href="http://www.aswf.io/community" target="_blank">open continuous integration (CI) and build infrastructure</a> to enable reference builds from the community and alleviate issues caused by siloed development.<br /> ●       Provide individuals and organizations with a clear path for participation and code contribution.<br /> ●       Streamline development for build and runtime environments through the sharing of open source build configurations, scripts and recipes.<br /> ●       Provide better, more consistent licensing through a shared licensing template.<br />  <br /> “In the last 25 years, software engineers have played an increasing role in the most successful movies of our time,” said David Morin, Project Lead for the Academy Open Source Investigation. “The Academy Software Foundation is set to provide funding, structure and infrastructure for the open source community, so that engineers can continue to collaborate and accelerate software development for moviemaking and other media for the next 25 years.”<br />  <br /> For more information about the Academy Software Foundation, visit <a href="http://www.aswf.io/" target="_blank">http://www.aswf.io/</a>.<br />  <br /> <strong>SIGGRAPH 2018</strong><br /> <br /> The ASWF will have a significant presence at <a href="https://s2018.siggraph.org/" target="_blank">SIGGRAPH 2018</a>. Interested developers and engineers can learn more about the ASWF by attending the <a href="https://s2018.siggraph.org/keynote-session/" target="_blank">keynote session</a> on Monday, August 13, from 2:00 - 3:15 pm PT by Rob Bredow, ​SVP, Executive Creative Director and Head of ILM, or following the keynote livestream on <a href="http://www.youtube.com/user/ACMSIGGRAPH" target="_blank">YouTube</a> or <a href="https://www.facebook.com/SIGGRAPHConferences" target="_blank">Facebook</a>.<br /> <br /> Conference attendees are also invited to participate in the Academy Software Foundation Birds of a Feather (BoF) session on Tuesday, August 14, from 9:00 - 10:00 am PT in the Vancouver Convention Centre, East Building, Room 11.<br /> <br /> <strong>Academy Software Foundation Supporting Premier Partners</strong><br /> <br /> Founding members of the ASWF include: Academy of Motion Picture Arts and Sciences, Animal Logic, Autodesk, Blue Sky Studios, Cisco, DNEG, DreamWorks, Epic Games, Google Cloud, Intel, Walt Disney Studios, and Weta Digital at the Premier level, and Foundry and SideFX at the General level.<br />  <br /> <strong>Animal Logic</strong><br /> "Animal Logic is proud to be an inaugural member of this industry initiative to support foundational software on which future innovations will be developed. Combined with our open-source contribution through AL_USDMaya, this signifies our commitment to giving back to the community that has provided us with so much.”<br /> <em>-        Darin Grant, CTO, Animal Logic Group</em><br />  <br /> <strong>Autodesk</strong><br /> “Autodesk understands and appreciates the diversity of our media and entertainment customer’s workflows and the critical role open source projects play in making them successful. We are very excited to collaborate with the Academy and The Linux Foundation to build this industry initiative to strengthen the projects and ecosystem we all rely on.”<br /> <em>-        Guy Martin, director of Open Source, Autodesk</em><br />  <br /> <strong>Blue Sky Studios</strong><br /> "Blue Sky Studios is thrilled to be a part of the Academy Software Foundation launch. Open source technologies give filmmakers the wonderful opportunity to collaborate and give back to the community. As a studio, we are committed to creating films that bring meaning to the world, and are excited when other filmmakers do the same. With the ASWF we have the opportunity to help each other in this goal: to create technology for ourselves and each other, so we can craft beautiful and compelling stories that make us all proud.”<br /> <em>-        Hank Driskill, Chief Technology Officer, Blue Sky Studios</em><br />  <br /> <strong>Cisco</strong><br /> “Cisco is joining the Academy Software Foundation to help fuel technology innovation in the creation and production of film and TV. Linux and OSS have become de facto standards across a wide number of industries, and we are realizing the distinct benefits for pre- and post-production as Hollywood and Silicon Valley continue to explore best practices for technology transfer.”<br /> <em>-        Dave Ward, Senior Vice President, CTO of Engineering and Chief Architect, Cisco</em><br />  <br /> <strong>DNEG</strong><br /> “Open source software projects have contributed significantly to the success and growth of the Visual Effects industry. DNEG recognises the importance of these projects in the service of our key goal: participating in the telling of inspiring and innovative stories through technology. As such we are delighted to support the ASWF and are excited by the prospect of collaborating on the software that will form the basis of the future of our industry.”<br /> <em>-        Graham Jack, CTO, DNEG</em><br />  <br /> <strong>DreamWorks</strong><br /> "DreamWorks is pleased to be a founding Premier member of the Academy Software Foundation. The formation of the Foundation underscores how vital open source technology is in the production industry, and how essential it is to future innovation and advancements. As an active open source contributor, and a dedicated consumer, we at DreamWorks Animation recognize the value of an oversight organization to ensure continued resources and momentum for software development."<br /> <em>-        Andrew Pearce, VP of Global Technology, DreamWorks</em><br />  <br /> <strong>Epic Games</strong><br /> "Epic Games is committed to providing a new generation of real-time technology for the film and television industry based on standard technology and practices. We are strong supporters of open source software and open platforms, and are thrilled to be part of the new Academy Software Foundation.”<br /> <em>-        Marc Petit, General Manager, Unreal Engine Enterprise at Epic Games</em><br /> <br /> <strong>Google Cloud</strong><br /> “Google Cloud is excited to be a Premier member of the Academy Software Foundation and to help contribute to the open source standards for industry software. Our core belief in an open platform for cloud users echoes the sentiments of the VFX industry and we look forward to collaborating on projects that will benefit all users.”<br /> <em>-        Todd Prives, Product Manager, Google Cloud</em><br />  <br /> <strong>Intel Corporation</strong><br /> “As a founding member of the Academy Software Foundation, Intel will join a vibrant community of studios, and software and hardware technology leaders to spur developer innovations and enable top-quality motion picture content that is better, faster and less expensive to make. Intel’s just announced Intel® Rendering Framework including software libraries Embree and OSPRay, and the new OpenImageDenoise library, can be used by ASWF members to develop compelling motion pictures and visual effects with all the benefits of open source solutions on Intel® architecture-based systems.”<br /> <em>-        Lynn Comp, Vice President, Data Center Group, and General Manager, Visual Cloud Division, Intel Corporation</em><br />  <br /> <strong>The Walt Disney Studios, including Walt Disney Animation Studios, Industrial Light & Magic, Pixar Animation Studios and Marvel Studios</strong><br /> “The creation of the Academy Software Foundation is an important and exciting step for the motion picture industry. By increasing collaboration within our industry, it allows all of us to pool our efforts on common foundational technologies, drive new standards for interoperability and increase the pace of innovation.”<br /> <em>-        Nick Cannon, Chief Technology Officer, Walt Disney Animation Studios</em><br />  <br /> <strong>Weta Digital</strong><br /> “We are proud to help establish the Academy Software Foundation as a Founding member.  Thoughtful curation and dedicated financial commitment to open source platforms and standards is vital for our industry as we mature. The pace of innovation in visual effects is supported by a reliable platform of technology contributions by many companies and groups. The Academy Software Foundation will provide a unique collaboration environment that marries the interests of all parties and ensures the industry’s history of experimentation and creativity will continue well into the future.”<br /> <em>-        Joe Letteri, Sr. Visual Effects Supervisor, Weta Digital</em><br />  <br /> <strong>About The Linux Foundation</strong><br /> The Linux Foundation is the organization of choice for the world’s top developers and companies to build ecosystems that accelerate open technology development and industry adoption. Together with the worldwide open source community, it is solving the hardest technology problems by creating the largest shared technology investment in history. Founded in 2000, The Linux Foundation today provides tools, training and events to scale any open source project, which together deliver an economic impact not achievable by any one company. More information can be found at <a href="http://www.linuxfoundation.org/" target="_blank">www.linuxfoundation.org</a>.<br />  <br /> <strong>About the Academy</strong><br /> The Academy of Motion Picture Arts and Sciences is a global community of more than 8,000 of the most accomplished artists, filmmakers and executives working in film. In addition to celebrating and recognizing excellence in filmmaking through the Oscars, the Academy supports a wide range of initiatives to promote the art and science of the movies, including public programming, educational outreach and the upcoming Academy Museum of Motion Pictures, which is under construction in Los Angeles.<br />   <div style="text-align: center;"># # #</div>  <br /> The Linux Foundation has registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see our trademark usage page: <a href="https://www.linuxfoundation.org/trademark-usage/" target="_blank">https://www.linuxfoundation.org/trademark-usage</a>. Linux is a registered trademark of Linus Torvalds.film and televisionenterprisenewscommunityDana CowleyFri, 10 Aug 2018 14:00:00 GMThttps://www.unrealengine.com/blog/epic-games-unreal-engine-and-the-academy-software-foundationhttps://www.unrealengine.com/blog/epic-games-unreal-engine-and-the-academy-software-foundationUnreal Engine Developers Earn Top Honors in the 2018 ‘Rookie of the Year’ Awards<a href="http://www.therookies.co/" target="_blank">The Rookies</a> recently revealed the winners of its International Student Awards and we are pleased to announce that a student using Unreal Engine 4 took home the winning title of “Rookie of the Year” in the Game Development category.<br /> <br /> For those unfamiliar, The Rookies is an annual awards and mentor platform where creative students in games, animation, film, VR, motion graphics and architecture visualization compete for the coveted title of “Rookie of the Year”. The stakes are high as each project is evaluated by a panel of all-star judges. The Rookies is an incredible way for students to showcase their creative projects to industry leaders and transition from student to professional. <br /> <br /> The Rookies received 9,971 projects created by 2,914 students from 581 different schools from over 87 countries. Students from around the world submitted their unique digital projects to compete for the ultimate titles such as Rookie of the Year (Feature Animation, Digital Illustration, Visual Effects, Game Development, Virtual Reality, 3D Motion Graphics, Product Visualization, Architecture Visualization), Film of the Year (2D Animation, 3D Animation, and Visual Effects) and Game of the Year (PC & Console, Mobile, and Virtual Reality). This year, students were also able to participate to win Studio Internships and University Scholarships.<br /> <br /> We would like to congratulate all the students that submitted these amazing UE4 projects. Let’s check out some of the outstanding UE4 winners. <h2>Rookie of the Year — Game Development — Winner 2018</h2> <img alt="Peyton-Varney.png" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Funreal-engine-developers-earn-top-honors-in-the-2018-rookie-of-the-year-awards%2FPeyton-Varney-432x424-8891dffc7cc5ba99bff5ee6c8c8401ea474fa345.png" style="width: 25%; height: 25%; float: left;" /><br /> <a href="http://www.therookies.co/entrants/peyton-varney/" target="_blank"><strong>PEYTON VARNEY</strong></a><br /> <strong>United States, Sarasota</strong> | Ringling College of Art and Design, Game Art and Design <br /> <br /> <strong>UE4 Projects</strong>: Protege, Protege: Conservatory Exterior, Protege: Conservatory Interior, Workshop, Smokey Mountains Biome. <br /> <br /> <strong>Awards</strong>: Excellence Award | Protege<br /> <img alt="peyton-varney-protege-conservatory-interior-74-1525071125.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Funreal-engine-developers-earn-top-honors-in-the-2018-rookie-of-the-year-awards%2Fpeyton-varney-protege-conservatory-interior-74-1525071125-1160x494-ec2db1c0a7ec395a31d88d333d5eea79d7df1643.jpg" width="100%" /> <div style="text-align: center;">Protege: Conservatory Interior, Peyton Varney. Ringling College of Art + Design, Game Art and Design</div> <div>  <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/iLbWA03gDqI" width="100%"></iframe></div> </div> <div style="text-align: center;">Protege, by Max Frorer, Peyton Varney, & Steven Hong. Ringling College of Art + Design, Game Art and Design</div>   <h2><br /> Rookie of the Year — Game Development — Runner Up 2018</h2> <img alt="Abigail-Jameson.PNG" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Funreal-engine-developers-earn-top-honors-in-the-2018-rookie-of-the-year-awards%2FAbigail-Jameson-365x271-8690b26356ec7e86d57934ee125a6db64615fb50.PNG" style="width: 25%; height: 25%; float: left;" /><br /> <a href="http://www.therookies.co/entrants/abigail-jameson/" target="_blank"><strong>ABIGAIL JAMESON</strong></a><br /> <strong>United Kingdom, Middlesbrough</strong> | Teesside University, Computer Games Art<br /> <br /> <strong>UE4 Projects</strong>: Inhotim Lighting, Harbour Wall, Archvis Corridor, 1960’s Subway,  Cadillac Motel<br /> <br /> <strong>Awards</strong>: Excellence Award | 1960’s Subway<br /> <br /> <img alt="abigail-jameson-1960s-subway-15-1523099357.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Funreal-engine-developers-earn-top-honors-in-the-2018-rookie-of-the-year-awards%2Fabigail-jameson-1960s-subway-15-1523099357-1160x463-d306d81f1f6917c76c8ceba5fe4c46007c360902.jpg" width="100%" /> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/JBDIeZFb-_g" width="100%"></iframe></div> <h2><br /> Rookie of the Year — Virtual Reality — Runner Up 2018</h2> <br />  <img alt="Brooke-Routh.png" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Funreal-engine-developers-earn-top-honors-in-the-2018-rookie-of-the-year-awards%2FBrooke-Routh-396x390-33f70cd30ff95494683cd5c20f6765bb67ca54e8.png" style="width: 25%; height: 25%; float: left;" /><br /> <br /> <a href="http://www.therookies.co/entrants/brooke-routh/" target="_blank"><strong>BROOKE ROUTH</strong></a><br /> <strong>United</strong><strong> States, Tampa</strong> | Ringling College of Art and Design, Game Art and Design <br /> <br /> <strong>UE4 Projects</strong>:  Weeaboo Basement, Dreary Ally, Witch’s Hovel, Victorian Trophy Room<br /> <br /> <strong>Awards</strong>: Excellence Award | Victorian Trophy Room<br /> <div style="padding:56.25% 0 0 0;position:relative;"><iframe src="https://player.vimeo.com/video/252286293" style="position:absolute;top:0;left:0;width:100%;height:100%;" frameborder="0" webkitallowfullscreen mozallowfullscreen allowfullscreen></iframe></div><script src="https://player.vimeo.com/api/player.js"></script><br />   <center>Victorian Trophy Room by Brooke Routh, Ringling College of Art and Design, Game Art and Design</center>   <h2>Internship Finalists</h2> <div>Many of the Internship Finalists were also students using Unreal Engine. Students were judged in the categories of Overall Impression, Creative Skills, Technical Skills, Range of Skills, Complexity, Presentation, and Industry Employability. Below are the Internship Finalists who used UE4 to create their projects.<br /> <br /> <strong>United States — Game Development — Internship Winners</strong><br />  <br /> <a href="http://www.therookies.co/entrants/beck-michaels/" target="_blank">BECK MICHAELS</a> - Savannah College of Art and Design Game Development <br /> <br /> <a href="http://www.therookies.co/entrants/giselle-valenzuela/" target="_blank">GISELLE VALENZUELA</a> - Ringling College of Art and Design Game Development<br /> <br /> <a href="http://www.therookies.co/entrants/hannah-lawler/" target="_blank">HANNAH LAWLER</a> - Gnomon Game Development<br /> <br /> <a href="http://www.therookies.co/entrants/julian-elwood/" target="_blank">JULIAN ELWOOD</a> - Gnomon Game Development<br /> <br /> <a href="http://www.therookies.co/entrants/kat-gray/" target="_blank">KAT GRAY</a> - University of Southern California School of Cinematic Arts Game Development<br /> <br /> <a href="http://www.therookies.co/entrants/kat-tamburello/" target="_blank">KAT TAMBURELLO</a> - Gnomon Game Development<br /> <br /> <a href="http://www.therookies.co/entrants/mackenzie-patrick/" target="_blank">MACKENZIE PATRICK</a> - Savannah College of Art and Design Game Development<br /> <br /> <a href="http://www.therookies.co/entrants/martin-pietras/" target="_blank">MARTIN PIETRAS</a> - Rochester Institute of Technology Game Development<br /> <br /> <a href="http://www.therookies.co/entrants/sean-graefen/" target="_blank">SEAN GRAEFEN</a> - Savannah College of Art and Design Game Development<br /> <br /> <a href="http://www.therookies.co/entrants/yvonne-lee/" target="_blank">YVONNE LEE</a> - Champlain College Game Development<br /> <br />  <br /> <strong>United Kingdom — Game Development — Internship Winners</strong><br />  <br /> <a href="http://www.therookies.co/entrants/aidan-vangrysperre/" target="_blank">AIDAN VANGRYSPERRE</a> - Howest University - Digital Arts and Entertainment<br />  <br /> <a href="http://www.therookies.co/entrants/allan-de-paepe/" target="_blank">ALLAN DE PAEPE</a> - Howest University - Digital Arts and Entertainment Game Development<br />  <br /> <a href="http://www.therookies.co/entrants/aristeidis-chrysikopoulos/" target="_blank">ARISTEIDIS </a><a href="http://www.therookies.co/entrants/aristeidis-chrysikopoulos/" target="_blank">CHRYSIKOPOULOS</a>  - Staffordshire University Game Development<br />  <br /> <a href="http://www.therookies.co/entrants/guillaume-menot/" target="_blank">GUILLAUME MENOT</a> - New3dge Game Development<br />  <br /> <a href="http://www.therookies.co/entrants/jonathan-hemmens/" target="_blank">JONATHAN HEMMENS</a> - Falmouth University Game Development<br />  <br /> <a href="http://www.therookies.co/entrants/jonathon-ivall/" target="_blank">JONATHON IVALL</a> - University of Hertfordshire Game Development<br />  <br /> <a href="http://www.therookies.co/entrants/ramon-schauer/" target="_blank">RAMON SCHAUER</a> - Darmstadt University of Applied Sciences Game Development</div> <h2>Oceania — Game Development — Internship Winners</h2> <div><br /> <a href="http://www.therookies.co/entrants/shanshan-he/" target="_blank">SHANSHAN HE</a>  - Gnomon - Game Development <br />  <br /> To view all of the incredible UE4 entries and projects submitted for The Rookies 2018:<br />  </div> <ul> <li><a href="http://bit.ly/2mp7OwU" target="_blank">Highest ranked entries that used UE4</a></li> <li><a href="http://bit.ly/2msWU9f" target="_blank">Highest ranked projects that used UE4</a></li> </ul> <div> <br /> To view the full list of winners and to see the amazing entries please visit:</div> <ul> <li><a href="https://therookies.us11.list-manage.com/track/click?u=b8688534a82e76e4efd89a4d0&amp;id=da5ea5cea6&amp;e=f79a85b067" target="_blank">http://www.therookies.co/2018-results/</a></li> </ul> newseducationcommunityenterprisevrMelissa RobinsonThu, 09 Aug 2018 15:09:07 GMThttps://www.unrealengine.com/blog/unreal-engine-developers-earn-top-honors-in-the-2018-rookie-of-the-year-awardshttps://www.unrealengine.com/blog/unreal-engine-developers-earn-top-honors-in-the-2018-rookie-of-the-year-awardsUnreal Engine Receives Major Update to Magic Leap Support, 'Creator Edition' Hardware Now Available for PurchaseToday, Magic Leap <a href="https://www.magicleap.com/magic-leap-one" target="_blank">opened up orders for Magic Leap One, Creator Edition</a>, inviting developers to get their hands on the new spatial computing platform. Accompanying the announcement, we’ve released UE4 support for the latest Magic Leap SDK and features alongside updated resources for Magic Leap development!<br /> <img alt="Unreal-_Blessed_-Magic-Leap-Image-1080x720_NEW.png" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Funreal-engine-receives-major-update-to-magic-leap-support-creator-edition-hardware-now-available-for-purchase%2FUnreal-_Blessed_-Magic-Leap-Image-1080x720_NEW-1080x720-5ed53c2ff7a2e2bd01ff8059ad654aa4bc6abc15.png" width="100%" /><br /> Developers can access the latest UE4 support for the platform by selecting “Magic Leap” from the “Add Version” dropdown menu in the Epic Games launcher. Source code is also available via our<a href="https://github.com/EpicGames/UnrealEngine/tree/dev-vr" target="_blank"> Dev-VR branch</a> or the <a href="https://github.com/magicleap-ue4/UnrealEngine/tree/4.20-mlsdk-0.16.0" target="_blank">Magic Leap-maintained fork</a> of the Unreal Engine GitHub repository. Resources for getting started, including a rich sample project, are also available in our <a href="http://epic.gm/pssdocs" target="_blank">Magic Leap documentation</a>. Expect these updates and more to make their way into the 4.21 release of Unreal Engine due out later this year. <div style="text-align: center;"><img alt="ML_016_Download.png" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Funreal-engine-receives-major-update-to-magic-leap-support-creator-edition-hardware-now-available-for-purchase%2FML_016_Download-521x266-7978a15c8c748d568d9a6336276931c3bcbbb159.png" style="width: 60%;" /></div> We’re thrilled to bring these updates to the developer community as a follow up to our early access support for the Magic Leap One and cannot wait to see the great UE4 content built for the platform. <br /> <br /> Grab UE4 with the latest Magic Leap support and order your <a href="https://www.magicleap.com/magic-leap-one" target="_blank">Magic Leap One, Creator Edition</a> today.arnewscommunityfeaturesenterpriseChance IveyWed, 08 Aug 2018 12:08:00 GMThttps://www.unrealengine.com/blog/unreal-engine-receives-major-update-to-magic-leap-support-creator-edition-hardware-now-available-for-purchasehttps://www.unrealengine.com/blog/unreal-engine-receives-major-update-to-magic-leap-support-creator-edition-hardware-now-available-for-purchaseImmersing Users in Virtual Worlds with Mixed Reality CaptureOver the past few years, our development community has pushed Unreal Engine&#39;s <a href="https://www.unrealengine.com/en-US/blog/category/vr" target="_blank">Virtual Reality (VR)</a> and <a href="https://www.unrealengine.com/en-US/blog/category/ar" target="_blank">Augmented Reality (AR)</a> platforms to new horizons. With high-quality VR and AR experiences being developed and released by passionate and skilled developers around the globe, we&#39;re excited to continue supporting all of the creativity that we&#39;re seeing around us.<br /> <br /> To support the efforts of XR devs everywhere, we’ve been building a solution for compositing real-world video onto virtual world space in Unreal Engine 4. Mixed Reality Capture, available in Early Access as of Unreal Engine 4.20, equips you with the tools you need to project yourself (or any tracked object) into your virtual experience.<br /> <br /> <img alt="MixedReality_03.gif" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fimmersing-users-in-virtual-worlds-with-mixed-reality-capture%2FMixedReality_03-c97512c8c7631f3bea53ae042c388c395a5a1003.gif" width="100%" /> <div style="text-align: center;"><em>Here, I’m using a drone to fire a laser at a robot in Robo Recall! </em></div> <br /> <strong>SETUP</strong><br /> Getting the capture space set up is pretty straightforward, though there are a few real-world things you’ll need to get started. For recording <a href="http://www.roborecall.com/" target="_blank">Robo Recall</a>, we used a green screen draped between two tripods, a stationary mounted camera, a capture device to snag the camera feed, three Oculus sensors and the Rift +Touch controllers to play.<br /> <img alt="MixedRealityCapture_Montage.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fimmersing-users-in-virtual-worlds-with-mixed-reality-capture%2FMixedRealityCapture_Montage-800x600-19f3fe24971fc7a15160ab2fa8044dc4c11c9b60.jpg" width="100%" /><br /> Once the environment is set up, we use a <a href="https://docs.unrealengine.com/en-us/Platforms/MR/HowToCaptureCalibrationTool" target="_blank">calibration app</a> that creates spatial mappings between objects in the virtual world and the real world. <br /> <img alt="MixedRealityCapture_Pic3.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fimmersing-users-in-virtual-worlds-with-mixed-reality-capture%2FMixedRealityCapture_Pic3-648x365-d2740bec221a5dedd9d15a66e56785187075fa07.jpg" width="100%" /><br /> After everything is properly calibrated, we can play through the game and record the composited video using the screen recording software of our choice (for Robo Recall, we used <a href="https://obsproject.com/" target="_blank">OBS</a>.)<br /> <br /> A quick tip - You’ll want to play around with what you render and how you render it while capturing using Mixed Reality Capture to provide the best representation of your project. Notice that the guns in my back holsters are not rendered and the hand models are also hidden. We turned them off for this recording as their scale and location is tuned for the player’s experience - not the spectator’s - and they didn’t quite feel right as they were OOTB. Like everything, test and iterate often!<br /> <img alt="MixedRealityCapture_Screen2.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fimmersing-users-in-virtual-worlds-with-mixed-reality-capture%2FMixedRealityCapture_Screen2-750x421-f159d6a815fce2b5e645f499e50b05a9d7c013bb.jpg" width="100%" /><br /> <br /> <strong>MORE INFO</strong><br /> To learn more about using the Mixed Reality Capture feature, including how you can calibrate supported devices and composite users into a virtual environment, read through our early access <a href="https://docs.unrealengine.com/en-us/Platforms/MR" target="_blank">MRC Development documentation</a> to get started with your own immersive experiences. <br /> <br /> Is your video capture device not listed? We’ve built the system in a modular way to facilitate new capture methods easily. If you’re interested in integrating a different video capture solution, have a look at the “Media” module in the engine to find a series of interfaces to implement. Currently we’re using the WmfMedia module for our playback.<br /> <br /> We’re eager to see how people leverage this feature to show off their projects in a new way and are looking for feedback on the tool. Drop us a line in the MRC <a href="https://forums.unrealengine.com/development-discussion/vr-ar-development/1512350" target="_blank">feedback forum thread</a> to share your thoughts!arcommunityMike BeachTue, 07 Aug 2018 17:00:00 GMThttps://www.unrealengine.com/blog/immersing-users-in-virtual-worlds-with-mixed-reality-capturehttps://www.unrealengine.com/blog/immersing-users-in-virtual-worlds-with-mixed-reality-captureHow NEP Leveraged Unreal Engine to Deliver Standout Live Broadcast Events For the Winter Olympics, Formula 1<a href="https://www.nepgroup.com/" target="_blank">NEP </a>helps its clients develop and deliver standout live broadcast events. With combined creative and IT expertise, the company delivers services that transform the way global video entertainment is created, managed and distributed. Over the past two years, NEP has built a dedicated Unreal Engine team to service growing client demands for real-time graphics in broadcast scenarios.<br />  <br /> We spoke with NEP Senior Unreal Engineer Roel Bartstra to get a closer look at their work. <img alt="NEP_The_Netherlands_HighresScreenshot00002.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fhow-nep-leveraged-unreal-engine-to-deliver-standout-live-broadcast-events-for-the-winter-olympics-formula-1%2FNEP_The_Netherlands_HighresScreenshot00002-1920x1052-a5b191529aa01ab4e3e7454b22c93794133869ad.jpg" width="100%" /><br /> <strong>When did NEP The Netherlands start using Unreal Engine?</strong><br /> <br /> NEP’s first project in Unreal Engine was a couple of years ago, before I joined the company. The team started working in Unreal to build an augmented reality-enhanced virtual set for the broadcast of a local game show called “I Bet That I Can Do It.” After that project, NEP decided to invest more heavily in hardware, software, people, and R&D.<br />  <br /> <strong>Can you describe what augmented reality in broadcast production means?</strong><br /> <br /> We add value by creating things that are physically impossible, for example, in scenarios where there’s limited space or the desired studio is booked. Our virtual set customers are typically broadcasting from a green screen set, which are often very small spaces. Sometimes there is a call for a virtual set extension, where there is a physically-built set or set pieces in the foreground, and augmented graphics and props are placed behind the camera. Other scenarios have no physical props and everything behind and in front of the talent is computer graphics.<br />  <br /> In production, this is referred to as ‘augmented reality’ because the talent can interact with these dynamically generated real-time graphics, and the magic bringing these images together is happening in Unreal Engine, often during a live broadcast. We can build data-driven graphics into these scenarios as well, updating content related to news and sports stats in real time. This was heavily used in the <a href="https://platform.vixyvideo.com/tiny/5gezy" target="_blank">Winter Olympics broadcast by Eurosport</a>, and also by <a href="https://platform.vixyvideo.com/tiny/8zmtq" target="_blank">Liberty Global’s Ziggo Sport as part of their Formula 1 coverage</a>. We also have a client preparing to launch a show with a virtual floor, creating a look that would not be achievable in a physical set environment.<br /> <img alt="NEP_The_Netherlands_HighresScreenshot00012_stitch.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fhow-nep-leveraged-unreal-engine-to-deliver-standout-live-broadcast-events-for-the-winter-olympics-formula-1%2FNEP_The_Netherlands_HighresScreenshot00012_stitch-1920x1012-8e2832cc5efe4a3b9e995e09a0ddf90697dd3742.jpg" width="100%" /> <strong>Are you providing services for hire for other production companies, or are you doing the integration for existing production/graphics teams?</strong><br /> <br /> It depends on the client. We may see that it’s an addition to our multicam operation, but the end-to-end managed graphics services can be delivered separately as well. Sometimes we are involved in the design phase, and for other clients we execute on an existing design and figure out how to make them feasible. Many of these clients are accustomed to providing a raytraced image and getting an idea of how a show will look, in those cases, we work on making that possible in real time in UE4.<br />  <br /> <strong>Why did NEP choose to invest in AR?</strong><br /> <br /> It was Ralf van Vegten, Managing Director of NEP The Netherlands who really believed that AR would be the future of dynamic virtual set production in broadcast.<br />  <br /> My background is in the gaming industry, and I started using Unreal Engine fifteen years ago when I did level design for games and modding in Unreal Tournament. I joined NEP almost two years ago to expand the Unreal Engine team, and integrate gaming industry knowhow to meet the current needs of broadcasting.<br />  <br /> In the early days, there was a level of technical expertise and custom development required that made this very expensive, but NEP has invested heavily in people and technology to make use of AR in broadcast production much more cost effective. We are continuing to revise and improve our methods, and research what aspects of production we can replace with AR, and make the process more practical.<br />  <br /> With Unreal Engine, the graphics are very photorealistic, so we really can replace entire studio builds with AR as opposed to the build-out of a physical set. Clients have been really happy with the results so far. We started out with one AR show a month, and now we have four or five running simultaneously at any given time, with one or two going live every week.<br /> <img alt="NEP_The_Netherlands_HighresScreenshot00008.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fhow-nep-leveraged-unreal-engine-to-deliver-standout-live-broadcast-events-for-the-winter-olympics-formula-1%2FNEP_The_Netherlands_HighresScreenshot00008-1920x1050-542b95fc12669652cfe2ebd5e94215d9df339f22.jpg" width="100%" /> <strong>Why do you maintain a dedicated Unreal Engine team?</strong><br /> <br /> Unreal Engine allows us to produce photo-real graphics at high frame rates, which is important as we’re broadcasting at 50 frames per second. Other game engine technologies we tested weren’t able to achieve that level of image fidelity with real-time performance. UE also has an insane number of pre-built tools that are very solid, and make it easy to get started pretty quickly. It is also well integrated with tools from our partners at Zero Density, and both are working together to give us more creative freedom to do even more in the engine ourselves.<br />  <br /> <strong>How do broadcasters manage the investment required in turning part of their offices into virtual studios?</strong><br /> <br /> It’s all about explaining the added value: AR makes literally everything you can think of possible – which is far more (cost) efficient than using different sets, props et cetera. For example, for Eurosport they took a small 5 X 5 meter break room and turned it into a small green screen stage. By adding skylights into the ceiling they ensured that enough light could come into the space and the costs were minimal. At NEP we are also working on creating a green box studio so that people can come in and rent a fully operational virtual studio setup (pre-configured with all necessary hardware and software) for one-off shows at several of our existing studios in the Netherlands.<br /> <img alt="NEP_The_Netherlands_HighresScreenshot00004.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fhow-nep-leveraged-unreal-engine-to-deliver-standout-live-broadcast-events-for-the-winter-olympics-formula-1%2FNEP_The_Netherlands_HighresScreenshot00004-1920x1052-17744566818429441a926c14e8e5ce8ad7eac661.jpg" width="100%" /><strong>What are the key pieces of gear and software involved in these build-outs?</strong><br /> <br /> This varies per client. For many clients we provide all of the hardware and software in our studios, along with AR camera operators and graphics administrators to control what’s being shown on virtual screens built into the sets. We can also train our clients’ internal teams to setup and operate similar systems once we’ve handed off a graphics package that lives in Unreal Engine.<br />  <br /> Most importantly, our people make the difference. NEP The Netherlands is a very people-driven organization. Our team is continuously trained on new technologies, workflows and specializations. We stimulate growth, innovation and ambition, because we believe that the highest level of knowledge is needed in the rapidly changing media industry.<br />  <br /> <strong>Is your UE team made up more of artists or technologists?</strong><br /> <br /> It’s a mix. We have a few 3D artists that are designers, but they come from the game industry. We also have some technical artists who do a bit of modeling, but are mostly creating Blueprint scripts and custom materials, doing compositing inside of Reality - all of the processes on the tech side of implementing a model in a real-time virtual set environment.<br /> <img alt="NEP_The_Netherlands_ScreenShot00017_stitch_processed.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fhow-nep-leveraged-unreal-engine-to-deliver-standout-live-broadcast-events-for-the-winter-olympics-formula-1%2FNEP_The_Netherlands_ScreenShot00017_stitch_processed-1920x1154-7738a4e0e035ac53721489b96e29559892172fe3.jpg" width="100%" /> <strong>What’s the average number of assets you’re building for clients?</strong><br /> <br /> It depends on the nature of the show, so for smaller news broadcasts in a small studio, it may be around 10-50 elements, but for something like Formula 1, we’re producing around 100 assets. Those are generally combined with pre-existing assets created by their in-house teams so there are probably closer to 1,000 for that show.<br />  <br /> <strong>Are there different considerations when building digital assets for virtual sets than other types of photo-real CG productions?</strong><br /> <br /> When we create assets digitally, we want them to look as lifelike as possible. In order to make the sets look ‘real’ we actually have to build models with slight flaws such as seams between wood panels and misaligned edges, otherwise they are too perfect to be convincing. <br />  <br /> <strong>Are broadcasters embracing Unreal Engine to save time and money or because they want more flexibility in the visual look of their shows?</strong><br /> <br /> Flexibility is something that we’re working on for sure, but generally this is deployed for cost and time-savings. This is especially the case if a show uses a lot of LEDs on set. LEDs are practically free to create digitally, so compared to a physical LED set, building virtual screens is much less expensive.<br />  <br /> <strong>Does using game engines have the chance to change the way things are done in broadcast?</strong><br /> <br /> Yes, Unreal Engine is on its way to becoming a true game changer in virtual production. It’s no longer necessary to build physical sets, as it’s now possible to generate, iterate and revise virtual set design and assets dynamically and in real time. Unreal Engine is a transformative technology for news broadcasts, sports commentary, game shows and any type of live or produced production that benefits from real time data-driven computer graphics, whether stylized or photo-real.<br />  communityenterprisenewsfilm and televisionarshowcaseAndy BlondinThu, 02 Aug 2018 15:00:00 GMThttps://www.unrealengine.com/blog/how-nep-leveraged-unreal-engine-to-deliver-standout-live-broadcast-events-for-the-winter-olympics-formula-1https://www.unrealengine.com/blog/how-nep-leveraged-unreal-engine-to-deliver-standout-live-broadcast-events-for-the-winter-olympics-formula-1Free Paragon Assets Get an Update<p>During GDC 2018, Epic Games <a href="https://www.unrealengine.com/en-US/blog/epic-games-releases-12-million-worth-of-paragon-assets-for-free" target="_blank">released over 20 AAA-quality characters and more than 1,500 environment components from Paragon</a> to the development community for free. Today, we’ve updated all the character assets currently available on the <a href="https://www.unrealengine.com/paragon" target="_blank">Unreal Engine Marketplace</a> to include animation Blueprints, making it even easier to implement and utilize these characters.<br /> <br /> To demonstrate how to get started with these characters, Lead Animator Jay Hosfelt joined the <a href="http://twitch.tv/unrealengine">Unreal Engine Livestream</a> for a series dedicated to work in animation Blueprints. Over the course of the livestreams, he walked through retargeting animations, adding a strafe locomotion system and blendspaces, utilizing sync markers, creating additive animation and more!</p> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/ffuq5k-j0AY?start=290" width="100%"></iframe></div> <p>More character assets, with their animation Blueprints, will become available in the future.<br /> <br /> Download the Paragon assets at <a href="http://www.unrealengine.com/paragon" target="_blank">unrealengine.com/paragon</a> and get started today!</p> newsmarketplacecommunityAmanda BottWed, 01 Aug 2018 15:00:00 GMThttps://www.unrealengine.com/blog/free-paragon-assets-get-an-updatehttps://www.unrealengine.com/blog/free-paragon-assets-get-an-updateNVIDIA Edge Program Recipients - July 2018<p>In an effort to recognize developers that produce visually outstanding works in Unreal Engine and reward them with top-of-the-line hardware, Epic Games partners with NVIDIA to offer the <a href="http://unrealengine.com/nvidiaedge" target="_blank">NVIDIA Edge Program</a> through which selected teams receive a <a href="https://www.nvidia.com/en-us/geforce/products/10series/geforce-gtx-1080-ti/" target="_blank">GTX 1080 Ti</a>.<br /> <br /> Congrats to this month’s recipients! Remember, if you’ve entered in the past, but your project hasn’t been selected, we encourage you to re-enter at any time.</p> <h2>Zero Caliber VR - XREAL Games</h2> <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/p7H2a5zdg1c" width="100%"></iframe></div> <p>Set to release in September, <a href="https://store.steampowered.com/app/877200/Zero_Caliber_VR/" target="_blank">Zero Caliber</a> is an incredibly realistic military FPS, developed exclusively for VR by the team at <a href="http://www.xrealgames.com/a-techcybernetic" target="_blank">XREAL Games</a>. To jump in on the action now, visit their <a href="https://t.co/b5jKB0ZgX0" target="_blank">Discord channel</a> to participate in their alpha tests.</p> <h2>Barcelona Loft - Steven Bracki</h2> <img alt="BarcelonaLoft_StevenBracki_770.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fnvidia-edge-program-recipients---july-2018%2FBarcelonaLoft_StevenBracki_770-770x433-0dbedc659690ddf1b87f7820fe503739098bac56.jpg" /> <p>Steven Bracki brought to life a small studio loft design in just two short weeks with <a href="http://www.artstation.com/artwork/15DWX" target="_blank">Barcelona Loft</a>. The simple, yet modern feel really let the scene pop. Be sure to check his <a href="https://www.artstation.com/stevenbracki" target="_blank">beautiful portfolio</a> with other incredible projects. </p> <h2>The Crossing - Conecow Studio</h2> <img alt="TheCrossing_ConecowStudio_770.jpg" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fnvidia-edge-program-recipients---july-2018%2FTheCrossing_ConecowStudio_770-770x433-fd55d928a07186d8a8a068806eee02b54c5121fe.jpg" /> <p>Set in an indeterminate future, The Crossing leads players on a metaphysical journey about existence. The narrative adventure, in development by <a href="https://twitter.com/ConecowS" target="_blank">Conecow Studio</a>, leverages machine learning for its procedural environments. Follow along with the game’s progress on their <a href="https://conecowstudio.wordpress.com/" target="_blank">development page</a>.</p> -- <p>Wow! There is so much incredible talent on display from the Unreal Engine community. Thank you so much for sharing it with us.</p> <h2>Want to submit your project?</h2> <p>Head over to the <a href="https://www.unrealengine.com/en-US/programs/nvidia-edge">NVIDIA Edge hub</a> for the details on how to submit your work. If you’ve entered previously, but haven’t been a winner thus far, we encourage you to re-apply to the program.</p> <p>Good luck!</p> newscommunityAmanda BottWed, 01 Aug 2018 14:00:00 GMThttps://www.unrealengine.com/blog/nvidia-edge-program-recipients---july-2018https://www.unrealengine.com/blog/nvidia-edge-program-recipients---july-2018Don’t Miss These Unreal Tech Talks at SIGGRAPH 2018It’s been a particularly exciting year of growth and innovation for the Unreal community. We’re seeing more and more creative teams in an ever-broadening range of design industries taking their work to the next level with Unreal Engine and Unreal Studio. Film and TV producers, architects, product designers, game developers, marketing teams, automotive manufacturers, and many more are using our tools to push the boundaries of possibility within their respective industries.<br /> <br /> With <a href="https://www.unrealengine.com/en-US/events/siggraph-2018" target="_blank">SIGGRAPH 2018</a> just around the corner, we’re thrilled to be able to share the latest cutting-edge Unreal Engine developments and showcase some of the amazing projects our dedicated Enterprise users are building with our real-time platform.<br /> <br /> At the expo, you’ll find us at <strong>Booth 1401</strong> with lots of demos and Enterprise customer guest speakers, but we also don’t want you to miss out on the info-packed day of tech talks the Unreal team has put together for you! On <strong>Wednesday, August 15</strong>, our team of experts will be giving in-depth talks on a wide range of Unreal-related topics in the <strong>Vancouver Convention Centre East Building, Meeting Room 16</strong>. Please note, a SIGGRAPH badge is required to attend. <br /> <br /> You won’t want to miss this! Here’s a quick look at the tech talks we have planned:<br /> <img alt="Blog-body-img1.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fdon-t-miss-these-unreal-tech-talks-at-siggraph-2018%2FBlog-body-img1-1640x1000-807b76655750ef993de40022fb0f2a288a651af4.jpg" width="100%" /> <h2><strong>Real-time Ray Tracing Advances in Unreal Engine </strong></h2> (9:30 a.m. - 10:30 a.m.)<br /> <br /> The holy grail of real-time rendering is finally here! Join Juan Canada and Fran&ccedil;ois Antoine in exploring the latest exciting ray tracing advancements in Unreal Engine. We’ll break down the roadmap of our new path-tracer and progressive lightmapper, then reveal our latest real-time ray-traced showcase.<br /> <img alt="Fortnite_CinematicTrailer.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fdon-t-miss-these-unreal-tech-talks-at-siggraph-2018%2FFortnite_CinematicTrailer-1280x720-54466b60a9574356ac89f895d3a4f3186763aa0e.jpg" width="100%" /> <h2><strong>Fortnite - Advancing The Animation Production Pipeline</strong></h2> (11:00 a.m. - 12:00 p.m.)<br /> <br /> Looking to improve your understanding of real-time animation production in Unreal Engine? Follow Epic’s Brian Pohl and Ryan Mayeda as they chart the course of new workflow and pipeline improvements to UE4 while dissecting the Fortnite cinematic trailer one year after its initial release. <br /> <br /> In this lecture, you&#39;ll learn more about what we&#39;ve improved, changed, and added to UE4 to dramatically streamline your animation pipeline. We&#39;ll examine Perforce setup, source control, Python integration, Sequencer improvements, better production management techniques through UE4&#39;s Shotgun integration capabilities, and more!<br /> <img alt="Blog-body-img4.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fdon-t-miss-these-unreal-tech-talks-at-siggraph-2018%2FBlog-body-img4-1640x1000-634c29ce6d006d7fc41fa8428f1b01998531e9a8.jpg" width="100%" /> <h2><strong>Real-time Motion Capture in Unreal Engine</strong></h2> (12:30 p.m. - 1:30 p.m.)<br /> <br /> Motion capture technology has come a long way over the years, expanding from its early roots as a medical tool for biomechanics to being widely used in Hollywood blockbuster films.<br /> <br /> Ready to experience the next wave in motion capture innovation? Join us to see for yourself how Unreal Engine is helping to break boundaries and bring the power of real-time motion capture to film, games, theater, and beyond!<br /> <img alt="Blog-body-img3.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fdon-t-miss-these-unreal-tech-talks-at-siggraph-2018%2FBlog-body-img3-1640x1000-05ff242ce1050d16c94816223c3bd93c94c306d0.jpg" width="100%" /> <h2><strong>Virtual Production with Unreal Engine 4.20</strong></h2> (2:00 p.m. - 3:00 p.m.)<br /> <br /> With the release of <a href="https://www.unrealengine.com/en-US/blog/unreal-engine-4-20-released" target="_blank">Unreal Engine 4.20</a>, creative teams in the film and television worlds have an array of new tools designed to support the virtual production process. These tools are defining a new wave of production workflows that integrate real-time visual effects with live action footage. <br /> <br /> Join us as we take a look under the hood of UE4&#39;s new virtual camera, genlock, timecode, Sequence Recorder, and video I/O capabilities that are destined to alter the way you make visual content. <br /> <img alt="Blog-body-img5.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fdon-t-miss-these-unreal-tech-talks-at-siggraph-2018%2FBlog-body-img5-1640x1000-0a01c0a66edcd53a6ed69853211fd5d67c7ca888.jpg" width="100%" /> <h2><strong>Mixed Reality Production using Unreal Engine 4.20</strong></h2> (3:30 p.m. - 4:30 p.m.)<br /> <br /> Curious about what it takes to build a mixed reality entertainment attraction using Unreal Engine 4? Join us to learn more about the pipeline and processes of designing themed mixed-reality experiences for a local stage.<br /> <br /> This tech talk will delve into mixed reality production processes using UE4, VR hardware, real lighting and effects integration using DMX/Midi/Artnet, and motion capture systems. You’ll also learn to build live 3D characters with motion tracking IK, which helps in crafting immersive experiences for live actors and players alike. <br /> <br /> We hope to see you at our tech talks in the <strong>Vancouver Convention Centre East Building, Meeting Room 16</strong> <strong>at SIGGRAPH 2018</strong> on <strong>Wednesday, August 15th</strong>! Don’t forget to also stop by <strong>Booth 1401</strong> and check out our demos and see Enterprise customer presentations, too! For more information, visit our <a href="https://www.unrealengine.com/en-US/events/siggraph-2018" target="_blank">SIGGRAPH 2018 event page</a>.<br />  enterprisenewseventsKen PimentelTue, 31 Jul 2018 20:00:00 GMThttps://www.unrealengine.com/blog/don-t-miss-these-unreal-tech-talks-at-siggraph-2018https://www.unrealengine.com/blog/don-t-miss-these-unreal-tech-talks-at-siggraph-2018Platforming, Puzzles and Stealth Take Center Stage in PLANET ALPHAThe <a href="https://www.unrealengine.com/en-US/unrealdevgrants" target="_blank">Unreal Dev Grant</a> program, a $5,000,000 dollar development fund created by Epic back in 2015, provides a no-strings-attached monetary boost given out to select developers and creators every year. Among the winners of the inaugural offering was Adrian Lazar and his ambitious project, PLANET ALPHA. Crediting the honor with being the catalyst he needed to turn his part-time passion project into a full-time development studio, Adrian is on the precipice of releasing PLANET ALPHA for PC, PlayStation 4, Xbox One and Nintendo Switch on September 4, 2018.<br /> <br /> Employing a unique mechanic of controlling the cycle of day to night and back again, PLANET ALPHA is a mixed bag of quick platforming, puzzle solving and stealth mechanics. Colorful environments brought to life by Unreal Engine 4 will see players tackle everything from lush forests alive with organisms of all sizes, mysterious caverns rife with danger, and magical skyborne lands complete with massive flying creatures of myth and legend. It’s a visual feast that’s full of life at every turn.<br /> <br /> As the title prepares for launch, we caught up with Adrian Lazar to learn more about PLANET ALPHA’s growth from being a personal project to becoming a highly-anticipated initial offering for a newly-found studio. With a ton of experience on his resume, Adrian imparts his advice on what burgeoning developers should learn and tells us about the tools in Unreal Engine 4 that served the studio best over the past four years. <br />   <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/ITZW-f_Mj80" width="100%"></iframe></div> <strong>You&#39;ve put together a small, talented team for PLANET ALPHA. Tell us a little bit about your inspiration for jumping into indie development.</strong><br />  <br /> At 32, when I started my game development studio, I had worked for other people for about 14 years in eight different companies of all sizes and I really, really wanted to try something new.<br />  <br /> After two years I managed to finance the project and at the beginning of 2016, I hired two people onto the core team with whom I had worked with before in other companies. In total, seven people worked on developing PLANET ALPHA.<br />  <br /> Together we took the game to a level I never thought possible and I’m looking forward to making more games with them in the future!<br />  <br /> <strong>For anyone unfamiliar with PLANET ALPHA, please tell us about the game&#39;s premise.</strong><br />  <br /> PLANET ALPHA is an adventure that takes place in a living alien world where you have the unique ability of being able to control the day and night cycle. It combines fast platforming, puzzles and stealth elements with a unique art-style to create an unforgettable experience.<br />  <br /> So that’s the elevator pitch, but what I hope is for the game to be received as a fresh take on a genre that is almost as old as the industry itself, an experience that will carry the player to a place that fascinates and intrigues at the same time. <img alt="Screenshot_02.png" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fplatforming-puzzles-and-stealth-take-center-stage-in-planet-alpha%2FScreenshot_02-1280x720-aeb789e9583d94933dafea30a750e09e6b3c8570.png" width="100%" /><br /> <strong>One of the first things that struck me while watching the PLANET ALPHA trailer was how alive the world felt. How difficult was it for you to bring your environments together into something very much moving and breathing?</strong><br />  <br /> We did countless iterations on the game, including a complete change of direction, but the feeling of being stranded on a living alien planet was one of the few things that stayed true from start to finish. From the beginning, we built the team and the workflow with this target in mind and every asset that we created had to contribute to this feeling.<br />  <br /> Being a small team means that each of us needs to wear different hats most of the time, which worked great for the type of experience we’re building — a diverse game both in looks and gameplay.<br />  <br />  <br /> <strong>It looks like PLANET ALPHA has a broad mix of platforming crossed with some unique puzzle play as well. What can you tell us about the gameplay mechanics in PLANET ALPHA?<br />  </strong><br /> Because the game started as a personal project, we skipped a few steps. For example, we never had a proper game design document. Instead, we relied on the game to develop itself in an organic way and we let ideas generate other ideas. This is true for the story, the environments, and the gameplay — each one feeding the other.<br />  <br /> We never planned to have platforming, puzzles, stealth, and exploration from the beginning but different situations require different approaches so that is how the mix of gameplay mechanics came to be.<br />  <br /> For example, the player was originally equipped with a gun and able to fight but we removed this about halfway into production. Of course, that now left us needing to find another way for the player to overcome the enemies — that’s when stealth and the ability to use the environment to your advantage came to be.<br />  <br /> At the same time, it was important for us to keep the different mechanics as natural as possible, to avoid having them feeling forced. To give you an example of this, in the dark alien jungle you should be more worried about what could eat you, rather than falling to your death. On the other hand, while on the unstable floating islands, it’s more important to watch your steps because it’s a long way down.<br /> <img alt="Screenshot_03.png" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fplatforming-puzzles-and-stealth-take-center-stage-in-planet-alpha%2FScreenshot_03-1280x720-b8efb8cda6b326e4815932726e13603f142075d2.png" width="100%" /><br /> <strong>How much did it mean to the team being the recipients of the Unreal Dev Grant? How has that extra help aided you in development of PLANET ALPHA?</strong><br />  <br /> The Unreal Dev Grant was what made our studio possible in the first place and I can’t overstate how beneficial it was.<br />  <br /> I started working on PLANET ALPHA in 2013 and for almost two years I kept trying to finance the production of the game without success. But at the beginning of 2015, I received the Unreal Dev Grant and things snowballed from there.<br />  <br /> It allowed me to submit a prototype to the Indie Prize Singapore where it won three awards for Best In Show, Most Promising Game in Development, and Best Game Art. The awards helped me to find an investor that financed the production of PLANET ALPHA and made it possible for me to establish the game development studio.<br />  <br /> After two years of trying, the Unreal Dev Grant made it possible to go from working on the game part-time to starting a development studio, all in a matter of months.<br />   <br /> <strong>Now that you&#39;ve had some serious time with Unreal Engine 4, what would you say your favorite tool has been and why?</strong><br />  <br /> With its node-based approach, Unreal Engine workflow fitted me perfectly and by far my favorite tool is the Blueprint system.<br />  <br /> The funny thing is that the game was started in Unity but after a while, my low C# skills started to take their toll on the production quality, so I abandoned the project. However, after a few months, Unreal Engine came out of private beta and I was able to try the Blueprints Visual Scripting.<br />  <br /> Many were reluctant that visual scripting could be a viable way of building a game, myself included, but I kept pushing forward waiting for that time when it wouldn’t work anymore. Now, four years later, we have a game that is made 90% of Blueprints, with the main exception being the hero locomotion system which was built in C++ by our talented freelance programmer, Fernando Castillo.<br />  <br /> The low learning curve for the visual scripting system means that each of us can build and prototype different systems quickly and on our own which is another great benefit for a team of our size. <br /> <img alt="Screenshot_07.png" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fplatforming-puzzles-and-stealth-take-center-stage-in-planet-alpha%2FScreenshot_07-1280x720-704e636f7cc01fb5e9ed46b0931ef7e786fca3df.png" width="100%" /><br /> <strong>One of the mechanics I&#39;ve seen mentioned for PLANET ALPHA is the ability to control the switch from day to night. How does the affect gameplay for the player?</strong><br />  <br /> The planet rotation mechanic came early on when I was looking for a feature to tie all the other mechanics together but also to help the game stand out.<br />  <br /> Moving to Scandinavia a few years earlier, I was fascinated by the long and colorful twilights when the sun lowers below the horizon and the sky is scattered with vivid shades. It’s a huge and often dramatic transformation that as an artist I wanted to replicate in the game, so I implemented a day and night cycle.<br />  <br /> On the gameplay side, the day and night cycle affect every gameplay mechanic: platforming, puzzle, stealth, and exploration. The player will need to take advantage of the changes to advance in their quest - it can be anything from mushrooms raising to catch the sunlight in the dark forests, which can be used as platforms, to plants that spit acid bombs when they feel under threat which can be used to distract or even destroy enemies.<br /> <img alt="Screenshot_05.png" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fplatforming-puzzles-and-stealth-take-center-stage-in-planet-alpha%2FScreenshot_05-1280x720-47dea753532c94a17851b9076583d0677ba840a0.png" width="100%" /><br /> <br /> <strong>If you had a chance to offer advice to someone picking up development on Unreal Engine 4 for the first time, what would it be?</strong><br />  <br /> By the time I started using Unreal Engine 4, I already had experience with countless other software so I found it very easy to learn the engine. But I can understand that a newcomer to game development might feel intimidated by Unreal Engine and my advice is that while it might look too complex at the start, that same complexity will allow you to later create games better and faster, so think of it as an advantage.<br />  <br /> <strong>Where are all the places people can go to keep up on PLANET ALPHA?</strong><br />  <br /> We’re pretty much everywhere! You can learn more about the game and subscribe to our newsletter via<a href="https://www.planetalpha-game.com/" target="_blank"> our official website</a>. Of course, you can follow up on social media as well via <a href="https://twitter.com/PlanetAlpha" target="_blank">Twitter</a>, <a href="https://www.facebook.com/planetalpha/" target="_blank">Facebook </a>and <a href="https://www.instagram.com/planet.alpha.game/" target="_blank">Instagram</a>. You can also catch up on our latest videos via our <a href="https://www.youtube.com/c/planetalpha" target="_blank">YouTube channel</a>.<br />  newscommunitygamesshowcaseShawn PetraschukTue, 31 Jul 2018 14:00:00 GMThttps://www.unrealengine.com/blog/platforming-puzzles-and-stealth-take-center-stage-in-planet-alphahttps://www.unrealengine.com/blog/platforming-puzzles-and-stealth-take-center-stage-in-planet-alphaEmbarking on an Emotional Journey in the Student Game “Hollowed”Hello, we are Leah Augustine, Douglas Halley, Paul Salas, Charley Choucard, Jerrick Flores, Brandon Kidwell and Erin Marek - otherwise known as Project Polish Productions, a team of graduate students at the Florida Interactive Entertainment Academy. We recently completed our Capstone game, Hollowed, and we wanted to give you more insight into the game&#39;s creative process and its overall development in UE4. <h2>How the Project Started</h2> Hollowed was our team’s “Capstone” game for our Master’s program, which made it equivalent to a Master’s Thesis’. Through this Capstone process, our group of student game developers — with minimal game development experience —  would come together and develop a game through an entire project cycle. The Capstone process was slightly game-ified for our cohort.  Development would be under constant evaluation by our professors who, acting as pseudo-stakeholders, had the ability to “cut” a project, ending its development and requiring the students on that team to join another project. With the intricacies of the Capstone, the stakes were high - develop an incredible game, or get cut.<br /> <br /> The intimidating context was fuel for us, and we took to the stage with fervor. Our team had one core ambition above all else - we were determined to make a polished game that could be released and viewed as more of an “indie” game rather than a “student” game. This goal of polish made us more considerate of any possibility where our game could get cut and further informed our decision-making when it came to scope, design, any many other aspects of Hollowed’s development. It even became the basis for our team’s name: Project Polish (pō-lish) Productions.<br /> <h2>Art </h2> <h3><strong>Style</strong></h3> It took a lot of iteration for us to find our way to our final style. Initially, we were heavily influenced by the game Ori and the Blind Forest and the films of Studio Ghibli. This led us to try hand-painted textures for our assets. We did a few early experiments with this workflow in Substance Painter and it can be seen in our <a href="https://youtu.be/gptMCkXAUNA?t=1h54m10s" target="_blank">vertical slice demo</a> of the game from February 2017.<br /> <br /> <img alt="Style-1---Initial-Vertical-Slice.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fembarking-on-an-emotional-journey-in-the-student-game-hollowed%2FStyle-1---Initial-Vertical-Slice-1080x608-7ee2039bb912e694c5c2144adba9a5377ee92974.jpg" width="100%" /><br /> <br /> After the rush to create the vertical slice, we took a step back and really analyzed the game again. We realized a few things. The textures felt muddy and things were getting lost. As a 3D side scrolling platformer, readability is crucial to the players. Another issue was time. As students with a relatively small team and short development time, the amount of effort needed to create quality hand painted textures was too much.<br /> <br /> These constraints heavily influenced the style we landed upon although we still took influence from Ori and the Blind Forest and Studio Ghibli for their use of color, fantastical elements, and mood/themes. We decided to also look at other games in similar genres. Games like Inside, Journey, and the work of artist Mikael Gustafsson became huge influences. Clean silhouettes, exaggerated lighting, color, and contrast became the tentpoles of our style.<br />  <br /> <img alt="Style-2---Gustafsson-Reference.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fembarking-on-an-emotional-journey-in-the-student-game-hollowed%2FStyle-2---Gustafsson-Reference-1080x608-4fa09233d59c1c830fc98d91f740ebdc09ef2874.jpg" width="100%" /><br /> With a little more direction, we went back to our vertical slice and made changes to try and reflect our new style. This was a test to feel out a new and improved workflow to our asset creation and lighting.<br /> <img alt="Style-3---Revised-Vertcal-Slice.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fembarking-on-an-emotional-journey-in-the-student-game-hollowed%2FStyle-3---Revised-Vertcal-Slice-1080x608-ce7799c35414f683c4577e69571ad9ebebfaf638.jpg" width="100%" /><br /> Although we ended up reworking this scene before the release of the game, this proved a few things for us. First, asset creation was indeed faster. We could now focus mostly on shape, and textures could be more simplified. We could skip a high poly to low poly workflow for most assets and utilize flat colors for things in the distance that would only appear silhouetted. Secondly, this felt closer to the mood we were trying to nail down for the game. Things felt a lot more fantastical and otherworldly, while at the same time reading a bit more clean to players. <br /> <br /> It was a bit easier to proceed once our style was nailed down, even though this wasn&#39;t the end of our iterations. Throughout almost the entire project we wanted Halia&#39;s journey to go from daylight, to night time, to dawn. This meant that early parts of our game needed the lighting to reflect that. When we completed the first chapter of the game, we took a step back and looked at it. Although we felt we were in a better place than the vertical slice, we still weren&#39;t satisfied. Our team noticed that with the daytime lighting there was a lack of contrast between the real world and the underworld (where Halia would spend most of the game). Narratively and design-wise we needed these two worlds to feel and look different; however, both worlds looked dull. The underworld didn&#39;t look appealing because it did not have the saturated and exaggerated colors we were able to get in our reworked vertical slice.<br />   <br /> <img alt="Style-4---Tree-of-Life-(Day).jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fembarking-on-an-emotional-journey-in-the-student-game-hollowed%2FStyle-4---Tree-of-Life-%28Day%29-1080x608-4d5ce12ee31bc1b45ec5272ac873d20c8daf5ad4.jpg" width="100%" /><br /> We decided to make the entire game take place during the night, leading into dawn at the end of the game. Because of that, we were able to really push the light and colors. Not only did this give us more contrast between the real world and the underworld, but it also provided a stronger narrative parallel to Halia&#39;s journey into darkness and arriving at a new dawn.<br /> <img alt="Style-5---Tree-of-Life-(Night).jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fembarking-on-an-emotional-journey-in-the-student-game-hollowed%2FStyle-5---Tree-of-Life-%28Night%29-1080x608-d12bf13fbdb27a69cedee285160ece5a761d2d81.jpg" width="100%" /><br /> During the process, we experienced a lot of back and forth when it came to art. For future projects, we believe more time in pre-production would be key. Unfortunately, with our very short development phase we couldn&#39;t devote too much time before we had to move into production. All in all, we appreciate all the hard work our artists put into the game and we are extremely proud of where we were able to take it. Prior to completion, we ended up having to redo textures, lighting, and set dressing several times to get where we needed to be. None of this could have been accomplished without the hard work of everyone on the team.<br /> <img alt="Style-6---Grove-of-Acceptance.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fembarking-on-an-emotional-journey-in-the-student-game-hollowed%2FStyle-6---Grove-of-Acceptance-1080x608-9f7734e72d9de4fb4635f9370c217dd49c8f7f44.jpg" width="100%" /> <h3><strong>Tech Art</strong></h3> To achieve the visual style for Hollowed it was important to standardize how our Materials were setup. Initially, functionality had to be copy/pasted across multiple Materials. Material functions allowed us to re-use and edit functionality and have it permeate out. Using Material Instances, Parameters, and Material Functions, we were able to create Materials that non-artist team members could use. This saved us time since we could tweak Materials in-engine without having to go to an external tool to edit and then re-import them.<br /> <br /> Materials Instances were also created and manipulated within Blueprints in order to animate textures. By using Lerps and time within a Blueprint we were able to control Parameters in a material behaviour, resulting in moving textures. Examples of this can be seen in Hollowed when the lover disintegrates and when the god’s arm appears and disappears in the intro. <br /> <img alt="Gif.gif" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fembarking-on-an-emotional-journey-in-the-student-game-hollowed%2FGif-085e6d61c63b999b97b6b152f475bc47284de4ea.gif" width="100%" /><br /> UE4’s Material Editor is so powerful that we are able to access aspects of how a given scene is rendered. This allowed us to make post processing and rendering tricks to make the visual aesthetic unique. Since UE4 is open source, our programmers modified how UE4’s render operates in order to control how they can reveal a hidden world within the levels of Hollowed. By applying a Material Function to an object, they were able to create a stencil and either hide or reveal that object to the player: <br /> <img alt="gif2.gif" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fembarking-on-an-emotional-journey-in-the-student-game-hollowed%2Fgif2-b1b0441dbe27b4b21ae8255eaf2b046266071456.gif" width="100%" /> <h2>Design</h2> <h3><strong>Gameplay</strong></h3> As a team, Project Polish was fascinated by how movies and video games could convey powerful and interpretive lasting emotions through storytelling, visuals, and imagery. Ghibli movies like Spirited Away and games like Inside were the base of our inspiration. Spirited Away’s beautiful imagery and Inside’s storytelling without a single word were the base for our game.<br /> <br /> Once we had our inspirations, we started to interpret and adapt them to create a new vision and design for our game. It was clear to us that we wanted to create a game that would convey the feeling of melancholy. An emotion that lies between happiness and sadness can be very personal and interpretive per individual. We built a story and a world that could reflect that feeling. This ultimately lead to the stages of grief becoming a clear foundation for our story, world, and mechanics. <br /> <img alt="Gameplay-1---Melancholy.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fembarking-on-an-emotional-journey-in-the-student-game-hollowed%2FGameplay-1---Melancholy-1080x608-ad07bddc88b379ec8636bbc08f6acc675d557a9d.jpg" width="100%" /><br /> The key to basing the mechanics around the stages of grief was to create a strong foundation to ensure that the player would never feel lost. Each ability state had to create an interaction that affected both the heroine and the spirit. From there, our team would interpret what each stage meant for them and how it could be adapted into a mechanic. With Denial, for example, we interpreted this stage as the inability to face the reality of the world. This inspired the reveal mechanic, a way of showing objects that were not there in the first place. This was a satisfying way for us to portray how the heroine was unable to clearly perceive the world for what it was and only with the help of her spirit could she temporarily face her reality.<br /> <img alt="gif3.gif" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fembarking-on-an-emotional-journey-in-the-student-game-hollowed%2Fgif3-7389eb6e46ab3a2b317b78895f8e7b8b793b4b36.gif" width="100%" /><br /> Overall, like Inside, it was clear for us that our game was meant to be a journey of discovery for our players. It was our pursuit to make sure that nothing was ever on the nose. We wanted players to interpret our game as they see fit. For us, if a player was able to create their own story that satisfies him or her during their playthrough, even if it is vastly different than what we had in our minds during development, then our game experience would become all the more richer for it. <h3><strong>Level Design</strong></h3> The tools associated with the Level Editor were a boon, allowing quick iteration on Blockmesh Maps and several ways to accomplish creating the base shape of a level. UE4  provides us with basic shape Static Meshes, but also provides us with the ability to use “brush shapes” for more customizable block mesh shapes. Using a Box Brush, it was easy to create a box, scale it up then hollow it out. From there, it was easy to duplicate that box and scale it down into the shape of a door and cut out as many doorways needed. In a matter of seconds the base shape of a room was created.<br /> <img alt="Level-Design-1---Box-Editor-Piece.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fembarking-on-an-emotional-journey-in-the-student-game-hollowed%2FLevel-Design-1---Box-Editor-Piece-671x747-3ad79584a5a529a763c05512d808799c9fc3b5e3.jpg" width="100%" /><br /> It was also possible to take time to create more complex geometry inside a room, then convert the brush to a Static Mesh for an artist to pull out for scale. UE4 provides many of the basic tools needed from 3D modeling programs and places them at the fingertips of the designer. Iteration became simple because whatever needed to be altered was easily cut, modified, or shaped to what it represents. Combining these tools with the Material Editor and the Level Designer, our team was able to move expeditiously while creating an accurate representation of the intended level.<br /> <br /> <img alt="Level-Design-2---Whitebox.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fembarking-on-an-emotional-journey-in-the-student-game-hollowed%2FLevel-Design-2---Whitebox-1080x545-f3b67408bfaabf3cc4b0dc7d92fb35279204d71f.jpg" width="100%" /><br /> <img alt="Level-Design-2---Polished-Room.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fembarking-on-an-emotional-journey-in-the-student-game-hollowed%2FLevel-Design-2---Polished-Room-1080x608-7fecac7c488afc76ae2c8c7119ae10a72181d5b0.jpg" width="100%" /><br /> Thank you for reading. We hope this recap of the work put into Hollowed encourages or inspires you in some way. To learn more about the project, please visit <a href="https://projectpolish.wordpress.com/" target="_blank">Project Polish Productions</a>.<br /> <br /> Hollowed is available for free on <a href="https://store.steampowered.com/app/669630/Hollowed/" target="_blank">Steam</a>. You can also follow Project Polish Productions on <a href="https://twitter.com/TeamPolish" target="_blank">Twitter</a> and <a href="https://www.youtube.com/channel/UCFtRxgVS5OqDoLW3dAErlcg" target="_blank">YouTube</a>. <h2>The Team</h2> <h3><strong>Intel University</strong></h3> <h3>Project Polish Productions<img alt="The-Team-1---Project-Polish-Productions.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fembarking-on-an-emotional-journey-in-the-student-game-hollowed%2FThe-Team-1---Project-Polish-Productions-1080x720-fda6acdf98717e0fc7b8c9825282570a0ab03fc5.jpg" style="font-size: 13px;" width="100%" /></h3> Erin Marek - Project Lead<br /> Charley Choucard - Lead Designer<br /> Leah Augustine - Lead Artist<br /> Louis Hofer - Lead Programmer<br /> Paul Salas - Character Artist<br /> Matthew Trupiano - Character Artist<br /> Melissa Almirall - Animator<br /> Will Perez-Valines - Animator<br /> Douglas Halley - Art Manager / Technical Artist<br /> Yunhao Huo - VFX / Technical Artist<br /> Anthony German Ballinas - Environmental Artist<br /> Brandon Kidwell - Writer / Level Designer<br /> Gabi Capraro -  Level Designer <br /> Cameron M. Schwach - UI/AI Technical Designer<br /> Jerrick Flores - Audio/Technical Designer<br /> Kayla Garand - Audio Designer<br /> Martin Holtkamp - Gameplay/Graphics Programmer<br /> Siddharth Suresh - Gameplay/Animation Programmer <h3><strong>Special Thanks</strong></h3> Richard Hall<br /> Tom Carbone<br /> Ron Weaver<br /> Bailey Steggerda<br /> Jackson King<br /> <br /> <br /> <br /> <br />  communitynewseducationProject Polish ProductionsTue, 31 Jul 2018 11:00:00 GMThttps://www.unrealengine.com/blog/embarking-on-an-emotional-journey-in-the-student-game-hollowedhttps://www.unrealengine.com/blog/embarking-on-an-emotional-journey-in-the-student-game-hollowedUnreal Engine Powers VR Interviews with the Teenage Mutant Ninja Turtles at Comic-Con 2018Nickelodeon debuted “Rise of the Teenage Mutant Ninja Turtles VR Interview Experience” at San Diego Comic-Con 2018 where users could step inside the world of the Teenage Mutant Ninja Turtles to conduct a live interview with the cast of <a href="http://www.nick.com/rise-of-the-teenage-mutant-ninja-turtles/" target="_blank">Rise of the Teenage Mutant Ninja Turtles</a> in virtual reality. This one-of-a-kind interview allowed users to have a conversation with Mikey or Donnie - voiced live on the scene by series voice talent, Brandon Mychal Smith and Josh Brener, respectively. We spoke with Chris Young, Entertainment Lab Senior VP at Nickelodeon to get the inside scoop.<br /> <br /> <img alt="rtmnt_img7.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Funreal-engine-powers-vr-interviews-with-the-teenage-mutant-ninja-turtles-at-comic-con-2018%2Frtmnt_img7-1920x1090-d71d623582e01a25b90044b0208eb849bb615928.jpg" width="100%" /><br /> <br /> <strong>What inspired you to use VR as the medium for the TMNT Comic-Con press junket?</strong><br /> <br /> The idea originated from a meeting that we had at NAB in Las Vegas with Epic Games, Adobe and NewTek. We had put together a pipeline to stream Adobe Character Animator into UE4 using NewTek’s NDI technology, and were looking to do a live cartoon in a game engine, and VR was the obvious way to get up close and experience it.  <br />   <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="https://www.youtube.com/embed/yj-LYFN7P5g" width="100%"></iframe></div>  <br /> <strong>How did you devise a live two-way conversation in VR with Turtles Mikey (Michelangelo) and Donnie (Donatello)?</strong><br />  <br /> We had done a lot of exploration around real-time puppetry along with full body and facial performance capture streaming into Unreal, and when the opportunity to do something with the Turtles came about, it seemed like an innovative approach to allow journalists to speak directly to our characters in a real-time experience.<br />  <br /> <strong>Was mo-cap integrated into the live experience?</strong><br />  <br /> Since the Turtles in the Rise of the Teenage Mutant Ninja Turtles reimagined series are a 2D design style, we relied on Adobe’s Character Animator tool to handle the animation, which allowed us to create on model, art directed character rigs that used actual poses and cycles from the show.  We have a large motion capture volume where we are working on several projects that use more traditional performance capture with a pipeline for streaming into UE4.<br /> <br /> <img alt="rtmnt_image01.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Funreal-engine-powers-vr-interviews-with-the-teenage-mutant-ninja-turtles-at-comic-con-2018%2Frtmnt_image01-1920x1040-2e40d6383049a5f3f2343b78fdea9c3ace331141.jpg" width="100%" /><br /> <br /> <strong>Please describe the workflow (hardware and software, headset, capture) from creation of the initial CG/environment assets, through to the final produced interviews. </strong><br />  <br /> The virtual reality experience was developed using Unreal Engine. Creating this experience in a game engine allowed the ability to create real-time interactions and conversations. The experience streams in Mikey and Donnie puppets by using NewTek NDI Technology. NDI allows streaming of large video files over a shared network. The Mikey and Donnie puppets were created and are driven live using Adobe Character Animator. MIDI Keyboards, with mapped animation cycles, are used to trigger the various poses Mikey and Donnie can do. These MIDI Keyboards are puppeted live during the interviews. NDI allows the puppets to be streamed out of Adobe Character Animator, and picked up by various other machines which use these live animations for the experience and compositing. <br /> <br /> <img alt="rtmnt_img10.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Funreal-engine-powers-vr-interviews-with-the-teenage-mutant-ninja-turtles-at-comic-con-2018%2Frtmnt_img10-1920x1090-0a40e49b2c6b3f57a43aebb6c89f1b9de2dc0a4a.jpg" width="100%" /><br /> <em>Mikey and Donnie in Nickelodeon&#39;s Rise of the Teenage Mutant Ninja Turtles VR Interview Experience.</em><br />  <br /> As the user stands in front of a green screen, they are composited into the New York City rooftop, with a Nickelodeon character avatar head as their head. To maintain performance for the VR viewer and to create high-quality outputs, compositing was accomplished by having a network spectator version of the experience. This spectator version separates foreground and background layers, which are picked up over NDI. These layers are then surfaced in Resolume and real-time composited with the live action footage. That final composition is sent out of Resolume via NDI. <br />  <br /> The final composition of the live action footage, the POV of the VR player, and close-up and wide-angle shots of the two Turtles are all recorded with a TriCaster. The TriCaster allows for real-time editing as the interview is occurring with the ability to hand off ISO records and the final program edit to journalists upon completion of the interview on a thumb drive. <br />  <br /> We used a combination of 2D planes for background elements and CG meshes in the foreground to get correct perspective for the VR viewer.  In order to replicate the 2D look in a 3D space, we adopted a workflow for taking deformed meshes from Maya into UE4 until we felt that we had captured the correct pushed, warped perspective for a stylized hand-drawn environment. Back in Maya, from the VR player’s POV we projected flat planes on the background geometry and painted textures on those flat UVs using the actual Photoshop brushes from the show. This perfectly captured the hand-drawn, painted style of the backgrounds. <br />  <br /> <strong>Does this experience use similar technology to the Nickelodeon experience created for Imax VR centers? How is this Comic-Con experience an extension of that technology?</strong><br />  <br /> We were able to merge a lot of our code for how we handle VR pawns and VoIP along with some of our CG avatar assets we created for <a href="https://www.unrealengine.com/en-US/blog/nickelodeon-s-slimezone-multi-player-vr-experience-comes-to-imax-centers" target="_blank">SlimeZone VR</a> into the Turtles experience. <br />  <br /> <strong>Why did you choose to build this experience in Unreal Engine?</strong><br />  <br /> We’ve had a lot of success rapidly going from prototypes to fully functional builds in UE4.  The knowledge we’ve acquired for authoring VR in UE4 mixed with the ability to mock-up ideas in Blueprints allows us to iterate quickly and gives us freedom to focus on creating cool things. <br /> <br /> <img alt="rtmnt_image02.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Funreal-engine-powers-vr-interviews-with-the-teenage-mutant-ninja-turtles-at-comic-con-2018%2Frtmnt_image02-1920x1040-1cf528b4087c1bc321e534e27b64cba83812e323.jpg" width="100%" /><br />  <br /> <strong>Which version of UE did you use, and was there frequent use of any favorite UE features?</strong><br />  <br /> We used 4.19.  Since our development tends to be experimental and cycles are short, we are pretty aggressive about updating to the latest version of the engine. We relied heavily on NewTek’s NDI plugin with alpha support for streaming video onto UE4 materials. <br />  <br /> <strong>Did the animators from the series collaborate with the creators of the VR interview activation for Comic-Con?</strong><br />  <br /> We worked closely with the creators and the animation team to bring the look of the show and the animation style into the VR experience.  A lot of attention was spent on getting the 2D look to translate to VR.  <br />   <br /> <strong>What was the biggest challenge in pulling this off?</strong><br />  <br /> The live aspect, with synced audio/video/game sources, which required a network of machines all sending and receiving data to each other, that had to be trucked from our studio in Burbank and built on-site at Comic-Con required a lot of pre-planning. <br />  <br /> <strong>How are the production considerations different when designing a VR experience that requires live updates on the fly?</strong><br />  <br /> This was a hybrid of animation, game development, live-action performers, puppeteers, virtual cinema photography techniques, on location in a live-broadcast setting.  It was everything rolled into one, and like nothing we had ever done before!film and televisionvrenterpriseeventsshowcaseDaniel KayserMon, 30 Jul 2018 16:30:00 GMThttps://www.unrealengine.com/blog/unreal-engine-powers-vr-interviews-with-the-teenage-mutant-ninja-turtles-at-comic-con-2018https://www.unrealengine.com/blog/unreal-engine-powers-vr-interviews-with-the-teenage-mutant-ninja-turtles-at-comic-con-2018Learn How to Develop High-End Mobile Games with the Action RPG Sample Project<p> Recently shipping alongside Unreal Engine 4.20 is the Action Role Playing Game (Action RPG or ARPG) sample project. As the name suggests, Action RPG is a fast-paced, third-person hack-and-slash game that was built from the ground up to help developers learn more about how to use UE4 to develop high-end mobile games for both Android and iOS. <br /> </p> <img alt="ActionRPGSampleProject_Title.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Flearn-how-to-develop-high-end-mobile-games-in-ue4-with-the-action-rpg-game-sample-project%2FActionRPGSampleProject_Title-1200x675-4cca69e224c8376d2e6909b13f503d90be8d46b6.jpg" width="100%" /><br /> <p> Inside of the Action RPG sample project and accompanying documentation you will find a wide range of topics that any UE4 developer will find useful. Some of the topics that this sample covers are. </p> <ul> <li>Utilizing C++ and Blueprints together in a UE4 project.</li> <li>Setting up and using certain aspects of UE4&#39;s Ability system.</li> <li>How to support multiple platforms like Android, iOS, PC, Mac, and consoles.</li> </ul> <p> Things to Consider: </p> <ul> <li>Due to the complexity of UE4&#39;s Ability system, ARPG only utilizes a small subsection of the available features.</li> <li>Certain aspects of this project also require that you have a good understanding of how C++ works.</li> </ul> <p> Related Links: </p> <ul> <li><a href="https://play.google.com/store/apps/details?id=com.EpicLRT.ActionRPGSample" target="_blank">Google Play</a></li> <li><a href="https://itunes.apple.com/us/app/action-rpg/id1411473790?ls=1&amp;mt=8" target="_blank">iOS</a></li> <li><a href="https://www.unrealengine.com/marketplace/action-rpg" target="_blank">Unreal Marketplace</a></li> </ul> <p> We hope you enjoy this sample project and we look forward to seeing the projects you develop!<br /> </p> <img alt="FB_ActionRPGSampleProject.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Flearn-how-to-develop-high-end-mobile-games-in-ue4-with-the-action-rpg-game-sample-project%2FFB_ActionRPGSampleProject-1200x675-555c51d61fc8276009080bab27f74deacf5fabda.jpg" width="100%" />newscommunitymobilefeaturesSam DeiterFri, 27 Jul 2018 15:00:00 GMThttps://www.unrealengine.com/blog/learn-how-to-develop-high-end-mobile-games-with-the-action-rpg-sample-projecthttps://www.unrealengine.com/blog/learn-how-to-develop-high-end-mobile-games-with-the-action-rpg-sample-projectSketchUp + Unreal Engine + Unreal Studio = Your Design Potential RealizedAre you ready to remove the obstacles hampering your design creativity?<br /> <br /> Making 3D modeling and real-time visualizations easier and more accessible for architectural designers and engineers is essential for taking design productivity to the next level. In our upcoming webinar, we’re going to show you the exciting creative potential of combining the power of Unreal Studio with the intuitive flexibility of SketchUp for real-time design in Unreal Engine.<br /> <br /> Get ready for an eye-opening look at ways to streamline and enhance your design workflows in our webinar on July 31, <strong>Great Things Happen in Unreal Engine with Unreal Studio and SketchUp</strong>, which you can register for below.<br /> <br /> <br /> <img alt="Blog_share_1200x630Option3.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fsketchup-unreal-engine-unreal-studio-your-design-potential-realized%2FBlog_share_1200x630Option3-1200x630-78b30cf2743aa0e569c281f8a43a61a31527ac79.jpg" width="100%" /><br /> <br /> Join Aaron Dietzen, Customer Success Manager at SketchUp, as he demonstrates how working with SketchUp and Unreal Engine can make the process of creating dynamic 3D visualizations faster and simpler than ever before!<br /> <br /> <br /> <strong>You’ll learn:</strong> <ul> <li>Faster, easier workflows for 3D modeling and real-time design</li> <li>How SketchUp makes 3D design more accessible</li> <li>Tips for getting the most out of your real-time design visualizations</li> <li>How to exponentially speed up design data import with Datasmith </li> </ul> <br /> <strong>About our presenter:</strong><br /> <br /> Aaron started his career in software development and architectural design when he was still in high school as a software tester for roof truss design software. Over the next few decades, he learned the ins and out of both industries with positions that include everything from a design team lead to software product manager. After using SketchUp for nearly a decade Aaron joined the Trimble team as a Sales Engineer.<br /> <br /> Don’t miss out on this free online webinar. <br /> <br /> Register here: <a href="https://event.on24.com/wcc/r/1803451/77F6B58FBA7EE54C8D62B314CAAEC9A2" target="_blank">EMEA</a> / <a href="https://event.on24.com/wcc/r/1804385/A2B62B1141E8968C832FE96D4345F7F6" target="_blank">AMER</a>enterprisedesignlearningeducationprogrammingvisualization and trainingKen PimentelFri, 27 Jul 2018 13:00:00 GMThttps://www.unrealengine.com/blog/sketchup-unreal-engine-unreal-studio-your-design-potential-realizedhttps://www.unrealengine.com/blog/sketchup-unreal-engine-unreal-studio-your-design-potential-realizedCCP Games Chooses Unreal Engine for All Upcoming Projects<a href="https://www.ccpgames.com/" target="_blank">CCP Games</a>, the company behind the deep and uniquely player-driven spaceship MMO game EVE Online, has confirmed that Unreal Engine 4 is the exclusive development tool for all of CCP Games’ currently unannounced projects. CCP’s commitment to Unreal stems from the engine’s stability, quick prototyping and the solid cross-platform support for both established and new hardware. <br /> <br /> CCP’s development teams globally are taking advantage of the powerful and easy-to-use functionality of UE4, from fluid integration of third-party systems with plug-ins and modules to improved networking and cross-platform support. CCP’s game developers have jumped on board the exclusive use of UE4, specifically its Unreal Editor that leads the way in terms of lighting and rendering solutions, world composition, landscape sculpting and Blueprint prototyping.<br /> <br /> “Although CCP Games has used versions of Unreal Engine for more than 10 years on various projects, Unreal Engine 4.20 has massive improvements,” said Bing Xi, Development Director at CCP Shanghai. “UE4’s latest enhancements, like the new ProxyLOD mesh reduction tool and the new Niagara VFX editor are among the many features that make game development a more efficient process.”<br /> <br /> <img alt="FB_CCP_UE420_Logos.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Fccp-games-developing-highly-anticipated-new-games-exclusively-with-epic-games-unreal-engine-4%2FFB_CCP_UE420_Logos-1200x630-5a04e93d54af767fc80ffc18e716385d7d0524b8.jpg" width="100%" /><br /> <br /> Using Unreal Engine 4 in the development of all their upcoming new games allows CCP‘s studios to plan for the future by cultivating a shared in-house knowledge of the engine. When coupled with the engine‘s ability to ease the transition of games to different platforms, UE4 enables CCP to stay platform-relevant earlier and for longer.<br /> <br /> “Working with Epic Games using their engine source code, which is open to all Unreal developers, is great for both AAA and indie studios alike, thanks to the huge amount of support options available,” said James Dobrowski, Executive Producer on the unannounced action-MMO being created at CCP London. “UE4‘s Blueprints system allows us to prototype and iterate quickly, and its world-class AAA pipelines allow us to focus on crafting great gameplay and stunningly beautiful worlds.”<br /> <br /> CCP Games is one of many major players in the industry whose development teams trust Epic Games’ UE4, from traditional gaming companies including Microsoft, Nintendo and Sony, to newer technology and experience makers, such as Magic Leap and Oculus.<br /> <br /> “Considering that Epic uses its own engine to develop games like Fortnite is a great example of the trust they have in their own product,” said Snorri &Aacute;rnason, Project Nova‘s Game Director at CCP Reykjavik. “The product’s push for cinematic rendering quality is something that we will aim to implement into our future projects.” <br /> <br />  newscommunitygamesGeorge Kelion Thu, 26 Jul 2018 13:00:00 GMThttps://www.unrealengine.com/blog/ccp-games-chooses-unreal-engine-for-all-upcoming-projectshttps://www.unrealengine.com/blog/ccp-games-chooses-unreal-engine-for-all-upcoming-projectsArchitect Places as Finalist with First Unreal Studio Project<p>When architect Pawel Mielnik set out to tackle the <a href="https://cabins.ronenbekerman.com/" target="_blank">Ronen Bekerman CABINS Challenge</a>, he decided to try something new: real-time rendering. Mielnik, who specializes in architectural visualization at Poland-based <a href="http://www.essentium.pro">Essentium Studio</a>, took a step away from his usual workflow to try out Unreal Studio for the first time.</p> <p>Despite never having used the software before, Mielnik created several stunning images of a Fogo Island cabin that landed him as a finalist in the competition. Most of the competing imagery used offline renderers.</p> <p>Mielnik’s work caught our attention when he posted it on the <a href="https://forums.unrealengine.com/development-discussion/architectural-and-design-visualization/1477912-tower-studio-my-first-archviz-with-unreal" target="_blank">Unreal Studio forum</a>.</p> <img alt="body-img1.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Farchitect-places-as-finalist-with-first-unreal-studio-project%2Fbody-img1-1640x1000-15ebaa7c3c6701ef8c82c8fb2880c1d2a5b5b181.jpg" width="100%" /> <div style="text-align: center;"><em>Tower Studio in Unreal Engine</em></div> <em> </em><strong>Let the training begin</strong> <p>Mielnik has long used 3ds Max as his primary DCC tool with V-Ray for rendering. After watching a number of YouTube tutorials on Unreal Studio by other visualization professionals, he decided to give it a try. “I’ve seen people achieve really good results fairly quickly,” says Mielnik, “and the way they did it didn’t seem too difficult.”</p> <p>Before starting his project, Mielnik browsed the Unreal Studio forum to find out how others achieved similar results, and watched all the <a href="https://www.unrealengine.com/en-US/video-tutorials" target="_blank">Unreal Studio training videos</a>.</p> <img alt="tutorial-img.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Farchitect-places-as-finalist-with-first-unreal-studio-project%2Ftutorial-img-1640x1000-4db9f2e4f096f725b9f5384540e62784eacf86e2.jpg" width="100%" /> <center><i>Unreal Studio training video</i></center> <p>Next, he chose a project: <a href="http://saunders.no/work/fogo-island-tower-studio/" target="_blank">Tower Studio</a>, one of the iconic Fogo Island cabins by Saunders Architecture. Fogo Island, off the coast of Newfoundland in Canada, is well known for its striking coastline. “I like the shape of the Tower Studio,” says Mielnik, “and was amazed by the landscape.”</p> <strong>Bring on Datasmith</strong> <p>Mielnik started by creating the model in 3ds Max, then exporting it to Unreal Engine with the Datasmith plugin included with Unreal Studio. “Using Datasmith was so easy,” he says. “It’s really just a matter of clicking one button. Prepare your model, set up some simple UVs, and export.”</p> <p>Next, he turned his attention to the environment. Mielnik started by creating a simple landscape and sculpting it with Unreal Engine’s tools. He then made a landscape material with four layers using <a href="https://megascans.se/" target="_blank">Quixel Megascans</a> and <a href="https://www.allegorithmic.com/products/substance-designer" target="_blank">Substance Designer</a> textures, and started painting.</p> <p>Mielnik also used Megascans for the foliage, setting up a master material and then painting and tweaking until he was happy with the results.</p> <p>“Building the master materials was a bit of a challenge,” says Mielnik, “but being able to paint with the layers was easy and fun.”</p> <img alt="panoramic-img.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Farchitect-places-as-finalist-with-first-unreal-studio-project%2Fpanoramic-img-1640x820-a50e782bdf2e963aa23202084871fed66b92c7ba.jpg" width="100%" /> <center><i>Panoramic view of the site. You can navigate this panorama in the artist’s <a href="https://www.artstation.com/artwork/Ykm6q" target="_blank">Artstation portfolio</a></i>.</center> <p>He also appreciates being able to use Unreal Engine to test multiple lighting scenarios. “Testing goes very quickly. You don’t have to press ‘Render’ and wait for the results every time you make a change.”</p> <p>To add to his still renderings, Mielnik tried his hand at a 360-degree panorama with Unreal Engine. “This process takes hours when using offline render engines,” he says, “but with the <a href="https://docs.unrealengine.com/en-us/Engine/Plugins/Ansel/Overview" target="_blank">NVIDIA Ansel plugin</a>, it literally took two minutes. It’s amazing how much time it can save you.”</p> <p>He adds that he didn’t do any touch-up on his images with outside packages—all his imagery came straight from Unreal Engine.</p> <strong>Watching the scene unfold</strong> <p>In creating the project, Mielnik learned not only a new workflow, but a new way of seeing the scene progress. When he began painting, the real-time results gave him a very different experience from working in a DCC viewport with a separate rendering step. “It took me a while to get used to seeing the work in progress rendered right there in real time,” Mielnik says, “but now I very much prefer it over the previous way that I worked.”</p> <img alt="body-img4.jpg" height="auto" src="https://cdn2.unrealengine.com/Unreal+Engine%2Fblog%2Farchitect-places-as-finalist-with-first-unreal-studio-project%2Fbody-img4-1640x1000-89283a3b8f8c659b024fe77c18e38653f2f73656.jpg" width="100%" /> <center><i>Painted landscape for Tower Studio</i></center> <p>“I love how easy it is to achieve certain effects that are much more time-consuming with other engines used for arch viz,” he says. “A great example is volumetric fog. You just drag and drop into your scene and work a little on the values, and it’s done.”</p> <p>Mielnik is excited about using Unreal Engine for future projects. “I’m just starting to appreciate how quick it is to create animations, and haven’t even touched VR and walkthroughs yet.”</p> <p>You can see more of Mielnik’s work on his <a href="http://www.visometria.com" target="_blank">personal website</a> and on his <a href="https://www.artstation.com/pjmielnik" target="_blank">Artstation page</a>.</p> <strong>Save time and make something great</strong> <p>Want to try out Unreal Studio yourself? <a href="https://www.unrealengine.com/en-US/studio" target="_blank">Join our free beta</a> today, and post your results on the forum!</p>  communitydesignenterpriselearningKen PimentelThu, 26 Jul 2018 03:47:56 GMThttps://www.unrealengine.com/blog/architect-places-as-finalist-with-first-unreal-studio-projecthttps://www.unrealengine.com/blog/architect-places-as-finalist-with-first-unreal-studio-project