Unreal Engine 4.22 released
What's New
Unreal Engine 4.22 continues to push the boundaries of photorealism in real-time environments whether you are making immersive and engaging games, broadcasting live television, visualizing groundbreaking products, or creating the next blockbuster film. We don't believe significant advances in technology should result in increases in development time for you to take advantage of them, so we have once again set our sights on making workflows for users from all disciplines even faster and more accessible.Unreal Engine delivers unbridled power to build realistic worlds with the most accurate real-time lighting and shadowing effects - including dynamic global illumination, pixel perfect reflections and physically accurate refraction - thanks to real-time ray tracing on Nvidia RTX graphics cards. Soft area shadows and ambient occlusion provide the finishing touches to ground your scenes firmly in reality.
Our vast suite of virtual production features enables you to speed up your workflow on set with the ability to capture and record complex live performances and composite them in real-time. Entire teams can work in concert to orchestrate and direct scenes live using the new multi-user editing feature.
Every second spent waiting to see your latest creation come to life has a cost - a cost to you, a cost to your users, a cost to your vision - so we strive to make Unreal Engine easier and faster to go from iteration to iteration with each release so you can spend more time tweaking and polishing the experience for consumers. Live Coding brings Live++ support to Unreal Engine so you can go from idea to reality in seconds while you are running your project. Build times have been optimized across the board making iteration times for incremental builds up to 3x faster and freeing up valuable resources in your pipeline.
This release includes 174 improvements submitted by the incredible community of Unreal Engine developers on GitHub! Thanks to each of these contributors to Unreal Engine 4.22:
0xmono, Adam Moss (adamnv), Ahsan Muzaheed (muzaheed57), Alessio Sgarro (cmp-), Alexander Stevens (MilkyEngineer), AlexTimeFire, AlSafty, Andrzej K. Haczewski (ahaczewski), Anton Rassadin (Antonrr), Ben Peck (bpeck), BinaryRK, Branislav Grujic (grujicbr), Cameron Angus (kamrann), Cengiz Terzibas (yaakuro), Chris Conway (Koderz), Chris Gallegos (Chrispykins), Clinton Freeman (freemancw), Cristiano Carvalheiro (ccarvalheiro), Dan Ogles (dogles), Daniele Benegiamo (kafumanto), David Aylaian (davidaylaian), David Nadaski (GlassBeaver), David Sauve (notanumber), Deep Silver Dambuster Studios (DSDambuster), Dmitriy Donskoy (sentik), doodoori2, Dorgon Chang (dorgonman), Doug Moscrop (dougmoscrop), Doug Richardson (drichardson), Dzuelu, Erik Dubbelboer (erikdubbelboer), H1X4Dev, Hargreawe, hkozachkov2, Ilyin Aleksey (IlinAleksey), improbable-valentyn, Ivan Popelyshev (ivanpopelyshev), IvanKuzavkov, James Cahill (Prouser123), Jan Kaniewski (getnamo), Jin Hyung Ahn (zenoengine), jkinz3, Joe Best-Rotheray (cajoebestrotheray), joemmett, Josef Gluyas (Josef-CL), Kalle Hämäläinen (kallehamalainen), Kartik Saranathan (Kartiku), korkuveren, Kory Postma (korypostma), Leon Rosengarten (lion03), locus84, lotodore, Marat Radchenko (slonopotamus), Marcel (Zaratusa), Mark Whitty (Mosel3y), mastercoms, Mathias Hübscher (user37337), Michael Kösel (TheCodez), Michael Samiec (m-samiec), Mike Bell (MichaelBell), Mike Slegeir (tehpola), Mimus1, Mitsuhiro Koga (shiena), momboco, Morva Kristóf (KristofMorva), Muhammad A. Moniem (mamoniem), Nick (eezstreet), Nick Edwards (nedwardsnae), Nick Pruehs (npruehs), Ondrej Hrusovsky (skylonxe), Paul Hampson (TBBle), Philippe Chaintreuil (Philippe23), Phillip Baxter (PhilBax), projectgheist, RDIL, Riley Labrecque (rlabrecque), Roman K. (CrispMind), Robin Zoň (ZonRobin), roidanton, ruzickajason, ryugibo, Sam Hocevar (samhocevar), Satheesh (ryanjon2040), Scott Fries (ScottFries), Sébastien Rombauts (SRombauts), ShalokShalom, spoiltos, stanley1108, Stephen A. Imhoff (Clockwork-Muse), stkrwork, sturcotte06, Takashi Suzuki (wankotank), tgroll, Thang To (thangt), Tim Lincoln (Ratherbflyin), TommyTesla, Vladimir Ziablitskii (rainlabs), whoisfpc, YanaPIIDXer, Yannick Comte (demonixis), yhase7, Zeblote
Major Features
New: Real-Time Ray Tracing and Path Tracing (Beta)
With this release, we're excited to announce beta support for Ray Tracing and Path Tracing using Windows 10 RS5 update that takes full advantage of the DirectX 12 and DirectX Raytracing (DXR) with NVIDIA RTX series cards. Real-Time Ray TracerThe Ray Tracing features are composed of a series of ray tracing shaders and ray tracing effects. With each of these, we're able to achieve natural realistic looking lighting effects in real-time which are comparable to modern offline renderers for shadowing, ambient occlusion, reflections, and more. We introduced a number of ray tracing features and will continue to expand the feature set in upcoming versions of the Unreal Engine. Some of the features in this release include:
- Soft area shadowing for Directional, Point, Spot, and Rect light types.
- Accurate reflections for objects in and outside of the camera frustum.
- Soft ambient occlusion to ground objects in the scene.
- Physically correct refraction and reflection results for translucent surfaces.
- Indirect lighting from dynamic global illumination from light sources.
- And more!
Path Tracer
In addition to the Ray Tracer, we've included an unbiased Path Tracer with a full global illumination path for indirect lighting that creates ground truth reference renders right inside of the engine. This can improve workflow content in your scene in Unreal without needing to export to a third-party offline path tracer for comparison.
For additional details, see Path Tracer.
New: High-Level Rendering Refactor
In this release, we have completely rewritten mesh drawing in Unreal Engine to have better drawing performance and support for Real-Time Ray Tracing. In the future, we'll continue to move additional rendering work to the GPU.Mesh Drawing Pipeline Refactor
WIth the new mesh drawing pipeline, drawing information for static scene elements are cached more aggressively than before, and automatic instancing merges draw calls where possible. This enables new implementations of mesh passes to be four to six time fewer lines of code!
This refactor mostly affects mesh drawing inside the Renderer with custom scene proxies and the interface to the Renderer being marginally affected. Any Custom Drawing Policies will need to be rewritten as FMeshPassProcessors in the new architecture, meaning that backwards compatibility for Drawing Policies was not possible with a change as big as this one.
New: C++ Iteration Time Improvements
Live Coding (Experimental)We've licensed Molecular Matters' Live++ for all developers to use on their Unreal Engine projects, and we've integrated it as the new Live Coding feature. With Live coding, you can make C++ code changes in your development environment and compile and patch it into a running editor or standalone game within seconds. Unlike the legacy hot reload mechanism, Live Coding will patch individual functions without requiring any special consideration for object re-instancing, making it much more reliable and scalable for large projects. To use it, check the Live Coding (Experimental) option from the drop down next to the compile button in the editor, and hit Ctrl+Alt+F11 to compile and apply your changes. To enable from a cooked game, type "LiveCoding" into a console window.
Notes:
- Modifying class layouts while the engine is running is not supported. We intend to address this in a future release.
- Only Windows is currently supported.
Build Times
We optimized UnrealBuildTool and UnrealHeaderTool to make C++ iteration times up to 3x faster!
Full build (UE4Editor Win64 Development):
Unreal Engine 4.21 |
Unreal Engine 4.22 |
Improvement |
|
---|---|---|---|
Total Build Time: |
436.90 |
326.81 |
30% faster |
Compiling UnrealHeaderTool: |
46.12 |
46.30 |
|
Generating headers |
25.05 |
15.50 |
60% faster |
Compiling UE4Editor |
323.15 |
257.97 |
25% faster |
UnrealBuildTool overhead |
42.58 |
7.04 |
600% faster |
Incremental build (UE4Editor Win64 Development):
Unreal Engine 4.21 |
Unreal Engine 4.22 |
Improvement |
|
---|---|---|---|
Total Build Time: |
7.47 |
2.14 |
340% faster |
Compiling UE4Editor |
1.19 |
1.08 |
|
UnrealBuildTool overhead |
6.28 |
1.06 |
590% faster |
No code changes (UE4Editor Win64 Development)
Unreal Engine 4.21 |
Unreal Engine 4.22 |
Improvement |
|
---|---|---|---|
UnrealBuildTool overhead |
5.38 |
1.03 |
520% faster |
Additionally, we significantly improved the accuracy of dependency checking for incremental builds, including detecting files being added and removed from a project and using compiler-driven include dependency checking.
New: Virtual Production Pipeline Improvements
Unreal Engine continues to lead the way for real-time virtual production with significant improvements to all aspects of the pipeline.Real-Time Compositing with Composure (Beta)
The Composure compositing system has been significantly improved to make it easier than ever to composite images, video feeds, and CG elements directly within the Unreal Engine.
With built-in compositing, you can easily visualize the results of combining green-screened scenes with your Unreal Engine Level content in real time. This can be helpful for pre-visualization, giving directors on set a good sense of what the final scene will look like after the captured film has been enhanced by digital content from Unreal Engine and the green backgrounds replaced. These in-engine results can also be a valuable reference for compositors working offline in other third-party software. Improvements include:
- Use Materials to drive layer blending.
- Leverage the post-processing pipeline for effects like "Light Wrap".
- Use the Media Framework and Professional Video I/O systems to capture video input and render output.
- Use built-in render passes for chroma-keying, despill, tonemapping, and more. Or, or build your own custom passes.
OpenColorIO (OCIO) Color Profiles (Experimental)
You can now use OpenColorIO (OCIO) color profiles to transform the color space of any Texture or Composure Element directly within the Unreal Engine. This can help you keep the colors of your video and computer generated elements consistent from initial capture, through compositing, to final output.
Hardware-Accelerated Video Decoding (Experimental)
On Windows platforms, you can now use your GPU to speed up the processing of H.264 video streams. This reduces the strain on the CPU when playing back video streams, which may result in smoother video and may allow you to use higher resolution movie files and more simultaneous input feeds.
To use hardware decoding, enable the Hardware Accelerated Video Decoding (Experimental) setting in the Plugins - WMF Media section of the Project Settings window.
New Media I/O Formats
We added even more Professional Video I/O input formats and devices:
- 4K UHD inputs for both AJA and Blackmagic.
- Supports both 8bit and 10bit inputs.
- Supports single-link, dual-link, and quad-link.
- Supports AJA Kona 5 devices.
- HDMI 2.0 input.
- UHD at high frame rates (up to 60fps).
This release adds several new features that make the nDisplay multi-display rendering system more flexible, handling new kinds of hardware configurations and inputs.
- Each cluster node's application window can now contain multiple viewports at defined screen-space coordinates. This allows a single Unreal Engine instance, running on a single computer, to handle multiple offset displays.
- In previous releases, the only way to provide input to the nodes in an nDisplay system was through VRPN. This release adds a new communication mechanism called cluster events, which you can use to trigger synchronized responses on all connected computers.
- More of the possibilities of the input subsystem have been exposed in the nDisplay configuration file, allowing you to change attributes and mappings without repackaging your Project.
- If you're already using o, your configuration file may need some adjustments to match the new schema. However, the nDisplay Launcher can now upgrade your configuration file automatically.
New: HoloLens Remote Streaming Support (Beta)
Unreal Engine 4 now supports Holographic Remoting through the Windows Mixed Reality plugin! This allows Unreal applications to run on a Windows desktop PC and stream the rendered result wirelessly to HoloLens over a Wi-Fi connection in real time.New: Audio System Improvements
TimeSynth (Beta)
TimeSynth is a new audio component focused on providing sound designers with sample accurate starting, stopping, and concatenation of audio clips. TimeSynth enables precise and synchronous event audio event queuing, which is essential for interactive music applications. Layered Sound ConcurrencyThe Sound Concurrency System now observes multiple settings or groups. Now, if a sound object (AudioComponent, SoundBase, or SynthComponent) does not satisfy all of the requirements specified in their ConcurrencySet property, the new sound will not play. In addition, if the new sound satisfies all concurrency set resolution rules and begins playback, one or more sounds may be stopped.
Spectral Analyzer for Submixes (new Engine)
Designers can now analyze the spectral energy of a Submix during gameplay to drive modulations in gameplay, material, or any number of possible destinations based on the frequency content of the currently playing audio.
Baked Spectral Analysis Curves and Envelopes on Sound Waves
Sound waves can now be pre-analyzed for envelope and spectral energy to drive Blueprints during playback. This allows sound designers to create compelling audio-driven systems, while offloading the spectral analysis work to improve runtime performance. In addition, analysis data from a proxy sound wave can be substituted for a sound wave's analysis data, allowing designers to spoof isolated sound events when trying to drive gameplay.Improvements to Sound Asset Importing
We have made significant improvements to sound asset importing. In addition to our current support for multi-channel WAV files, the Unreal Audio Engine now supports importing a wide variety of sound file formats, including AIFF, FLAC, and Ogg Vorbis.
Improvements to MIDI Device Plugin
The MIDI Device Plugin now allows users to send MIDI messages to external MIDI devices in addition to processing MIDI Device input. In addition, improvements to the Plugin's Blueprint API simplify parsing MIDI messages. These changes allow sound designers to more effectively integrate MIDI I/O into their project.
New: Sequencer Improvements
The industry-leading Sequencer linear animation toolset once again received significant updates with new tools and enhancements that benefit virtual production projects along with a host of quality of life improvements.Sequencer Take Recorder
Take Recorder enables fast iteration of recording performances and quickly reviewing previous takes for virtual production workflows! Building upon the foundations of Sequence Recorder, we improved what you can record and how the data is handled as well as made the system extendable to fit the needs of different use cases. You can easily record animations from motion capture linked to characters in the scene as well as actual Live Link data for future playback. By recording Actors into subsequences and organizing them by Take metadata, productions of all sizes and any number of takes can easily be accommodated. Composure Sequencer Track
The new Composure track enables you to easily export a Sequence as layers defined in Composure. Multiple tracks can be added to export more than one layer at a time. You can drag layers straight from Composure into Sequencer to generate tracks for those layers. Layered Animation Workflows in Sequencer
You can now create layered animations using multiple weighted sections in a single track. Layered animations are supported in Transform tracks as well as several other property tracks. Live Link Sequencer Track
You can now record incoming live link data onto sequencer tracks and play them back. Live link data that comes with timecode and multiple samples per engine tick can be saved at a resolution higher than the sequencer playback rate.
Object Binding Sequencer Track
You can now change the Object - Static Mesh, Skeletal Mesh, Material, etc - assigned to a property on an Actor in Sequencer. Once the Actor is added to Sequencer, these Object reference properties are available like other properties in the Track menu for that Actor. Network Replication in Sequencer
Level sequence actors that are set to replicate using the Replicate Playback property on Level Sequence Actors now synchronize their playback time between server and client.
Enhanced Python Support for Sequencer
Python support for Sequencer has been improved for movie scene functionality:
- Movie scene section and key data manipulation
- Movie scene capture
- You can now assign multiple objects to a single track.
- You can now copy/paste sections from multiple tracks at once.
- You can now mark frames with a label and quickly jump to that mark to play the sequence. A color can be set for each mark to make them easily identifiable.
- Expose Track/Section to Blueprints - Todo, ie. Runtime toggling of subsections/takes
New: Animation Budgeting System (Beta)
The new Anim Budgeter tool enables you to set a fixed budget per platform (ms of work to perform on the gamethread), and it works out if it can do it all, or if it needs to cut down on the work done. IT works by measuring the total cost of animation updates, and calculating the cost of a single work unit. If work needs to be cut down to meet the budget, it does it based on significance and targets several areas: stop ticking and use Master Pose Component, update a lower rate, interpolate (or not) between updates, etc. The goal is to dynamically adjust load to fit within fixed (gamethread) budget.New: Animation Sharing Plugin
Added a new Animation Sharing plugin that reduces the overall amount of animation work required for a crowd of actors. It is based upon the Master-Pose Component system, while adding blending and additive Animation States. The Animation states are buckets for which animation instances are evaluated. The resulting poses are then transferred to all child components part of the bucket. See the diagram below for a breakdown of the system.New: Support for long filenames (Experimental)
We added support for long file paths for users with Windows 10 Anniversary Update! Historically, paths in windows have been limited to 260 characters, which can cause problems for projects with complex naming conventions and deep hierarchies of assets. The Windows 10 Anniversary Update adds support for much longer filenames, on the condition that the user and each application opt-in to it. To enable long file paths in Windows 10:- Ensure you're running Windows 10 version 1607 or later
- Enable support for long paths via the group policy setting or registry (see https://docs.microsoft.com/en-us/windows/desktop/FileIO/naming-a-file#maxpath[/url])
- In the Unreal Editor, check the "Enable Long Paths Support" in the experimental settings dialog.
New: Blueprint Indexing Optimizations
Changes to how we index Blueprint search data have significantly improved Editor and Play-In-Editor startup times. We now defer search data updates until a Find-in-Blueprint tab is opened, perform updates asynchronously, and separate Blueprint re-indexing from the Asset loading process.New: Improved Steamworks support
Using UE4 with Steam has never been easier! We've made several improvements to usability and quality of life for developers of multiplayer games on Steam.- Dedicated Servers on Steam can now receive custom names (up to 63 characters) with the new "-SteamServerName" launch argument.
- Projects can now override the Steam network layer by deactivating the "bUseSteamNetworking" configuration value and setting their NetDriver configurations to the preferred underlying network layer.
- We have greatly improved the usability of Steam NetDrivers with UE4 Beacons in addition to standard game networking.
- You can now set certain required Steam values, such as dedicated server names, or the application ID, in your project's Target.cs file. Making changes to these values will no longer require recompiling the engine.
New: Preview Scene Settings Improvements
We added functionality to Preview Scene Settings that enables you to hide Environment Cubemap (Show Environment]) without disabling lighting. See the Use Sky Lighting property in the Preview Scene Settings panel.New: Skeletal Mesh LOD Reduction
Use the Skeletal Mesh Reduction Tool to generate versions of a Skeletal Mesh with reduced complexity for use as levels of detail (LODs) all within Unreal Editor! You no longer need to rely on external Digital Content Creation (DCC) programs or third party tools that can be very time consuming and error prone. Create accurate, high-quality levels of detail and see the results immediately in the viewport.For additional information, see Skeletal Mesh Reduction Tool.
New: Per Platform Properties Improvements
Per Platform Properties have been extended to allow for setting values based on Target Platform in addition to the Platform Groups.New: Gauntlet Automation Framework Improvements
The Gauntlet automation framework received several improvements focusing on usability, documentation, and examples to learn from.Expanded documentation & samples
- Documentation about Gauntlet architecture and getting started
- Additional ActionRPG and Profile Guided Optimization examples
- Example of tracking editor load and PIE times
Gauntlet now supports installing and running IPA files on iOS (requires Mac host). This takes our device support to PC, Mac, PS4, XB1, Switch, Android, and iOS.
Profile Guided Optimization
An example script for automation of Profile Guided Optimization (PGO) file generation on PS4, XboxOne, and Switch for your project.
Report Creation
Added HTML and Markdown builders for creating custom reports as part of automation scripts.
New: Visual Studio 2019 Support
Support for Visual Studio 2019 has been added. To use Visual Studio 2019 by default, select "Visual Studio 2019" as the IDE from the editor's source control settings.We've also added support for switching to newer C++ standard versions. To change the version of the C++ standard that your project supports, set the CppStandard property to one of the following values from your .target.cs file.
Version |
Value |
---|---|
C++14 |
CppStandardVersion.Cpp14 |
C++17 |
CppStandardVersion.Cpp17 |
Latest |
CppStandardVersion.Latest |
At the same time, we've deprecated support for Visual Studio 2015. If you want to force your project to compile with the Visual Studio 2015 compiler, you can set WindowsPlatform.Compiler = WindowsCompiler.VisualStudio2015 from your project's .target.cs file. Note that the version of the engine downloaded from the Epic Games Launcher does not support Visual Studio 2015, and we no longer test it internally.
New: Subsystems
Subsystems are automatically instanced classes with managed lifetimes which provide easy to use extension points without the complexity of modifying or overriding engine classes, while simultaneously getting Blueprint and Python exposure out of the box.Currently Supported Subsystem Lifetimes
Engine
class UMyEngineSubsystem : public UEngineSubsystem { ... };When the Engine Subsystem's module loads, the subsystem will Initialize() after the module's Startup() function has returned. The subsystem will Deinitialize() after the module's Shutdown() function has returned.
These subsystems are accessed through GEngine:
UMyEngineSubsystem MySubsystem = GEngine->GetEngineSubsystem();
Editor
class UMyEditorSubsystem : public UEditorSubsystem { ... };
When the Editor Subsystem's module loads, the subsystem will Initialize() after the module's Startup() function has returned. The subsystem will Deinitialize() after the module's Shutdown() function has returned.
These subsystems are accessed through GEditor:
UMyEditorSubsystem MySubsystem = GEditor->GetEditorSubsystem();
Note: These Editor Only subsystems are not accessible to regular Blueprints, they are only accessible to Editor Utility Widgets and Blutility Classes.
GameInstance
class UMyGameSubsystem : public UGameInstanceSubsystem { ... };M
This can be accessed through UGameInstance:
UGameInstance* GameInstance = ...; UMyGameSubsystem* MySubsystem = GameInstance->GetSubsystem();
LocalPlayer
class UMyPlayerSubsystem : public ULocalPlayerSubsystem { ... };
This can be accessed through ULocalPlayer:
ULocalPlayer* LocalPlayer = ...; UMyPlayerSubsystem * MySubsystem = LocalPlayer->GetSubsystem();
Accessing Subsystems from Blueprints
Subsystems are automatically exposed to Blueprints, with smart nodes that understand context and don't require casting.
You're in control of what API is available to Blueprints with the standard UFUNCTION() markup and rules. Subsystems from Python
If you are using Python to script the editor, you can use built-in accessors to get at subsystems:
my_engine_subsystem = unreal.get_engine_subsystem(unreal.MyEngineSubsystem) my_editor_subsystem = unreal.get_editor_subsystem(unreal.MyEditorSubsystem)
Note: Python is currently an experimental feature.
New: Editor Utility Widgets
Editor Utility Widgets enable you to extend the functionality of Unreal Editor with new user interfaces created entirely using the UMG UI Editor and Blueprint Visual Scripting logic! These are Editor-only UI panels that can be selected from the Windows menu like other Unreal Editor panels. To create an Editor Utility Widget, right-click in the Content Browser and select Editor Utilities > Editor Widget.To edit the Blueprint, double-click on the Editor Widget Asset. Once you've edited the Blueprint for your Editor Widget Asset, right-click the Editor Widget and select Run Editor Utility Widget to open the UI in a tab. The tab is only dockable with Level Editor tabs. It appears in the Level Editor's Windows dropdown, under the Editor Utility Widgets category. This is an experimental feature.
New: Material Analyzer
The Material Analyzer enables you to get a holistic view of parameter usage in Materials and Material Instances so you can quickly find opportunities to consolidate and optimize your Material Instances to minimize switching render state and save memory. The Material Analyzer can be found under Window > Developer Tools. Materials are listed in a tree with a list of suggestions which show groups of material instances with the same set of static overrides so you can make optimizations. You can also place all the related instances into a local collection, so you can easily find and update them.New: Select Child and Descendant Actors
You can now extend your selection to all the immediate children or all the descendants of your selected Actor using the context menu in the World Outliner and the Level Viewport, making it easier to work with large, complex scene hierarchies.New: Scaled Camera Zoom and Pan
When you have one or more objects selected in the Level Viewport, the sensitivity of camera zoom and pan operations now scales automatically with the distance between the objects and the camera. This makes your camera movements feel more natural, especially when you're working with objects at extreme sizes, such as tiny mechanical parts or large landscapes.You can return to the previous behavior by disabling the new Use distance-scaled camera speed setting in the Level Editor > Viewports section of the Editor Preferences window.
New: Orbit Around Selection
You can now make the camera orbit around the pivot of the selected objects - as opposed to orbiting around the center of the screen - when one or more objects are selected in the Level Viewport. To activate this mode, enable the new Orbit camera around selection setting in the Level Editor > Viewports section of the Editor Preferences window.New: Toggle Multiple Layer Visibility
You can now toggle the visibility of multiple Layers at the same time. Hold CTRL and click each Layer to build your selection. Then click the eye icon next to any of those selected Layers to toggle visibility of all selected Layers.New: Multi-User Editing (Beta)
Multiple level designers and artists can now connect multiple instances of Unreal Editor together to work collaboratively in a shared editing session, building the same virtual world together in real time.- A dedicated server keeps track of all the modifications made by all users, and synchronizes the state of the Editor between all connected computers.
- When you make changes in your Levels and Sequences on one computer, the changes are automatically mirrored, live, to all other computers that are part of the same session.
- When you make changes to other types of Assets, like Materials, the changes are replicated to all other computers when you save those Assets.
- Before leaving an editing session, each user can choose whether they want to apply the changes made during that session to their local copy of the Project.
New: Preview Rendering Level Improvements
The workflow for the Mobile Previewer has improved when working with different devices and platform's shading models in order to consistently use the same shading model across all Editor viewports and to enable you to instantly switch between the default Shader Model 5 (SM5) and a selected Preview Rendering Level. Start by selecting a Preview Rendering Level from the main toolbar Settings drop-down to compile shaders for a platform. Once compiled, use the added Preview Mode button in the main toolbar to toggle the view mode.For additional details, see Mobile Previewer.
New: Dynamic Spotlight Support on Mobile
We now support non-shadow casting Dynamic Spot Lights on high-end mobile devices.You can enable dynamic Spot Lights from the Project Settings > Rendering > Mobile Shader Permutations by setting Support Movable Spotlights to true.
For additional information, see Lighting for Mobile Platforms.
New: SaveGame System iCloud Support
UE4 now supports saving games to iCloud using the ISaveGameSystem interface, on both iOS and tvOS. You can enable saving games to iCloud by going to Project Settings > Platforms > iOS > Online and enabling the Enable Cloud Kit Support option. Then from the iCloud save files sync strategy option, you can select the sync strategy that works best for your project. The currently available iCloud sync options are as follows:- Never (Do not use iCloud for Load/Save Game)
- At game start only (iOS)
- Always (Whenever LoadGame is called)
New: Device Output Window Improvements
Major improvements have been made to the Device Output Log window, bringing it out of the Experimental state. You can use the Device Output Log window to send console commands to iOS devices from the PC. To access the Device Output Log, from the main menu click Window >Developer Tools > Device Output Log.New: HTML5 Platform Improvements (Experimental)
We have added experimental multithreaded support for HTML 5 projects. Please note you need access to the Unreal Engine 4 source code to enable this functionality.Some browsers will need special flags enabled in order to run in multithreaded mode. See https://github.com/emscripten-core/emscripten/wiki/Pthreads-with-WebAssembly for more information.
- For Chrome: run chrome with the following flags:
--js-flags=--experimental-wasm-threads --enable-features=WebAssembly,SharedArrayBuffer
These can alternatively be enabled/disabled in
chrome://flags/#enable-webassembly-threads
as "WebAssembly threads support"- In Firefox nightly, SharedArrayBuffer can be enabled in
about:config
by setting thejavascript.options.shared_memory
preference to true.
New: iOS Preferred Orientation
You can now set the preferred orientation to be used as the initial orientation at launch for iOS devices when both Landscape Left and Landscape Right orientations are supported.New: Niagara Vector Field Data Interface
The Vector Field Data Interface now works the same for both CPU and GPU particles! You can use the Sample Vector Field module to sample vector fields. It exposes three primary inputs:- VectorField: This is the Vector Field Data Interface instance, containing the static vector field object itself, and per-axis tiling flags.
- SamplePoint: This is the point where the vector field is sampled. This defaults to Particles.Position, but this can be customized.
- Intensity: This scales the sampled vector.
- ApplyFalloff: Check this to apply a falloff function to the sampled vector, so the influence of the vector field approaches zero towards the edges of the vector field's bounding box.
- UseExponentialFalloff: Check this to make the falloff function be exponential instead of linear.
- FalloffDistance: When applying a falloff function, this parameter determines how far from the bounding box edges the falloff applies.
- FieldCoordinates: This makes it possible to override the Emitter's Localspace parameter. It has three options:
- Simulation: Uses the Emitter.Localspace parameter.
- World: This overrides the position and transform of the vector field so that it is always relative to the world origin, regardless of the Emitter.Localspace parameter.
- Local: This overrides the position and transform of the vector field so that it is always relative to the System itself, regardless of the Emitter.Localspace parameter.
- FieldTranslate: This offsets the vector field relative to the origin as defined by FieldCoordinates.
- FieldRotate: This reorients the vector field relative to the origin as defined by FieldCoordinates.
- FieldScale: This rescales the vector field.
Note: The input expected here will be relative to the volume of the vector field itself, as no transformations are applied for you.
An example for easily visualizing and using a vector field is included, called VectorFieldVisualizationSystem.
New: Niagara Curl Noise Data Interface
The Curl Noise Data Interface now generates procedural curl noise based on an underlying simplex noise function and the results are identical for both CPU and GPU emitters. It is recommended to use the SampleCurlNoiseField module to generate curl noise for your particles. This module has two primary inputs exposed:- Strength: This scales the output vector generated by the module.
- Length Scale: This describes the approximate size of the vortices generated by the curl noise.
- Offset: This is used to pan the noise field.
- Noise Field: This is the Data Interface object itself, primarily used for adjusting seeds.
- Sample Point: This specifies where to sample from. Defaults to Particles.Position, but other values can also be used.
New: Deterministic Random Number Generation in Niagara
We added support for deterministic random number generation for both CPU and GPU Niagara emitters. The behavior of the random number generated can be controlled globally from the Emitter Properties module, with the following options:- Determinism: A flag to toggle between deterministic or non-deterministic random numbers for the entire emitter.
- Random Seed: A global seed used by the deterministic random number generator.
- Min: This defines the lower bound of the random numbers generated. It can be any integer or float type.
- Max: This defines the upper bound of the random numbers generated. It can be any integer or float type.
- RandomnessMode: This is an enum controlling the determinism mode of the random number generator, and it can be:
- Simulation Defaults: This is the default behavior; it inherits the value of Emitter.Determinism.
- Deterministic: Uses the deterministic random number generator.
- Non-deterministic: Uses the non-deterministic random number generator.
- OverrideSeed: This determines whether or not to override the seed specified by Emitter.GlobalSeed.
- Seed: This value is used to override Emitter.GlobalSeed if OverrideSeed is enabled.
New: Additional Inputs for Niagara Math Operations
Many of the script math operations now support an arbitrary number of input pins which can be added by clicking the Add (+) button or by connecting to the pin next to the Add button.New: Support for Deprecating Niagara Scripts
Scripts for modules, dynamic inputs, and functions can now be marked as deprecated in the script editor. Emitters and systems using deprecated scripts will now display errors in the UI, and deprecated scripts will not show up in the menus used to add them.New: Niagara Initialization Modules
New modules have been added which expose the most common attributes used when initializing particles.New: Select by Simulation Target Node for Niagara
The new Select by Simulation Target node enables you to execute different logic depending on whether an emitter is running in the CPU vector machine or in a GPU compute script. In general, most scripts should run identically on both simulation targets. However, this is not always possible, especially when making data interface calls. In cases where exact parity isn't available, this new node gives the module author more tools to build consistent behavior. For an example of how this is used, see the new collision response module.New: Collision System for Niagara
Niagara collisions have been completely rewritten to support ray-trace-based CPU collisions, CPU+GPU analytical plane collisions, GPU scene depth, and distance field collisions.Additional features include:
- Stability has been vastly improved across the board, in comparison to previous Niagara and Cascade implementations.
- CPU collisions support the incorporation of the scene's physical material characteristics, such as restitution and friction coefficients, and offers several integration schemes.
- The system has been written as a single module to improve usability.
- Collisions now work in combination with all renderers.
- A configurable "rest" state allows particles to remain stable in particularly challenging situations.
- The equations are physically based/inspired, and work with mass and other system properties.
- A number of advanced options have been exposed, including static, sliding and rolling friction.
- Collision radii are automatically calculated for sprites and meshes. Optionally, you can specify this parameter directly.
New: Platform SDK Upgrades
In every release, we update the Engine to support the latest SDK releases from platform partners.- IDE Version the Build farm compiles against
- Visual Studio - Visual Studio 2017 v15.6.3 toolchain (14.13.26128) and Windows 10 SDK (10.0.16299.0)
- Minimum Supported versions
- Visual Studio 2017 v15.6
- Requires NET 4.6.2 Targeting Pack
- Minimum Supported versions
- Xcode - Xcode 10.1
- Visual Studio - Visual Studio 2017 v15.6.3 toolchain (14.13.26128) and Windows 10 SDK (10.0.16299.0)
- Android
- Android NDK r14b (New CodeWorks for Android 1r7u1 installer will replace previous CodeWorks on Windows and Mac; Linux will use 1r6u1 plus modifications)
- Note: Permission requests are now required on a per-feature basis. (for example: RECORD_AUDIO, CAMERA). For more information, see Android Upgrade Notes.
- HTML5
- Emscripten 1.37.19
- Linux "SDK" (cross-toolchain)
- Internal: v13_clang-7.0.1-centos7
- sync from //depot/CarefullyRedist/HostWin64/Linux_x64/v13_clang-7.0.1-centos7
- Public: v12_clang-6.0.1-centos7
- downloadable from http://cdn.unrealengine.com/CrossToolchain_Linux/v13_clang-7.0.1-centos7.exe (now with the installer!)
- Internal: v13_clang-7.0.1-centos7
- Lumin
- 0.19.0
- Steam
- 1.39
- SteamVR
- 1.0.16
- Oculus Runtime
- 1.32
- Switch
- SDK 7.3.0 + optional NEX 4.4.2 (Firmware 7.x.x-x.x)
- SDK 6.4.0 + optional NEX 4.6.2 (Firmware 6.x.x-x.x)
- Supported IDE: Visual Studio 2017, Visual Studio 2015
- PS4
- 6.008.061
- Firmware Version 6.008.021
- Supported IDE: Visual Studio 2017, Visual Studio 2015
- XboxOne
- XDK: July 2018 QFE-4
- Firmware Version: December 2018 (version 10.0.17763.3066)
- Supported IDE: Visual Studio 2017
- macOS
- SDK 10.14
- iOS
- SDK 12
- tvOS
- SDK 12