Unreal Engine 4.14 Released!
November 15, 2016

Unreal Engine 4.14 Released!

By Alexander Paschall

This release includes hundreds of updates from Epic and 71 improvements submitted by the incredible community of Unreal Engine developers on GitHub! Thanks to each of these contributors to Unreal Engine 4.14:

Adam Moss (adamnv), Alan Edwardesa (alanedwardes), Andreas Axelsson (judgeaxl), Andreas Schultes (andreasschultes), Andrew Armbruster (aarmbruster), Artem V. Navrotskiy (bozaro), Audiokinetic Inc. (audiokinetic), BaxterJF, CA-ADuran, Cameron Angus (kamrann), Cengiz Terzibas (yaakuro), Christian Hutsch (UnrealEverything), CodeSpartan, Cuo Xia (shrimpy56), Damir Halilovic (DamirHalilovic), dcyoung, Deniz Piri (DenizPiri), Dennis Wissel (dsine-de), Dominic Guana (jobs-git), Dorgon Chang (dorgonman), dsine-de, Filip Brcic (brcha), Hakki Ozturk (ozturkhakki), Hannah Gamiel (hgamiel), Hao Wang (haowang1013), Jarl Gullberg (Nihlus), Jason (Abatron), Jeff Rous (JeffRous), Jeremy Yeung (jeremyyeung), Jørgen P. Tjernø (jorgenpt), Josh Kay (joshkay), jpl-mac, KashiKyrios, Kory Postma (korypostma), Kyle Langley (Vawx), Laurie (Laurie-Hedge), Lei Lei (adcentury), Leszek Godlewski (inequation), Marat Radchenko (slonopotamus), Matthew Davey (reapazor), Matthias Huerbe (MatzeOGH), Matthijs Lavrijsen (Mattiwatti), mbGIT, Michael Geary (geary), Michail Nikolaev (michail-nikolaev), Moritz Wundke (moritz-wundke), Narendra Umate (ardneran), Nelson Rodrigues (NelsonBilber), null7238, Paul Evans (paulevans), PjotrSvetachov, projectgheist, Rama (EverNewJoy), rcywongaa, rekko, Ryan C. Gordon (rcgordon), sangpan, Sébastien Rombauts (SRombauts), Shihai (geediiiiky), stfx, straymist, Theodoros Ntakouris (Zarkopafilis), tmiv, ungalyant, Webster Sheets (Web-eWorks), x414e54, yehaike, YossiMHWF, Yukariin, Zachary Burke (error454), Zhiguang Wang (zhiguangwang)

What’s New

Unreal Engine 4.14 introduces a new forward shading renderer optimized for VR, enabling crisp multi-sampled anti-aliasing in your games. The new Contact Shadows feature renders beautifully detailed shadows for intricate objects. We've also introduced a new automatic LOD generation feature for static meshes that does not require a third-party library.

We’ve streamlined the animation tools to help you be more productive, and added many new features to Sequencer (UE4’s non-linear cinematic tool), as well as improvements to vehicles, clothing and animation Blueprints.

For mobile developers, Vulkan support is ready to use on compatible Android devices! And, we've added various new mobile rendering features such as reading from scene color and depth, and the ability to draw 3D objects on top of your UI.

On the Windows platform, C++ programmers can now use Visual Studio "15" for development. Visual Studio 2015 is still supported.

Major Features

New: Forward Shading Renderer with MSAA

The new forward shading renderer combines high-quality UE4 lighting features with Multisample Anti-Aliasing (MSAA) support! MSAA and the option to enable per-material optimizations make the forward renderer well suited for VR.


The forward renderer works by culling lights and reflection captures to a frustum-space grid. Each pixel in the forward pass then iterates over the lights and reflection captures affecting it, shading the material with them. Dynamic shadows for stationary lights are computed beforehand and packed into channels of a screen-space shadow mask allowing multiple shadowing features to be used efficiently. Enable ‘Forward Shading’ in the Rendering Project settings and restart the editor to use the forward renderer.

Forward Shading

Supported forward rendering features include:

  • Full support for stationary lights, including dynamic shadows from movable objects which blend together with precomputed environment shadows
  • Multiple reflection captures blended together with parallax correction
  • Planar reflections of a partial scene, composited into reflection captures
  • D-Buffer decals
  • Precomputed lighting and skylights
  • Unshadowed movable lights
  • Capsule shadows
  • Instanced stereo compatible

Some features are not yet supported with Forward Shading:

  • Screen space techniques (SSR, SSAO, Contact Shadows)
  • Shadow casting Movable Lights
  • Dynamically shadowed translucency
  • Translucency receiving environment shadows from a stationary light
  • Light functions and IES profiles
  • Alpha to Coverage
  • MSAA on D-Buffer decals, motion blur, dynamic shadows and capsule shadows


The forward renderer supports both multi sample anti-aliasing (MSAA) and temporal anti-aliasing (TAA). In most cases TAA is preferable because it removes both geometric aliasing and specular aliasing. In VR, the constant sub-pixel movement introduced by head tracking introduces unwanted blurriness, making MSAA a better choice.

Projects that choose to use MSAA will want to build content to mitigate specular aliasing. The ‘Normal to Roughness’ feature can help reduce specular aliasing from detailed normal maps. Automatic LOD generation for static meshes can flatten features on distant meshes and help reduce aliasing from small triangles.

In our tests, using MSAA instead of TAA increases GPU frame time by about 25%. Actual cost will depend on your content.

To use MSAA, set the default Anti-Aliasing Method in the Rendering project settings:

The console variable ‘r.MSAACount’ controls how many MSAA samples are computed for every pixel. ‘r.MSAACount 1’ has special meaning and falls back to Temporal AA, which allows for convenient toggling between anti-aliasing methods.


The forward renderer can be faster than the deferred renderer for some content. Most of the performance improvement comes from features that can be disabled per material. By default, only the nearest reflection capture will be applied without parallax correction unless the material opts-in to High Quality Reflections, height fog is computed per-vertex, and planar reflections are only applied to materials that enable it.

Leveraging these options in Epic’s new VR game, Robo Recall, the forward renderer is about 22% faster than the deferred renderer on an NVIDIA 970 GTX.

New: Contact Shadows

Contact shadows allow for highly detailed dynamic shadows on objects.

The ivy below is only a few flat cards but is able to self-shadow in a very convincing way due to outputting Pixel Depth Offset in the material.

The Contact Shadows feature adds a short ray cast in screen space against the depth buffer to know whether a pixel is occluded from a given light or not. This helps provide sharp detailed shadows at the contact point of geometry. There are a number of reasons why shadows through other algorithms may have missing or blurry contacts. Typically it is due to lack of resolution or a depth bias. Regardless of the reason, the new Contact Shadows feature can fill in the gap very well for a small cost.

Contact shadows can be used by setting the Contact Shadow Length property on your light. This controls the length of the ray cast in screen space where 1 is all the way across the screen. Large values can degrade quality and performance so try and keep the length to the minimum that achieves your desired look.

Another use case of contact shadows is to get self-shadowing from the parallax occlusion mapping from arbitrary lights. This requires outputting pixel depth offset in the material. This animation shows a parallax occlusion mapped surface with contact shadow length set to 0.1.

New: Automatic LOD Generation

Unreal Engine now automatically reduces the polygon count of your static meshes to create LODs!

The above animation shows five LODs that were generated automatically. Each is half the number of triangles as the previous.

Automatic LOD generation uses what is called quadric mesh simplification. The mesh simplifier will calculate the amount of visual difference that collapsing an edge by merging two vertices would generate. It then picks the edge with the least amount of visual impact and collapses it. When it does, it picks the best place to put the newly merged vertex and removes any triangles which have also collapsed along with the edge. It will continue to collapse edges like this until it reaches the requested target number of triangles.

This mesh simplifier maintains UVs including generated lightmap UVs, normals, tangents, and vertex colors. Because UVs are maintained the same materials can be used as well as all LODs can share the same lightmap.

The high level settings for controlling the generated LODs are in the static mesh viewer under LOD Settings.

"LOD Group" provides a list of presets. These can be changed per project in BaseEngine.ini under [StaticMeshLODSettings]. We encourage you to set up good categories for your project and mostly use LOD groups instead of controlling the details of every LOD.

An important setting to note is "Auto Compute LOD Distances". Because the algorithm knows how much visual difference every edge collapse is adding it can use this information to determine what distance that amount of error is acceptable. That means it will automatically calculate the screen size to use for each LOD as well.

If you wish to muck with the details of auto generation for each LOD they can be found under Reduction Settings. Note that this feature currently only works with static meshes and that mesh proxy LOD generation is not yet supported.

New: Precomputed Lighting Scenarios

We now support precomputing lighting for multiple lighting setups with the same geometry! This is especially important for use cases such as VR and architectural visualization where you need the highest possible quality at the fastest possible performance.

In the above example the directional light, sky light and skybox have been placed in a Lighting Scenario level called DayScenario. The streetlights have been placed in NightScenario.

To use Lighting Scenarios:

  • Right click on a sublevel in the Levels window and change it to Lighting Scenario. When a Lighting Scenario level is made visible, its lightmaps will be applied to the world.
  • Change the level streaming method to Blueprint on the Lighting Scenario level
  • Place meshes and lights into this level and build lighting
  • In the BeginPlay of your persistent level’s Level Blueprint, execute a Load Stream Level on the Lighting Scenario level that you want active.


  • Only one Lighting Scenario level should be visible at a time in game.
  • When a Lighting Scenario level is present, lightmap data from all sublevels will be placed inside it so that only the DayScenario lightmaps are loaded when it’s daytime. As a result, lightmaps will no longer be streamed by sublevel.
  • A Reflection Capture updated is forced when making a Lighting Scenario level visible, which can increase load time.

New: Improved Per-Pixel Translucent Lighting

In the deferred renderer, the new forward shading functionality can now be used on translucent surfaces to get specular highlights from multiple lights and image-based reflections from parallax-corrected reflection captures!

New: Full Resolution Skin Shading

UE4 now supports full resolution skin shading for the Subsurface Profile shading model. This provides high-fidelity lighting for surface details such as pores and wrinkles.

Checkerboard rendered skin (left), Full resolution skin (right) (Note: 3D head model by Lee Perry-Smith)

Surface detail - checkerboard (left), full resolution (right)

Previously, lighting on skin was represented using a checkerboard pattern, where half the pixels contained diffuse lighting and the other half, specular lighting. The lighting was recombined during a final subsurface profile fullscreen pass. That approach gave good results for subsurface lighting (which is low-frequency by nature), but it could result in lower fidelity lighting for surface details.

With the new approach, every pixel contains diffuse and specular lighting information, packed into an RGBA encoding. This allows us to reconstruct full-resolution lighting during the final subsurface profile fullscreen pass, giving better results for surface details and more stable behavior with temporal antialiasing.

Compatibility Full resolution skin shading requires at least a 64-bit scene color format with a full alpha channel. The default FloatRGBA scene color format works fine, but 32-bit representations such as FloatRGB are not supported. If the scene color format is not compatible with full resolution skin, we fall back to checkerboard-based lighting. This behaviour can be overridden using the r.SSS.Checkerboard console variable. The possible values for this are:

0: Checkerboard disabled (full resolution)

1: Checkerboard enabled (old behavior)

2: Automatic (default) -

Full resolution lighting will be used if the scene color pixelformat supports it Limitations It’s worth noting that the full-resolution skin shading is an approximation. It works well in the vast majority of cases, but certain material features can be problematic due to the encoding method. In particular:

  • Metallic materials
  • Emissive materials

These features will work, but you may notice differences in output compared to checkerboard due to the packed RGBA diffuse/specular encoding. It is possible to workaround particular issues when authoring materials by setting the opacity to 0 in areas where skin shading is not desirable. Pixels with an opacity of zero are treated as default lit for the purposes of shading.

Note: Masking non-opaque pixels in this way is also worthwhile for performance reasons, since these pixels are bypassed by the subsurface postprocess.

Performance Considerations

f your title has a 64-bit scene color format then full resolution subsurface lighting will typically be faster than checkerboard due to the reduced number of texture fetches. However, if your title has a 32-bit scene color then the performance gain from the reduced texture bandwidth will likely outweigh the benefits (although this is hardware dependent).

New: Reflection Capture Quality Improvements

When you use Reflection Captures, the engine mixes the indirect specular from the Reflection Capture with indirect diffuse from lightmaps. This helps to reduce leaking, since the reflection cubemap was only captured at one point in space, but the lightmaps were computed on all the receiver surfaces and contain local shadowing.

(With lightmap mixing on the left, without on the right)

Mixing works well for rough surfaces, but for smooth surfaces the reflections from Reflection Captures no longer match reflections from other methods, like Screen Space Reflections or Planar Reflections.

Lightmap mixing is no longer done on very smooth surfaces. A surface with roughness .3 will get full lightmap mixing, fading out to no lightmap mixing by Roughness .1 and below. This allows Reflection Captures and SSR to match much better and it's harder to spot transitions.

The below shot shows mirror surface reflections before and after. Note the difference in the reflection of the wall between SSR and reflection captures. The artifact is especially noticeable in motion, because it will move with your camera due to SSR limitations.

This affects existing content - in cases where you had reflection leaking on smooth surfaces, that leaking will be much more apparent. To solve this, place additional reflection probes to reduce the leaking. Levels should have one large spherical capture at a minimum. You can also revert to the old lightmap mixing behavior with a rendering project setting:

New: Visual Studio "15" Support

Unreal Engine 4.14 now the upcoming Visual Studio "15" out of the box. Visual Studio 2015 is still supported as well. Visual Studio “15” is currently available in “Preview” from Microsoft’s Visual Studio web site.

If you have multiple versions of Visual Studio installed, you can select which to use through the ‘Source Code’ section in ‘Editor Preferences.’

New: Create Static Mesh from Actors

You can now right-click actor(s) in the level viewport and convert their current state to a new Static Mesh asset. This even works with skeletal meshes, so you can capture a mesh from posed characters.

New: NVIDIA Ansel Support

UE4 4.14 adds support for NVIDIA Ansel Photography! Ansel is a new tool from NVIDIA that enables players to take in-game screenshots. While in Ansel mode the game will pause and players will have camera control to compose shots and apply various screen effects. It can also capture a variety of screenshots, from HDR to 360 stereo. See NVIDIA’s website for more details.

Ansel support is now exposed as a new UE4 plugin. After enabling the plugin in your project, you can access Ansel in a standalone game session.

(Viewing an Ansel 360 capture in a web browser)

We have also exposed functions on the Player Camera Manager class so your games can customize Ansel capture behavior. Games may wish to limit the distance of camera movement, disable UI elements, disable/enable certain lighting or post processing effects, etc. Thanks to Adam Moss and NVIDIA for providing the implementation. To get started using this feature, check out the ‘Ansel_integration_guide.html’ document under the Ansel plugin folder. Official UE4 documentation for Ansel will be coming soon.

New: Improved Cable Component

The Cable Component plugin has been updated with new features, including collision support and sockets for attaching objects or effects.

Cable Component now includes these new features:

  • Simple collision, including friction settings
  • Stiffness setting, which tries to reduce bending
  • Sockets at each end of the cable
  • Ability to set either end to ‘free’

New: UI Font Outlines

Fonts for UMG and Slate now have an optional outline that can be applied to them.

Any widget that specifies a font can change the outline setting, color, or material to be used with the outline.

A font material on an outline can be used in the same way that any other font material is used except that a material specified for an outline only applies to the outline. Font materials can be used on the outline to create lots of different effects.

New: Editable Map and Set Properties

We now support editing Map and Set properties from within the Details Panel!

Sets are similar to Arrays, but you can never have the same element in a set twice and the order of elements is not guaranteed. However, it’s extremely quick to lookup into a set to see whether it contains an element.

Maps will have a key and a value and you can edit both within the details panel. Like Sets, all keys must be unique, and the order of elements is not guaranteed to persist. However, it’s very quick to lookup an element’s value as long as you know it’s key.

New: Vector Noise in Materials

The Noise material graph node includes several functions useful for procedural shading that produce a single-valued (scalar) result. |

| | | | | |

| Cellnoise | Vector Noise | Gradient | Curl | Voronoi |

The new Vector Noise node adds several more with 3D or 4D vector results. Due to the run-time expense of these functions, it is recommended that once a look is developed with them, all or part of the computation be baked into a texture using the Draw Material to Render Target Blueprint feature introduced in 4.13. These material graph nodes allow procedural looks to be developed in engine on final assets, providing an alternative to creating procedurally generated textures with an external tool to apply to assets in the engine. The new functions are:

1. Cellnoise: Returns a random color for each cell in a 3D grid (i.e. from the mathematical floor operation applied to the node input). The results are always consistent for a given position, so can provide a reliable way to add randomness to a material. This Vector Noise function is extremely cheap to compute, so it is not necessary to bake it into a texture for performance.

2. Perlin 3D Noise: Computes a version of Perlin Simplex Noise with 3D vector output. Each output component is in the range -1 to 1. Computing three channels of noise output at once is cheaper than merging the results from three scalar noise functions.

3. Perlin Gradient: Computes the analytical 3D gradient of a scalar Perlin Simplex Noise. The output is four channels, where the first three (RGB) are the gradient, and the fourth (A) is the scalar noise. This is useful for bumps and for flow maps on a surface

4. Perlin Curl: Computes the analytical 3D curl of a vector Perlin Simplex Noise (aka Curl Noise). The output is a 3D signed curl vector. This is useful for fluid or particle flow.

5. Voronoi: Computes the same Voronoi noise as the scalar Noise material node. The scalar Voronoi noise scatters seed points in 3D space and returns the distance to the closest one. The Vector Noise version returns the location of the closest seed point in RGB, and the distance to it in A. Especially coupled with Cellnoise, this can allow some randomized behavior per Voronoi cell. Below is a simple stone bed material using the distance component of the Vector Noise / Voronoi to modulate some surface bumps and blend in moss in the cracks, and the seed position together with Vector Noise / Cellnoise to change the color and bump height per rock.


Perlin Curl and Perlin Gradient can be added together in octaves, just as regular Perlin noise can. For more complex expressions, it is necessary to compute the gradient of the result of the expression. To help with this, place the expression to compute into a material function and use it with the helper nodes Prepare3DDeriv, Compute3DDeriv, and either GradFrom3DDeriv or CurlFrom3DDeriv. These use four evaluations of the base expression spaced in a tetrahedral pattern to approximate these derivative-based operations. For example, this network uses the gradient to compute bump normals from a bump height function.

New: PhysX 3.4 Upgrade

Unreal Engine now uses the latest version of NVIDIA PhysX, which is 3.4. This brings improved performance and memory usage for rigid bodies and scene queries (especially multi-core performance.)

This version of PhysX allows for Continuous Collision Detection (CCD) on kinematic objects, which allows for accurate collisions between very fast moving rigid bodies! In the animation below from a Robo Recall test level, a player is swiping a weapon to impact an oncoming bullet!

New features available to use in UE4 right away:

  • Continuous Collision Detection (CCD) support for kinematic objects (shown in the animation above!)
  • Faster updating of kinematic objects
  • Faster convex hull cooking

In future releases, we’ll expose more new physics features available in the latest version of PhysX.

New: Animation Editor Revamp

Animation-related tools have been split into separate asset editors rather than using one editor with multiple modes.

Many other improvements have been made as well. Functionality that is common to each of the editors is now generally found in the viewport and the improved Skeleton Tree.

  • The Skeletal Mesh editor has had modifications to its layout and to the asset details panel, specifically the materials and LOD sections have been overhauled.
  • The Skeleton editor has had its layout tweaked and the skeleton tree itself has been polished.
  • The Animation editor has had its layout tweaked and the asset browser has gained the ability to optionally add and remove its columns.
  • The Animation Blueprint editor has had its layout tweaked to more closely follow that of the standard Blueprint editor. The Anim Preview Editor can now optionally apply changes that are made to the preview’s properties to the class defaults.

Asset Shortcut Bar

You can jump between related animation assets that share a skeleton using the improved Asset Shortcut Bar.

Recording Moved to Transport Controls

Recording used to be performed via a button in the toolbar. Now it has been moved to a recording button in the transport controls, similar to Sequencer.

Preview Scene Setup

The objects in the scene and their animation can be modified in each of the editors via the "Scene Setup" menu. This allows preview animations to be applied, different preview meshes to be set (this is either specified for the skeleton or for individual animations) and additional meshes to be attached. Additional meshes are now specified as separate editor-only assets that define a set of skeletal meshes that are driven as slaves of the main mesh.

New: Animation Curve Window

You can now easily tweak Animation Curves using the new dedicated window for this in the Animation Editor. Curves are previewed live as you edit them.

Previously you could only configure curves on the animation assets themselves, but now you’ll set these for the skeleton instead.

New: Child Actor Templates

Child Actor Components added to a Blueprint can have their properties customized via Child Actor Templates.

Once you add a Child Actor Component, you will see an expandable template in the Details panel of the owning Actor's Blueprint Editor. From here, you can access all the properties of the Child Actor, including public variables. For example, if you have Blueprint_A

  • containing a PointLight Component with a public variable driving its color, and then make that Blueprint a Child Actor Component within *Blueprint_B*, you can now adjust that color variable from within *Blueprint_B's Details panel!

This is a dramatic improvement over previous behavior, wherein users were restricted to the default properties of the Child Actor Component and could only make updates via gameplay script.

New: Default Animation Blueprint

Allows you to assign an animation Blueprint to a skeletal mesh that will always be run after any animation Blueprint assigned in the component. This allows you to set up anim dynamics or other controllers that will always be applied, whether that mesh is viewed in the animation tools, a Sequencer cinematic or just placed in a level.

This allows for dynamics, controllers, IK or any other anim Blueprint feature to be related to a mesh and not have to be duplicated in every animation Blueprint intended to be used on that mesh.

‘Post process’ animation Blueprints also have their own native and Blueprint update step so parameters can be read or calculated for use in the animation graph.

New: Landscape Editing in VR

You can now create and sculpt terrain and paint landscape materials using motion controllers in VR!

You can summon the Landscape Editing tools from the "Modes" panel on your Quick Menu. Then choose a brush from the UI and start painting! If you hold the “Modifier” button on the motion controller, you can erase instead of painting.

New: Improved Support for Vehicles

We’ve changed where tire forces are applied. Previously, tire forces were applied at the vehicle’s center of mass. We now apply force at the tire’s center of mass which makes it easier to achieve load sway in cars.

We’ve also added Simple Wheeled Vehicle Movement Component which provides wheel suspension and tire friction without the complexities of engine and drivetrain simulation. This component allows you to easily apply torque to individual tires. All components inheriting from Wheeled Vehicle Movement Component can now be used on arbitrary components, and you no longer have to rely on the Wheeled Vehicle actor.

Existing content will automatically have Deprecated Spring Offset Mode set to true which will maintain the old behavior. You can tune this behavior further by changing Suspension Force Offset.

New: Improved Vulkan support on Android

Unreal Engine 4.14 is ready for shipping games with Vulkan support!

  • UE4 supports Android 7 (Nougat) devices with Vulkan drivers as well as the Samsung Galaxy S7 running a recent OTA update.
  • Many rendering issues have been fixed with the UE4 Vulkan renderer on Android devices.
  • The renderer will automatically fall back to OpenGL ES when launched on Android devices that are not Vulkan-capable.
  • Vulkan support on specific devices and driver versions can now be enabled or disabled using device profiles, with fallback to ES 3.1 and ES 2. This allows UE4 games to disable Vulkan support and use OpenGL ES on phones with incomplete or broken Vulkan implementations.

New: Support for Custom Depth on Mobile

Custom Depth is now supported in the mobile rendering path. Custom post-process materials can now sample from Scene Depth, Custom Depth as well as Scene Color.

As it requires post-processing, Mobile HDR must be enabled, and the feature does not currently work while Mobile MSAA is enabled.

New: Scene Capture Improvements on Mobile

When rendering scene captures, the Scene Capture Source settings that output Inverse Opacity and Depth values are now supported on mobile.

  • The "SceneColor (HDR) in RGB, Inv Opacity in A" option can be used to render objects with translucency into a texture which can then be alpha-blended over a scene or widget blueprint.
  • Similarly, the depth value can be used as a mask when using the resulting texture.
  • Generating the opacity data has some cost, so use "SceneColor (HDR) in RGB, 0 in A" for improved performance if you do not need opacity
  • Scene captures now work correctly on devices that do not support floating point targets, such as Galaxy S6 prior to Android 6.

New: Improved Cloth Skinning

We have added the ability to calculate our own mesh-to-mesh skinning data for clothing within the engine, so rather than using the render data exported in an .apx or .apb file we now use the render data UE4 already has. We take the simulation mesh from the APEX-exported asset and reskin our render data onto that mesh. This means that the final data should look as good as the data you originally imported.

This brings a few benefits. Normals could previously appear incorrect (see image below) and you were previously restricted to one UV channel. Both of these issues are solved with the new skinning system.

New: Material Attribute Nodes

Working with material attributes is now easier to read and less error prone as part of an ongoing update to improving extensibility of material properties.

  • GetMaterialAttributes - This node is a compact replacement for BreakMaterialAttributes
  • SetMaterialAttributes - This node is a compact replacement for MakeMaterialAttributes
  • BlendMaterialAttributes - This is a new node to allow easier blending of Material Attributes structures.

The main improvement for the Get and Set nodes is that pins are optionally added unlike the Break and Make nodes which expose all attributes by default. This allows graphs to avoid the old workflow that required manually connecting every attribute pin. Selecting a node shows the list of current pins in the details panel which can be expanded or removed. For an example, the material function below takes a set of attributes then blends the Base Color and Roughness to a shiny, red surface.

As well as reducing clutter in the graphs, these nodes take advantage of many backend changes to be forward-compatible with any custom material attributes that a project may need to add. Sharing materials between projects is more viable as missing attributes are automatically detected and users given a chance to handle the errors. Any attribute not explicitly listed on a node is passed through with the main Material Attributes pin, including any that are added after the material graph is created. With the Make and Break nodes a new pin would be added and all graphs would need manually updating.

The new Blend node is intended to allow blending of multiple sets of attributes using a mask, a common operation when working with detailed layers of materials. The example below evenly blends Red and Green materials (defined as functions) then has a node that applies a clear-coat to the result:

By default the Blend node performs a linear interpolation (lerp) for all material attributes using the Alpha input. The node has checkboxes to opt-out of blending on a per-vertex/pixel level to allow easier control when using vertex-only or pixel-only mask data. Similarly to the new Get and Set nodes above, the Blend node will automatically handle new attributes being added or removed and allows programmers to specify custom blending behavior when registering attributes.

New: Pre-Skinned Local Position in Materials

Materials now have access to a skeletal mesh’s reference pose position for use in per-vertex outputs. This allows localized effects on an animated character. The node can be shared for static meshes also for which it returns the standard local position. The example graph below creates a grid pattern in local-space which remains relative to the skeletal mesh during animation:

New: Improved Sequencer Shot Import/Export

Movie recording with frame handles per shot. Master sequences can now be rendered with extra frames at the start and end of each shot. These extra frames are cut into and out of by an Edit Decision List (EDL), which can be used in an external video editing package to adjust the cuts between shots.

New: Improved Camera Rig Crane

We’ve tweaked the camera rig crane behavior so that it mimics the movement of a physical crane.

  • Roll and yaw of the camera crane mount is 0.
  • Add toggles to lock the mount pitch/yaw for the crane. By default they are not locked so that the camera will stay level with the ground.

New: Sequencer Audio Recording

You can now record audio from a microphone while recording into a sequence.

New: Pose Driver Improvements

The Pose Driver node allows a bone to drive other aspects of animation, based on a set of ‘example poses’. In this release, it can now drive bone transforms as well as morph targets, for example driving a shoulder pad bone based on arm rotation. We have also added an option to use the translation of the driving bone instead of its orientation. Debug drawing has been improved to show each ‘target’ pose and how close the bone is currently considered to it.

New: Virtual Bones

We’ve added the ability to add ‘virtual bones’ to a skeleton. Virtual bones are not skinnable, but constrained between two existing bones on the skeleton and automatically have data generated for them for each animation on the skeleton. For example, you could add a joint that is a child of a hand, but constrained to a palm joint. Unlike socket, then this joint can be used in Animation Blueprint as a target - i.e. IK target or look at target - or you could modify them in AnimBP for later use.

This helps to improve character iteration time. Previously, if you change your target joint hierarchy for IK or aim, you have to do this in outside of engine, DCC, and import back all the animations to fix the animation data with that new joint included, but this virtual bone will allow you to skip that and do all of work in engine. However this will require recompressing of the animation data to include that joint back to the animation data. To see more practical usage of virtual bones, see "Animation Techniques used in Paragon" for more information. It can be used to make it easier to retarget or change reference frames for controllers and are used for orientation and slope warping in Paragon.

New: Morph Target Debug View Mode

The new Morph Target View Mode makes it easy to see which vertices are affected by each morph target.

New: Child Animation Montages

Create a Child Montage based on a parent Montage, allowing you to replace animation clips, whilst maintaining overall timing. Useful for adding variations to a move whilst guaranteeing it won’t affect gameplay.

New: MIDI Device Plugin

This release contains a new "MIDI Device" plugin for interaction with music hardware

This is a simple MIDI interface that allows you to receive MIDI events from devices connected to your computer. Currently only input is supported. In Blueprints, here's how to use it:

  • Enable the "MIDI Device" plugin using the Plugins UI, then restart Unreal Editor.
  • Look for "MIDI Device Manager" in the Blueprint RMB menu.
  • Call "Find MIDI Devices" to choose your favorite device.
  • Break the "Found MIDI Device" struct to see what's available.
  • Then call "Create MIDI Device Controller" for the device you want. Store that in a variable. (It’s really important to store the reference to the object in a variable, otherwise it will be garbage collected and won’t receive events!)
  • On your MIDI Device Controller, bind your own Event to the "On MIDI Event" event. This will be called every game Tick when there is at least one new MIDI event to receive.
  • Process the data passed into the Event to make your project do stuff!

New: Landscape Rotation Tool

The landscape mirror tool can now flip the reflected geometry parallel to the mirror plane, to create diagonally-opposed multiplayer maps.

New: Improved Mesh Material Slot Importing

The material workflow has been changed in order to give more control and information on how every material is used static and skeletal meshes and to improve material ordering inconsistencies when reimporting meshes.

Each element in the list is a material slot with the following information

  • Name of the slot
    • The name of the slot is used to match up the material on reimport. When a mesh is reimported it looks for this name in the FBX file to determine which sections should match up to existing materials. Previously this relied on index ordering which was easy to become out of order.
    • Meshes that were imported before this change will have their material slot set to none. Meshes imported after this change will have their material slot set to the imported material name by default.
  • Material asset reference
  • The original imported material name (in the tooltip)

In Blueprints and C++ it is now possible to use the material slot name instead of using a hard coded index to retrieve a material slot. Call Set Material By Name to set a dynamic material on a skeletal mesh or static mesh component. Using a name lookup instead of an index ensures gameplay code still works properly if the order of materials on a mesh changes.

New: Platform SDK Upgrades

In every release, we update the engine to support the latest SDK releases from platform partners.

  • Xbox One: Upgraded to August 2016 QFE 2
  • Playstation 4: Upgraded to PSR SDK 4.008.061
  • HTML5: Upgraded to Emscripten 1.36.13
  • macOS: Now supports 10.12 Sierra, Xcode 8.1
  • iOS/tvOS: Now supports iOS10/tvOS10, Xcode 8.1

New: Blueprint Library for Mobile Downloading/Patching

The new Mobile Patch Utilities Blueprint library contains all the functionality required to allow a mobile game to download and install game contents and patches from a cloud website instead of being distributed as part of the initial download from the App Store.

There is functionality to determine if updated game content is available, initiate the download, track progress, handle any errors and finally install the content paks that are downloaded successfully. Functionality to check for sufficient storage space and WiFi connectivity is also available, so the blueprint can warn the user in such cases. Both Android and iOS are supported.

New: Amazon GameCircle Plugin for Kindle Fire

A new Online Subsystem GameCircle plugin is now included!

GameCircle Achievements, Leaderboards, and Friends is supports as well as Amazon In-App Purchases. Enabling the plugin will provide access to a new Amazon GameCircle project settings panel under the Plugins category. Changes to the AndroidManifest.xml for Fire TV may be enabled here.

New: Live GPU Profiler

UE 4.14 includes a real-time GPU profiler which provides per-frame stats for the major rendering categories. To use it, enter the console command ‘stat gpu.’ You can also bring these up in the editor via the ‘Stat’ submenu in the Viewport Options dropdown.

The stats are cumulative and non-hierarchical, so you can see the major categories without having to dig down through a tree of events. For example, shadow projection is the sum of all the shadow projections for all lights (across all the views).

The on-screen GPU stats provide a simple visual breakdown of the GPU load when your title is running. They are also useful to measure the impact of changes instantaneously; for example when changing console variables, modifying materials in the editor or modifying and recompiling shaders on the fly (with recompile shaders changed).

The GPU stats can be recorded to a file when the title is running for analysis later. As with existing stats, you can use the console commands ‘stat startfile’ and ‘stat stopfile’ to record the stats to a ue4stats file, and then visualize them by opening the file in the Unreal Frontend tool.

Profiling the GPU with UnrealFrontend. Total, postprocessing and basepass times are shown

Stats are declared in code as float counters, e.g:

DECLARE_FLOAT_COUNTER_STAT(TEXT("Postprocessing"), Stat_GPU_Postprocessing, STATGROUP_GPU);

Code blocks on the rendering thread can then be instrumented with SCOPED_GPU_STAT macros which reference those stat names. These work similarly to SCOPED_DRAW_EVENT. For example:

SCOPED_GPU_STAT(RHICmdList, Stat_GPU_Postprocessing);

GPU work that isn’t explicitly instrumented will be included in a catch-all [unaccounted] stat. If that gets too high, it indicates that some additional SCOPED_GPU_STAT events are needed to account for the missing work. It’s worth noting that unlike the draw events, GPU stats are cumulative. You can add multiple entries for the same stat and these are aggregated across the frame.

Certain CPU-bound cases the GPU timings can be affected by CPU bottlenecks (bubbles) where the GPU is waiting for the CPU to catch up, so please consider that if you see unexpected results in cases where draw thread time is high. On PlayStation 4 we correct those bubbles by excluding the time between command list submissions from the timings. In future releases we will be extending that functionality to other modern rendering APIs.

New: Improved Merge Actor Texture Atlas (experimental)

We’ve improved how texture space is utilized when merging actors together and combining materials, by introducing a new option to generate a weighted (binned) atlas texture, instead of having a atlas-texture in which each material is equally weighted.

(Left: Equal-weighted materials. Right: Binned method) The new functionality first calculates the importance of an individual material according to the maximum sized texture it samples. These values are then used to calculate the amount of space the material should occupy and to iteratively add each texture to the atlas