Unreal Engine 4.13 Released!
September 1, 2016

Unreal Engine 4.13 Released!

By Chance Ivey

This release brings hundreds of updates for Unreal Engine 4, including 145 improvements submitted by the community of Unreal Engine developers on GitHub! Thanks to all of these contributors to Unreal Engine 4.13:

alk3ovation, Allegorithmic (Allegorithmic), Alwin Tom (alwintom), Andreas Axelsson (judgeaxl), Andrew Scheidecker (AndrewScheidecker), Andrian Nord (NightNord), ArnoB (ABeekhub), Artem (umerov1999), Artem V. Navrotskiy (bozaro), Błażej Szczygieł (zaps166), Brent Scriver (FineRedMist), Cedric Neukirchen (eXifreXi), Céleste (CelPlays), Chris Conway (Koderz), Chris528, Christoph Becher (chbecher), Christopher P. Yarger (cpyarger), DanielDylan, DaveC79, Derek van Vliet (derekvanvliet), DevVancouver, Douglas Lassance, Eric-Ketchum, Eugene (grisevg), Franco Salas (SupremeNinjaMaster), gameDNA (gameDNAstudio), ghost, Joel McGinnis (joelmcginnis), Jonathan Johansson (DualCoder), Jørgen P. Tjernø (jorgenpt), Joshua Kaiser (JoshuaKaiser), korkuveren, Kory Postma (korypostma), Krish Munot (KrishMunot), Kuts Alexey (krunt), Lars Jørgen Solberg (larsjsol), Lectem, Lee Reilly (leereilly), Lukasz Baran (iniside), madsystem, Manny (Manny-MADE), Marat Radchenko (slonopotamus), Markus Breyer (pluranium), Martin Gerhardy (mgerhardy), Marvin Pohl (pampersrocker), massanoori, Mateusz Polewaczyk (matii1509), Matthias Huerbe (MatzeOGH), Matthijs Lavrijsen (Mattiwatti), MaximDC, mfortin-bhvr, Michael Allar (Allar), Michael Schoell (MichaelSchoell), mik14a, Miłosz Kosobucki (MiKom), mkirzinger, Moritz Wundke (moritz-wundke), Nachtmahr (NachtMahr87), Narendra Umate (ardneran), NaturalMotionTechnology, Oleksandr Kuznietsov (Teivaz), OWIAdmin, Patryk Stępniewski (kodomastro), Paul Evans (paulevans), pfranz, Piotr Bąk (Pierdek), PistonMiner, projectgheist, Rama (EverNewJoy), Ricardo Maes (kukiric), Rick Deist (dreckard), Robert Segal (robertfsegal), RobertTroughton, Rohan Liston (rohanliston), Saffron (SaffronCR), Sajid (sajidfarooq), salamanderrake, Samuel Maddock (samuelmaddock), Sébastien Rombauts (SRombauts), Tanner Mickelson (DarthCoder117), Thomas Mayer (tommybear), tmiv, Tyler Thompson (Bogustus), Victor Polevoy (vityafx), Webster Sheets (Web-eWorks), Wesley Hearn (wshearn), yehaike, Yohann Martel (ymartel06), Yong-Quan Chen (wingedrobin), Yu He (yuhe00), Zachary Burke (error454)

What’s New

Unreal Engine 4.13 has arrived! In this version you'll find numerous improvements across the board.

Many new rendering features have been added, such as mesh decals, Blueprint drawing to render targets, GPU morph targets, refraction improvements and high quality, optimized noise functions are now available to materials. Shadow map caching allows for more shadow-casting dynamic lights in a scene than ever before!

Sequencer, our new non-linear cinematic editor, has been updated with a slew of new features for high-end cinematography. Live recording from gameplay has been significantly improved. Also, you can now transfer shots and animations back and forth from external applications. You can see these features in our SIGGRAPH Real-Time Live! 2016 demonstration.

Alembic support allows you to import complex and interesting vertex animations. And the new Physical Animation Component lets your characters respond realistically to physical forces by driving their skeletal animation through motors.

For mobile developers, dynamic shadows have been optimized, full-precision materials are supported, and custom post-processing is now possible. OpenGL ES 3.1 can now be used on Android, and binary shader caching will improve your iteration times.

VR games can now use capsule shadows, and stereo instancing has been optimized. Oh, and check out the new VR Template Project! It's a great example of how to use motion controllers to interact and navigate in your VR game.

Want to build your levels while in VR? Unreal's VR Editor has been improved with support for mesh and foliage painting, a new transform gizmo, and VR color picking. Finally, you now can instantly play your game right from VR! You can turn on "Enable VR Editing" in the Experimental section of your Editor Preferences.

Major Features

New: Sequencer Live Recording

Sequencer been updated with new Live Recording improvements as shown at Siggraph Real-Time Live! 2016 .

image alt text

The Sequence Recorder allows you to capture live gameplay, including all animation, audio and effects into a standalone asset that you can edit using Sequencer! New features in this release:

  • Quickly record selected actors and automatically create camera cuts track if camera is recorded.

  • Ability to specify arbitrary components and properties to record.

  • Optionally record data to actor possessed in level.

  • Record transforms in world space when actor is attached but not recorded.

New: Shadow Map Caching for Movable Lights

When a point or spot light is not moving, we can store off the shadow map for that light and reuse it next frame. This is now done automatically and makes shadow casting movable point and spot lights much more affordable in games where the environment is often not moving.

image alt text

The above image shows 33 dynamic shadow-casting point lights, with very minimal overhead:

Performance results on a 970 GTX at 1920x1200

  • 33 shadow casting point lights without caching: 14.89ms to render Shadow Depths.

  • With cached shadow maps: .9ms (about 16 times faster!)

  • Note that it still costs 2ms to render the contributions of the 33 point lights, which can be optimized in other ways but is not affected by this change.

  • Memory used by the cache can be seen under ‘Stat ShadowRendering’ and was 25.6Mb in this scene.

  • Max memory used by the cache can be controlled with ‘r.Shadow.WholeSceneShadowCacheMb’

Limitations

  • By default, caching can only happen when:

    • Primitives have mobility set to Static or Stationary

    • Materials used do not use World Position Offset

    • Light is point or spot, shadow casting, and mobility set to Movable but not currently moving

  • Materials that use an animating Tessellation or Pixel Depth Offset can cause artifacts as their shadow depths are cached.

New: Voronoi Noise Materials

We’ve added a new Voronoi noise option available for the Noise material node. Voronoi noise, also sometimes called Worley or Cellular noise, can be useful for procedural material creation.

image alt text

Voronoi noise can be used to generate patterns for familiar physical materials such as marble, as seen on the statue below.

image alt text

This example uses a Voronoi noise with a technique called ‘gradient mapping’ to achieve a marble look.

image alt text

From Left to Right:

1) Standard Voronoi Noise, 1 octave

2) Voronoi with "Gradient" noise added to input position, set to 0.05

3) Gradient noise multiplied by 0.3 before adding to Voronoi input position

4) Using result of step 3 as texture coordinates for a random tiling texture

The Voronoi noise has four quality levels, with decreasing grid artifacts at the higher levels at the cost of significantly increased shading time.

Also, performance has been improved for several of the Noise Material Node features, with more detailed description of the performance tradeoffs in the function selection tooltips. Most of these can be slow for runtime use, so baking the results into a texture is encouraged.

New: Blueprint Drawing to Render Targets

Blueprint functions can now be used to draw materials into render targets. This enables a huge variety of game-specific rendering effects to be implemented without having to modify source code.

image alt text

This is a fluid surface simulation implemented entirely in Blueprint and Material graphs. Characters and projectiles can push the fluid around!

image alt text

Above shows a simple heightfield painter made entirely in Blueprint, by accumulating a height value at projectile impacts.

The new Blueprint function Draw Material to Render Target draws a quad filling the destination render target with the Emissive Color input in the material.

image alt text

Upon starting the game, Begin Play is called and the render target is filled with blue. You can then right-click on the render target and save it as a static Texture which can be compressed.

image alt text

For more advanced drawing to a render target, use Begin Draw Canvas to Render Target and End Draw Canvas to Render Target. These allow multiple draws to a subset of the render target efficiently, as well as font drawing methods through the Canvas object.

image alt text

Being able to render off-screen from blueprint enables a ton of rendering features to be implemented quickly without a graphics programmer. It's also very easy to tank the GPU by doing a lot of pixel shader work with many passes to large render targets. These draw calls show up after typing 'ProfileGPU' in the debug console (under the WorldTick event)

Limitations

  • You cannot draw to a render target that is being sampled as a texture by the material you specify. Either use alpha blending to modify a render target in-place, or ping-pong between two different render targets.

  • Only the Emissive Color and Opacity outputs of the material are valid when drawing to a render target, lighting is not supported and nodes like WorldPosition may have unexpected values.

  • Emissive Color is clamped to be positive by default, but you can output negative values by enabling the material property ‘AllowNegativeEmissiveColor’

Check out the BlueprintRenderToTarget map in the ContentExamples project for working examples!

New: Alembic Importer for Vertex Animation (Experimental)

Alembic animation import is now supported! Alembic allows for complex animations to be authored offline, then rendered in real-time inside UE4! This feature is still considered experimental, but please try it out and send us feedback.

image alt text

We allow importing Alembic caches in different ways:

  • Static Mesh. A single frame from the Alembic animation will be imported as a static mesh asset (no animation.)

  • Geometry Cache. This is a new type of animation asset that allows playback of vertex-varying sequences. The imported Alembic animation will be played back as a flipbook of frames. Performance will scale with your mesh’s complexity and may not be optimal in all cases

  • Skeletal Mesh. This is the most efficient way to play back an Alembic animation, as long as the vertex count doesn’t change. During import, your animation sequence will be compressed using a PCA scheme, in which common poses (bases) are extracted and weighted to compose the original animation during playback time. The percentage or fixed number of bases used can be set during import to tweak the level of compression

New: Mesh Preview Scenes

New functionality has been added to setup the scene used to preview Static and Skeletal meshes.

image alt text

A new Preview Scene Settings panel has been added to the Static Mesh and Skeletal Mesh editors. Here, you can setup multiple profiles (scenes) to preview your meshes, and the profiles allow for changing:

  • Directional light (colour, intensity, rotation)

  • Sky light (HDRI environment map, intensity, rotation)

  • Post processing settings (identical to post process volume)

image alt text

We’ve also added some showcase functionality:

  • Manually rotate environment (hold K key) and directional light (hold L key)

  • Automatically rotate lighting rig (directional light and environment)

  • Easily hide the floor (O key) and environment (I key)

New: Mesh Decals

The new Mesh Decals feature lets you efficiently splat materials on top of your static meshes, effectively allowing you to smoothly layer different materials on top of one another. You can think of it as a second mesh that sits above the underlying geometry’s profile with it’s own topology and materials.

image alt text

The above pillars are each created as base mesh overlayed with a single detailed break mesh, as shown in the wireframe below. The left pillar is using a mesh decal to allow smoothly blended color, normals and roughness. The right-most pillar is a masked material, just for comparison.

image alt textimage alt text

Unlike deferred decals there is no projection involved, so a typical decal mesh which is tightly coupled to the underlying surface may need to include a surface offset in the material. Also, be careful if you have LODs where the mesh decal geometry would interpenetrate the LOD mesh.

New: Widget Interaction Component

Using the widget interaction component, you can now simulate hardware input events with widget components in the world.

image alt text

You can attach it like a laser pointer to any object in the world to interact with widgets; there are also some other options available for more customized behavior. When standard input comes to the player controller, you’ll instruct the interaction component to simulate a particular hardware input such as Left Mouse Down/Up over whatever widget the user happens to be hovering at that time.

For users who were previously relying on clicking directly on Widget Components in the world using the mouse, that path is no longer supported. Users will need to attach an Interaction Component (In Mouse Mode) to their player, then forward input to the Interaction Component when the player receives it.

New: VR Project Template

We’ve added a new project template designed for Virtual Reality on desktop and console.

image alt textimage alt text

To access this, simply choose the new VR option in the New Project template selection window.

This Blueprint project has settings that are optimized to run in VR up to 90 frames per second. To support different types of controllers the template includes two methods of locomotion, split into two distinct Pawn Blueprints. The first is designed for gamepads while the second supports motion controllers. When using motion controllers, you can teleport to different locations and grab and throw objects. A C++ version of this template will follow in a future update. Also, mobile VR templates will be coming later too.

New: Custom Post-Process for Mobile

Custom Post-Process materials now can be used on Mobile devices! Here is a "TV Static" effect displayed using the mobile renderer.

image alt text

  • This feature requires the "Mobile HDR" option to be enabled in your Project Settings

  • Supports fetching from PostProcessInput0 (SceneColor) with blendable location ‘Before Tonemapping’ and ‘After Tonemapping’

  • This feature does not currently work on older Android devices which require the ‘mosaic’ mode for HDR rendering.

  • Pixel depth information is not yet supported.

New: Lighting Channels on Mobile

Lighting channels now work in the mobile renderer! These allows you to selectively influence objects with specific lights. Great for cinematics of advanced lighting rigs.

image alt text

  • Multiple directional lights are supported in different channels.

  • Each primitive can be affected by only one directional light. The first lighting channel set on a primitive determines which directional light will affect it.

  • CSM shadows from stationary or movable directional lights cast only on primitives with matching lighting channels.

  • Dynamic point lights fully support lighting channels.

New: Shader Model 5 Rendering for Mac

Mac Metal now has initial Shader Model 5 support enabled by default. This exposes all the available features of Metal on Mac OS X 10.11.6 that are applicable to Unreal Engine 4.

image alt text

  • Implements the RHI thread & parallel translation features to parallelise render command dispatch.

  • Exposes support for Metal compute shaders.

  • Exposes asynchronous compute support on AMD GPUs.

  • Enables high-end rendering features previously unavailable on Mac, including:

    • High quality dynamic exposure (a.k.a. Eye Adaptation).

    • Compute-shader reflection environments - only available on discrete GPUs for 4.13.

    • DistanceField Ambient Occlusion - only available on discrete GPUs for 4.13.

    • DistanceField Shadowing - only available on discrete GPUs for 4.13.

New: Physical Animation Component (Experimental)

We’ve added a physical animation component that allows you to easily drive skeletal mesh animation through physical motors!

image alt textimage alt text

The component allows you to set motor strengths directly, as well as using pre-configured physical animation profiles which can be created and edited inside PhAT. The new "Physical Animation Profiles" feature in PhAT provide a powerful way to customize character physics in different game contexts, as well as fine tune it for special animations.

You can create and edit different profiles within the PhAT tool, and then change between them easily at runtime. Check out the new "Apply Physical Animation Profile" and “Apply Physical Animation Settings” functions in Blueprints, which allow you to to change physical animation behavior dynamically.

New: Procedural Mesh Slicing

There is a new utility in Procedural Mesh Component which will ‘slice’ a Procedural Mesh at runtime using a plane.

image alt text

After slicing, we support adding ‘capping’ geometry, and creating a second Procedural Mesh Component for the ‘other half’ if desired.

Also, Procedural Mesh now supports simple collision, so physics simulation can be enabled! Finally, we added a utility to copy data from a Static Mesh to a Procedural Mesh (‘Allow CPU Access’ flag must be set on the Static Mesh for this to work in cooked builds.)

New: Mesh Painting in VR

You can now paint on textures and mesh vertices in using the VR Editor.

image alt text

This allows you to use motion controllers to paint on static meshes while immersed in VR. To use this feature, open the "Modes" window in VR, then click the “Mesh Paint” tab. Now simply select an object in the world, then aim and pull your trigger to paint! Pressure sensitivity is supported on your controller’s trigger, and you can hold the ‘Modifier’ button to erase instead of paint.

New: Foliage Painting in VR

In this release, you can use motion controllers to spray down foliage instances while in VR.

image alt text

Select a foliage type, aim your laser and pull the trigger to paint foliage! You can hold the modifier button to erase foliage. Pressure sensitivity on the trigger is supported. Additionally, the Foliage Editor has been improved to be a bit more functional while in VR, although some features, lasso tool and select tool, are still unavailable.

New: Color Picker in VR

The Color Picker window is now available in VR, so you can change color properties on lights and other Actors in your level. You can also use the Color Picker to select colors for Vertex Painting and Texture Painting in VR.

image alt text

New: Play from VR Editor

To easily prototype your project it is now possible to play your game in VR from within the VR Editor. Press the "Play" button on the quick menu to start playing in VR! To instantly go back to the VR Editor, hold the Grip buttons on both controllers and squeeze both trigger buttons.

image alt text

New: Improved VR Transform Gizmo

The VR Editor’s transform gizmo has been improved with better usability and new features!

image alt text

Translating and rotating objects feels much more natural, and you can now uniformly scale objects or translate them along a single 2D plane in VR. We’ll continue to make improvements to VR gizmos in future releases.

New: VR Editor Flashlight

Using the Quick Menu, you can now add a flashlight to your controller, to light up dark parts of your scene or see how light interacts with different Materials.

image alt text

New: Screenshots from VR Editor

You can now take screenshots right from VR!

image alt text

New: Automatic Entry to VR Editing Mode

You can now enter and leave VR editing mode when the VR editor is enabled without having to use the VR button or escape manually! As long as the editor is in the foreground, when you wear the headset, you will automatically enter VR editing mode; when you remove the headset, you will leave it.

image alt text

There is a setting under VR in the Experimental section of Editor Settings that will allow you to turn off auto-entry if you prefer.

image alt text

New: Sequencer Import/Export

Sequencer can now import and export CMX EDL files for interchange with non-linear editing packages.

image alt text

The above shot shows a sequence that was exported to Adobe Premiere Pro. Each shot in a sequence will be written to a separate movie file which is referenced by the EDL file. Any sequencing changes made in Premiere can then be imported back into UE4’s Sequencer!

Sequencer’s ability to export HDR data in OpenEXR files has been expanded to give the user a choice of the color gamut used to encode the HDR data.

image alt text

Finally, Sequencer now supports importing FBX animation directly to an object or track. You can also export animated tracks to FBX!

New: Sequencer Burn-ins on Renders

When rendering out your movie sequence, you can now configure a "Burn-in" for the exported images. This is very useful in dailies for identifying and tracking shots.

image alt text

New: Media Framework Overhaul

Media Framework API has been completely overhauled, with many new features! Media Framework allows you embed live video and audio into your projects, with numerous powerful features.

image alt text

  • Playlist assets for playing multiple media sources in a row

  • Audio playback support has been added

  • Improved media file import workflow

  • Improved Blueprint integration

  • Performance improvements on several platforms

  • Pixel format conversion on the GPU

  • Support for dynamically changing video dimensions on some platforms

Android

  • Support for multiple audio tracks

  • HTTP Live Streaming (HLS) on devices supporting it (m3u8)

Playstation 4

  • HTTP Live Streaming (HLS)

  • Improved playback controls (Pause, SetRate, etc.)

  • Media files can be pre-cached to memory

  • Opening media from FArchive

  • Ability to play multiple videos at once (may require increased memory pool settings)

Windows

  • H.264 is now supported

  • Better support for HTTP(S) and RTSP streams

  • Better error handling and logging

  • Stability and usability improvements

  • Graceful handling of non-standard & unsupported codecs

Notes

  • Early experimental macOS/iOS support (AvfMedia plug-in)

  • Experimental Linux support (via VlcMedia plug-in on Github)

  • Experimental Video-Over-IP support (via NdiMedia plug-in on Github)

  • XboxOne (MfMedia) and HTML5 are not supported yet

  • Integration with Sequencer/Video Recording is scheduled for 4.14

  • PlatformMediaSource asset is not implemented yet

New: Platform SDK Updates

In every release, we update the engine to support the latest SDK releases from platform partners. Also in this release, you can remote compile iOS/tvOS projects from Windows in the binary version of UE4!

image alt text

  • iOs/tvOS: Code projects are now supported in the Windows version of the binary version of UE4 with the Unreal Engine Launcher. (You do need a Mac somewhere to remote compile on)

  • Xbox One: Upgraded to August 2016 XDK

  • Playstation 4: Upgraded to PS4 SDK 3.508.201

  • Oculus Rift: Updated to the Oculus 1.6 Runtime

  • SteamVR: Updated to OpenVR 1.0.2

  • Google VR: Added Google VR (Cardboard) support for iOS

  • OSVR: Updated to v0.6-1194-g0c54f5e

  • Android: Google Play Games native C++ SDK updated to 2.1

  • Android: Google Play Services updated to 9.2.0

  • Android: Supports running on Nougat aka Android 7.0

  • Vulkan API: Updated SDK to 1.0.17.0 (for Android and Windows)

New: Improved Landscape Tessellation

Hardware tessellation on landscape is now much faster! Landscape will now only render hardware tessellation on the highest level-of-detail (LOD), fading out as the landscape approaches the second LOD. Subsequent LODs will no longer have tessellation enabled. This significantly improves performance of enabling tessellation for displacement or extra detail up close to the camera.

image alt text

In the visualization above, the highest the LOD (white) is rendered with tessellation enabled, while the other LODs (colors) are rendered without tessellation.

New: Animation Pose Assets

We have added a new type of animation asset called a Pose Asset. This contains a set of named bone poses, which you can blend additively, in a similar manner to blend shapes for vertices.

image alt text

One use for this is to support facial animation where either FACS (Facial Action Coding System) or viseme curves can drive poses. However you could use this system to create new animation by blending multiple poses.

Currently you create a Pose Asset from an Anim Sequence using Content Browser context menu or the Create Asset menu in Persona. When you create a Pose Asset, the pose names will be automatically generated. After that you can rename each pose manually, or use clipboard paste to rename all of them at once.

Poses are driven by normal animation curves. As long as they exist in the animation you can see the curve. In Persona, in order to preview a pose from a curve, you need to set the current Preview Pose Asset.

image alt text

In the AnimGraph, you can use a Pose Blender Node (or Pose By Name) to output the pose based on the incoming curves.

image alt text

To support this system, we have improved how we handle scale with additive blending. In the future we would like to support curves on other assets (e.g. Sound Waves) that can be used to drive Pose Assets and Morph Targets.

New: Pose Driver Animation Node (Experimental)

We have added a new Pose Driver node, which lets you drive a curve value (such as a morph target weight) based on a bone’s movement.

image alt text

This uses an RBF (Radial Basis Function) to interpolate driven values based on the orientation of a target bone. You use a PoseAsset to define the target poses for the bone, and the desired curve values at each pose. This node can be used as a Pose Space Deformer, to drive corrective blend shapes based on the orientation of a bone.

New: Animation Node Pose Watching

Anim graph nodes can now be "watched" in Persona.

image alt text

This allows you to see a representation of the pose being generated at any point in the anim graph dynamically. Multiple watches can be active at once allowing you to compare poses at different points and find the exact point at which any errors in your current pose are introduced. This can be very useful for debugging complex Animation Blueprints and previously would only have been achievable by connecting the node you wanted to view directly to the root node and recompiling the blueprint.

New: Improved Scene Capture

Scene Captures have been improved to be more useful with the new ‘Blueprint Drawing to Render Targets’ feature!

image alt text

  • Orthographic projections are now supported.

  • When blueprints update a scene capture through the Capture Scene function, it happens immediately, allowing subsequent image processing with Draw Material to Render Target

  • Opacity is now captured in alpha, which allows partial rendering in a scene capture and compositing into another scene later.

  • Various GBuffer attributes are now available to be captured, including depth.

  • Added ‘Hidden Actors’ and ‘Show Only Actors’ arrays which can be used to easily control what is rendered into a scene capture.

image alt text

Game specific effects like fog of war can be implemented by rendering meshes as visibility shapes into an orthographic scene capture, and then doing image processing with Draw Material To Render Target.

New: Improved Refraction Shaders

There’s a new "Pixel Normal Offset" refraction mode which uses the vertex normal as a reference, and computes the refraction offset from how different the per-pixel normal is from the vertex normal. This is non-physical but allows refraction to be used on flat water surfaces.

image alt text

Left: simple scene with no refraction

Center: default refraction method which causes an undesired constant offset for a water surface

Right: Pixel Normal Offset refraction method which distorts based on the normalmap difference

image alt text

New: Texture Coordinates from Line Traces

We have added a project setting to support texture coordinate (UV) info from line traces.

image alt text

The option is under Project Settings -> Physics -> Optimizations. When this is enabled, you can use the ‘Find Collision UV’ function to take a Hit Result and find the UV info for any UV channel at the point of impact. Enabling this feature does use extra memory, as a copy of UV information must be stored in CPU memory.

New: Spline Editing Improvements

Editing Spline Component defaults in the Blueprint Editor

Now it's possible to set Spline Component points in the Blueprint Editor using the standard spline visualizer editing features. New instances of the Blueprint will be created with these defaults, although these too can be overridden on a per-instance level once placed.

image alt text

The "Reset to Default" context action on the spline visualizer will set an instance back to the Blueprint default. Any changes made to the Blueprint default will be propagated to any instance whose spline points have not been subsequently edited.

image alt text

Numerical editing of spline points in the Details panel

Before there was no way to precisely place spline points and assign scale, roll or tangents to them. Now these properties are exposed in the Details panel for selected spline points:

image alt text

New spline point properties

Spline points can now be defined with distinct arrive and leave tangents, and an arbitrary input key. The former allows for splines with discontinuities, while the latter allows for greater control of interpolation speed between points. This allows for greater versatility when designing spline paths.

In the spline visualizer, in the Spline Component details, there is an option in the context menu to allow the arrive and leave tangents to be edited separately instead of locked:

image alt text

As a consequence of being able to set arbitrary input keys per point, there is now also a way to specify the input key of the Loop Position for closed splines:

image alt text

If input keys or a loop position are not specified, they will default to starting at 0.0 and incrementing by 1.0 for each point, as before.

Defer spline update in Blueprints

Sometimes it's desirable to build splines procedurally in a Blueprint construction script. Previously every operation on a spline point would cause the spline to be rebuilt, but now - for optimization purposes - it's possible to specify whether the spline should be rebuilt following a spline point operation. There's also an explicit Update Spline node.

image alt text

Input Spline Points to Construction Script

Sometimes it's useful to be able to hand edit a spline with the spline visualizer, and then refine them with a Blueprint construction script. An example might be a Blueprint which locks all edited points to the surface of a sphere, like this:

image alt text

This is now achievable by checking the "Input Spline Points to Construction Script" property:

image alt text

New: Sub Animation Blueprints

You can now share animation logic by using a ‘Sub Anim Instance’ node within your Animation Blueprint to reference another Sub Animation Blueprint. This also allows you to break up large Animation Blueprints into separate assets, for example into ‘locomotion’ and ‘physics’ parts.

image alt text

Member variables of the Sub Blueprint can be exposed as input pins on the node. The Sub Animation Blueprint must use the same Skeleton as the outer Animation Blueprint.

New: Animation Curve Viewer

We removed the Skeleton Curve tab from Persona, and moved that functionality into the improved Animation Curves tab. Here you can now rename and delete curves, as well as previewing curve data.

image alt text

You can see all curves that belong to current skeleton or currently active curves from preview asset. And you could also filter by specific type of curves if you only want to see active curves. Please note that we named default curve to be called "Attribute", so any animation curves will be by default attribute curves.

If you want to modify the curve value, you can either turn off Auto check box option or just type the value.

New: Sprites in UMG Widgets

You can now use Paper2D Sprites as Brush inputs for UMG and Slate Widgets. In addition to allowing users to reference UI art that may have been developed as a sprite sheet, it permits users to more efficiently render widgets on platforms where the draw call count budget is tight. Sprites that are part of the same texture atlas can be batched together in Slate, provided they all share the same layer when rendered.

image alt textimage alt text

New: Optimized Instanced Stereo Rendering for VR

There have been a number of improvements to instanced stereo rendering, including moving the velocity pass to use instanced stereo rendering. Multi-view support has also been enabled on the PS4, which leads to significant performance improvements when using the ISR path.

New: GPU Morph Targets

Projects can now enable calculating morph targets on the GPU on Shader Model 5 level hardware. This frees the CPU from performing those calculations:

image alt text

New: Optimized Landscape Shader Memory

Landscape now compiles dramatically fewer shader combinations for landscape materials, improving first-load shader compile times, improving editor iteration and reducing memory usage and shader cache size.

New: Shadow Optimizations for Mobile

Optimizations have been added to the Combined Static and CSM shadow mode added in 4.12. In this mode, a stationary directional lights cast static shadows from static objects and CSM shadows for dynamic objects. In 4.13, the appropriate shader is now automatically selected based on the bounds of the dynamic objects casting CSM shadows, and there is no longer any need to manually tag each primitive that will receive combined static and CSM shadows.

New: Landscape Import Plug-ins

A new plugin API has been added for landscape file formats, allowing developers to create plugins that add support for different heightmap and weightmap file formats to landscape. The existing raw and png support has been converted to the new API. The png support in particular makes a good reference for implementing new landscape file format plugins.

New: Automation Testing for Android

The Project Launcher is now able to package and launch your project onto multiple Android devices simultaneously. The app running on each device will communicate back to your host PC over the USB cable and the will appear in the Session Frontend window.

You can then launch Automated Tests on all the devices and see the results in the Session Frontend.

image alt text

New: OpenGL ES 3.1 on Android

While UE4 has long supported many OpenGL ES 3.0 and 3.1 features on Android, you can now specifically target ES 3.1 on Android. This brings feature parity with Metal and Vulkan to higher-end Android devices and gives you access to 16 texture samplers as well as improved performance through the use of uniform buffers.

You can choose to package bothe ES 2.0 and ES 3.1 shaders for the same project, and the device will chooses the best shader platform based on the device’s capabilities.

image alt text

New: Mobile Packaging Wizard

We have added a Mobile Packaging Wizard to help support packaging for mobile where a minimal app without any content is uploaded to an App Store and the rest of the content is downloaded from the Cloud.

image alt text

  • This type of packaging is common with larger games that have regular DLC updates

  • It also allows the user to download only the specific content required for their device, such as the texture or shader format their device needs

  • Wizard can be accessed from Project Launcher window as an option for new profile creation

New: Full Precision Materials on Mobile

Materials have an option to use full precision (default: medium precision) in pixel shader when used on Mobile devices. For example this helps in cases when material uses world coordinates in computations.

image alt text

New: Binary Shader Caching for Android

Compiled shaders will be stored on the disk on first use and then reused on subsequent application runs.

  • Requires GL_OES_get_program_binary extension

  • Disabled by default, can be enabled only on Android devices (r.UseProgramBinaryCache=1)

New: Localized Text Formatting Improvements

We’ve improved our localized text formatting to allow your translations to be more accurate.

Plural Forms:

  • Plural forms allow you to use different text based upon a numeric variable given to your text format. Plural forms may be cardinal, eg "There is 1 cat", “There are 4 cats”, or ordinal, eg) “You came 1st!”, “You came 2nd!”, etc.

  • Plural forms are specified as key->value pairs, and support any of the following keywords (as defined for your culture by the CLDR data ): zero, one, two, few, many, other. Values are an optionally quoted string that may also contain format markers.

  • Cardinal Format Example: "There {NumCats}|plural(one=is,other=are) {NumCats} {NumCats}|plural(one=cat,other=cats)"

  • Ordinal Format Example: "You came {Place}{Place}|ordinal(one=st,two=nd,few=rd,other=th)!"

Gender Forms:

  • Gender forms allow you to use different text based upon an ETextGender value given to your text format, eg) "Le guerrier est fort", “La guerrière est forte”.

  • Gender forms are specified as a list of values in the order of masculine, feminine, neuter (neuter is optional). Values are an optionally quoted string that may also contain format markers.

  • Format Example: "{Gender}|gender(Le,La) {Gender}|gender(guerrier,guerrière) est {Gender}|gender(fort,forte)"

Hangul Post-Positions:

  • Hangul post-positions help you deal with the grammar rules present in Korean, and will insert the correct glyph(s) based upon whether the value being inserted ends in a consonant or a vowel, eg) "사람은", “사자는”.

  • Hangul post-positions are specified as a list of values in the order of consonant, vowel. Values are an optionally quoted string.

  • Format Example: "{Arg}|hpp(은,는)"

To allow you to pass in the numeric/gender values needed for plural/gender form support, all of the FText::Format(...) family of functions now take their values as FFormatArgumentValue rather than FText. This can be implicitly constructed from any numeric type, ETextGender, or FText.

The ability to set these value types in Blueprints has been exposed using wildcard pins on the "Format Text" node:

image alt text