Filmmaker Tim Richardson’s short film, Neon Rapture, takes Iris van Herpen’s futuristic fashion line into a stunning virtual world, blurring the line between couture and cinema. Learn how Unreal Engine helped them do it inside.
Take a look at how three filmmakers used Unreal Engine to create three very different animated shorts that were featured in Short Nite, Fortnite’s in-game film festival.
Ready to unlock the secrets to animating realistic facial performances? Learn how to create real-time facial animation for your MetaHumans with Faceware Analyzer and Retargeter! Start these two partnered courses today.
Jeremiah Grant, Technical Product Manager at Epic Games, explains the mentality, implementation, and tools used to rig and animate ‘The Ancient One,’ the massive robot encountered at the end of UE5 Early Access project, Valley of the Ancient.
What’s blue and lives in Xanadu? Cory Strassburger’s alter ego, Blu! Blu is busy building his empire in the Metaverse and documenting the process on YouTube, thanks to Unreal Engine, MetaHuman Creator, some amazing performance capture tech, and Cory’s very, very vivid imagination.
With the finish line just weeks away, real-time technology came to the rescue and enabled REALTIME to deliver two and a half hours of additional cinematic cutscenes in record time on Codemasters’ new F1 2021 game.
Prepare yourself for the most photorealistic pop performance yet. Using virtual production and digital human tech, Madison Beer, Sony Music and Magnopus have set a new bar for virtual concerts, thrilling fans who had no choice but to stay at home.
With digital fashion opening up new doors for designers, find out how Epic MegaGrant recipients Gary James McQueen and Moyosa Media created Guiding Light, an all-digital fashion show, in Unreal Engine.
Epic Games is working with studio partners like Netflix to develop new collaborative filmmaking methods during this extended work-from-home period. The resulting process is a groundbreaking form of remote virtual production not previously possible with traditional digital creative tools.
Do more with Control Rig using the latest updates in 4.26. This sample project includes the new Female Mannequin, the ability to convert animation sequences to Control Rig animation, and more.
Do more with Control Rig using the latest updates in 4.26. This sample project includes the new Female Mannequin, the ability to convert animation sequences to Control Rig animation, and more.
Discover how Mr Jingles, Lil’ Rey, and the rest of the characters in The Jolliest Elf were created by a team working remotely using iPhone facial capture in a real-time animation pipeline based on Unreal Engine.
Discover how Mr Jingles, Lil’ Rey, and the rest of the characters in The Jolliest Elf were created by a team working remotely using iPhone facial capture in a real-time animation pipeline based on Unreal Engine.
Skydance Interactive Co-Founder and Chief Technologist Peter T. Akemann provides an overview for The Walking Dead: Saints & Sinners’ combat system and details of some of the tricks, techniques, and key insights that the VR developer employed.
Animation features The Life of Our Lord and Golden Panda leveraged an innovative pipeline using the real-time technology of Unreal Engine. Learn more about the production process and the value Unreal Engine brings.
Leading provider of automated performance-driven facial animation technology Cubic Motion is joining the Epic family, extending our commitment to advancing the state of the art in the creation of digital humans.
Leading provider of automated performance-driven facial animation technology Cubic Motion is joining the Epic family, extending our commitment to advancing the state of the art in the creation of digital humans.
Archiact Marketing Manager Renee Klint and Technical Art Director John Cruz walk users through how to create 360-degree game trailers that effectively portray the immersiveness of VR on 2D screens.
Archiact Marketing Manager Renee Klint and Technical Art Director John Cruz walk users through how to create 360-degree game trailers that effectively portray the immersiveness of VR on 2D screens.
Real-time performance capture can transfer an actor’s entire performance to a digital character on the fly. Get an understanding of how these systems work, and which one is right for you, in our new white paper “Choosing a real-time performance capture system.”
Real-time performance capture can transfer an actor’s entire performance to a digital character on the fly. Get an understanding of how these systems work, and which one is right for you, in our new white paper “Choosing a real-time performance capture system.”
Digital platform TwinSite is a virtual interactive world set in and around construction sites. Providing immersive VR and display-based training, it’s estimated to be up to four times more cost efficient than traditional offsite training.
Ever dreamed of having your own performance capture rig? Learn about two facial capture systems you can pair up with body mocap for a complete in-house system.