VLAST is at the forefront of innovation within the entertainment industry in Korea. The agency creates real-time rendered virtual IP that merges advanced digital technology with traditional content formats.
As real-time 3D content creation technology evolves, the market for digital humans has rapidly expanded. In South Korea, PLAVE—a new and enormously popular virtual boy band—has been riding this wave for a little over a year.
PLAVE’s five members Yejun, Noah, Bamby, Eunho, and Hamin ostensibly do their own songwriting, composition, and choreography. Since debuting in March 2023, they’ve become the first virtual idol group to reach number one on a major K-pop music show and recently held a sold out two-day concert at Jamsil Indoor Stadium in Seoul, South Korea.
In this article, we’ll explore how Unreal Engine is used to produce the band’s music videos, livestream shows, and real-world concerts.
Real-time integrated pipeline for cinematic content
(Left) Traditional cinematic content production pipeline with embedded DCC tools / (Right) Unreal Engine-integrated cinematic content production pipeline.
VLAST, the agency behind PLAVE, recognized the potential of Unreal Engine to create experiences driven by virtual humans early on. With a vision to build a seamless virtual IP pipeline, VLAST set out to leverage Unreal Engine for both cinematic and live content for PLAVE performances.
Initially, the team opted for a hybrid approach, using a mix of DCC applications and Unreal Engine to produce the music video for PLAVE’s debut song, ‘Wait for you.’ UE was leveraged for level design, lighting, and rendering, and DCC tools were used for effects and outfit simulations.
VLAST decided to bring more of this pipeline into Unreal Engine because they felt the previous approach ultimately slowed down the overall content production process. Character modeling and rigging would still be done in DCC applications, but simulation and effects would now be produced in UE.
PLAVE's fifth single, ‘WAY 4 LUV’
Once they’d shifted the pipeline focus towards Unreal Engine, the virtual avatars’ outfits and hair could be simulated in real time, replicating the look and feel of performance in a live broadcast.
The team used Unreal Engine’s Take Recorder to capture sequences of the virtual avatars performing, and Niagara for visual effects.
By the time the team were working on the band’s fifth single, WAY 4 LUV, all aspects of the music video—animation, look development, and more—were being rendered in real time in Unreal Engine, enhancing both workflow efficiency and visual quality.
Motion capture for authentic live performances
(Left) First broadcast / (Right) Recent broadcast
VLAST has spent a lot of time honing their expertise in creating authentic PLAVE performances for live broadcast.
They’ve been particularly interested in making the artists’ movements appear as natural as possible, developing various solutions to process the raw motion capture data streamed from Unreal Engine’s Live Link plugin.
VLAST built a number of tools in Unreal Engine to deliver smoother animated performances, including a solution for seamless animation transitions, a calibration tool that corrects marker position shifts, and an interference avoidance system to resolve mesh overlap.
Before and after the real-time interference avoidance solution
They introduced a dynamic FK/IK solution for fluid transitions between accurate poses and natural interactions with surroundings, and a foot IK solution that fine-tunes the width of the characters’ hips, sole depth, and foot width for realistic ground contact.
An integrated real-time pipeline
VLAST were using Unreal Engine for two key content streams: producing cinematic videos for album releases and holding weekly livestreams for fans.
Along the way, they realized there was no need to differentiate between the two pipelines. Whether the artists’ performances were shown in real time or in a recording, both pipelines were fundamentally the same, in that they started with live motion capture.
That discovery led VLAST to develop an integrated pipeline in which all content begins with live motion capture.
A fully integrated pipeline for cinematic and live content creation
To give the PLAVE character rigs the adaptability for optimized use in both pre-recorded content and live performance, VLAST created a modular, unified character system that achieves both cinematic quality and live performance flexibility, drawing inspiration from MetaHuman character design.
The system separates skeletal meshes into a structured hierarchy, enabling each skeletal mesh component to switch seamlessly between modes: one for playing animations in Sequencer and another for real-time motion capture and simulation via Live Link.
Live performance-specific functions are modularized as separate components, allowing for flexible customization based on the live broadcast scenario. This system produces smooth transitions between cinematic and live modes at any time, on demand.
(Left) Hierarchical structure and modular skeletal mesh components / (Right) Flexible mode switching at the component level
The team leveraged Unreal Engine’s Take Recorder feature to build a motion capture recording pipeline that enables them to go from motion capture to cinematic entirely within Unreal Engine. This eliminates the need to hop from one software to another to create animation.
Using Take Recorder, the team can see the results immediately and react to feedback directly in the field in real time—a boon for planning and directing.
(Left) Previous motion capture workflow / (Right) Integrated motion capture workflow with Take Recorder
To simplify and automate the process of coordinating multiple software programs on virtual production shoots for PLAVE performances, VLAST developed a tablet app called Virtual Slate, powered by Unreal Engine. This app connects all studio devices and software, enabling remote, synchronized control.
With a single tap of the record button, it starts recording in Unreal Engine, triggers motion capture software, and captures multi-channel video and audio simultaneously. Once recording is complete, Virtual Slate automatically organizes file names, storage locations, and generates a preview scene.
(Left) Traditional operator communication / (Right) Enhanced operator communication
When it came to real-world live concerts, a big focus for VLAST was delivering an immersive experience for the audience. Another key objective was accurately capturing scale and perspective to make the audience feel they were watching real artists on stage.
To achieve this, the team had to reliably ascertain how their LED projections would look on the big screen on stage. They built a digital replica of the concert venue within Unreal Engine, matching its exact dimensions.
By projecting the final images onto the virtual LED screens and viewing them through VR from the audience’s perspective, the team could accurately adjust the scale and depth. This approach enabled VLAST to create a sense of presence from every seat even in the massive venue, making it feel as though the members of PLAVE were truly on stage.
(Left) VR simulation / (Right) A scene from the actual live concert
Unlocking creativity beyond 2D screens
VLAST set out to transform the concert experience from projecting visuals onto a flat, 2D LED screen to building an all-encompassing 3D environment.
This approach unlocked new creative possibilities, with the band singing while moving to and from impossibly high vantage points (think on top of a high tower), seamlessly entering the stage on a motorcycle, or appearing and disappearing in a flurry of petals.
These are experiences only virtual artists can deliver. Beyond visual effects, combining digital elements with physical set pieces (like theatrical fog effects) helped ground the experience in the physical venue and enhance the sense of audience immersion.
PLAVE live concert fight scene.
The elaborate lighting and visual effects in PLAVE’s live concerts were powered by Unreal Engine’s DMX plugin. In collaboration with Metalocat, specialists in Unreal Engine concert lighting, the team at VLAST pre-configured the concert lighting layout within Unreal Engine. They used the DMX plugin to simulate and fine-tune lighting effects, creating a seamless blend of real and virtual stage lighting.
Stage lighting design in collaboration with Metalocat
VLAST’s work with Unreal Engine proves what’s possible for entertainment and live events using real-time technology. Game engines can power everything from music videos to online shows and real-world concert experiences, with the assets used for one element of a muti-faceted project seamlessly reusable in the next. For IP holders, the opportunities available to impress and delight fans and audiences have never been greater.
Get Unreal Engine today!
Get the world’s most open and advanced creation tool. With every feature and full source code access included, Unreal Engine comes fully loaded out of the box.
Get updates on industry innovations and the latest free assets for