In August, Hyundai Motor Group’s KIA Motors held "Carnival on AR,” its fourth generation Carnival AR launch event, to showcase its latest vehicle lineup online, and Vive Studios powered the event by using Unreal Engine’s AR capabilities. Vive Studios, a global storyteller based in South Korea, has delivered captivating digital experiences with works like VR documentary “Meeting You” and K-pop Idol IZ*ONE’s XR concert “ONEIRIC THEATER,” which received praise in various fields including VR, AR, and VFX. With the AR premiere livestream ending successfully, Vive Studios once again delivered.
The Carnival on AR project was planned as an online event to introduce customers to KIA Motors’ latest Carnival series model lineup, which is the company’s first new release in six years. Traditional offline launch events allowed physical cars to be viewed and experienced in person, but given current social-distancing measures, KIA Motors opted to show Carnival’s features and highlights in a unique manner.
The successful virtual launch was livestreamed in real-time to viewers around the world, breaking South Korea’s top retention record with an all-time high that peaked at 155,000 viewers and amassing up to 3,300 concurrent users. Furthermore, this event showcased new online AR launch possibilities powered by real-time tech and delivered a novel experience to customers using top-notch, realistic CG.
Vive Studios asserts Unreal Engine has been the foundation for the success of the online launch. Now, let’s do a deep dive into how Vive Studios used Unreal Engine to produce the launch of the Carnival AR event.
Project concept
Vive Studios created a new creative solution for the Carnival on AR project to showcase the energy of a live show, the interaction between the presenter and vehicles, and the product’s brand philosophy. Carnival’s philosophy, “Connecting Hub,” intends to connect spaces, people, information, experiences, and generations together. In order to infuse this philosophy into the livestream, the recording studio put the concept into practice by bringing a physical vehicle, an AR vehicle, and a VR vehicle into a single space while connecting the three realities. Unreal Engine was then used to create the augmented-reality and virtual-reality components.
Technical goal
Vive Studios set out to achieve three main technical goals for the launch. The first goal was to render vehicles with photorealistic quality so that they look indistinguishable from the real cars. The second goal was to present a grand event to viewers and customers through AR and VR. In order to make a vast and empty space seem brimming with action, everything except one physical vehicle was rendered in real-time. While focusing on the details of the photorealistic AR vehicle, a third space was visualized in real-time to expand the stage beyond the virtual LED walls.
Image courtesy of Vive Studios
The third goal was to create a control interface that was optimized for AR. AR production requires complex large-scale collaboration due to the requirements of various technology and equipment, which are not commonly used in general filmmaking. These include tracking devices, real-time compositing programs, and PC synchronization. This can also bring about a number of unpredictable on-set factors, like crashing between devices, changes in the production direction, and presenter mishaps. Because of this, a professional interface and control interface was needed in order to swiftly respond to urgent issues that occur on-set and to facilitate a smooth production process. As a result, Vive Studios saw value in how Unreal Engine offers the freedom to build on existing features and to respond quickly to on-set needs and decided to develop V2Xr, an in-house AR solution based on Unreal Engine. This was all made possible with Unreal Engine’s real-time rendering, synchronization feature, and feasibility of compositing external footage, along with various plugins offered by Epic Games.
Composition of AR Solution, V2Xr
Vive Studios’ proprietary AR solution, V2Xr, is comprised of LiveComp and XR Player. LiveComp is a remote-control interface that commands the overall system and manages operations. XR Player is a UE-based visualization program for AR and LED walls. LiveComp and XR Player constantly transmit and receive statuses as well as commands over the network to control visualization shots. XR Player especially paired well with Unreal Engine’s robust real-time rendering to achieve its technical goals.
Image courtesy of Vive Studios
XR Player
XR Player is a UE-based visualization program for AR composition, LED wall graphics, and more. Various tasks are executed within XR Player, such as receiving control signals from LiveComp to change Sequencer’s position and facilitating the compositing of footage for AR. XR Player can be divided into two different types, AR Player and SC Player, according to its purpose. AR Player composites the camera footage and CG video in real-time. SC Player, on the other hand, instantly outputs real-time rendering graphics or general footage onto LED walls or projection walls. Each player runs independently on separate rendering PCs, which can be increased depending on the number of AR cameras or LED walls.
Image courtesy of Vive Studios
AR Player
The AR Player receives camera-movement information through external tracking devices and applies it to CG and is then composited in real-time with the real camera footage. During this process, the compositing method or the lens distortion calibration feature is dependent on the tracking device’s communication protocol method. V2Xr supports Stype’s communication protocol and universal protocol, FreeD. For Stype’s protocol, compositing and lens calibration is applied through Stype’s Unreal Engine plugin. On the other hand, FreeD protocol’s compositing and lens calibration are based on Unreal Engine’s Composure plugin.
Footage Composition and Lens Distortion Calibration
If the tracking device doesn’t have an Unreal Engine plugin or needs a custom feature, the FreeD protocol can be used to receive the camera status information for direct implementation of the compositing feature. Lens distortion calibration is also a necessary feature. For this feature, the Lens Distortion plugin and Composure plugins included in Unreal Engine can be essential for real-time AR compositing.
Image courtesy of Vive Studios
Using calibration data to correct lens distortion
Image courtesy of Vive Studios
Using calibration data to correct lens distortion
An effective calibration method to correct lens distortion in AR compositing is very critical and can be difficult to achieve, especially with B4-mount broadcasting lenses. The camera angle here varies greatly depending on the zoom, and a significant level of lens distortion occurs, especially in wide view. The focal position also influences the appearance of distortion, so various calibration data must be collected, and the appropriate calibration value needs to be applied during footage compositing. To do this, AR Player utilizes the Lens Distortion plugin to create render targets that apply the distortion calibration value for each frame, and Composure uses the created images as a basis to alter the camera footage’s UV to correct the distortion.
Image courtesy of Vive Studios
CG Color Correction
In the AR set, the color tone correcting feature that allows CG to blend in with the environment is very important. The conceptual direction of physical lighting should determine the most appropriate tone for the virtual imagery in order to facilitate seamless and natural compositing. In order to do so, Vive Studios developed a feature that reads the color tone information contained in each frame and immediately adjusts the basic post-process options available in Unreal Engine, such as white balance, gamma, and exposure. When this feature is used, keys are created in the LiveComp timeline to apply the color information within Unreal Engine in real-time. This enabled Vive Studios to respond to color issues on set in a more straightforward manner.
Image courtesy of Vive Studios
An image of LiveComp is on the left with the Unreal Engine-based XR Player on the right.
SC Player
SC Player is used to produce graphics that are directly projected onto large LED walls or projection walls. This is a player for in-camera filming that receives external camera-movement information and renders the graphics in Unreal Engine in real-time. Although, for the KIA AR premiere livestream, the physical LED walls were not installed, and SC Player’s footage was taken from AR Player instead so that the large virtual LED wall was visualized in augmented reality, making it seem as if the virtual wall actually existed on the physical set.
Image courtesy of Vive Studios
Image courtesy of Vive Studios
Normally, when AR Player and SC Player’s cameras are synchronized with each other to move as one, the visual spatiality can be maximized to its full potential. SC Player can also receive tracking information from external cameras in real-time to apply CG imagery and offers various tracking data reception methods. Like AR Player, SC Player can use plugins provided by the equipment manufacturers or can use FreeD data by parsing it directly. Another available method is synchronizing the camera by sending data such as camera position, camera angle, and angle of view from the AR Player’s socket channel to the UDP packet, which is then received by the SC Player as shown below. Socket communication in the UDP method gives it an advantage over the TCP/IP method when sending large data and can be easily implemented through Unreal Engine’s Simple TCP UDP Socket Client plugin. Through this method, AR Player’s camera and SC Player’s camera are fully synchronized and move together flawlessly in unison.
Image courtesy of Vive Studios
Data transmitted in real-time such as focal length, aperture, and focal distance is applied to the camera (location and angle excluded).
Image courtesy of Vive Studios
The interpolation function smoothly applies location and angle data.
Virtual LED Wall Display
When directly projecting the SC Player’s footage onto the physical LED wall, anamorphic distortion must be applied then projected in order to define the frustum where the camera’s filming area and the physical LED shape are applied. However, in this project, the SC Player’s output footage was delivered directly to the AR Player and was used as the virtual LED wall shape, as shown below. Then the delivered footage was masked in the shape of the LED wall, and other AR components and external footage were composited together. The virtual LED wall process enables the AR components and the footage on the LED wall to instantly respond to the physical camera’s movements, creating an impression of a whole new dimension stretching beyond the horizons of the LED wall. In addition, two separate spaces with different lights, fog, and post-process settings can be displayed in one space simultaneously.
Image courtesy of Vive Studios
Conclusion
During the preparation stage, Vive Studios considered a number of methods that offered the most effective way to deliver Unreal Engine-based high-quality AR graphics in the KIA Carnival AR premiere livestream. The creative studio focused on enhancing operation efficiency with features such as linking the remote control program LiveComp, Composure-based lens distortion calibration and compositing, and instant color correction on-set. The virtual LED wall display method also created a more stereoscopic view of the space. The reason why the result was successful despite the short preparation period was because Vive Studios actively took advantage of the various features and plugins offered by Unreal Engine and built upon that foundation by quickly developing and testing additional features as needed and iterating on this process.
As a result, Vive Studios provided an AR/VR solution service to customers all around the world through the AR premiere livestream. The group delivered top-notch technology with the help of experienced computer graphics designers and AR/VR R&D engineers and drove the momentum to advance its technology to new heights. Since Unreal Engine is being used in various fields, this has proved to be an incredible experience that opens up opportunities to provide and improve the quality of this service for streaming events and exhibitions for all types of industries. It also serves as a foundation for Vive Studios to continue innovating moving forward.
Interested in learning more about this project as well as Vive Studio’s creative solution? Visit vivestudios.com for more information.
For more on how Unreal is helping power virtual reality and augmented reality experiences, check out our VR/AR hub.