30. April 2020
Emotional documentary explores new compassionate possibilities of VR
In Meeting You, a grieving mother is reunited in virtual reality with her deceased daughter. The documentary, produced by Korean broadcasting station Munhwa Broadcasting Corporation (MBC) and shown on February 6, delivered a powerfully emotional journey for the woman and the audience who witnessed it.
The project started with a question: “What would you say to your loved ones in heaven if you could meet them once again?” After experimenting with different ideas, the production team decided to combine VR with a documentary approach.
The team was seeking participants for the project when they heard the story of Ji-sung Jang, a mother of four. Her third child, 7-year-old Na-yeon, tragically passed away in the fall of 2016. Ji-sung wished for a chance to cook miyeok-guk, or Korean seaweed soup served on birthdays, for her late daughter and to tell her that she loves her and thinks about her daily.
After the project was confirmed, Vive Studios helped recreate Na-yeon in VR so that Ji-sung could share a moment with her late daughter.
Thanks to the efforts of the broadcasting team and Vive Studios, Ji-sung was reunited with Na-yeon in MBC’s virtual studio. After spending years missing her daughter, Ji-sung finally had a chance to tell her one more time that she loved her. Na-yeon seemed to delight in her special meal, made a birthday wish, and then said, "I love you, mom." The moment ended as the little girl transformed into a delicate white butterfly and drifted away.
MBC and Vive Studios spent seven months creating the experience for Ji-sung, allowing her to feel like she was able to spend one more precious moment in time with her child.
We spoke to MBC’s producer Jong-woo Kim and Vive Studios, a creative storyteller which provides production technology in VR, AR, VFX, and film, about the project and how it presented new opportunities and direction for VR beyond typical entertainment uses.
Combining VR with a documentary was a very original idea. What was the inspiration for producing Meeting You?
MBC Producer Jong-woo Kim: The motif of reuniting with family members who passed away came to mind while I was planning a new program. I was inspired, in part, by photorealistic CG renders created by a game engine. The CG images were very realistic yet transcendental, which seemed to align with the theme of our project.
We decided to develop in VR because linear CG footage wouldn’t be any different from viewing existing videos, no matter how high the quality. VR allows the participant to interact with their counterpart which delivers a completely different experience. For this reason, we believed that it would be possible to give an impression of actually “meeting” someone. We also had to choose between VR and AR to proceed, and eventually decided on VR for its immersive nature.
Vive Studios: When the broadcasting station approached us with their project, we thought it would give us the opportunity to go beyond VR technology that [typically] served as a form of entertainment up until now. The project would allow us to create something with which the general public could emotionally resonate. The fact that our technology can comfort and touch viewers was enough of a reason for us to take part in the project, disregarding other factors like revenue. It also allowed us to further pioneer a new frontier in the VR realm, which is heavily characterized by its entertainment purposes.
What was the most challenging part of the production process?
Kim: At first, we lacked the assurance that our work could be a genuine reality for someone without merely resorting to an entertaining experiment. Another challenging aspect was that we were unsure whether there would be a family that would be brave enough to participate.
After the project progressed to a certain extent, we faced technical limitations. For example, enabling uninhibited facial expressions while the character moves around was harder than we imagined, and forming a voice using deep learning was also a difficult task. Many difficulties arose until the day of the VR experience, but in retrospect, the most challenging aspect yet was technically restoring the unique characteristics of human “Na-yeon.”
Also, we put a lot of thought into how to arrange the encounter and where it would take place. Unlike films, we were cautious with letting any of the filmmaker's intent get involved. Our team gained the trust of the participating family, aiming to create the experience based solely on the family's memories, even if it meant having less emotional impact. Through this approach, the family also opened up to us. Vive Studios: At the beginning of the project, we proceeded with our existing methods because it was our first time working on a project that required such a sensitive and emotional approach. We integrated voice recognition and AI to make Na-yeon automatically react and tested out engaging interactions between Na-yeon and her mother, such as taking a picture or drawing together.
However, several pilot tests led us to recognize the need for an extremely cautious approach. We realized that it isn’t easy for a bereaved mother to keep calm and use logical thinking to partake in various tasks in an unfamiliar virtual environment. We also felt immense pressure knowing that we had just one opportunity to film. In other words, capturing the dramatic moment of the mother and daughter reunion made it impossible for multiple takes.
What did you focus on the most to create a “VR experience tailored for one”?
Kim: A VR experience tailored for one person required an analysis of intimate memories of which only that person knows. We focused on analyzing and replicating Na-yeon’s overall impression, behavior, and facial expressions based on the recollections of her mother. Although the lack of technical skill and time made it difficult to depict all of these details realistically, we tried our best to create a digital human based on memory, albeit within our limitations.
What was the production process?
Kim: It was important to give the mother the impression that she had entered her memories. Many aspects of Na-yeon’s image were based on the mother’s memory, and the setting was based in a park that the mother recalled visiting with Na-yeon. The entire point of the story was to show that Na-yeon was alive and well in her memory as well as in heaven so that we can console the mother even if the experience was artificial.
From a technical perspective, we intended to give an impression of actually spending time with a person by mixing in pauses in time and interaction as the story unfolds. This aspect focused on the interactivity into which all VR filmmakers put a lot of thought. With more technical advancements, it will be possible to feature much more interaction and open-ended stories.
Vive Studios: During production, we set three different standards in consideration of the aforementioned difficulties.
We tried our best to avoid factors that may interfere with the mother’s emotions. This meant eliminating automated features such as voice recognition or using AI for character reactions, which were considered in the early stages of planning, because they had a chance of failing and not giving us the results we were looking for. Instead, we used motion capture to create a set of animation clips that portrayed Na-yeon’s every-day behavior with great detail and applied them according to the scenario. In between the flow of animations, we inserted idle looping so that we could adjust the timing between idling and progressing through the scenario, depending on the state of Na-yeon’s mother.
Also, we added simple interactions, such as blowing out candles and Na-yeon reacting naturally when her hair is caressed, to replace complex interactions. Here, eye contact was key. We used Aim Offset so that Na-yeon naturally makes eye contact with her mother while going through the set of motions.
The second key point was configuring devices and optimizing programs in order to deliver a natural VR experience for Na-yeon’s mother. To meet the broadcasting standards for graphics quality and performance, we chose HMD equipment powered by high-end PCs but achieving high rendering performance was not an easy endeavor. Various environment props such as grass had to be rendered across a vast space, and Na-yeon’s skin, hair, outfit, and overall appearance had to be realistic even when viewed up-close while also remaining lightweight. To achieve this, we continuously optimized the modeling and Unreal Engine material setup, and actively utilized the upsampling feature.
For the HMD devices, a wireless VR module was used so that the mother could move around the virtual space untethered, and special gloves were used that could deliver a sense of warmth to the mother when she embraced Na-yeon. An external fan was also programmed in Unreal Engine so that it could be controlled to deliver the experience of a windy outdoor environment.
Finally, we provided various perspectives so that the mother’s experience could be effectively delivered straight to the viewers. The mirrored VR footage from the mother’s point of view wasn’t fitting for the documentary as it was too shaky and too low resolution, which would easily cause motion sickness. We resolved this issue by using the Unreal Engine-powered Zero Density solution, which allowed us to provide a third-person point of view footage of the mother shown simultaneously with the CG background as the final broadcasting footage. The VR footage and third-person point of view footage were rendered by two separate PCs, which had to show matching scenes from the same position. So we developed a solution to sync the space and internal settings between the two programs. This enabled the footage from the mother’s perspective and another from a separate camera to be composited in real time, which was then shared with the viewers. To achieve natural camera work, a hand-held camera with a spatial tracking sensor was used to film the virtual space designed in Unreal Engine as if it were an actual location.
Why did you choose Unreal Engine for Meeting You?
Vive Studios: Terrestrial broadcasting has restricted production times and budgets. In order to quickly produce high-quality interactive graphics, we decided that Unreal Engine was a great fit.
We had experience working with Unreal Engine, so we were aware of Unreal Engine’s ability to create sophisticated real-time graphics as well as features like shaders and logic using a highly intuitive method. Another reason we chose Unreal Engine was to shorten the production time using free resources offered by Epic Games or assets from the Unreal Engine Marketplace.
Where was Unreal Engine used in production?
Vive Studios: Quixel’s Megascans library was very useful for placing various environment props within the large space to deliver a believable VR experience for Ji-sung.
When restoring the child as a digital human, we faced difficulties creating photoreal skin texture and wrinkles because they were too smooth. Unreal Engine’s documentation on digital humans served as a reference that helped us work on the subsurface profile for light scattering through skin and authoring shaders to create minute pore details.
Na-yeon’s movement was designed using features like animation Blueprints and Aim Offset for a natural blend between animation motions and real-time eye contact with Ji-sung. Unreal Engine was also used to control external devices such as fans to simulate wind and devices designed to simulate body warmth.
What have you learned from this project?
Kim: After the documentary aired, many other families sent us their stories and expressed their desire to meet their deceased loved ones. We were very grateful for their responses but also found ourselves thinking hard on it. We felt that this project was much more difficult and large-scale than we had imagined. Although we’re unsure whether we could make this a regular program, we have demonstrated the possibilities of leveraging technology for compelling stories. If we were to work on a similar project, we would want to take it to the next level.
Vive Studios: After Meeting You aired, international press like Reuters and BBC covered the project and we were met with explosive interest and responses that surpassed our expectations. There were many who sympathized with Na-yeon’s mother whereas there were also mixed opinions on the scope of the technology’s application or about ethical standards.
We not only discovered the scale of influence that could be generated when cutting edge technology meets human emotions but also learned that we must approach sensitive topics with great discretion. Now seems to be the time to have a public discussion on how to manage and distinguish real life and virtual reality when such advanced technology becomes the norm in the near future.
What does the future hold for you and what are your next goals?
Vive Studios: Vive Studios will put more focus on the technical recreation of humanistic experiences through various experiments and test the forefront of technology. To achieve this, we will continue with digital human research and development using Unreal Engine. We also plan on developing a real-time virtual production pipeline so that we can utilize virtual reality technology to produce films and TV series.