How MTV Choreographika Leveraged UE4 to Bring Social Sentiment to Life
Have you ever wondered what social sentiment might look like if it were brought to life? For many, the ebb and flow of social interaction moves like a whimsical dance as users act and react in real time to a global conversation with streams of unique opinions given in varying degrees of intensity.
Well, that concept recently became a reality when a group of artists and technicians were commissioned to bring social sentiment surrounding a particular event to life.
The result was Choreographika - a project that took the social conversation around #MTVEMA and transformed it into an interpretive digital dance and interactive artwork. Through it, keywords, artist mentions and emotions from Twitter were translated into vision and movement in real time.
To find out more, we caught up with Silent, No Ghost and Friends Electric - the teams behind the collaborative artwork that is Choreographika.
Q: How did the original concept for Choreographika come about?
Nathan Prince (Creative Director @ Silent): An agency called Sunshine X were curating an MTV exhibition in Rotterdam that was due to open during the MTV EMA weekend. They approached Silent as 1 of 10 artists they were looking to commission to make an installation/exhibit.
Our brief was to look at the huge, online Twitter conversation around the MTV EMAs and turn the mood and excitement into an exhibit. Our response could be anything from sculpture to sound installation and we had free reign to make something progressive and ambitious.
Creatively we knew that our response had to react dynamically - in real-time - to a frantic world of Tweets and we knew we wanted the conversation to have an effect in the real, physical world.
This became a data visualisation project, yet the brief also gave us the opportunity to look at human emotion on a global scale. What was important for us is that it had a human, emotive quality and it felt like dance was the most exciting, expressive response. Dance transcends borders, it is a language that is universal, transcending age, creed and cultures. So the idea was born - We will interpret the #MTVEMA conversation and fans’ emotion through a dancing character that reacts dynamically in real time, be that to the rate of hashtags, to emotive keywords, sentiment and artist mentions.
Q: How big was the team that worked on this project and how long did you have to work on it?
Tom Flavelle (Designer @ No Ghost): The crew consisted of the following artists and technicians:
- Silent (Creative Direction and Audio) = Nathan Prince, Liam Paton, Andy Theakstone, Sarah Kelly / Lucy Ridley dancer/choreographer
- Friends Electric (Production) = Alex Webster, Dom Thompson-Talbot, Orlaith Turner
- No Ghost (Development) = Tom Flavelle, Jack Straw, Lawrence Bennett, Luke Gibbard
The entire project took five weeks from conception to delivery, including design, original music, motion capture and development.
Q: What were some of the biggest challenges you faced in realizing your vision for the project?
Jack Straw (Developer @ No Ghost): The tight deadline on this project required all of the elements of the production to be worked on in parallel. This would have been impossible if we had gone down the route of using pre-rendered loops, but as it was real time we could quickly integrate updated elements. The unknowable nature of using live data was also a challenge. For example, in our data Justin Bieber was an unexpected source of problems as he was mentioned 10 times more often than any other artist! This meant the dancer was constantly doing a Bieber dance that we had to dial back a bit.
Q: Why did you choose Unreal Engine 4 for this project?
Tom Flavelle: Unreal Engine was the obvious choice for this project. The project was displayed on a wall of 9x42” LED screens so it needed to be output in as high a resolution as possible to still maintain visual clarity when viewed close up. The final installation ran at 60fps in 4K resolution. We’re all originally from a film and animation background so high quality lighting is really important to us.
The included C++ API was instrumental in allowing the streaming of the metrics data, collected live from Twitter, to the engine itself.
Our experience using motion capture to drive a character in Unreal Engine also helped inform our choice to incorporate different intensities of dance based on the rate on incoming Twitter mentions in real-time.
Q: Were there any aspects of UE4 that you found particularly helpful throughout the production process?
Tom Flavelle: Unreal’s new sequencer is a powerful tool that allowed me to animate tons of cameras in a short amount of time. Normally we’d author cameras in a 3D package and then bring them in, but with all the existing UE camera rigs, all of the controls you need are right there in the engine. Being able to grade and adjust all the post FX per shot was extremely useful to add that final visual polish.
Q: Has your work on Choreographika inspired other ideas about visualizing data - whether it be social or otherwise?
Nathan Prince: Yes! I love the idea of data storytelling. Generally, I feel we crave meaning and a deeper kind of connection than the frantic world of Tweets and social networks and if we can harness that world and present it in a different, more meaningful way, I find that really inspiring and thought-provoking.
Tom Flavelle: When the ‘Twitch plays Pokemon’ phenomenon happened it got us thinking about how we could engage communities in an interactive way. That idea has kept knocking around in the back of our heads and we want to do more projects in that area. There is a massive untapped potential for data visualisation in Unreal Engine. Hopefully in the future we’ll explore more of the possibilities of worldwide audience interaction.
Q: Where can folks go to learn more about your work?