Hello Internet, I am Russell Grain. My team and I work for the Australian company Opaque Multimedia, where we have been working on the Kinect 4 Unreal plugin for a bit over the past six months. Kinect 4 Unreal is a plugin that allows Unreal Engine 4 to access the full functionality of the Kinect 2 sensor. We have just recently released version 1.1 of the plugin on our website and a collection of associated content examples on the Epic Marketplace.
What is Kinect 2?
For those of you who haven’t gotten the chance to use one before, the Kinect 2 is a sensor suite that uses a combination of colour, infrared and time-of-flight cameras to create a detailed map of the world. It also houses an array of microphones that is used to localise any sound it picks up.
Principally, it is used to track players according to skeletal information that it infers from its various sensors. Skeletal tracking is really just the tip of the iceberg for what the Kinect 2 is capable of but it should provide a good overview for what is most commonly used for.
How it All Started
Kinect 4 Unreal started as an internal tool for a number of projects that called for a natural motion interface, that is, an interface where the only input is the user’s body motion. The engineering systematically poured over every part of the Kinect 2 SDK to ensure that the artists and level designers has access to everything the Kinect 2 can possibly give them in the way they are expecting to use it.
I remember when I first sat down to use K4U. I had seen one of our engineers working with the Kinect in Unreal, posing and dancing in his free time, so I took him aside to ask if he could set me up with something to play with, specifically hand states and hand position. This was very early days and many of the features we have now had not yet been created, but I did have access to what I needed to get what I wanted.
I wanted to be able to throw fireballs.
Based on my experience developing Kinect applications in earlier versions of the technology and in other engines, I set aside the rest of the day to create that demo. To my genuine surprise, half an hour and a dozen nodes later, I had became a wizard with the ability to conjure explosions with my bare hands and throw them around, sending stacks of boxes flying about the level. No other work was done that day in the Opaque Multimedia offices, because, let’s face it, nothing you do in a workplace is quite as awesome as throwing fireballs.
I expected I’d need a day or more, when in reality, it only took me thirty minutes. I didn’t have to do the work because it had already been done.
Blueprint for the Fireball demo.
Empowering Users
The development of the fireball demo really highlights one of our core principles that underlie every aspect of Kinect 4 Unreal - we wanted to make it accessible to anyone regardless of their coding experience.
As development progressed and we exposed more and more of the Kinect 2’s functionalities, we also began taking a closer look at how we can make the data useful for users who will want turn data into games. So we turned to our design team and asked them to build, by themselves, a suite of demo games, then we could iterate on their feedback.
Spydrone camera control demo from content examples. A crowd favourite.
Through this process, we gradually refined our interfaces and the means through which we expose raw Kinect data to the user. By the time the designers were able to create whole demos from beginning to end, we were confident that we had created an interface that allows users to access everything the Kinect 2 has to offer comfortably and easily.
One of the best example of this collaborative approach is in our avateering system. Originally, we simply exposed the raw skeletal data from the Kinect 2. While the artists had very little problem using that to directly control an in game character, they also commented on the fact that the output was too jittery.
The Avateering system demonstration showcasing the difference between raw input, processed input and ways to combine avateering input with Unreal animantions.
Something a lot of people thus far have been thankful for is that we also automatically convert the semi-incomprehensible quartonian output of the Kinect 2 directly into Unreal rotations and integrated a dynamic smoothing function. The end result is information that is easy to read and highly usable.
Moving Forward
The current 1.1 version of the plugin has had a stable set of features for quite some time, mostly because we have given the user practically everything the Kinect 2 currently allows them to do. We will also maintain full feature parity following every official update to the Kinect 2 SDK. If you want to get started we can recommend checking out our 1.1 introduction video that demonstrates the basic functions of the Kinect and goes into detail about some new features.
There are a number of planned improvements that will roll out over the next few months, some of which will be optional extensions to the plugin while some are improvements to the core functionalities.
We are also working with the folks at Sirenum Digital, the developer of the upcoming UE4 title Lost Pisces to create an interface that will enable players to engage with their in-game avatar in ways that has never been done before.
With Unreal Engine 4 now being totally free, we’re looking forward to seeing the many crazy and wonderful things you can do when you rub the Kinect 2 and Unreal Engine together!