Building The Future: How One Project Points to the Convergence of Education and Industry in the U.K.
Hello, my name is Stephen Trimble and I am a Lecturer in Games Design and Media at the South West College, Enniskillen. In addition to my primary lecturing and curriculum development responsibilities, I have been afforded several opportunities to engage with industry projects within my area of expertise. South West College is entrepreneurial in spirit, industry facing and aims to empower staff to use their knowledge to engage with industry, developing innovative and robust solutions to modern business problems.
I work within a highly diverse and talented team with each member having various software and process specialisms. Recently the College has invested in a state of the art creative technologies studio, Image, which contains PC’s equipped with Autodesk Maya, Unreal Engine, Nuke, ZBrush and a “markerless” Motion Capture facility. Moreover, in the past few years the College has engaged with VR and AR technologies in a significant way. Last year, for example, Paul McGovern, a member of staff, developed a communications platform for engineers to communicate on CAD design in real-time via Oculus technology.
The College curriculum is diverse and this is particularly true within the Technology Department. Recently, Stephen Moss, Curriculum Manager for Technology, used his contacts and specialism in construction to develop a collaborative project opportunity that would see the College fuse together the worlds of game design and construction/architectural technology.
The Quinn Building Products VR Experience project was conceived. This project was delivered for Quinn Building Products to enhance their marketing and product showcase capacity. Indeed the outcome would be displayed at EcoBuild held in London’s ExCeL Centre from 8th – 10th March. The idea was to have a Virtual Reality experience at their stand to gain interest and encourage interaction of the exhibition attendees. Viewers were able to freely walk around the main room of the house while walking into Q markers progressed the simulation to the next stage.
Below I have provided an executive overview of the development process.
The Brief
Quinn Building Products contacted South West College, Enniskillen wanting a new way to show off their products to potential customers at EcoBuild. This idea evolved into a Virtual Reality experience using the Oculus DK2 so that potential customers could visualise how their products are used within the structure of a building while also providing information and statistics through a narration.
After a period of creative consultation, an outcome was agreed where the viewer could walk around the house and activate a set of stages in order to progress through the simulation. These stages included narrative information about the products, interactivity to pull back layers of the wall’s construction and an informative video further detailing their products.
Constraints & Management
Performance
It was clear at the beginning of the project that the overall aesthetics of the project would have to be constrained as the PC’s used for development and showcasing had certain hardware limitations. After a few tests this meant foliage had to be left out and the polygon density of assets had to be greatly optimised. The majority of textures used were of 512 or 1024 resolutions. Most of these were tiled textures as it would take too much time to develop sets for each asset.
Communication
It was vitally important that constant communication was maintained between the College and Quinn Building Products to ensure as little time was wasted as possible throughout development. Due to travelling distance, it was not always possible to organise weekly meetings to display how the project was progressing. A service called Basecamp was used to connect all parties from both the College and Quinn Building Products, enabling remote communication and project management which in turn allowed for frequent feedback if there were any irregularities or modifications to be made. This meant messages could be sent back and forth throughout the week allowing for quick confirmations, such as the correct materials being assigned to assets and their texture scaling.
Design
Asset Modelling
After some brainstorming of building designs, Quinn Building Products supplied an FBX model built in Autodesk Revit they wanted to be used for the project. Unfortunately, after importing the file into Maya it became clear a large amount of optimisation would be necessary. A number of assets in the file had problems with large tricounts and having multiple faces placed on top of each other. It turned out to be faster to re-model the whole scene using the FBX file as reference rather than trying to modify it.
The interchangeable wall sections were spilt into three different assets; the original wall section, a pulled back version exposing brick work and a final layer displaying the insulation. All three assets were present in the scene, however triggers within the simulation controlled their visibility.
UV Unwrapping
The majority of materials used for this project were tileable, this meant the texture file sizes were kept small. As mentioned previously this was to help with the overall performance of the project. Using this technique meant the UV shells did not need to be limited to the 0-1 UV space allowing for the small texture files to produce a high quality finish.
Several material ID’s were used for both windows beside the interchangeable wall. This allowed for the effect of the window being sliced in half, whereas the only change was an invisible material being applied to that selection. This method was chosen over another mesh swap as all these object were static meshes with baked lighting being applied to them. Having multiple objects invisible in the scene and swapping may have affected the overall lightmass quality around the connecting points.
A second set of UV’s were created for each asset for lightmap usage. Due to the large scale of the house, the model was divided into smaller sections allowing for multiple lightmaps rather than just one covering the whole building. Depending on the size of the asset the lightmap resolution was set appropriately, ranging from 256 for small objects to 2048 for large objects. Adjusting these map sizes to scale with the assets saved time when building lighting throughout production.
Media Framework
The video sequence to be played on the TV started off as a real problem, the first few tests decreased the framerate so much that it was barely playable. After posting a question on the AnswerHub one of the Epic Games employees swiftly replied, giving a few pointers on how to improve the performance. Due to the Framework being very experimental at the time it was very picky about file formats and resolutions. The final video used was reduced to 720p resolution and rendered out with the .wmv codec.
Another problem appeared after getting the video to play properly, this being the audio playback. The audio channel was full of white noise, so it had to be exported as a separate 16bit .wav file and played at the same time as the video.
Triggered Events
The idea behind the project was to allow the player to navigate freely in the specified area while also having a linear narrative overview. This meant procedures had to be set in place ensuring the event activations were carried out in a certain order otherwise the narration would not make sense. Booleans and Branch nodes were heavily used to check if the previous event had finished and only then could the next event be triggered.
Another point that needed constrained was the time limit for each viewer to spend in the simulation. A longer time limit meant queues would grow behind and less people would get the chance to try it out. To counter this, a loop system was incorporated so that when the viewer reached the end of the simulation it would return to a Main Menu display, ready for the next person to start.
A breakdown of the different stages are as follows:
- Main Menu display
- Press Right Trigger to launch
- Activate Events while progressing through the building.
- Interact with the wall section showing how products are used
- Begin video with further information about products
- Video sequence ends, narration ends
- Player returns to Main Menu display
Navigation
A Q marker modelled from the Quinn Building Products logo was created to help players navigate around the scene. The design was kept simple with the idea of it spinning in place giving a familiar feeling to traditional health or ammo packs. Without the marker viewers had to guess where the next trigger would be, this way they could visualise where they needed to be positioned. The marker also had to be modified to stand out from the rest of the scene. Without an emissive glow it was not always be clear as to where the next marker was. The glow was applied through the material and also a bloom effect added to the scene. The marker was then set up in its own simple blueprint to rotate in place, this allowed it to be quickly reused throughout the scene.
Lighting
The lighting setup was kept fairly simple overall. A Directional Light was aimed through the windows of the building while a Skylight was utilised to reduce the contrast of the shadows. Point Lights were positioned around the inside of the building to increase the brightness value, while Spotlights were placed in front of the ceiling and floor light assets. These Spotlights were used with a Post Processing Volume to create a Bloom effect.
Due to the hardware limitations, Lightmass settings were kept close to default as it was taking too long to build lighting every time modifications were made.
While testing the interchanging walls it became apparent that correct lighting would be very important to hide the seams between the assets. While building with Preview lighting quality it became very worrying and obvious where the connecting areas were. Thankfully, by increasing the quality to Production the seams were hidden. It was always apparent on the Unreal Engine Forum that you should always use Production lighting for the final build but there was now comparison visualising the differences between them. This change in quality made it clear that Production lighting is far superior.
Testing
Although the mechanics and gameplay were tested throughout development, Game Design students were also given the opportunity to play-test and provide feedback at regular intervals. This was a very important stage of development to test the activation of events as there were quite a few triggers located in a small area. Their detachment from the project allowed them to explore openly and reveal some minor issues which were subsequently fixed through modifications in Blueprints.
The biggest issue arose from trigger overlapping. With the way the stages were created in Blueprints, the trigger points were always live, but had to go through a branch to check if it was turned on yet. Therefore if the player was overlapping it before the current stage was over the trigger had already been activated. A secondary activation had to be added to the end of each stage to check if the player was already overlapping the next trigger. If it returned as being true there was a one second delay before the next stage would activate.
Two desktop PC’s showcasing the VR experience were being shipped to London, so the final test was to ensure everything worked as expected. Oculus’, headphones and gamepads were connected to each PC while completed builds of the project were copied over to the hard drives. Thankfully everything worked from the beginning and there were no unexpected setbacks.
Reflection
Various lessons about level design were learned during the time spent on the VR stand. This was due to the variety of different reactions of the viewers while they were playing through the simulation. The following points are some of the most important that came out of the whole experience.
VR Excitement
A number of people at the event had never used an Oculus before, this meant they were more interested to try out the experience at first rather than be guided by the narration. On a few occasions the player was either round the side of the house or already in it by the time the introduction had finished. This meant they had to be guided back to the front door to trigger the first event. This could have been controlled better if the player was either restricted to an area at the beginning using blocking volumes or movement was disabled for the player until the introduction had completed.
Narrative Triggers
Having key words positioned properly in the narration. E.g - Some lines of speech said to press the right trigger at the beginning of the audio file rather than at the end. This meant that players were continuously pressing the right trigger until the narration had completed.
Composition & Leading the Eye
Some players also didn’t realise the block of windows at the front of the house was actually a doorway leading into it. This was due to the doors only opening when you got close enough to them. At the side of the house there was a walkway to give the player some freedom as to where they wanted to go. A number of players ended up going down this alleyway as they thought that was the path to take into the house.
Conclusion
The project is demonstrative of the creative, innovative and technically robust capacity of South West College. It serves as a true testament of the College’s commitment to engaging with industry. Furthermore, the project portrays in a very positive light how investment in digital creativity can benefit all industries, strengthening their business function. Even traditional businesses, not directly involved in game design, can embrace such technology to help their competitive edge.