Engine - News, Developer Interviews, Spotlights, Tech BlogsFeed containing the latest news, developer interviews, events, spotlights, and tech blogs related to Unreal. Unreal Engine 4 is a professional suite of tools and technologies used for building high-quality games and applications across a range of platforms. Unreal Engine 4’s rendering architecture enables developers to achieve stunning visuals and also scale elegantly to lower-end systems.en-USUnreal Fest Europe 2019 presentations now available online by Epic Games, technical partners, and a variety of studios across Europe, these sessions focus on the needs of coders, artists, animators, designers, and producers.In April 2019, over 800 game developers from all around Europe and beyond gathered in Prague to celebrate their achievements, connect with like-minded professionals, and learn new ways of using Unreal Engine to create great games. <br /> <br /> This second annual Unreal Fest Europe, which included 56 sessions delivered by Epic Games, technical partners, and 16 studios across Europe, focused on the needs of coders, artists, animators, designers, and producers. <br /> <br /> Today, we’re pleased to announce that you can view available sessions from Unreal Fest Europe 2019 online right <a href="" target="_blank">here</a>. <br /> <img alt="UFE2019_PresentationsReleased_Pic1.jpg" height="auto" src="" width="auto" /><br /> Of course, we were thrilled to have a large number of developers participate as speakers ready to share their knowledge with their peers. Valentin Galea, Lead Programmer from Splash Damage, delivered a talk focused on the developer’s history with Unreal Engine while presenting workflows and systems that continue to empower studios and their developers. "Unreal Fest was valuable for me, both because I met a lot of like-minded engineers and also because I had the opportunity to speak and share experiences," says Galea. "Hopefully, what we share here will save others time, and overall, bring up the standards of work and promote good practices." <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> As seen in the video above, Unreal Fest Europe 2019 also provided the opportunity for attendees to meet, mingle, and connect with others who are actively developing new experiences with Unreal Engine. There were multiple networking events during the conference days, including a kick-off party at Prague&#39;s own Epic nightclub and a closing main party. <br /> <br /> If you’ve been inspired by what you’ve seen, then be sure to keep an eye out for the Unreal Fest Europe 2020 dates and information. Until then, we hope <a href="" target="_blank">these presentation videos</a> will be of value to you, and we look forward to seeing you there as an attendee (or even a speaker!) next year.<br />  Unreal Fest EuropeLearningMilena KoljensicMon, 20 May 2019 15:30:00 GMTMon, 20 May 2019 15:30:00 GMT UE4-powered games you can play on the Oculus Quest at launch Robo Recall: Unplugged to Vader Immortal, here are 10 amazing UE4-powered games you can play on the Oculus Quest today.Initially revealed back in 2016 as Project Santa Cruz, the highly anticipated <a href="" target="_blank">Oculus Quest</a> is finally here. The hype is driven by the fact that the VR headset is completely wireless and doesn&#39;t require a PC. Coupled with a new inside-out tracking system known as Oculus Insight, the all-in-one headset also doesn&#39;t require any external sensors or cameras, making it a completely tetherless experience. <br /> <br /> Unreal Engine 4 <a href="" target="_blank">fully supports the Oculus Quest</a> and to commemorate the launch of the groundbreaking device, we&#39;re highlighting 10 awesome UE4-powered titles you can play on the headset today. <br /> <img alt="blog_body_BOGO.png" height="auto" src="" width="auto" /><br /> <strong><a href="" target="_blank">BOGO</a> | Oculus</strong><br /> <br /> Developed by Oculus as a showcase for what&#39;s possible with the Quest, BOGO is a virtual pet simulator. Not only will you be able to pet the titular BOGO in the experience, but you&#39;ll be able to play fetch with the cute alien, concoct edible meals for it, and engage in a wide variety of mini-games that show off what the wireless headset is capable of.  <div style="text-align: center;"><img alt="Creed_Rise_to_Glory.jpg" height="auto" src="" width="auto" /></div> <strong><a href="" target="_blank">Creed: Rise to Glory</a> | Survios</strong><br /> <br /> In <a href="" target="_blank">Creed: Rise to Glory</a>, you are Adonis Creed, fighting toe-to-toe with the world’s top opponents to establish your boxing legacy. This intense cinematic experience features Survios’ new Phantom Melee Technology for impactful VR melee combat so you can train, fight, and win like Creed. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <strong><a href="" target="_blank">Dance Central VR</a>​ | Harmonix</strong><br /> <br /> Dance Central is back, now for Oculus Quest! Get ready to dive back into this beloved series and immerse yourself (literally) in your surroundings. As a day-one launch title for the Oculus Quest, Dance Central was made with the Quest’s standalone capabilities in mind to bring you a truly unique gaming experience. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <strong><a href="" target="_blank">Dead and Buried II</a> | Oculus</strong><br /> <br /> Set in the deadly western setting established by the first game, Dead and Buried II mixes things up by being a fast-paced multiplayer arena shooter that supports full locomotion this time around. Players will be able to dual wield and pick up a wide variety of weapons that include pistols, shotguns, rocket launchers, and more.  <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <strong><a href="" target="_blank">Moss</a> | Polyarc</strong><br /> <br /> Nominated for more than <a href="" target="_blank">30 awards</a>, Moss is an action-adventure puzzle game from Polyarc that&#39;s tailor-made for VR. It combines classic components of a great game—compelling characters, gripping combat, and captivating world exploration—with the exciting opportunities of VR. The Oculus Quest version of the game features a new chapter that continues the story of Quill, a young mouse with dreams of greatness.  <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <strong><a href="" target="_blank">Oculus First Contact</a> | Oculus</strong><br /> <br /> Developed by the Oculus team that created UE4-powered <a href="" target="_blank">Dreamdeck</a>, <a href="" target="_blank">Farlands</a>, <a href="" target="_blank">Toybox</a>, and more, First Contact was originally developed as a fun showpiece for what&#39;s possible with the Oculus Rift Touch controllers. The ‘80s sci-fi experience has been optimized for the Oculus Quest and its updated Touch controllers. In Oculus First Contact, gamers will be able to engage with a cute robot and play a variety of mini-games that allow you to shoot moving targets with a laser pistol, fire little rockets, and more.  <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <strong><a href="" target="_blank">Oculus First Steps</a> | Oculus</strong><br /> <br /> Oculus First Steps is a new user experience developed by Oculus that showcases the freedom of movement that the Quest allows. In it, players will be able to hold hands with a robot and dance completely in 360 degrees. Oculus First Steps also shows off the device&#39;s new Touch controllers and Insight Tracking through a variety of tactile mini-games that allow you to dance, shoot guns, and more.   <br /> <img alt="blog_body_roborecall.jpg" height="auto" src="" width="auto" /><br /> <strong><a href="" target="_blank">Robo Recall</a><a href="" target="_blank">: Unplugged</a> | Drifter Entertainment</strong><br /> <br /> Robo Recall is an action-packed VR first-person shooter with visceral gameplay and an in-depth scoring system. Earn the high score by using creative combat tactics and skill shots as you teleport through city streets and rooftops in an awe-inspiring ballet of bullets. Tear apart your interactive robot foes and use them to fend off the enemy onslaught. Unlock, customize, and test weapons before taking on advanced challenges that put your newfound skills to the test! <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <strong><a href="" target="_blank">Sports Scramble</a> | Armature Studios</strong><br /> <br /> Welcome to Sports Scramble! Take your favorite sports and mix them together! Play tennis with a golf club. Bowl a strike with a basketball. Smash a Homerun with a hockey stick! Each of the three main sports: Tennis, Bowling, and Baseball have their own single-player training, quickplay, and challenge modes. Show off your skills competing against other players in online multiplayer matches in the <a href="" target="_blank">free demo</a> currently available!<br /> <br /> Developed exclusively for the untethered freedom of Oculus Quest – Sports Scramble is a fresh new take on multiplayer VR gaming.<br />   <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <br /> <strong><a href="" target="_blank">Vader Immortal: A Star Wars VR Series - Episode I</a> | ILMxLAB</strong><br /> <br /> Step inside a galaxy far, far away with Vader Immortal: Episode I - the beginning of a three-part series that combines immersive cinematic storytelling with dramatic interactive play. You are a smuggler operating near Mustafar, the fiery world Darth Vader calls home. When you are unexpectedly pulled out of hyperspace, you find yourself uncovering an ancient mystery at the behest of the Sith Lord himself. With the help of you droid companion, ZO-E3, you’ll navigate the dangers of the fortress, hone your lightsaber skills, and meet new characters along the way as you discover what Vader is up to. <br />  Jimmy ThangTue, 21 May 2019 15:30:00 GMTTue, 21 May 2019 15:30:00 GMT Sanzaru Games aims to deliver a next-level 30-hour VR epic with Asgard’s Wrath Director Mat Kraemer and Technical Director Evan Arnold share how the studio is developing next-gen VR swordplay, provide their thoughts on the future of VR, and more.Developer Sanzaru Games is no stranger to developing VR games. Being a pioneer of the medium, the Foster City, CA-based studio has released titles such as <a href="" target="_blank">VR Sports Challenge</a>, <a href="" target="_blank">Ripcoil</a>, and <a href="" target="_blank">Marvel Powers United VR</a>. The company’s latest effort, <a href="" target="_blank">Asgard’s Wrath</a>, releases later this year and leans into the studio’s learnings from previous projects. Produced in collaboration with Oculus, the fantastical action game boasts an epic 30-hour adventure with top-notch production values that has the potential to alter our perception of what’s possible with a VR game.<br /> <br /> We recently had the chance to interview Sanzaru Games Creative Director Mat Kraemer and Technical Director Evan Arnold to learn how the studio is developing the game’s satisfying swordplay. They also talk about how they’re delivering one of VR’s best-looking games while keeping performance steady. In addition, the duo elaborates on the benefits of using Unreal Engine 4, provide their thoughts on the future of VR, and more.  <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <strong>Thanks for your time! With a fantastical Norse premise coupled with a focus on melee combat, what games influenced Asgard&#39;s Wrath?</strong><br /> <br /> <strong>Creative Director Mat Kraemer:</strong> Asgard&#39;s Wrath is a concept that Sanzaru has always wanted to create and build. It brings together influences from some of our favorite games and genre’s. Asgard&#39;s Wrath has a combat style influenced by God of War, the puzzles of Zelda, and the exploration and crafting loops from games like the Witcher. All of these influences are taken to the next level in VR utilizing the technology of the Oculus Rift platform.  <br />  <br /> <strong>Asgard&#39;s Wrath was originally going to be a game where you controlled a massive-sized god, but human-scale gameplay was later added to the experience. Can you elaborate on why it was important to incorporate both aspects of gameplay? </strong><br /> <br /> <strong>Kraemer:</strong> As with all game development, the game and its direction evolved over time. In the original concept, players only played as the god, but would place down pawns that would be AI-controlled. The team discovered it was so much fun switching back and forth between human and god that it became a full feature pillar and drove the game’s future direction. <br /> <img alt="Asgards_Wrath_Screenshot08.jpg" height="auto" src="" width="auto" /><br /> <strong>How did Sanzaru Games leverage VR to play with the sense of scale to differentiate the look and feel of the game when you&#39;re controlling a larger-than-life god?</strong><br /> <br /> <strong>Technical Director Evan Arnold:</strong> In the original design of the game, you played as the god exclusively, manipulating the environment to assist the mortal in accomplishing goals (puzzles, castle defense battles, etc.). Around this same time in development, we were experimenting with scale in Marvel Powers United VR. We knew from some very rough experiments that our brains were quick to accept scale. We just didn’t know if it was compelling and fun. Once we got Rocket Raccoon and The Hulk running around next to each other, we immediately saw some joyful reactions between people experiencing that scale. <br />  <br /> From this starting point, one of our engineers on [Asgard’s] Wrath took this sense of scale to the extreme. Transforming the player into mortal and god-scale experiences creates an environment where players can feel small among giants or tower over mortal minions to help the hero with their adventure. It was immediately fun to see the world from two very different perspectives. It required a fair amount of engineering work and optimization to be able to view the world in this way, but the game is much better for it. <br />  <br /> <strong>Kraemer: </strong>What Evan describes is that exciting moment in game development when you have stumbled across something really special. When you see recurring feedback on something you experimented with, and players enjoy it, you know you are onto something good. This was the case for the god and mortal perspective switching. The team is really happy how this all came together. <br /> <img alt="Asgards_Wrath_Screenshot01.jpg" height="auto" src="" width="auto" /><br /> <strong>Coupling VR controls with physics, people are saying Asgard&#39;s Wrath features arguably the best VR melee combat to date. How did you set about achieving this? </strong><br /> <br /> <strong>Arnold:</strong> We’ve been experimenting with user interaction with the environment from our earliest prototypes on VR Sports Challenge. We’ve found time and time again that as we slide along the scale between simulation and canned experience, we generate very different VR experiences for the end user. In Asgard’s Wrath, we have tried the spectrum of experiences for our melee combat from entirely canned responses to entirely simulated physical responses. <br />  <br /> I believe we landed on the perfect balance between what physics would dictate should occur and what the player “meant” to occur. Total physics simulation is fine for a time, but we wanted our combat to be fast-paced, responsive, and always working towards realizing the player’s intent. <br />  <br /> Since people are claiming it’s the best VR melee to date, all of us here feel validated about our many decisions along the way. <br />  <br /> <strong>Kraemer:</strong> It took lots of iteration and a great team to pull it together. As with all our VR experiences, we are always trying to push new features that have been unproven in the VR space. We have done this with our locomotion, full body IK, and throwing/catching. For Wrath, we wanted to nail combat, and I think our team is doing an excellent job at proving melee combat in VR can be a fun and rewarding experience. <br /> <img alt="Asgards_Wrath_Screenshot04.jpg" height="auto" src="" width="auto" /><br /> <strong>Asgard&#39;s Wrath features numerous weapons that include bows, swords, and throwable axes. How did you go about designing weapons in the game while ensuring they would feel fun and satisfying to use in VR?</strong><br /> <br /> <strong>Arnold: </strong>Creating fun experiences with a variety of weapons has indeed been a challenge. Most notably, players are typically not quite as good as they think they are when it comes to throwing or timing. <br />  <br /> In Asgard’s Wrath, we have scalable assist in place so players can select the amount of help they require in order to have a fun, yet challenging experience. The types of weapons, and the tuning required to get powerful and fun VR experiences, is something that Sanzaru has been working toward for years. We now have a full bag of tools and tricks in order to create a compelling “feel” to the experience using these weapons. <br />  <br /> <strong>Kraemer:</strong> Working on Marvel Powers United VR really helped pave the way for our weapon variety. We had already done bows with Hawkeye and had a great foundation for melee and projectile combat. I am excited for players to see the unique weapons we have in Wrath, many of which have not been shown yet. <br /> <img alt="Asgards_Wrath_Screenshot03.jpg" height="auto" src="" width="auto" /><br /> <strong>Does Asgard&#39;s Wrath take into account how hard players swing their swords?</strong><br /> <br /> <strong>Arnold:</strong> Absolutely! We have taken examples from a variety of different sword play ideas and tried to integrate them into the game. There are some attacks where it’s sufficient to just intercept the incoming attack (put my sword between me and the bad guy), and there are other attacks that require the player to actively swing a counter direction. Without the latter, we found there are many exploitative behaviors that result in a mushy, non-threatening engagement with enemies. <br />  <br /> <strong>With good lighting, high-quality textures, and impressive character models, Asgard&#39;s Wrath features amazing visuals. How did you achieve the look of the game?</strong><br /> <br /> <strong>Kraemer:</strong> A group of talented artists and lots of iteration. Sanzaru has an extremely talented art crew that has experience with the platform’s needs and boundaries. Our game director, Bill Spence, has a really good eye for detail and where to best drive the art team’s efforts. <br /> <img alt="Asgards_Wrath_Screenshot06.jpg" height="auto" src="" width="auto" /><br /> <strong>Considering VR&#39;s steep requirements, how are you keeping performance in check?</strong><br /> <br /> <strong>Arnold:</strong> Here at Sanzaru, we have a world class art team and an incredible next-generation engine. We also have ambitious worlds, complex interactivity, puzzles, UI, and physics. The short answer is… Profile, profile, and profile. <br />  <br /> It cannot be said enough to have strict asset guidelines in place, and then hold the team accountable for abiding by them. Using an engine like UE4 has made keeping tabs on this enormous world much easier than it would have been with other engines. We make full use of all profiling infrastructure and automation tooling to check our work as we go. We fix issues as they arise and, as much as feasible, try to maintain performance stability. <br />  <br /> <strong>Kraemer:</strong> Evan [Arnold] is a performance guru and stays on the team when we go over budget. Many times we want to add more VFX, polys, and complexity, but Evan keeps us in check. It’s a balance back and forth where we spend our budget for performance requirements. <br />  <br /> <strong>Having worked on several VR games like Marvel Powers United VR, Ripcoil, and VR Sports Challenge, what have you learned about developing for the medium that you&#39;re bringing into Asgard&#39;s Wrath?</strong><br /> <br /> <strong>Arnold:</strong> Never take anything off the table. This is a totally new medium and we are finding more and more that things need to be tried before they are tabled. Traditional flat screen ideas sometimes really don’t work in VR. Sometimes something that you feel would be perfect for VR becomes laborious and players get tired (as cool as Minority Report is, most of us don’t have Tom Cruise’s muscles). This has been, in my opinion, the best part about working in VR. It’s uncharted, and the team here at Sanzaru is helping lead the way into standardizing what it means to build games for VR. <br />  <br /> <strong>Kraemer:</strong> I agree with Evan [regarding] the comment, “never take anything off the table.” There have been many times people have said to us, “don’t do that in VR,” but we push back and do our best to make it work. It’s such a new medium with lots of space for growth and exploration. It’s exciting to be on the front of the technology creating things that will drive the VR space for years to come. <br /> <img alt="Asgards_Wrath_Screenshot05.jpg" height="auto" src="" width="auto" /><br /> <strong>As a game that Oculus is using to showcase the Oculus Rift S, were there any special considerations developing for the new VR headset?</strong><br /> <br /> <strong>Arnold:</strong> Oculus has done a fantastic job with the updated Oculus Rift S hardware. The inside-out tracking is stable and robust. We have not had any issues moving our content from Rift to Rift S. Plug in and enjoy the higher resolution. <br />  <br /> <strong>Does the studio have any favorite UE4 tools or features?</strong><br /> <br /> <strong>Arnold:</strong> Too many to pick. We utilize almost every tool in the tool box in our regular daily development.<br />  <br /> <strong>Kraemer:</strong> Sanzaru really enjoys using UE4, and it’s smoothed out many pipeline issues. Wrath will be our fifth game shipping on UE4. Sonic Boom: Fire and Ice was our final game developed using our in-house APE engine.  <br /> <img alt="Asgards_Wrath_Screenshot07.jpg" height="auto" src="" width="auto" /><br /> <strong>How helpful has it been having access to UE4&#39;s source code?</strong><br /> <br /> <strong>Arnold:</strong> Extremely helpful. We regularly delve deeply into the codebase to track issues and apply bug fixes from future releases. Having the code base allows us to have a more thorough understanding of the decisions we need to make and the technologies backing those decisions. <br />  <br /> <strong>With Sanzaru Games being a pioneer in the VR space, what do you think about the medium and where it&#39;s headed?</strong><br /> <br /> <strong>Arnold:</strong> As soon as we tried early prototype versions of what is now Rift, we were convinced that VR is the future. VR remains very exciting, and Oculus is doing a fantastic job growing the ecosystem. As cost comes down and more players try VR, the ecosystem continues to grow. There’s no doubt that VR is an incredible gaming experience. <br />  <br /> <strong>Kraemer:</strong> Since the time we started our work in VR, the medium has evolved immensely. We now have full-body locomotion and games are getting bigger with more complexity. It’s nice to see larger adventure games coming to the platform and not just smaller experiences. With games like Asgard’s Wrath, it’s a massive adventure game with lots of content. I hope that this trend continues and players get more full-game experience on the platform. <br />  <br /> <strong>Thanks for your time. Where can players learn more about the game?</strong><br /> <br /> <strong>Kraemer:</strong> Check out more info on the <a href="" target="_blank">official website</a> or our join the discussion on our Sanzaru on <a href="" target="_blank">Discord channel</a>. <br />  Asgard's WrathSanzaru GamesVRGamesJimmy ThangTue, 21 May 2019 11:00:00 GMTTue, 21 May 2019 11:00:00 GMT Racer pays tribute to old-school arcade action while forging its own path studio 3DClouds explains how their project was inspired by racing games like Daytona, Wipeout, and F-Zero.Building on the success of beloved kart racer <a href="" target="_blank">All-Star Fruit Racing</a>, Italian game developer 3DClouds set out to develop <a href="" target="_blank">Xenon Racer</a>, a game that pays homage to arcade classics like Daytona, Wipeout, and F-Zero. While the roughly 20-person studio set out to create a game that honored old-school arcade racers, they also wanted to inject Xenon Racer with modern visuals and physics.<br /> <br /> To find out how 3DClouds leveraged Unreal Engine 4 to do this, we interviewed Production Director Sergio Rocco, Community Manager Tommaso Valentini, and Lead Programmer Christiano Orlandi. In our discussion, the trio discusses working with esteemed automotive designer Marcello Raeli, elaborates on what the futuristic setting allowed them to do, and shares developmental tips.  <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <strong>Thanks for your time. Developer 3DClouds previously worked on All-Star Fruit Racing. What has the studio learned from working on the kart racer that you&#39;ve brought to the development of Xenon Racer?</strong><br /> <br /> <strong>Production Director Sergio Rocco:</strong> On All-Star Fruit Racing, we built the data structures, the organization, and the flow that the company would build off of. With it, we also developed the online support for all the platforms, which served as a starting point for our next projects.<br /> <img alt="Xenon-Racer---Screenshot-1.jpg" height="auto" src="" width="auto" /><br /> <strong>Many are saying that Xenon Racer&#39;s game design feels like a modern remake of old-school arcade racing games. Was this a design goal for the team?</strong><br /> <br /> <strong>Community Manager Tommaso Valentini:</strong> We wanted to re-imagine arcade titles from the 90s. While we wanted to make a game that paid homage to these games, we also wanted to make a product for a usually neglected niche portion of the market. Xenon Racer takes inspiration from the likes of Ridge Racer, Daytona, Wipeout, and F-Zero while forming, at least in our opinion, its own well-defined and unique soul.<br /> <br /> <strong>Considering the game is set in 2030, how did the studio decide on the near-future theme?</strong><br /> <br /> <strong>Valentini:</strong> With Xenon Racer, the aim was to create an old-school arcade racer where players can have simple fun right from the start. We chose a futuristic theme, which fits the arcade vibe and allowed us to use our imagination and creativity when designing tracks, cars, and buildings. 2030 isn&#39;t too far in the future and our idea was to follow a realistic evolution of car design. [Automotive designer] <a href="" target="_blank">Marcello Raeli</a> helped us a lot with these concepts and created some marvelous vehicles, which became one of Xenon Racer&#39;s strengths. <br /> <img alt="Xenon-Racer---Screenshot-2.jpg" height="auto" src="" width="auto" /><br /> <strong>Drifting to earn speed boosts is important in Xenon Racer. Can you talk about how you designed these mechanics?</strong><br /> <br /> <strong>Rocco:</strong> It&#39;s been a long time since we felt like we had a true "arcade" racer. So we decided to study older games like Ridge Racer, Daytona, Out Run, and Wipe Out in depth. Xenon re-interprets the game mechanisms of these great arcade classics from the 90s in a modern age. The challenge was to create a game that felt similar to those games, but weren&#39;t bound by 90s’ technical limitations. So we used real physics. Xenon therefore infuses our passion for those 90s games with the look and technologies of today.<br /> <br /> <strong>Xenon Racer makes players respawn after they&#39;ve taken a certain amount of damage. Can you explain your thought process for implementing this system?</strong><br /> <br /> <strong>Rocco:</strong> As we mentioned, Xenon was born with the idea of being an arcade racing game through and through. Having to respawn the car after totalling it is a classic arcade driving mechanism. It seemed perfect for our project, and thanks to modern technologies, we were able to reproduce that effect with more spectacular VFX.<br /> <img alt="Xenon-Racer---Screenshot-6.jpg" height="auto" src="" width="auto" /><br /> <strong>Can you elaborate on how the studio achieved the game&#39;s great particle effects, cool-looking reflections, and excellent car models? </strong><br /> <br /> <strong>Rocco:</strong> Unreal Engine 4 provides us a very powerful development environment that facilitates excellent visual effects. It was very important for us to have a clear vision for how we wanted them to look. As for the vehicles, we knew from the beginning that we wanted to get Marcello Raeli involved with the project. He is a professional car designer who works for numerous car manufacturing companies. Marcello was excited to participate in the project and immediately understood the spirit of the vehicles we wanted to create. The realization of our vehicles, coming from the mind of a real car designer, has made our machines not only beautiful, but also credible, as we will probably see cars very similar to those featured in Xenon Racer in the next 10 years, due to the fact that Marcello is drawing those types of prototypes in real life. <br /> <br /> <strong>Considering Xenon Racer runs well, how did you keep performance in check? Were there any UE4 tools that were particularly helpful?</strong><br /> <br /> <strong>Orlandi:</strong> In order to fine tune performance, we developed a tool that automatically runs through all the tracks (for every platform) and collects data that we can analyze to find problem areas on tracks. We coupled this with Unreal Engine 4&#39;s ability to analyze and modify rendering parameters during runtime through the command prompt, which was incredibly useful to us when we had to troubleshoot for bottlenecks. Our artists made good use of <a href="" target="_blank">Hierarchical Level of Detail</a> (HLOD) and <a href="" target="_blank">Proxy Geometry</a> tools in order to best optimize the tracks. Meanwhile, when it comes to the CPU side of things, we have been using the <a href="" target="_blank">Session Profiler Frontend</a> in order to identify the more problematic lines of code.<br /> <img alt="Xenon-Racer---Screenshot-2.jpg" height="auto" src="" width="auto" /><br /> <strong>Xenon Racer has tracks that take players through Tokyo, Dubai, and more. Can you talk about your approach to locations and track design?</strong><br /> <br /> <strong>Valentini: </strong>We wanted to re-imagine the most iconic cities on the planet. Tokyo and Dubai come to players’ minds and picking them as key locations allowed us to let our creativity go wild. Each and every one of our locations are faithfully inspired by their own real-life counterparts, but we created futuristic visions of them with artistic liberties. The buildings alternate between historical buildings, breathtaking panoramas, and crystal buildings that create a variety not often seen in this genre.<br /> <img alt="Xenon-Racer---Screenshot-14.jpg" height="auto" src="" width="auto" /><br /> <strong>Xenon Racer features expansive car customization options. Did UE4 make it easier to implement this feature? </strong><br /> <br /> <strong>Orlandi: </strong>The wide range of customization of the vehicles is, for the most part, derived from our previous title All-Star Fruit Racing. Through the same approach and UE4 components, it was rather simple to add or modify car parts to customize based on the user’s preferences. The system is heavily data-driven, allowing artists to handle and test the different customizations directly in-editor without having to build the game first. This ended up speeding up the development and testing of all the possible car customizations.<br /> <br /> <strong>What made UE4 a good fit for the game?</strong><br /> <br /> <strong>Rocco: </strong>UE4 consistently provides an increasingly vast development environment, complete and powerful in every area. It allows development teams to channel more energy into the game and not into the technology RND.<br /> <br /> <strong>Did the studio leverage Blueprints in any way?</strong><br /> <br /> <strong>Orlandi:</strong> We used <a href="" target="_blank">Blueprints</a> to handle gameplay flux, UI, and menus. For all the game’s areas that weren’t highly performance dependent, it [allows for] a quick method of development and prototyping. We also used <a href="" target="_blank">Blutility</a> to create fundamental tools for the rapid development of the track splines and the urban environment surrounding them. <br /> <img alt="Xenon-Racer---Screenshot-11.jpg" height="auto" src="" width="auto" /><br /> <strong>Did the studio use the UE4 Marketplace at all?</strong><br /> <br /> <strong>Orlandi: </strong>We borrowed some plugins from the <a href="" target="_blank">Marketplace</a>, as well as some code and assets that allowed us to speed up the development of our game. The Marketplace is really useful to rapidly obtain the tools to start the development of a project or to prototype a feature, which could then later be refined.<br /> <br /> <strong>Did the studio have any favorite UE4 tools or features?</strong><br /> <br /> <strong>Lead Programmer Christian Orlandi:</strong> One thing that helped us tremendously when developing the AI was <a href="" target="_blank">Visual Logger</a>, which is a tool that allows the recording of events and data during a race and is then later able to recreate them frame-by-frame. This made it much easier to analyze and solve any bizarre behaviors that AI-controlled vehicles would do in specific situations. Another interesting UE4 feature is the excellent integration of both code and Blueprints. This allowed us to rapidly display all the parameters and data structure that the artists and designers needed in order to fine tune every element of our game. This also freed up our programmers to focus on the development of other features.<br /> <br /> <strong>Thanks again for your time. Where can people learn more about Xenon Racer?</strong><br /> <br /> <strong>Valentini:</strong> Thanks for the questions. People can keep up with all of the game&#39;s future updates by following 3DClouds on <a href="" target="_blank">Twitter</a>, <a href="" target="_blank">Facebook</a> or <a href="" target="_blank">Instagram</a> or by joining <a href="" target="_blank">Soedesco’s Discord channel</a> to directly chat with developers!<br />  AIArtBlueprintsDesignMarketplaceXenon Racer3DCloudsJimmy ThangFri, 17 May 2019 13:30:00 GMTFri, 17 May 2019 13:30:00 GMT Online Services roadmap update for more news on our Epic Online Services rollout? We’re publishing our roadmap to keep you up to date with our latest development schedule. Bookmark the Trello board today for instant visibility.In December 2018, we announced <a href="" target="_blank">Epic Online Services</a>, a suite of cross-platform game services that has been tested in the heat of battle by 250 million players on seven platforms.<br /> <br /> Since the <a href="" target="_blank">release</a> of the initial set of services in March, we have continued to develop its foundation. Our current efforts are focused on the immediate needs of store partner products that are approaching launch. We’re also working to open up the identity and social services we built for Fortnite to support cross-play in partner games without requiring the use of Epic accounts or other Epic dependencies.<br /> <br /> As a result of our current prioritizations, several roadmap items are being delayed. Most notably, we are rescheduling the release of the Player Reports and Player Data Storage services. To provide more detail, we are moving the roadmap to a <a href="" target="_blank">Trello board</a>, which you can bookmark and revisit at your convenience.<br /> <img alt="blog_body_img.jpg" height="auto" src="" width="auto" /><br /> For more information on Epic Online Services offerings and to download the SDK, visit <a href="" target="_blank"></a> and log into the developer portal. While you’re there, be sure to check the box to sign up for our newsletter to receive updates as new services become available.<br /> <br /> Stay tuned for our next update!<br />  NewsGamesCommunityEpic Online ServicesSimon AllaeysThu, 16 May 2019 14:00:00 GMTThu, 16 May 2019 14:00:00 GMT up with Epic Games at Nordic Game 2019 Games will be at the Nordic Game conference May 22-24 in Malmo, Sweden. We’ll have Unreal Engine tech talks, game showcases, a mixer, and more!Join us at the <a href="" target="_blank">Nordic Game conference</a> for tech talks, a UE mixer, and more! The conference will be held May 22-24 at Slagthuset in Malmo, Sweden. <h2>Unreal Presents</h2> <strong>Where:</strong> Round Bar near our A1/A2 booth<br /> <strong>When:</strong> From Wednesday, May 22, 11:00 AM onwards<br /> <br /> The Unreal Presents showcase is your chance to see some amazing UE4 games currently under development in Scandinavia, Germany, and the UK. Come and get an early look at this curated set of not-yet-published games with a wide range of styles and themes including multiplayer battles, futuristic journeys, Norse magic, mystery “Fjord Noir,” and old-school adventure.<br /> <br /> After a private viewing for publishers and investors, the Unreal Presents showcase is open to all conference attendees starting at 11 AM on Wednesday, May 22. Games from the following indie studios will be on display: <br />   <ul style="margin-left: 40px;"> <li><a href="" target="_blank">Angry Demon Studio</a> - <em>Apsulov: End Of Gods</em> - A single-player, first-person game that takes an adventure approach to horror, featuring a future viking theme and Norse magic, set in a sci-fi universe with memorable settings.</li> <li><a href="" target="_blank">Bagpack Games</a> - <em>Out of Place</em> - A single-player, third-person action adventure in which young Simon, with the help of his Orb companion, must reactivate an ancient machine using dynamic play style to ultimately overcome fear and take action for a better world.</li> <li><a href="" target="_blank">Digital Cybercherries</a> - <em>Hypercharge: Unboxed</em> - A first-person turret, trap, and defense-building game in which players take up arms as toy soldiers and battle solo or with up to three friends, online or in local co-op, to protect the Hyper-Core.</li> <li><a href="" target="_blank">Dimfrost Studio</a> - <em>Bramble</em> - The Mountain King - A story-focused, single-player game based on dark Nordic folklore. Impressive, realistic visuals intensify epic monster battles between a small boy and ominous mythical creatures. </li> <li><a href="" target="_blank">Frogsong Studios</a> - <em>Deconstruction Corp</em> - A four-player, co-op dungeon crawler in which elevators plunge players to the depths of an artificial planet where they must scavenge scraps to bring back to the surface. </li> <li><a href="" target="_blank">Invisible Walls</a> - <em>Cainwood </em>(working title) - A third-person cooperative game about trust, deceit, and survival in which a player is hiding their true identity, and in the shadows conducts unsavory deeds that actively work against your common goal of escaping the terrible predicament you are in.</li> <li><a href="" target="_blank">Red Thread Games</a> - <em>Draugen</em> - A single-player, first-person psychological "Fjord Noir" mystery set in 1920s Norway in which an American traveler and his ward embark on a dark search for his missing sister.</li> <li><a href="" target="_blank">Sluggerfly</a> - <em>Hell Pie</em> - A single-player platformer in which the player uses an extensive set of swing-based moves to explore a combination of twisted worlds and levels while fighting enemies, collecting weird cake ingredients, and interacting with bizarre characters.</li> </ul> <h2><img alt="blog_body_img_games.jpg" height="auto" src="" width="auto" /><br /> Unreal Engine Talks</h2> Here’s the schedule for the Unreal Engine talks:<br /> <br /> <strong>Tech Talk: Ray Tracing using Unreal Engine 4</strong><br /> Wednesday, May 22, 2:15 PM - 3:00 PM, Room Copenhagen<br /> <br /> As of Unreal Engine 4.22, Unreal’s renderer supports the new DXR API for real-time ray tracing. During this session, you will see how to light a visually appealing environment using the new ray-tracing features in UE4. Through a game-focused practical example, Epic’s Sjoerd De Jong will show how to control ray tracing in your scenes, discuss benefits and drawbacks to real-time ray tracing, and cover tips on performance, all focused on helping you make a smooth start with this amazing new tech.<br /> <img alt="blog_body_img_raytracing.jpg" height="auto" src="" width="auto" /><br /> <strong>Tech Talk: Demystifying Niagara</strong><br /> Wednesday, May 22, 3:15 PM - 4:00 PM, Room Copenhagen<br /> <br /> An incredibly customizable system, Niagara gives artists and designers unprecedented control over real-time VFX and particle simulation. This talk focuses on demystifying the new toolset, giving an overview of key concepts and the practicalities of making effects.<br /> <br /> <strong>Fireside chat with Epic Games and The Bearded Ladies</strong><br /> Thursday, May 23, 3:15 PM - 4:00 PM, Unreal Theater<br /> <br /> Please join Mike Gamble, Head of Games Licensing EMEA, and Halli Thormundsson, David Skarin, and Mark Parker of The Bearded Ladies for a light-hearted and engaging chat about their company, its culture, and, of course, their outstanding UE4 title Mutant Year Zero: Road to Eden. <h2>Unreal Mixer</h2> We’ll also be hosting a mixer for UE4 users (and those who want to be!) on Wednesday, May 22, 5:30 PM - 8:30 PM, at <a href="" target="_blank">Skeppsbron 2</a>, Malmo.<br /> <img alt="blog_body_img_mixer.jpg" height="auto" src="" width="auto" /><br /> <br /> Be sure to stop by and see us at our A1/A2 & B1 booth, and find out what’s happening with UE4!<br />  GamesCommunityEventsNewsNordic GameShera D’SpainWed, 15 May 2019 19:00:00 GMTWed, 15 May 2019 19:00:00 GMT in animation and VFX at FMX 2019 FMX 2019, a premier conference for media and entertainment in Europe, five trends emerged for the future of animation and VFX. Read about the ones to watch this year.The annual <a href="" target="_blank">FMX conference</a> is one of the premier events for visual effects and animation in Europe, providing not only a showcase for new technology but also fresh approaches to the challenges of production.<br />  <br /> At FMX 2019, which was held Apr 30 – May 3 in Stuttgart, Germany, there were five trends in animation and effects that emerged across the presentations and product demos. Here, we’ll take a tour of these trends, and see where they point for the future of real-time technology.<br />   <h3><strong>Trend #1: Realism</strong></h3> The impact of new hardware such as the NVIDIA RTX cards, in addition to advances in real-time game engines like UE4, was evident across so many aspects of animation and visual effects.<br />  <br /> With the advent of more physically plausible lighting, especially real-time ray tracing, there is a move to higher realism in animation and effects.<br />  <br /> One example is Matt Workman’s <a href="" target="_blank">Cine Tracer</a>, a real-time cinematography simulator, which Matt presented on the second day of FMX. This hybrid game/app combines Matt’s programming knowledge with his 10 years of technical on-set cinematography knowledge. Unlike many pitchvis or previs projects from years past, Cine Tracer enables not only accurate blocking and lensing of projects, but also realistic interactive lighting and depth of field.<br />  <br /> The realism of this UE4 project gives filmmakers opportunities for creative exploration of movement, focus, smoke/haze, and lighting with industry-standard virtual lights, cranes, and cameras. With it, “players” can explore real-world-based staging and direction of digital talent/actors in stunning next-gen environments created in Unreal Engine 4.<br />   <h3><strong>Trend #2: Virtual production and digital humans</strong></h3> Real-time collaboration on set and a more refined filmmaking pipeline can lead to a more nonlinear story creation process, which in turn fosters creativity in filmmaking. Several talks highlighted the real-time collaborative benefits of incorporating UE4 into animation and effects pipelines.<br /> <br /> David Morin, head of Los Angeles lab at Epic Games, chaired the virtual production track at FMX, which was dominated by stories of companies improving and innovating production pipelines with real-time technology. For example, Kevin Baillie, Creative Director and Sr. VFX Supervisor at Method Studios, outlined how his team used <a href="" target="_blank">creative UE4 virtual production techniques</a> to bring dolls to life in the Robert Zemeckis film <em>Welcome to Marwen</em>.<br /> <img alt="blog_body_img_marwen2.jpg" height="auto" src="" width="auto" /><br /> <strong>Trend #3: Simulation</strong><br /> Real-time simulation, which increases engagement and further adds to realism, was discussed at multiple levels and in several talks.<br /> <br /> For example, the game <a href="" target="_blank">Robo Recall</a> from Epic Games was on display, giving attendees the opportunity to interact with the new <a href="" target="_blank">Chaos destruction tools highlighted in this year’s UE4 GDC demonstration</a>. The new <em>Robo Recall</em> demo illustrated how players can now interact and directly affect (or destroy) complex scenes during these simulation-heavy portions of the action, rather than being limited to noninteractive cutscenes. UE4 has always been a way to render animation in real time, but this demo illustrated the staggering jump in simulation performance that turns players into participants rather than spectators.<br /> <img alt="blog_body_img_chaos.jpg" height="auto" src="" width="auto" /> <h3><strong>Trend #4: Deep learning</strong></h3> One of the techniques animation technical directors and programmers use to enhance realism and produce real-time simulation is the application of machine-learning techniques, such as deep learning. If there was one buzzword heard across the widest variety of presentations, it was deep learning!<br />  <br /> Deep learning, in the context of real-time technology for media and entertainment, involves writing programs that “learn” from vast sets of visual data, and then apply these learnings in real time to a rendering. Some of the most notable examples in this field are in denoising, ray tracing, and—most recently—facial and character animation.<br /> <br /> A case in point is the outstanding work from Digital Domain in facial tracking and animation. Doug Roble presented <a href="" target="_blank">Digital Doug</a>, a virtual copy of Roble’s face which he puppeted in real time, with incredible fidelity and realism, using UE4. This project builds on the work done at Epic Games in recent years and extends it, thanks to new techniques of producing highly detailed training data for facial speech and motion.     <br /> <img alt="blog_body_img_digitaldomain.jpg" height="auto" src="" width="auto" /> <br /> Darren Hendler also presented related digital human work for Marvel’s <a href="" target="_blank">Thanos</a>. Both projects use deep learning as part of the Digital Domain face pipeline.<br />  <br /> Not all the apparent AI advances use deep learning. At the conference, <a href="" target="_blank">Matthias Wittman at Method Studios</a> also demonstrated his “emotional intelligence” research for advanced character animation in UE4. This application doesn’t use machine learning (yet), but it enables highly interactive and realistic acting in secondary characters. The system was designed to run not only interactively, but also in VR. Matthias illustrated the technology (and entertained the audience) by poking characters in the face live in VR. <h3><strong>Trend #5: USD development and adoption</strong></h3> At a technical level, this year at FMX saw important advances in pipeline integration of UE4 and interoperability of assets with other standard industry tools. Programs such as Autodesk Maya, Foundry’s Nuke, and Unreal Engine are all moving to support <a href="" target="_blank">USD</a>, the open-source Universal Scene Description format standard.<br />  <br /> Most promisingly, NVIDIA has been developing their new <a href="" target="_blank">Omniverse</a> tool. It’s still in its early days, but Omniverse may facilitate full, seamless interchange and universal asset updates for a variety of use cases, from an individual artist running multiple applications to an intercontinental company sharing assets across all its offices in real time.<br />  <br /> USD was pioneered at Pixar Studios for wide sharing of animation, models, and assets. Unlike the <a href="" target="_blank">Alembic</a> interchange format, USD includes layering, referencing, and shading variants for individual assets, among other features. The broad adoption of USD would mean tremendous efficiencies in a host of animation and effects pipelines, including Unreal Engine. <br /> <br /> Epic Games remains committed to the open source movement and to the <a href="" target="_blank">Academy Software Foundation</a>, and works closely with companies such as NVIDIA to not only maximize stunning onscreen imagery, but also to make UE4 productions more efficient. This helps all productions from animation to virtual production. <br /> <br /> <br /> To take advantage of all that real-time technology has to offer, download <a href="" target="_blank">Unreal Engine</a> and get started with virtual production, real-time ray tracing, and more. You can also visit our <a href="" target="_blank">virtual production hub</a> for interviews, videos, and more insights into the expanding use of real-time technology in film and television production.<br />  CommunityEventsFMX 2019NewsGamesVFXFilm And TelevisionVirtual ProductionMike SeymourWed, 15 May 2019 18:30:00 GMTWed, 15 May 2019 18:30:00 GMT Unreal Engine learning portal preview released can try a preview of the new learning experience, now with easier course discovery, progress tracking, automatic bookmarking, assessments, badges, and more. Check it out now!The next evolution of the Unreal Engine learning portal is almost here, and you can try a <a href="" target="_blank">preview of the new learning experience</a> now. We have over 35 courses available, along with some great new content for this much-anticipated update.<br /> <br /> In addition to new content, we’ve added key features you’ve asked for, including: <ul style="margin-left: 40px;"> <li><strong>Improved search:</strong> Browse courses by topic, author, or industry to more easily find the material you’re looking for.</li> <li><strong>Progress tracking:</strong> See how far you’ve progressed in any course, and easily pick up where you left off. Bookmark your place within a module, and add a course to your favorites, so you can easily revisit it later.  </li> <li><strong>Quizzes:</strong> Check your understanding with built-in course assessments and make sure you are ready to apply what you’ve learned.</li> <li><strong>Badges: </strong>Earn skill badges as you complete courses.</li> </ul> <div style="text-align: center;"><img alt="images.PNG" height="auto" src="" width="auto" /></div> <br /> To access the preview, <a href="" target="_blank">first login to your Epic Games Account</a>, then select your language preference and timezone. Your Epic ID will be linked to your learning portal login. From there you can begin your Unreal Engine learning journey. <br /> <img alt="Home-Screen.JPG" height="auto" src="" width="auto" /><br /> <img alt="Preview-enter.JPG" height="auto" src="" width="auto" /><br /> We’re extremely excited to show off this preview version of the new learning experience. We’re striving to make it engaging, rewarding, and of course, informative. As such, your feedback is critical to the success of this new platform, so please feel free to submit your feedback in the learning portal, or visit <a href="" target="_blank">our forum thread</a> to join the conversation! <br />  LearningNewsEducationMelissa RobinsonTue, 14 May 2019 15:30:00 GMTTue, 14 May 2019 15:30:00 GMT and Twinmotion join forces to offer easy, high-quality real-time visualization’re proud to welcome Unreal Engine-powered Twinmotion to the Epic family. By joining forces, we’re able to make a new version of this fast, easy, high-quality solution more accessible to visualization professionals in the architecture, construction, urban planning, and landscaping industries as a free download.We’re excited to announce that Epic Games and Twinmotion have joined forces! Powered by Unreal Engine, <a href="" target="_blank">Twinmotion</a>, a high-quality, easy-to-use real-time visualization solution, is now part of the Epic family. And the best news? Our new version of Twinmotion is absolutely free to download and use for all customers until November 2019. <br /> <br /> Twinmotion makes compelling design visualization easy! Whether you’re in the architecture, construction, urban planning, or landscaping industry, its highly intuitive interface enables you to assign PBR materials, set up lighting, and even choose the season and the weather with just a few clicks. Populate your scene from a library of ready-to-use assets, including animated characters, and trees whose foliage blows in the wind. With as few as two clicks, you can create paths of walking people or moving cars, and then vary their appearance to suit your scene. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> If you’re a Revit or ARCHICAD user, life is even easier; you can take advantage of our one-click direct synchronization functionality to convert your scenes in the blink of an eye. We also support FBX, SKP, C4D, and OBJ files. <br /> <br /> Once you’ve prepared your scene, Twinmotion makes it effortless to create images, panoramas, standard and 360° videos, and virtual reality experiences, all from the same content.<br /> <img alt="blog_body_img2.jpg" height="auto" src="" width="auto" /><br /> The new Twinmotion has some differences from Twinmotion 2019, the version previously available from Abvent. Apart from now being free, the latest version offers improved performance, enhanced direct synchronization with Revit and ARCHICAD, and new global illumination glow effects. All users can also now access a shared library for multi-user collaboration—a feature previously only available in Twinmotion Team.<br /> <br /> A small number of features have been removed or changed. You can find full details of the differences, and answers to other questions you may have, in our <a href="" target="_blank">FAQ</a>.<br /> <br /> High-quality real-time visualization has never been faster, easier, or more fun. So what are you waiting for? <a href="" target="_blank">Download Twinmotion</a> and give it a try today. It’s absolutely free to download until November 2019, and yours to keep using indefinitely after that.<br />  EnterpriseNewsDesignArchitectureVisualizationTwinmotionDana CowleyMon, 13 May 2019 15:00:00 GMTMon, 13 May 2019 15:00:00 GMT Chinese universities introduce UE4 courses to meet global education market demands the demand for Unreal Engine education rises, more and more Chinese universities are creating UE4-specific courses in game development, art and design, and film.All across the globe, educational institutions are adopting Unreal Engine to elevate their skills and improve workflows for their real-time projects. China is no stranger to advances in 3D graphics technology, and they are quickly becoming a force for innovation when it comes to incorporating Unreal Engine into post-secondary education. <h2>Shanghai Jiao Tong University introduces for-credit Unreal Engine course</h2> Last fall, Shanghai Jiao Tong University (SJTU), a major research university in China, introduced its new Unreal Engine 4 elective course “Unreal Engine Program Development and Practice.” This course happens to be one of the first for-credit courses specifically focused on Unreal Engine in China.<br /> <br /> The Epic-approved curriculum concentrates on teaching basic game-development concepts and provides a hands-on experience to instill the core fundamentals of Unreal Engine, including how to learn <a href="" target="_blank">Blueprints</a>, Unreal’s node-based visual scripting language.<br /> <br /> When Shanghai Jiao Tong University debuted their Unreal Engine program, “the class reached maximum enrollment as soon as it became available in the university online system,” says Tianmin Xie, business director of sales for Epic Games China.<br /> <br /> With the success of the first fall course, SJTU launched its second Unreal Engine spring course this past March. <br /> <img alt="Chinese-EDU-2.jpg" height="auto" src="" width="auto" /><img alt="Chinese-EDU-1.jpg" height="auto" src="" width="auto" /> <h2>Central Academy of Fine Arts combines UE4 with art design</h2> <strong>Central Academy of Fine Arts</strong>, known for being <a href="" target="_blank">one of the top 30 design schools</a> in the world, debuted an Unreal Engine obligatory course this past March. The course includes basic knowledge of Unreal Engine, <a href="" target="_blank">Blueprints</a>, gameplay and materials, animation, and UI. <br /> <br /> The students in this course create their own VR project with Unreal Engine in just seven weeks. The Vice Dean Jinjun of Central Academy of Fine Arts thinks it’s a “fabulous idea to mix Unreal Engine technology with art design.” He’s encouraging more teachers to expand and include Unreal in their courses.<br /> <img alt="Chinese-EDU-3.jpg" height="auto" src="" width="auto" /><img alt="Chinese-EDU-4.jpg" height="auto" src="" width="auto" /><img alt="Chinese-EDU-5.jpg" height="auto" src="" width="auto" /> <h2>UE4 makes its debut at the Beijing Film Academy</h2> Unreal Engine continues to spark curiosity in the film industry due to its hyper-efficient <a href="" target="_blank">virtual production</a> pipeline. <strong>Beijing Film Academy</strong> opened up an Unreal Engine obligatory course this past March. Moving forward, the academy will also be offering a class to postgraduates to learn Unreal Engine. <br /> <br /> <strong>Shanghai Vancouver Film Academy</strong> has also jumped into the world of real-time — they recently opened a Blueprints course and launched an Unreal Engine C++ course this semester.<br /> <img alt="Chinese-EDU-6.jpg" height="auto" src="" width="auto" /><br /> At Epic, we are excited about the increased adoption of real-time technology in educational institutions around the world. The demand for real-time and Unreal Engine skills continues to grow and is significant not only in game development, but also in other disciplines such as architecture, TV/film, industrial design, and computer science. We’re continually partnering with universities to integrate real-time technology more broadly across multiple disciplines to prepare students for careers in these industries.<br /> <br /> Are you looking to incorporate Unreal Engine into your teaching curriculum? We have free, Epic-approved instructor guides available for download <a href="" target="_blank">here</a>.<br />  EducationNewsMelissa RobinsonMon, 13 May 2019 14:00:00 GMTMon, 13 May 2019 14:00:00 GMT in a Teacup invites players ‘Close to the Sun’ in its new horror adventure title by the works of Nikola Tesla, Close to the Sun is a compelling indie horror game with a stunning steampunk aesthetic.Italian studio Storm in a Teacup hopes to turn heads with its ambitious fourth title, <a href="" target="_blank">Close to the Sun</a>. A horror adventure loosely centered on the works of Nikola Tesla, Close to the Sun showed immense potential early in development and was a recipient of an <a href=";utm_content=Oktopost-twitter&amp;utm_medium=social&amp;utm_source=twitter" target="_blank">Unreal Dev Grant</a>.<br /> <br /> Sporting a brooding steampunk aesthetic, you’d be forgiven for thinking Close to the Sun is torn right from the pixels of BioShock, but apart from their similar art deco backdrops, that’s where the similarities end. As there are no weapons in the game, Rose, the game’s largely defenseless protagonist, will have to rely on her wits and puzzle-solving skills if she wants to survive a creepy voyage on Nikola Tesla’s fictional floating ship. Emphasizing intelligent play over straightforward action, Close to the Sun pushes to stand tall on its own.<br /> <br /> Sitting down with Storm in a Teacup founder Carlo Ivo Alimo Bianchi, who has worked at companies like Ubisoft, Crytek, and Square Enix, we discussed his reasons for venturing out on his own as an indie developer, the benefits of developing the game using Unreal Engine 4, and what the Unreal Dev Grant meant to him and his team. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <strong>Storm in a Teacup is a relatively young studio with three other games under its belt within a six-year period. Tell us how the studio came together and why you wanted to pursue indie development.</strong><br /> <br /> I had been working abroad for several years with companies like Ubisoft, Metricminds, Crytek, and Square Enix and I decided to go back home to Rome for family reasons. Once home, I faced some different opportunities and among them was the chance to open my own game company. I knew it would be difficult and tiring, but I decided that was the path to follow at that specific moment in my life. <br /> <br /> The company was started knowing that the first projects had to be part of a preparation for something bigger. When the time came, after three smaller projects, I decided that the fourth one was going to be the big one. Needless to say, I was scared as hell because Close to the Sun was both huge and expensive. For that reason, we implemented big changes including relocating to a new office, hiring more people, and restructuring the company with additional managers. After two years of development, through blood and sweat, I can proudly say we made it, but not just that, we made it bigger than we had initially anticipated.<br /> <br /> <strong>Those are some big changes indeed! So, with the three smaller games in your rearview mirror, what made Close to the Sun your “big one?”</strong><br /> <br /> Close to the Sun is our attempt to enter what we like to call the “big indie” market with our AA title. It’s a market we feel isn’t too full of games because usually indie companies tend to develop either very small or very big projects. There’s lots of room left in that middle area. <br /> <br /> If you think about the horror/adventure niche and the size of other titles like Outlast or SOMA, you have to admit there aren’t many games out there that fit the same bill. Clearly, that shows that developing a game like Close to the Sun is a hazard on so many levels for a small company like ours. You really need to believe in what you’re doing. <br /> <br /> Taking a look at some of the famous titles that inspired us along the way, we knew we wanted to incorporate some of their elements into our game. We studied in-depth what we wanted, but also looked for what really worked in other titles alongside what didn’t. After a lot of tailoring, we came out with something truly unique. This means — we hope — that you’ll find Close to the Sun different from any other indie horror/adventures you’ve already played, but at the same time still feel familiar and have it resonate. We’ve invented a new universe, but not a new genre, of course.<br /> <img alt="Screenshot_21.jpg" height="auto" src="" width="auto" /><br /> <strong>Comments I&#39;ve read see fans drawing a lot of parallels to the BioShock series. Was BioShock an inspiration at all? If not, were there any other games that were?</strong><br /> <br /> Personally, I only played the first couple hours of the first BioShock and also never finished the Infinite chapter. Our Art Director never played any of the games in the BioShock saga, but in the end, I’d still say there was some inspiration. What we can say for sure is that after a ton of research on art deco, art nouveau, and steampunk, we came to stylings that were similar to BioShock. When people started making comparisons internally, our Art Director and I checked out everything we could find online about the series to make sure that our game would be as far away a take as possible. The result is something that reminds people of BioShock, but definitely is not BioShock in any form.<br /> <br /> <strong>Close to the Sun is also inspired by the works of Nikola Tesla. Taking place on Tesla&#39;s own fictional ship, how did you get the idea to add the horror and survivor elements to the game?</strong><br /> <br /> At the very beginning, we had months of meetings before we even started to model anything. We literally got everyone involved in the initial creative process. We knew we wanted to make a horror adventure with elements of survival, but we weren’t sure if we had to take the classic “haunted mansion” path or if we wanted something more than that. Ultimately, we decided that since the project was so huge and dangerous for us, we couldn’t afford to take shortcuts at any point and focused on creating something new and fresh. <br /> <br /> Keeping those thoughts in mind, we moved the action from a standard mansion to a huge ship and that’s also how we came to adding a historical figure to the equation. We knew we needed a pulley of some sort, something or someone that everyone knows, even just by name, and that could reconcile our need for a historical “excuse” with the player’s need for a “deus ex machina.” <br /> <br /> When you make a haunted mansion game, you don’t need logic. You don’t need to explain why objects levitate or why there are monsters; it’s simply accepted as fact. In our game, everything is logical and scientific, so we needed a scientist pulling strings from above and the choice for Tesla was perfect. I’ll add to this that in my personal and very humble opinion, Tesla was one of the biggest minds in history and yet he died rather pitifully. I wanted to give him some glory. Maybe we gave him too much glory, and funnily enough, that’s why the name of the game is inspired by the story of Icarus.<br /> <img alt="Screenshot_17.jpg" height="auto" src="" width="auto" /><br /> <strong>The game&#39;s protagonist, Rose, is mostly defenseless having no access to weapons in the game. What kind of gameplay mechanics did you rely on to keep things interesting for players when combat isn’t really an option?</strong><br /> <br /> The first thing we created was a universe. We didn’t just place a character in a ship and cross our fingers for it to be entertaining. This means that under every mechanic in the game, we have a structured layer of lore, dead characters, living characters, betrayal, revenge, and so on. This isn’t a mechanic so much, but it’s extremely important to understand that this is the base of every mechanic present in the game. <br /> <br /> When we started thinking about puzzles and blockers, it all became very clear — the game itself is based on the fact that everything must be believable, so puzzles and other mechanics had to be as well. We ended up with puzzles that don’t quite look like puzzles and mechanics that don’t quite look like mechanics because they’re strongly tied to the environment surrounding the player. This was really interesting because we achieved exactly what we wanted and we did it with our specific style. It was a win-win situation both for us and for players, but it was indeed more complex than just spawning ghosts in a mansion. Very, very complex. So complex that we discovered we couldn’t have a single core mechanic and just use it during the entire game. If we really wanted something believable and strongly tied to the environment, then every single situation in the game had to be different. So, in the end, the game has many levels all extremely different from each other and every environment has different situations with different problems and every problem has its own solutions. <br /> <br /> <strong>Close to the Sun is a recipient of the Unreal Dev Grant. How did that extra funding benefit development, and what did that support mean to you as a studio?</strong><br /> <br /> For a small studio like Storm in a Teacup, the Unreal Dev Grant made a huge difference! We could simply say that the extra monetary infusion helped us recoup a month or so of costs for the production, but it would be very limiting just to say that. Numbers are numbers and we can’t do anything about that. The Dev Grant repaid us one month of production (yes, game development is an expensive business), but there is something more to acknowledge, something much more important than money itself. <br /> <br /> When we won the grant, we understood we were really on a good path with Close to the Sun. Team morale skyrocketed and we pushed even harder on the production. Passion is a beautiful thing to work with, but knowing that what you’re doing is recognized as being good fuels that passion and allows you to go the extra mile. I suppose this is what the Dev Grant is about. It’s not just about the amount of money, but also about the immediate feedback you get from being acknowledged. I honestly don’t know how to explain how happy we were about the Dev Grant. It meant the stars for us in a very stressful moment.<br /> <img alt="Screenshot_10.jpg" height="auto" src="" width="auto" /><br /> <strong>Visually, Close to the Sun has a haunting and gorgeous steampunk aesthetic. How did Unreal Engine 4 help the team create this vivid world?</strong><br /> <br /> We evaluated different engines for the game and the choice to use Unreal Engine came very easily. In today’s world, where quality is everything and speed is even more important, we had to find a compromise between quality and speed. We found that with Unreal Engine 4, we didn’t have to compromise — we could have quality and speed at the same time. This is quite simply the truth. Unreal Engine 4 gave us more quality and made it happen faster than other engines. <br /> <br /> I, myself, am a lighting and postfilter artist, and I’ve been working on huge AAA games for more than 15 years now. I know how much work it takes to make a game beautiful. With other engines, it would have been a hazard for us to aim for such quality with the budget and timeframe we had. <br /> <br /> In some ways, Unreal Engine’s graphics engine allowed us to make beautiful graphics with just a few clicks. The <a href="" target="_blank">Blueprints</a> system allowed us to quickly prototype, the native integration with Perforce allowed us to use our favorite versioning system, the debugging was easy, the <a href="" target="_blank">[source] code</a> is open, and we could develop anything we wanted on top of the original engine. There are so many reasons why we chose Unreal Engine and we would never want to take a step backward. This isn’t even mentioning the incredible support from the entire Epic team, both for marketing and technical. I’ve made many choices in my life that I regret, choosing Unreal Engine 4 for Close to the Sun is not one of them.<br /> <img alt="Screenshot_20.jpg" height="auto" src="" width="auto" /><br /> <strong>Do you have a favorite feature in Unreal Engine 4? What was your most valuable tool?</strong><br /> <br /> For me, it’s the <a href="" target="_blank">rendering</a> system. It’s so powerful that at some points in the production, we started asking ourselves if we were really that good or if there was a saint helping us! Let’s say it’s a mixture of both!<br /> <br /> <strong>Were there any Unreal Engine 4 resources you found particularly helpful while developing Close to the Sun?</strong><br /> <br /> Simply put, the open code. We have an engine programmer that comes from a AAA background as well and he could put his hands into everything. We’ve developed tons of technology and we’re even selling some on the <a href="" target="_blank">Marketplace</a>. <br /> <br /> <strong>Where are all the places people can go to keep up with both Storm in a Teacup and Close to the Sun?</strong><br /> <br /> Thanks for asking! You can follow us on Twitter at <a href="" target="_blank">@stcware</a> , follow our <a href="" target="_blank">Facebook page</a>, or drop by our websites at <a href="" target="_blank"></a> and <a href="" target="_blank"></a>.<br />  Close to the SunGamesStorm in a TeacupUnreal Dev GrantsShawn PetraschukFri, 10 May 2019 15:00:00 GMTFri, 10 May 2019 15:00:00 GMT automotive visualization for self-driving Robocar wins the day for AltSpace’s touch table presentation for Robocar, the world’s first self-driving electric race car, not only delighted automotive enthusiasts but also got them on the podium in this year’s Unreal Awards.<img alt="blog_body_img1.jpg" height="auto" src="" width="auto" /><br /> <a href="" target="_blank">AltSpace Studio</a> started out in 2012 with the goal of technical aesthetics in engineering. With offices in London and Moscow, their CGI videos and interactive projects have showcased everything from superyachts to spacecraft. <br /> <img alt="blog_body_img4.jpg" height="auto" src="" width="auto" /><br /> AltSpace specializes in industrial projects in aerospace, yachting, and automotive, with racing cars being one of the most prominent categories. One of their recent projects was for the <a href="" target="_blank">FIA Formula E Championship</a>, a race exclusively for electric vehicles. Another was for <a href="" target="_blank">Roborace</a>, a competition with a significant distinction—Roborace will include both manually-driven and autonomous vehicles.<br /> <img alt="blog_body_img2.jpg" height="auto" src="" width="auto" /><br /> In Roborace, all teams will use the same physical vehicle, the Robocar, and each team will develop their own AI technologies and real-time algorithms to drive it. Naturally, automotive enthusiasts are eager to learn about Robocar’s design inside and out, and AltSpace was just the team to showcase its many facets.<br /> <br /> AltSpace was commissioned to create a touch table presentation where visitors could interactively reposition the car to look at it from any angle, and remove layers to peek under the hood. The interactive experience, powered by Unreal Engine, garnered them <a href="" target="_blank">second place in the Unreal Awards Manufacturing category</a>. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> AltSpace credits the team’s engineering background with giving them the right insights for the project. “We completely recreated the car in digital form from drawings and photographs, which was conducted in parallel with the creation of the car itself,” says Oleg Butov, Art Director at AltSpace. “The viewer can break down the car and study its power structure, the drive train, the sensors, everything all the way down to the NVIDIA processor.”<br /> <img alt="blog_body_img5.jpg" height="auto" src="" width="auto" /><br /> AltSpace sees Unreal Engine as an important component in their business’s shift to automotive. “We always wanted to showcase our achievements and progress in the field of interactive experiences,” says Igor Voloschuk, CEO of AltSpace. “Using UE has played a crucial role in AltSpace’s development and opened an opportunity for creating sports car touch tables for Roborace as well as Formula E.”<br /> <img alt="blog_body_img3.jpg" height="auto" src="" width="auto" /> <div style="text-align: center;"><em>Designer Daniel Simon playing with Roborace touchtable</em></div> <br /> With their recent success using Unreal Engine, AltSpace sees themselves going deeper into real-time technology and branching out to new types of presentations. “Our passion for visualization has always led us to the point of outperforming ourselves, and we hope to retain this constant hunger for understanding and shaping what’s next,” says Voloschuk. “In the near future, we plan to explore blending solutions for live-action videos and broadcasts using UE.”<br /> <br /> <br /> Want to create your own real-time interactive experiences? <a href="" target="_blank">Download Unreal Studio</a> today and get Unreal Engine plus import tools, video training, and more.<br />  AltSpaceAutomotiveDesignEnterpriseManufacturingRoboraceVisualizationUnreal AwardsDoug WolffThu, 09 May 2019 14:00:00 GMTThu, 09 May 2019 14:00:00 GMT production: motion control and real-time preview at Stiller Studios directors meet new-school virtual production technology and giant motion control robots at Stiller Studios—and Unreal Engine provides the full picture, in real time, on set.Located in Sweden, <a href="" target="_blank">Stiller Studios</a> is one of the most technically advanced VFX studios in the world. It can offer its clients advanced motion control and motion capture equipment, housed within a huge pre-lit green-screen space, all tied together with live real-time preview on set, courtesy of Unreal Engine. What makes this all the more interesting is that CEO Patrik Forsberg describes himself as “an old-school director.”<br /> <img alt="blog_body_img1a.jpg" height="auto" src="" width="auto" /><br /> In this podcast hosted by <a href="" target="_blank">fxguide’s</a> Co-Founder Mike Seymour, Forsberg describes how Stiller Studios helps old-school directors work in a “new-school” environment, giving them the ability to see their live-action footage perfectly in sync with their CG imagery, in real time, on set. You can listen to the full podcast below, or read on for an overview.<br /> <br /> <iframe allowfullscreen="" height="90" mozallowfullscreen="" msallowfullscreen="" oallowfullscreen="" scrolling="no" src="//" style="border: none" webkitallowfullscreen="" width="100%"></iframe><br /> <br /> Stiller Studios focuses on intricate motion control work, where virtual and real camera positions and paths need to be perfectly matched and output in real time as usable data. But how did this all start?<br /> <br /> In the early days, Forsberg was grappling with getting the perfect green-screen shot. He went to <a href="" target="_blank">Mark Roberts Motion Control</a> in London to purchase a small robot for moving his camera a meter or two, and came back with a <a href="" target="_blank">Cyclops</a> motion-control rig weighing in at 4.6 tons and with the ability to reach up to six meters in the air.<br /> <img alt="blog_body_img_cyclops.jpg" height="auto" src="" width="auto" /><br /> “They had this giant beast standing right behind me, and I guess it was something that they thought of, because your age doesn&#39;t really matter—you&#39;re always gonna be a boy,” laughs Forsberg. “And I got home with that 4.6-ton unit.”<br /> <br /> Once the Cyclops was installed in the studio, it was initially hooked up to some Autodesk products. However, in the last five or six years, Stiller Studios started working with real-time engines, and today the motion control data goes directly into Unreal Engine. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> “It&#39;s like having a 16-meter track with a six-meter crane on it, but it sends out FBXs and camera positions, lens data, everything, so you can live-mix it in Unreal,” says Forsberg. “You can see live action on top of computer-generated imagery in perfect sync. What we want to provide is a way for the director to see the full picture [...] on the screen, and that includes light directions, shadows, interactions with everything.”  <h3><strong>Real-time on-set review</strong></h3> In order to see that full picture, directors are given an iPad so they can walk around the virtual set and see it fully rendered in UE4. They can even use the iPad as a virtual camera, recording a camera path that can be precisely repeated by the real camera mounted on the Cyclops. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> “We let the director or the DOP move [the iPad] on the set; he will see exactly how the camera is going to move, and then—only seconds later—the 4.6 ton beast does that exact move again,” explains Forsberg. “After 11 years of doing this, it&#39;s still cool to see.”<br /> <br /> This repeatability is a key factor in pulling off some of the more complex shots the studio undertakes, and was used to great effect in the award-winning short film <a href="" target="_blank">Echo</a> by Victor Perez. The time-bending, bullet-time-like mirror sequences that give the film its unique character required shooting the same subject from different points of view on the same set and then combining them. For this, Stiller Studios employed a second high-speed motion-control rig, the <a href="" target="_blank">Bolt</a>, alongside the Cyclops.<br /> <img alt="blog_body_img_echo.jpg" height="auto" src="" width="auto" /><br /> “You split the bullet time in some kind of 3D depth, so you would have one camera that gives you live action, the other one would give you bullet time, and they would swap back and forth,” explains Forsberg. “So one of the time layers would bend, but within one frame, and that&#39;s only doable because of the motion controls, because of us being able to program them so they go back and forth in time and in camera speed, and then ultimately sync up again.” <h3><strong>Digital doubles: recreating technical setups in UE4</strong></h3> In addition to the two camera rigs, Stiller Studios also has a computer-controlled motion base for actors, capable of taking up to one ton. This is useful for simulating movement of live actors in a helicopter or plane, or a real person riding on the back of a CG creature. For this kind of shot, UE4 is also used to visualize the technical aspects of the setup, with digital doubles of the rig and camera in the engine.<br /> <br /> “To us, Unreal is very much a techvis, an on-set-vis, and a previs tool,” says Forsberg. “If we put a flying car in there, or a flying carpet, then we can play around with it in real time. We have a miniature copy of the model mover, so we can play around with it like a six-axis joystick and move it around, and we can see that live in Unreal as well.”<br /> <br /> While this highly advanced setup is intended to be used for serious work, the team at Stiller Studios can’t resist having a little fun. <br /> <br /> “It&#39;s one of the best toys in the world,” says Forsberg. “Late Fridays, you can sit on top of that motion base and be flying an F16, or whatever. When you hook it up to the Unreal Engine, it&#39;s like you got the biggest flight simulator in the world. It&#39;s the same technology that you would use for big airlines and everything, but we&#39;ve got it hooked up to the film system or to the camera system.”<br /> <br /> It’s not just the rig and the camera that have digital doubles. Stiller Studios has also made virtual copies of their approximately 30 RGB LCD lights. “We can match the 3D and the live action, and that&#39;s really helpful for DOPs and for directors to get a sense of what&#39;s going to come out,” says Forsberg. <h3><strong>Perfectly synced output to post</strong></h3> While the real-time output from some projects is used as final pixels, for many others the on-set experience is more about getting the shots nailed, and there’s still more work to be done in a traditional VFX post-production pipeline. Making sure the assets transition usefully to that final stage is critically important, and perfect synchronization is key.<br /> <img alt="blog_body_img_UE4.jpg" height="auto" src="" width="auto" /><br /> “Let&#39;s say you have the camera on the motion control and then you get your FBXs out of your Unreal,” says Forsberg. “You might be working with virtual actors as well, using the motion capture system. What we take pride in is delivering files that sync perfectly together so the artists can be artists [...] and their job will be to make it more beautiful.” <h3><strong>Greater creative freedom</strong></h3> Among the advantages of the approach that Stiller Studios is taking, because the teams are able to actually visualize the final shot on set, it frees creatives up to experiment, and enables actors to be more spontaneous.<br /> <img alt="blog_body_img2_UE4.jpg" height="auto" src="" width="auto" /><br /> “It makes the creatives braver, which is good for the film because, back in the days, you would take something that you would be sure would work later,” says Forsberg. “Now that you get the real-time feedback, you can actually test things and push limits. <br /> <br /> “To me, a shot is so much more than the actor in it. It&#39;s the way the actor interacts with the composition and everything. And that too can be decided upon on set. One of the most important places to have the VFX person is on set, because that&#39;s where we make decisions and that&#39;s where we try to do the right thing.” <h3><strong>Loving the challenges</strong></h3> While Stiller Studios makes combining all of this complicated engineering look easy and fluid, it’s no mean feat. For example, as Forsberg says of calibrating the equipment, “I had one of my technicians doing some calculations, and he told me there were fifteen point something million ways of doing this wrong, and only one way to do it right!”<br /> <br /> But Forsberg shows no sign of losing his enthusiasm for the challenge. “After 11 years of doing it, it&#39;s still like going to the playground every single day,” he says.<br /> <br />  <br /> This podcast interview with Patrik Forsberg is part of our <a href="" target="_blank">Visual Disruptors</a> series. Visit our <a href="" target="_blank">Virtual Production</a> hub for more podcasts, videos, articles, and insights.<br />  EchoEnterpriseFilm And TelevisionPrevisStiller StudiosVirtual ProductionVirtual SetsBroadcastBen LumsdenWed, 08 May 2019 17:30:00 GMTWed, 08 May 2019 17:30:00 GMT Battles is a cheeky party game brought to life with Unreal Engine 4 developer Juicy Cupcake plants their tongues firmly in their cheeks for Brief Battles, a light-hearted and family-friendly experience aimed at all ages.<a href="" target="_blank">Brief Battles</a>, the first commercial project from the two-man team at <a href="" target="_blank">Juicy Cupcake</a>, is the culmination of two men going from their day job at a liquor store to the world of indie game development. This side-scrolling brawler is all about playing with friends while having your in-game personas wield the various powers that charged-up underpants bring to battle. The game’s ridiculousness is only matched by its fun. <br /> <br /> Originally built as a concept in Unity before making the transition to Unreal Engine 4, Brief Battles sports an absolutely massive lineup of 50 arenas at launch and utilizes a few of Unreal’s best tools along the way to speed up iteration. Focusing on the joy of playing locally with friends, Brief Battles combines fast-paced action alongside gut-busting humor while keeping things PG so the game can be fun for all ages.<br /> <br /> We chatted with one half of Juicy Cupcake, Creative Lead Andrew Freeth, to find out why they chose the party game genre for their first foray into development. He also discusses where his love for couch co-op games come from, elaborates on how the studio effectively utilized Unreal Engine 4, and explains what inspired him to weaponize underpants. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <strong>What inspired you to go into indie game development?</strong><br /> <br /> Playing classic PC titles like Jazz Jackrabbit and Commander Keen as a kid, I got it into my head that one day I wanted to make games like that. Tim (the other half of Juicy Cupcake) felt the same way when he first got his hands on the original Prince of Persia and other classics like Wolfenstein 3D. John Romero and John Carmack were his heroes. <br /> <br /> We were both new to the industry when starting out. Throughout school, I spent weekends messing around in Blender 3D, made mini-games in Flash and eventually Torque, with the goal of one day putting together an indie game team. I love to play games, but the appeal to make them has always been greater. <br /> <br /> As a kid, Tim was into modding Doom, Quake, and Elder Scrolls, and made games from scratch with C++ in his spare time. After high school, he studied 3D modeling and animation and was excited to finally put those skills to use commercially upon joining Juicy Cupcake.<br /> <br /> We eventually met each other working at our day job in a liquor store. Not long after that, Juicy Cupcake was founded and we started to catch up outside of work to teach ourselves how to make games. <br /> <br /> <strong>Do either of you have previous experience working with Unreal Engine? </strong><br /> <br /> Aside from Tim messing around in the original Unreal Engine 1 editor, our first experience with UE was getting into online tutorials and messing around in UDK years ago, when we were still fresh into game development. I was blown away by the possibilities and toolset available, though it was a little overwhelming to adopt at the time. <br /> <br /> Fast forward to when Unreal Engine 4 was released, we’d been working on a 3D adventure platformer proof of concept in Unity. The core mechanics were a lot of fun, but we had blind ambition and lofty goals to turn the project into a 3D platformer in a seamless world without loading screens. The tools to achieve this were already in Unreal Engine, so we spent a week or two bingeing online tutorials, then commenced re-building our project in Unreal. <br /> <br /> Even using the early versions of UE4, the transition was surprisingly easy. Having <a href="" target="_blank">Blueprints</a>, Node-based <a href="" target="_blank">material</a> editing, great <a href="" target="_blank">landscape tools</a>, fast <a href="" target="_blank">lightmapping</a>, and <a href="" target="_blank">GUI tools</a> built into the engine made adapting to the new toolset a seamless experience. <br /> <br /> After re-building our aforementioned proof of concept in Unreal, we evolved it into the cheeky butt-em-up party game that Brief Battles is today. <br /> <img alt="Dusty-Desert-Co-op.png" height="auto" src="" width="auto" /><br /> <strong>For people hearing about Brief Battles for the first time, could you provide a high-level overview of the game’s premise? </strong><br /> <br /> Sure thing! Brief Battles is a fast-paced underwear-fueled party game where players use their underpants to fight friends, family, or foe! <br /> <br /> Local multiplayer is at the core of the experience, allowing you to grab up to three friends to battle in underpants-themed game modes. “Hold the Gold - Bare Buns” is probably my favorite at the moment. Everyone starts out pantless, fighting for a hefty pair of golden undies. It’s sure to evoke some laughs and shouting at your friends.<br /> <br /> There’s a heap of solo and co-op content that you can play to improve your underwear mastery and unlock “Underwearrior” crossover mash-ups from the likes of Yooka-Laylee and CommanderVideo to name a few!<br /> <br /> You can battle it out with buns of steel, toxic tighty-whities, flaming hot pants, underpants of protection, icy undies, and leopard-print undies in 50 unique and sometimes treacherous arenas. <br /> <br /> Brief Battles is easy to pick up, but rewarding to master. Trophy hunters can push their skills to the limit to reach the hardest challenge goals, while accessibility options open the game up to everyone.<br /> <br /> We recently released Brief Battles on PS4, Xbox One, and PC!<br /> <br /> <strong>So, what was the inspiration behind Brief Battles (aside from the fantastic play on words)?</strong><br /> <br /> We’re both huge fans of couch co-op and party games! Brief Battles was created on a foundation of love for local multiplayer, and the times we’ve shared with friends and family playing games in that space. <br /> <br /> Brief Battles started out as a simple arena brawler based on a fast-paced platformer mechanic that allowed roof and wall climbing and some pretty wild agility. We had four of the same pink blob-like jiggly characters running around in tighty-whities, eating melon and spitting melon pips at each other with power-ups. <br /> <br /> While ideating what this local multiplayer prototype would come to be if we developed it in full, we gradually embraced the power of the butt, reworking everything to be based around super-powered underpants and the posterior of players. <br /> <br /> Without imagery, reading about a game where players battle it out in their undies can lead to some interesting imagining of what our title looks like. In reality, it’s a cheeky, light-hearted, and family-friendly underwear-laden experience aimed at all ages. <br /> <br /> <strong>Brief Battles contains quite a few arenas to play in, so how did Unreal Engine 4 help streamline the process of creating so many unique spaces?</strong><br /> <br /> Yeah, we’re at 50 arenas for launch and we’re still working on more ideas behind the scenes! <br /> <br /> Every arena we work on has to be greyboxed, playtested, and tweaked heavily before we approve it for an art pass. We’ve found <a href="" target="_blank">Binary Space Partitioning</a> (BSP) brushes to be ideal for this. Every arena was initially built with BSPs so that we could tweak everything easily. Once an arena is just right, we’d convert the BSPs to meshes and keep them for collision. <br /> <br /> When it comes to populating arenas with assets, the default editor tools have been great. The flexibility of viewports and speed of the interface has been especially useful. When it comes to creating new regions, we’ve made use of <a href="" target="_blank">material instances</a> for quick changes without compile times getting in the way. <br /> <br /> <strong>Does the studio have a favorite UE4 tool?</strong><br /> <br /> Blueprints, hands down! The speed, flexibility, and complete integration of Blueprints is insane. Brief Battles runs on 90 percent Blueprints. I can’t imagine living without the ability to make quick visual changes and debug on the fly. <br /> <br /> It’s a dream tool that really makes tweaking functionality of our game accessible to me as a designer. <br /> <img alt="Precarious-Peaks.png" height="auto" src="" width="auto" /><br /> <strong>Brief Battles isn&#39;t just a couch co-op arena battler - it also offers extra challenges and a single player mode, too. Was it difficult to incorporate so many different mechanics into the game? Did Unreal Engine help make this development process easier in any way?</strong><br /> <br /> We have a lot of data to handle that helps drive our many solo and co-op levels. We originally stored this data in a structure in the game instance, which became cumbersome. To solve this, we migrated all this info into Data Tables, making data management breezy. <br /> <br /> Using <a href="" target="_blank">Blutilities</a> (Blueprint Utilities) we were able to automate time-consuming and repetitive tasks when creating and editing levels.<br /> <br /> <strong>Now that you&#39;ve released Brief Battles, what have you learned that you think will be of great use to the studio as you look forward to developing your next game?</strong><br /> <br /> We’ve squeezed in a lot of experience during the development of Brief Battles. Along with it being our first full commercial title, we’ve learned about marketing, localization, development, and console porting for all major platforms coupled with ratings and legal requirements, crowdfunding, conventions, and more! <br /> <br /> The combination of all this experience has been phenomenal, and we’ll use it all when looking at new and exciting projects, though we’re always going to push ourselves to learn more with whatever we do next. <br /> <img alt="Crystal-Caverns.png" height="auto" src="" width="auto" /><br /> <strong>Reflecting back on all that you&#39;ve learned, what would you say to developers looking to jump into Unreal Engine for the first time? Any advice for first-time developers that you wish you had when you started?</strong><br /> <br /> If you’re looking to make the jump to Unreal and it’s the right time for your team, or if you’re just getting into game development, don’t be afraid to take the plunge. The Unreal Engine community and staff are incredibly supportive, and there are so many great resources online to help you get started. <br /> <br /> As for advice that I wish I’d had when I started? I wish I’d followed <a href="" target="_blank">@HighlySpammable</a> on Twitter sooner! They are always sharing neat tricks and hotkeys that would have made development easier. Also, do yourself a favor and stick through as many neat tips and tricks tutorials as possible before diving into development. There are so many neat hidden hotkeys and tools in Unreal that can speed up development. <br /> <br /> <strong>Where are all the places people can keep up with Brief Battles and Juicy Cupcake?</strong><br /> <br /> Embrace the power of the butt at <a href="" target="_blank"></a> to learn more about Brief Battles and our team! It’s now available on PS4, Xbox One, and PC.<br /> <br /> You can follow us for more cheeky updates on Twitter <a href="" target="_blank">@thejuicycupcake</a> or on <a href="" target="_blank">Facebook</a>/<a href="" target="_blank">Instagram</a>.<br />  Brief BattlesJuicy CupcakeGamesShawn PetraschukTue, 07 May 2019 14:30:00 GMTTue, 07 May 2019 14:30:00 GMT Build: Epic Games Recreates the Apollo 11 Mission for HoloLens 2 with Unreal Engine 4 Microsoft Build the Unreal Engine team unveiled a remarkable interactive visualization of the Apollo 11 lunar landing, which celebrates its 50th anniversary this year.Note: The Apollo 11 demo was intended to be presented onstage at Microsoft Build. The live showcase has been postponed due to unforeseen onsite technical issues. Although we were unable to show the Apollo 11 experience onstage today, we&#39;re excited to help others understand the potential of using HoloLens 2 to learn and share stories in entirely new ways that have never been possible until now.<br /> <br /> Creative communities across entertainment, visualization, design, manufacturing, and education eagerly anticipate Unreal Engine 4 native support for HoloLens 2, which Epic has confirmed will be released by the end of May. To kick off Microsoft Build, the Unreal Engine team unveiled a remarkable interactive visualization of the Apollo 11 lunar landing, which celebrates its 50th anniversary this year.<br /> <br /> ILM Chief Creative Officer John Knoll and Andrew Chaikin, space historian and author of <em>Man on the Moon</em>, present the multi-user HoloLens 2 demonstration, which recreates the historic 1969 event in meticulous detail. The demo presents a vision for the future of computing in which manipulating high-quality 3D content using a headset is as accessible as checking email on a smartphone. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <div style="text-align: center;"><em>ILM Chief Creative Officer John Knoll and Andrew Chaikin, author of Man on the Moon and space historian, during a May 5th rehearsal</em></div> <br /> The lifelike experience offers a bird’s-eye view of many aspects of the Apollo 11 mission, including the launch itself, an accurate model of the Saturn V, a detailed reenactment of the lunar landing, and a look at Neil Armstrong’s first steps on the moon reconstructed based on data and footage from the mission. Highlights that would be impossible to convey with this level of detail in any other medium include the three stages of Saturn V, the form-follows-function design of the Eagle lander, and lunar module’s suspenseful descent to the moon’s surface.<br /> <br /> “When we combine the power of HoloLens with the power of Azure, our partners can deliver transformative solutions. From automotive to manufacturing, from architecture to healthcare, our customers need highly precise and detailed representation of their 3D content,” said Alex Kipman, technical fellow in Microsoft’s Cloud and AI group. “Epic just showed us how to directly stream high-polygon content, with no decimation, to HoloLens. Unreal Engine enables HoloLens 2 to display holograms of infinite detail, far in excess of what is possible with edge compute and rendering alone.”<br /> <img alt="UE_HoloLens2_LunarLanding_Demo_Pic1.jpg" height="auto" src="" width="auto" /><br /> The demo’s visuals stream wirelessly using <a href="">Unreal Engine 4.22</a> on networked PCs to two HoloLens 2 devices using <a href="">Azure Spatial Anchors</a> to create a shared experience between Knoll and Chaikin, offering a glimpse into the potential of photorealistic, social mixed reality experiences. The two presenters collaborated in the environment, interacting with the same holograms in the same space – an exchange that looked and felt simple and seamless, but was in fact highly complex.<br /> <br /> The demo also takes advantage of HoloLens 2 instinctual interactions whereby users can naturally move their heads and hands to touch and manipulate holograms in front of them, for example, bringing hands together and pushing them apart to see the unique components of the Saturn V rocket as detached, individual units.<br /> <img alt="UE_HoloLens2_LunarLanding_Demo_Pic2.jpg" height="auto" src="" width="auto" /><br /> Finally, Unreal Engine 4’s support for <a href="">Holographic Remoting</a> brings high-end PC graphics to HoloLens devices. The Apollo 11 demo features a staggering 15 million polygons in a physically-based rendering environment with fully dynamic lighting and shadows, multi-layered materials, and volumetric effects.AerospaceAREnterpriseFeaturesHoloLensMicrosoftMRNewsVisualizationTraining And SimulationDana CowleyMon, 06 May 2019 20:00:00 GMTMon, 06 May 2019 20:00:00 GMT Tigerton’s Jupiter & Mars is leading the way for “Games that Inspire Change” & Mars tackles a post-polar ice cap melted world in this stunning underwater adventure.Beneath their ever-shifting surfaces, our oceans contain some of the world’s greatest creatures and mysteries within their depths, and they also serve as the location for Tigertron’s upcoming PlayStation 4 and PSVR title, <a href="" target="_blank">Jupiter & Mars</a>. In the game, humans are but a distant memory, the polar ice caps have melted, and the world as we know it has been swallowed whole by the ocean. Despite our absence, the human species leaves behind a legacy, one that threatens the oceans, and it’s up to dolphins Jupiter and Mars to quell the threat.<br /> <br /> Working towards providing entertainment with a splash of education, Tigertron strives to create “Games That Inspire Change,” according to their company motto. In doing so, the studio pulls its inspiration from the world around us in an effort to craft experiences that keep players engaged, but also provokes them to pause and reflect on current environmental scenarios.<br /> <br /> Created with Unreal Engine 4, Jupiter & Mars is a neon-soaked underwater utopia focusing on unique gameplay elements such as echolocation to bring its environments to life. No stranger to the glow of bright lights contrasted by darkness, Creative Director James Mielke draws on his past experience being a scuba diver and developing games such as Child of Eden and Lumines to create something bold and wondrous for an entirely new audience. We spoke with James alongside artists and programmers from the Jupiter & Mars development team to discover what they hoped to accomplish with the game’s message and discuss how Unreal Engine helped them achieve their goals. <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <strong>Tigertron was founded in 2015 with a motto to create “Games That Inspire Change.” Tell us how the studio came together and why this message is so important to you as a developer.</strong><br /> <br /> <strong>James Mielke:</strong> When I was around 14-15 years old, I did two things that influenced my outlook on life — I joined the Boy Scouts (later than most kids, surely) and I was certified as a scuba diver. These things put me in touch with our natural world, both above and beneath the ocean’s surface. My interest in marine biology, biodiversity, and all things related to these areas almost led me to apply to schools like Cornell University. Instead, I followed the safer path, which was design and illustration at the School of Visual Arts. <br /> <br /> Over the next 30 years or so, I split time between doing design work and my eventual role in the editorial field of gaming publications and game development, where I got to put my collective experience to work making games. But after I had two children, I started to get really anxious about the state of the world we lived in. I recalled the later days of my time at when Earth Day would pop up; I spent a lot of time arguing with people who simply couldn’t be bothered to make a symbolic effort to reduce their electricity use for an hour. I’m really bothered by the willful ignorance and denial some people openly flaunt when the science is right there in front of our faces — our planet cannot sustain our current lifestyles. While the planet may survive, its creatures (including us) cannot unless we make some serious and meaningful adjustments. <br /> <br /> So, after leaving a half decade of game development experience in Japan behind, I worked for a couple of development studios in New York City. Neither situation was the perfect match for me, so I decided to get off the sidelines and start my own thing while I still could. So at the encouragement of my wife, I brought in my good friend Sam Kennedy, who I’d worked with for over a decade, and formed Tigertron, which in regards to its name is a hybrid of nature and technology. <br /> <br /> Instead of leaving the game industry and becoming a full-time environmentalist — which is something I seriously considered — I figured I’d get more mileage out of combining my experience making games with an environmental ethos, and try to get the huge global gaming audience interested in their futures. <br /> <img alt="Jupiter_and_Mars_4.png" height="auto" src="" width="auto" /><br /> <strong>Many people would assume that being an environmentally-focused studio means your products fall into the &#39;edutainment&#39; category, but with Jupiter & Mars, your goal is to make people think and leave an impression on them long after they put down the game. Can you talk a bit about how Jupiter & Mars came together as an idea and what you hope to achieve with its release?</strong><br /> <br /> <strong>James: </strong>Yeah. We’re not trying to make “edutainment.” That term is fine for things that you might find in schools and whatnot, but it doesn’t accurately describe what we’re trying to make. It’s difficult to pave new roads in an industry that is largely formulaic, but being an independent developer lets us try different things. That’s the best part about being small, is that we can work on a game like Jupiter & Mars. We’re a low-risk, high-potential game maker, and that’s just bolted into my head, from my time spent at Q Entertainment in Tokyo, and the similar-sounding but very different Q-Games in Kyoto. I like to make smaller, more nimble games that are interesting, but that don’t overstay their welcome. <br /> <br /> I first started dreaming about Jupiter & Mars in Tokyo, after seeing the documentary The Cove, which details the now-annual dolphin slaughter in Taiji, Japan. Watching that film is a gut check, as there’s no reason to lure in and kill creatures as intelligent and socially diverse as dolphins, but The Cove is also inspiring. It’s encouraging to see these activists and filmmakers use their medium to do something unique and compelling. <br /> <br /> Likewise, in partnering with <a href="" target="_blank">SeaLegacy</a> to help spread their message to an all-new audience, we’re inspired by what they do and how they do it. Instead of proselytizing and trying to holler at people about how bad things are getting, SeaLegacy simply shows you a beautiful photograph, or more accurately a haunting photograph, and lets that spark conversation. <br /> <br /> With Jupiter & Mars, we’re hoping to achieve something similar. You’ll see familiar environments like London, Greece, and New York City, but in the very unusual circumstance of being almost completely underwater. If this makes a player wonder whether this could actually happen and inspires them to Wikipedia some stuff, then we’ve helped move the needle. There’s nothing more powerful than an informed, educated, and motivated person. If I can take my concern and form a company and create a video game about the environment, think about what 90 million PlayStation owners could achieve. <br /> <br /> Another group that inspires me is the <a href="" target="_blank">Oceanic Preservation Society</a>, who made The Cove and also the film “Racing Extinction.” Racing Extinction is proof of environmental multimedia in action, collaborating with like-minded tech savvy, environmentally-concerned activists. I’m partly in awe and in tears when I watch the OPS team project visual images and statistics of dying animal species on places like the Empire State Building, the United Nations, and the Vatican. Truly awesome stuff. That’s what Tigertron hopes to do. <br /> <img alt="Jupiter-and-Mars-Screen-6.jpg" height="auto" src="" width="auto" /><br /> <strong>Coming soon to PlayStation 4 and PSVR, can you tell us what Jupiter & Mars is about for the readers hearing about the game for the first time?</strong><br /> <br /> <strong>James:</strong> While humans have disappeared from planet Earth, their legacy still haunts the planet leaving behind still-functioning machinery to pollute the oceans in more ways than one. One devastating effect on the oceans is noise pollution; it’s harmful, damaging to sea life, and has been revealed to be one factor that drives whales up on beaches around the world. An ancient race of whales, known as The Elders, enlists the help of the infinitely more nimble Jupiter and Mars to assist them in shutting down acoustic harassment devices (AHD). AHDs were designed by scientists to keep certain sea creatures, like orcas, away from their typical feeding grounds so scientists could conduct their work without danger. Even in instances like this where there’s no malice intended, technology can still be harmful. <br /> <br /> There are other environmentally-inspired challenges found in the game, like freeing crabs, manta rays, and others from man-made hazards, but the core of the game is disabling the AHDs so that sea life can return to the reefs and the oceans can thrive again. <br /> <br /> It also explores the relationship between Jupiter and Mars. Our game is a very narrative-driven experience. We want people to remember these characters and their predicament for a long time to come. My five-year-old son was in tears over some of the game’s events the other day, and while he is a little young, I do hope this game resonates with people the way my favorite games have stuck with me over the years. <br /> <img alt="Jupiter-and-Mars-Screen-9.jpg" height="auto" src="" width="auto" /><br /> <strong>James, your resume includes producing games like Lumines and Child of Eden. At first glance, it&#39;s not hard to see that influence in Jupiter & Mars. How did that past experience help as a catalyst for creating the stunning visuals present in the game?</strong><br /> <br /> <strong>James:</strong> When I was preparing to move into game development, I was lucky enough to have a couple offers to choose from. One was from Valhalla Games (ex-Team Ninja guys) and the other was from Q Entertainment, whose alumni have made games like Sega Rally, Rez, Space Channel 5, and others. It was really hard to turn Valhalla’s Tomonobu Itagaki down. As much as I loved Ninja Gaiden, I was more interested in making games like Rez. <br /> <br /> Spending those years at Q Entertainment, under the direct tutelage of Tetsuya Mizuguchi, was instrumental in shaping my interests, but to be honest, the aesthetic of those games was always in my DNA. Electronic music, vector graphics, and day-glo colors is something you’re always going to find in stuff that I work on. Mizuguchi-san calls it the “synesthesia engine,” but what that really means is simply reinforcing player feedback. For every action you make in the game, you should feel a visual, audio, and tactile response to what you do. I think this makes perfect sense. You find it at work in Pachinko parlors all over Japan, and it’s part of what makes that game, gambling elements aside, so satisfying to players. <br /> <br /> Making video games enables me to work with collaborators who inspire me, in particular musicians and artists. Combine that with the ability to tell the stories I want to tell, my design background comes in handy in being able to package things in ways that are unique and, hopefully, compelling. <br /> <img alt="Jupiter-and-Mars-Screen-8.jpg" height="auto" src="" width="auto" /><br /> <strong>Playable in a standard mode and also on PlayStation VR, how hard was it to transition the base game to a VR experience and were there any ways Unreal Engine 4 helped streamline the process?</strong><br /> <br /> <strong>Harold Absalom (Programmer): </strong>Unreal Engine did a lot of the heavy lifting for getting things working in VR. Stripping that back to get things rendering in non-VR was very easy.<br /> <br /> <strong>Tim Ninnis (Designer): </strong>One of the most challenging things was setting up the cameras so that they would work in both formats. In VR, we can&#39;t change the camera&#39;s pitch to frame the action and we have to minimize unnecessary movement. We set up the cameras so that the action was framed well in non-VR and reverted their pitch to 0 whilst in VR mode, allowing the player to look up or down to where the action was happening whilst keeping the horizon line where they would expect. Panning and forcefully rotating the camera can trigger nausea, so these were used sparingly. The in-game sequencer allowed us to iterate on the camera work reasonably quickly to find solutions that felt comfortable in VR whilst framing the shots well in non-VR.<br /> <br /> <strong>Partially inspired by games like Ico and The Last Guardian, tell us about how that inspiration has worked into Jupiter & Mars and what other unique mechanics you were able to incorporate with the help of Unreal Engine.<br /> <br /> James: </strong>Ico and The Last Guardian specifically put you in control, more or less, of two characters. While you primarily control one, you have a companion throughout the majority of both of these games who is reliant on you to varying degrees, for direction and guidance. In Ico, your partner Yorda is fragile, and you can feel her physical weakness whenever she loses your grip or is in danger of being swept away by shadow monsters. In The Last Guardian, you’re heavily reliant on Trico to help you advance in the game, as it has the power that you do not. <br /> <br /> In Jupiter & Mars, you are the thoughtful, analytic Jupiter, but Mars is your brute force action button. You can play the game without VR — it looks really sharp on a PS4 Pro — but it certainly takes on a new level of immersion as a virtual reality game. Since most VR games are solitary experiences, we wanted to make sure you never felt alone, especially in the crushing darkness of the deep ocean areas, with only echolocation to light your way. So, from a utilitarian perspective, Mars is there as a companion to help you through the darkness.<br /> <br /> Unreal Engine helped us achieve a lot of cool things that were integral to a game-jam-like experience. First, we needed to be able to create large, expansive, unusual environments with a small team. To replicate (with some poetic license taken) familiar cities, that a small team of artists could manage, was integral to the game’s development. Unreal Engine made it possible for both 3D artists and game designers to work together and implement things quickly allowing for iteration and quick turnaround. Since Tigertron is based in NYC, but the development team is based in Melbourne, Australia, speed was of the essence to reduce the reaction time and implementation of feedback and ideas across the world.<br /> <br /> The other thing that is our foremost game mechanic is echolocation. We’re throwing a lot of calculations around, as Jupiter casts out echo after echo, lighting up even the darkest areas with the press of a button. This is probably the most “Q Entertainment” type of visual you’re going to see in the game. Using echolocation is like mapping out your immediate environment in an instantaneous neon web of vector visuals that highlights the area’s geometry. Initially, it just looks really cool, but in actual practice, it is integral to the game experience. You’re not necessarily blind playing through the game, but echolocation definitely makes it easier to see key elements that you’re searching for, and in certain areas in the game, it’s essential to being able to “see” clearly. Underwater caves, night-time portions of the game, and deep-sea zones all require echolocation in order to tell you where you’re going. <br /> <br /> Items are also color-coded to indicate threat level, so it’s important to keep echolocation pinging, lest you find yourself swimming through a tunnel, only to discover too late that you’re surrounded by poisonous anemone or a jellyfish barrier. I think that without Unreal Engine, it would have been very difficult to achieve all of these things, especially in VR. We decided early on that this was pretty much the only choice for what we wanted to do with the game, and I’m pleased that Jupiter & Mars looks very unlike what people probably associate with an Unreal Engine game. I think this points to the versatility of the engine’s tech. <br /> <img alt="Jupiter-and-Mars-Screen-1.jpg" height="auto" src="" width="auto" /><br /> <strong>Over the development of Jupiter & Mars, which of Unreal Engine 4&#39;s tools have proven the most valuable and why? Do you have a favorite?</strong><br /> <br /> <strong>Harold: </strong>The most valuable would be <a href="" target="_blank">Blueprints</a>, which helped our artists directly contribute functionality to the game. My favorite is the <a href="" target="_blank">Material Editor</a>, though, which is so powerful, and lets you make some truly amazing effects very easily.<br /> <br /> <strong>Steve Anderson (Programmer): </strong><a href="" target="_blank">UDN</a>, while not an engine tool, is fantastic. Any stumbling blocks we encountered were often dealt with by another developer already. Being able to access that wealth of knowledge and, if need be, ask our own questions sped production along greatly, especially in the final stages of the game. The particle and material editors are also amazing and allow a huge amount of creative freedom with great accessibility. My favorite part though is probably the <a href="" target="_blank">animation Blueprints</a> editor. I had a lot of fun there!<br /> <br /> <strong>Tim:</strong> When it came to improving the framerate, the optimization view modes, <a href="" target="_blank">profiling tools</a>, and statistics were very useful for identifying expensive objects and areas. <a href="" target="_blank">Auto-LODing</a> was very helpful for optimizing meshes that were too costly. And Blueprints allowed all team members to access and edit many of the commonly used assets.<br /> <br /> <strong>JJ Garcia (Artist): </strong>Terrain & <a href="" target="_blank">foliage</a> tools! Also, while Blueprints are great for quick development, don’t underestimate the power of C++ coding to support it.<br /> <img alt="Jupiter-and-Mars-Screen-5.jpg" height="auto" src="" width="auto" /><br /> <strong>With a wealth of experience behind you in Unreal Engine, what advice would you give to someone deciding to learn the engine for the first time?</strong><br /> <br /> <strong>Harold: </strong>Learn to balance Blueprints and C++. It&#39;s easy to get carried away implementing everything in Blueprints because it&#39;s so fast to iterate in, but sometimes C++ is the right tool for the job.<br /> <br /> <strong>Tim:</strong> Unreal Engine takes time to learn, but if you stick to it eventually things will become second nature and you&#39;ll end up being able to work quickly and efficiently.<br /> <br /> There are plenty of tutorials out there and even some very affordable courses. Make the most of them.<br /> <br /> <strong>Steve:</strong> Dive into the example projects and tutorials. There is so much available to see how different things are done, which makes for a great launching off point for your own development. <br /> <br /> <strong>Where are all the places people can go to keep up with Tigertron and Jupiter & Mars?</strong><br /> <br /> <strong>James:</strong> Anyone who’s interested in keeping up with our exploits, Jupiter & Mars and otherwise, can follow and interact with us on <a href="" target="_blank">Twitter</a>, <a href="" target="_blank">Facebook</a>, <a href="" target="_blank">Instagram</a>, and <a href="" target="_blank">YouTube</a>.<br />  Jupiter & MarsTigertonGamesVRShawn PetraschukThu, 02 May 2019 20:30:00 GMTThu, 02 May 2019 20:30:00 GMT production: Stargate Studios creates final pixels on set rendering has been opening the door to new virtual production techniques in recent years, revolutionizing VFX pipelines. But what if you could produce the actual final pixels in camera, on set? Find out how Stargate Studios is doing just that. After three decades in the business, <a href="" target="_blank">Stargate Studios</a> (<em>The Walking Dead, CSI: Crime Scene Investigation, Ray Donovan, The Orville</em>) knows a thing or two about high-tech film and television production. Offering both visual effects and digital production services, the company now has nine studios in seven countries, and a rich legacy of experience to draw on. But, as CEO and Founder Sam Nicholson, A.S.C. tells us, the company is never complacent. “We&#39;re always trying to reinvent ourselves,” says the distinguished cinematographer and VFX supervisor/producer. “We&#39;re trying to think ahead, and be relevant—five years out.” <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> So what does that mean in today’s fast-moving industry? For Stargate, for those visual effects that can be done in real time, it’s about bringing them right on set, and making them part of the filmmaking process. <br /> <br /> To achieve this, they’ve created <a href="" target="_blank">ThruView</a>, a set of tools that enables them to shoot in real time, while actually seeing CG elements—complete with reflections and lighting—integrated in the camera, eliminating the need for green-screen setups.<br /> <img alt="blog_body_img1.jpg" height="auto" src="" width="auto" />The company’s President, Al Lopez, explains: “At the heart of it, it’s a system that uses photographic plates,” he says. “We are shooting multiple 8K cameras. You can combine it with CG elements and it&#39;s going to be a hybrid: everything from full CG, if you want synthetic, all the way to fully photographic.”<br /> <br /> The combined output is piped into Unreal Engine at high resolution, playing back at up to 60 frames per second. “That is a lot of data that has to get through and be processed in real time,” says Lopez.<br /> <img alt="blog_body_img2.jpg" height="auto" src="" width="auto" />Nicholson elaborates: “What the Unreal Engine allows us to do is put everything in one basket and shake it up, and when it comes out, it looks real.” he says. “3D, 2D, real-time color, tracking, off-axis rendering, all these things [...] happen in real time. And it gives us enough hooks into it that with tablets, we can be adjusting color on the fly, we can be adjusting the lighting on the fly, we can be doing ray tracing on the fly. The Unreal Engine is the perfect software complement to all these converging technologies.”<br /> <br /> Clients are suitably impressed. “We&#39;re showing people something they have never seen before,” says Nicholson. “You generally don&#39;t see very many things for the first time. It&#39;s real magic when you see it.”<br /> <img alt="blog_body_img3.jpg" height="auto" src="" width="auto" />Bringing VFX on set helps the crew and cast feel more connected to the end result, as Producer Bryan Binder explains. “Let&#39;s say you&#39;re doing a show that takes place primarily on a bus, and instead of sitting there for day after day after day staring at green screen through the windows, and the entire crew and all the cast being so disconnected from what&#39;s going on and what this thing is going to look like at the end of the day, we’re actually able to bring that experience to everybody live on set,” he says.<br /> <br /> But the benefits are also financial, with a show shot in even a single environment or just a few environments potentially generating over 500 green-screen shots. “At that point, now we start talking about the financial realities of shooting on green and the financial realities of what ThruView can offer a production,” say Binder.<br />  <br /> Nicholson is keen to point out that, while real-time on-set effects are not new, producing final-quality pixels from them is. “For many years we&#39;ve pursued real-time green-screen shooting; there&#39;s a four- or five-frame lag on there and you&#39;re basically doing a previsualization on set, but it&#39;s not a finished product,” he says. “Now what we&#39;re doing is going for finished product on set in the lens—done.”<br /> <img alt="blog_body_img4.jpg" height="auto" src="" width="auto" />Because of the company’s post-production expertise, Stargate has been able to reassure nervous clients that they can always fall back on traditional methods at no extra cost if the real-time results are not satisfactory. “So far, nothing’s come back,” says Lopez.<br /> <br /> According to Nicholson, Unreal Engine has played a pivotal role in ThruView’s success. “I don&#39;t think any of this would be possible without the Unreal Engine,” he says. “There&#39;s a universal creative language developing; the Unreal Engine is very much part of that. It’s the converging of all these technologies: individually they&#39;re all getting faster and better, but what the Unreal Engine allows us to do is put all of them together in one place at one time in a dependable fashion—on set.”<br /> <br /> <br /> Want to find out what magic you can create in real time? Download <a href="" target="_blank">Unreal Engine</a> today.<br />  Stargate StudiosThruViewVirtual ProductionFilm And TelevisionVFXEnterpriseBrian PohlWed, 01 May 2019 15:00:00 GMTWed, 01 May 2019 15:00:00 GMT free Marketplace content - May 2019 celebration of the Spring #ue4jam: Marketplace Takeover, we’re doubling down on free content for May. From post-processes effects, interior props, animations, heroes, fires and effects, there’s something for any project!In partnership with Unreal Engine Marketplace creators, select content will be available for free to the UE4 community each month to give artists, designers and programmers even more resources at no additional cost.<br /> <br /> This month we’ve doubled down to provide extra content in celebration of the <a href="" target="_blank">Spring #ue4jam: Marketplace Takeover</a>! We hope you’ll make a new game and enter for a chance at prizes, but before you do, please be sure to check out the alternative rules that are unique to this jam. Oh, and don’t forget to take advantage of all the <a href="" target="_blank">free content on the Unreal Engine Marketplace</a>!<br /> <br /> Check out the new content available this month! <h2><strong>May’s Featured Free Content:</strong></h2> <h2><a href="" target="_blank">Chameleon Post Process</a> - <a href="" target="_blank">SumFX</a></h2> <div style="text-align: center;"><img alt="1_Chameleon_770.jpg" src="" /><br /> <em>An advanced post processing Blueprint actor with 70 customizable and combinable effects</em></div> <h2><a href="" target="_blank">Death Animations - MoCap Pack</a> - <a href="" target="_blank">MoCap Online</a></h2> <div style="text-align: center;"><img alt="4_DeathAnimations_770.jpg" src="" /><br /> <em>An animation set with 16 dramatic character deaths</em></div> <h2><a href="" target="_blank">Houseplant Pack - Interior and Exterior Plants</a> - <a href="" target="_blank">IanRoach</a></h2> <div style="text-align: center;"><img alt="2_Houseplants_770.png" src="" /><br /> <em>An assortment of houseplants and customizable planters</em></div> <h2><a href="" target="_blank">HQ Residential House</a> - <a href="" target="_blank">NOTLonely</a></h2> <div style="text-align: center;"><img alt="3_ResidentialHouse_770.png" src="" /><br /> <em>A collection of over 200 ready to use objects and a lit, assembled demo home</em></div> <h2><a href="" target="_blank">Modular RPG Heroes Polyart</a> - <a href="" target="_blank">Dungeon Mason</a></h2> <div style="text-align: center;"><img alt="5_ModularRPGHeroes_770.jpg" src="" /><br /> <em>A modular RPG pack with various weapon stances and animations</em></div> <h2><a href="" target="_blank">Particle Text</a> - <a href="" target="_blank">Killer Refresh Entertainment</a></h2> <div style="text-align: center;"><img alt="6_ParticleText_770.png" src="" /><br /> <em>A particle system that outputs user-defined text</em></div> <h2><a href="" target="_blank">Platformer Starter Pack</a> - <a href="" target="_blank">Platfunner</a></h2> <div style="text-align: center;"><img alt="7_PlatformerStarterPack_HiRes_770.jpg" src="" /><br /> <em>A collection of assets to build and create 2D or 3D platformers</em></div> <h2><a href="" target="_blank">Retro 8Bit Sounds</a> - <a href="" target="_blank">Gamemaster Audio</a></h2> <div style="text-align: center;"><img alt="8_RetroSounds_770.jpg" src="" /><br /> <em>1001 retro game sound effects</em></div> <h2><a href="" target="_blank">Urban Material Pack</a> - <a href="" target="_blank">Plan B Studios</a></h2> <div style="text-align: center;"><img alt="9_UrbanMaterials_770.jpg" src="" /><br /> <em>A pack of 41 different urban-themed PBR Materials. </em></div> <h2><a href="" target="_blank">User Interface Kit</a> - <a href="" target="_blank">Christianos Philippos</a></h2> <div style="text-align: center;"><img alt="10_UIKit_770.png" src="" /><br /> <em>A collection of widgets, icons and UI elements </em></div> <h2><strong>New Permanently Free Content:</strong></h2> <h2><a href="" target="_blank">Free Furniture Pack</a> - <a href="" target="_blank">Next Level 3D</a></h2> <div style="text-align: center;"><img alt="11_FurniturePack_770.jpg" src="" /><br /> <em>A collection of 37 different high quality pieces of furniture</em></div> <h2><a href="" target="_blank">M5 VFX Vol2. Fire and Flames</a> - <a href="" target="_blank">JeongukChoi</a></h2> <div style="text-align: center;"><img alt="12_M5VFX_770.jpg" src="" /><br /> <em>Create various kinds of fire, such as shaking candles, bonfires, fire, flames, and explosions </em></div> <br /> Be sure to download all these wonderful assets from our Marketplace creators before the end of the month, and come back in June for another round of excellent free content!<br /> <br /> Are you a Marketplace creator interested in having your content featured for free to the community? Visit <a href="" target="_blank"></a>.CommunityLearningMarketplaceNewsGame JamsAmanda BottWed, 01 May 2019 14:30:00 GMTWed, 01 May 2019 14:30:00 GMTéns helps move cities with urban design visualization in UE4 you need to do something as potentially traumatic as moving not just someone’s home, but large parts of their city, effective visual communication of the plans is key. Find out how Tyr&eacute;ns uses their custom build of UE4 on this and other large-scale urban and infrastructure projects.It’s said that moving home is one of the most stressful life events you can undergo. But what if someone said that it wasn’t going to be you moving home, but your home that was going to be moving? And not just your home, but your entire city?<br /> <br /> That’s what the residents of the arctic town of G&auml;llivare-Malmberget in Swedish Lapland are currently facing, as an expanding iron-ore mine is forcing the <a href="" target="_blank">relocation of their city</a> in a project estimated to cost over one billion US dollars and take nearly two decades. The redevelopment will affect almost 2,000 homes, along with historic buildings, offices, and industrial facilities. Public buildings like the retirement home, high school, and indoor swimming pool will be torn down and replaced. <div><img alt="blog_body_img1.jpg" height="auto" src="" width="auto" /></div> <div style="text-align: center;"><em>An area of G&auml;llivare that will be relocated.</em></div> <br /> Clearly, this kind of news can be disconcerting—to say the least—to those affected. That’s why <a href="" target="_blank">Tyr&eacute;ns</a>, one of Sweden’s leading community development consultancies, was brought in to help plan the redevelopment. Part of their mandate was to make sure that the designs were clearly communicated to the public, and to solicit their feedback.  <br /> <br /> Founded in 1947, Tyr&eacute;ns focuses on creating sustainable solutions in the fields of urban development and infrastructure. They’re keenly aware that visual communication is of paramount importance in their industry, and for many years have created static renderings and animations in applications such as 3ds Max to facilitate that.<br />   <h3><strong>Creating a real-time visualization platform</strong></h3> By 2013, advances in real-time rendering were sufficient for the company to take the leap toward creating interactive visualizations, motivated by the desire to save time and also to quickly offer clients any view they desired. After investigating all the major platforms available at the time, they selected Unreal Engine 3, which they customized into their own platform, called <a href="" target="_blank">TyrEngine</a>.<br /> <br /> “Unreal stood out as the platform with a clear path of progression,” says Jonas Gry Fjellstr&ouml;m, Visualization Expert at Tyr&eacute;ns. Gry Fjellstr&ouml;m was brought into the company for his expertise in real-time 3D, acquired from his previous work in the video game industry.   <div><img alt="blog_body_img2.jpg" height="auto" src="" width="auto" /></div> <div style="text-align: center;"><em>Midnight sun over G&auml;llivare.</em></div> <br /> The G&auml;llivare redevelopment was the first project to make use of TyrEngine. Tyr&eacute;ns created a white-box 4D model to show how the changes would happen over several years, from the demolition of existing buildings and construction of new ones to the physical relocation of certain buildings. The visualization has been used as a platform for discussion internally in the state office.  <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> By the project’s completion, the citizens will have a brand new city center, boasting a state-of-the-art education center, a sports and culture hall, and an ice and event arena, not to mention a leafy new city square disguising plenty of underground parking. Tyr&eacute;ns has showcased all of this in a second, fully textured incarnation of the final city model—an area of land over 72 km2—which members of the public are able to navigate and provide feedback on. This effective visual communication has resulted in constructive dialogue and broad acceptance, even positivity.  <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> The G&auml;llivare redevelopment project is still ongoing, although TyrEngine has evolved along with Unreal Engine. With the advent of Blueprint visual scripting in Unreal Engine 4, TyrEngine became an even more powerful platform. “Suddenly it became very easy for a small team like ours to make interactions and functions that we could use to make our presentations even better,” says Gry Fjellstr&ouml;m.  <br /> <img alt="blog_body_img3.jpg" height="auto" src="" width="auto" /> <div style="text-align: center;"><em>A view of the centre of G&auml;llivare scheduled for completion in 2023. </em></div> <br /> To date, the team has worked on more than 50 projects in TyrEngine. These vary from small indoor visualizations to large-scale urban and infrastructure developments, including one project that involved modeling 120 km of railroad tracks.<br />   <h3><strong>Sustainable development for large-scale infrastructure</strong></h3> One such large-scale project currently in progress is the planning of 20 km of new roads, tunnels, and buildings for Trafikverket (the Swedish Authority of Transportation). As with the G&auml;llivare project, the real-time model has acted both as a coordination platform for construction and planning, and as the consultation model for communication to the public. <br /> <img alt="blog_body_img4.jpg" height="auto" src="" width="auto" /> <div style="text-align: center;"><em>One of the ten larger junctions of The S&ouml;dert&ouml;rn Crosslink, a 20 km road construction project.  </em></div> <br /> The amount of data required to accurately model the entire affected area presents a challenge, requiring the incorporation of orthographic aerial photos for landscapes and terrain, and geometric information for existing buildings and other details in the environment. For such a vast area, these elements can quickly accumulate into a massive scene. <br /> <img alt="blog_body_img5.jpg" height="auto" src="" width="auto" /> <div style="text-align: center;"><em>A railroad project in the south of Sweden.</em></div> <br /> “The closer we get to reality, the more flaws we will find—so it’s important to find the ‘good enough’ detail level,” says Gry Fjellstr&ouml;m. “Since we’re not making sales material, we strive for the correct look rather than the pretty one.”<br /> <br /> On top of the real-world visible data, the team has to include information from GIS data, such as areas of historical importance, water protection areas, and nature reserves. The planners can then use this information when determining the route of the road to minimize its impact on these sensitive areas. <br /> <br /> The team also added a UI that accurately set the sun position for a particular date, time, and latitude/longitude for light and shadow studies. A similar feature was later released by Epic in Unreal Engine 4.21. <br /> <img alt="blog_body_img7.jpg" height="auto" src="" width="auto" /> <div style="text-align: center;"><em>Sun and shade study at an overpass in a railroad project. </em></div> <br /> Other features Tyr&eacute;ns provided were interactive measuring tools, the ability to display metadata such as part names coming from the BIM application, toggles for displaying data with various tags, and the ability to view the model in X-Ray mode to see underground features such as tunnels.<br /> <img alt="blog_body_img6.jpg" height="auto" src="" width="auto" /> <div style="text-align: center;"><em>Example of how to visualize objects underground. </em></div> <br /> The ability to extend Unreal Engine for their own purposes is just one of the things that makes it attractive to Tyr&eacute;ns. “We love that we have the C++ layer so we can add our own tools and algorithms into the mix,” says Gry Fjellstr&ouml;m. <br /> <br /> The planned road structure was built in Civil3D and cleaned up in 3ds Max before being integrated into the model of the existing landscape, providing an accurate interactive visualization of the proposed development. The project has been very well received by the client, the planners, and the public. <br /> <br /> “They’re always amazed over how real time is so much more understandable and flexible than static images or blueprints,” says Gry Fjellstr&ouml;m. “When they see the results, they get an understanding of the power to be able to communicate between levels of expertise and between different fields of knowledge. A real-time model can solve a lot of questions right away that would otherwise have to be reviewed at another time.”<br />   <h3><strong>Sharing the experience with Pixel Streaming</strong></h3> As well as showcasing the development to stakeholders on a kiosk-based PC, Tyr&eacute;ns took advantage of <a href="" target="_blank">Pixel Streaming</a> to communicate with remote viewers on lightweight devices like iPads and smartphones. <br /> <img alt="blog_body_img8.jpg" height="auto" src="" width="auto" /> <div style="text-align: center;"><em>Pixel Streaming the S&ouml;dert&ouml;rn Crosslink.</em></div> <br /> Introduced in Unreal Engine 4.21, Pixel Streaming enables you to stream workstation-quality content to almost any web browser on any platform, without the viewer having to download or install anything—they simply access a link as they would to view a YouTube video. But unlike watching a video, the viewer can interact with that content, and even send responses back to the engine.  <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> “Pixel Streaming gives us a good interface for collaboration, public communication, or just for showing our work,” says Gry Fjellstr&ouml;m. “It’s a powerful tool, especially when you’re trying to explain things to people remotely. It’s so much easier and faster than video. Viewers get 30 frames per second and they can interact with it. It really extends everything we want to achieve with TyrEngine.”<br />   <h3><strong>The road ahead for real-time technology</strong></h3> Gry Fjellstr&ouml;m has been involved with real-time technology for just about as long as real-time technology has been around. He had been working in the game industry for more than 15 years before starting at Tyr&eacute;ns in 2013, and he’s been working with Unreal technology for most of that time. “The first contact I had with Unreal was back when Unreal Tournament was released, and I found the UnrealEd that allowed me to start making level design and interactive 3D art,” he says.<br /> <br /> Since then, he’s worked in the professional game industry in various studios around the world, and his resume includes working on several large Unreal-projects such as Rainbow Six and Splinter Cell at Ubisoft Montreal. So how does he feel that real-time technology will be used in the future, and what excites him about the road ahead? <br /> <br /> “I believe that we will increasingly use external data to bring even more functionality to our real-time models,” he says. He goes on to explain that Tyr&eacute;ns is experimenting with digital twins of buildings inside of Unreal Engine, as part of what he calls “a living 3D management system”. <br /> <br /> Working with a local university, they’re in the process of creating a complete digital twin of an innovative dental clinic, which is designed to prototype new and evolving technologies, and features a virtual receptionist. Using Unreal Engine’s multi-user collaboration and avatars, they’re able to present the clinic to interested parties without physically being there.<br /> <img alt="blog_body_img9.jpg" height="auto" src="" width="auto" /> <div style="text-align: center;"><em>Digital twin of an innovative dental clinic in Sweden. </em></div> <br /> In another experiment, sensors underneath desks in the Tyr&eacute;ns office determine when seats are occupied, triggering motion on the chairs in a digital twin model. The idea is to gain a new level of understanding of how people move and to visualize the flow of movement over time in a building.<br /> <br /> “It’s kind of funny to watch your own office chair indicate movement when navigating in the office, but you do it in Unreal,” he laughs. <br /> <br /> <br /> Want to see how real-time technology can help you communicate better? Download the free <a href="" target="_blank">Unreal Studio</a> beta today and get started with Unreal Engine and a full suite of tools to support AEC workflows.<br />  EnterpriseDesignArchitectureVisualizationPixel StreamingDigital TwinTyrEngineBlueprintsTyrénsKen PimentelTue, 30 Apr 2019 12:00:00 GMTTue, 30 Apr 2019 12:00:00 GMT production in Unreal Engine 4.22: livestream recap Games Senior Cinematic Designer Grayson Edge walks users through Take Recorder, explains how to use Vcam with an iPad, and shares several new virtual production tips. As you may have noticed in the <a href="" target="_blank">release notes</a>, the recent launch of Unreal Engine 4.22 delivered several virtual production improvements. To highlight some of them, Epic Games Senior Cinematic Designer Grayson Edge recently conducted a livestream where he showcases improvements to <a href="" target="_blank">Sequencer</a>, explains how Take Recorder works, and shows viewers how to use Vcam with an iPad. He demonstrates how these numerous improvements have dramatically increased efficiency to a point where what used to be a week&#39;s worth of work can now be trimmed down to a single day. He also shares his production workflow. You can check out the stream in its entirety below.   <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> Edge begins the stream by outlining what virtual production means <a href="" target="_blank">within the context of Unreal Engine</a>. Specifically, he defines it as the cross section between where the physical and digital worlds meet. Elaborating further, the cinematic designer says that virtual production enables directors and creatives to work on a physical stage, but view and interact with virtual environments and characters at the same time. <br /> <br /> In the stream, Edge showcases a variety of tools that his team uses to produce cinematics. On the hardware front, these tools include Blackmagic Studio cameras and an iPad. When you couple these devices with actors wearing mocap suits, you&#39;ve got the foundation to start puppeteering in-game characters in real time.<br /> <br /> Spicing up the set, Edge elaborates on the physical props that his team uses, which include swords, guns, and axes that have mocap markers attached to them. A relatively new tool that the team has been leveraging is a Bluetooth device that syncs up to mocapped prop guns, enabling actors to fire the weapon for a corresponding in-engine effect. Finally, iPhones attached to a head mount enable facial mocap data to be applied on an in-game skeleton through the <a href="" target="_blank">Live Link</a> plugin, in real time.<br /> <br /> Introduced in 4.22, Take Recorder, which is an extension of Sequence Recorder, combines these elements together. While Sequence Recorder has been used to record actors&#39; performances for many projects, including Epic’s VR game <a href="" target="_blank">Robo Recall</a>, it wasn&#39;t streamlined for virtual production. On the other hand, Take Recorder, with its simpler UI, was tailor-made for it. It&#39;s also much better at managing data, which is helpful in a virtual production environment. <br /> <br /> In the video stream embedded above, Edge shows viewers how to enable and use Take Recorder. He provides several tips, which include: <ul style="margin-left: 40px;"> <li>Adding different actors into Take Recorder</li> <li>Adding and recording microphone audio</li> <li>Adding Live Link as a source</li> <li>Enabling recording</li> <li>Naming slates</li> <li>Reviewing the recording</li> <li>Assigning colors to different tracks to easily classify elements such as cameras, actors, and more</li> <li>Having an actor record a base performance to perform against himself/herself on subsequent takes</li> <li>Matching a virtual in-engine camera with a real physical camera to get 1:1 representations of the virtual and real world</li> </ul> <p>Edge also shows how to set up and use the <a href="" target="_blank">Virtual Camera plugin</a> with an iPad. With this tool, directors, cinematographers, and creatives can achieve numerous powerful things in real time. In the stream, he shows how users can record with the iPad. As well as sharing best practices, Edge gives practical advice on: </p> <ul style="margin-left: 40px;"> <li>Adding motion stabilization</li> <li>Zooming in and out</li> <li>Adjusting focal length</li> <li>Reducing keyframes to get smoother camera movements</li> <li>Optimizing tablet performance</li> </ul> <p>Edge wraps up the stream by showcasing several other new virtual production features. Animation blending, for instance, enables users to smoothly blend between two different animations, which was not something that was easy to achieve in the past. He also highlights the fact that users can copy and paste animation tracks; by making slight variations on these tracks and then duplicating them, users can create simple, yet believable crowds. Finally, Edge shows off the new object-binding track that lets users manually change materials, meshes, and more.<br /> <br /> To read about how famed director Robert Zemeckis used virtual production on the set of the movie <a href="" target="_blank">Welcome to Marwen</a>,  check out our <a href="" target="_blank">Spotlight video</a>. For more news, articles, and insights, visit our <a href="" target="_blank">virtual production page</a> and sign up for <a href="" target="_blank">our newsletter</a>.<br /> <br /> If you would like to play around with these new virtual production features yourself, download <a href="" target="_blank">Unreal Engine 4.22</a> for free today.</p> CommunityEnterpriseVirtual ProductionFilm And TelevisionJimmy ThangMon, 29 Apr 2019 18:00:00 GMTMon, 29 Apr 2019 18:00:00 GMT Entertainment revamps Tropico 6 with Unreal Engine 4 Unreal Engine for the first time in the series’ history, Limbic Entertainment game designer Marcus Cool explains how the new Tropico developer balances paying homage to what fans expect while introducing fresh mechanics. City-building franchise Tropico has been beloved since the original game released in 2001. With a fervent fan base, Limbic Entertainment, who had been previously known for developing several Might & Magic games, had big shoes to fill taking over development reigns for <a href="" target="_blank">Tropico 6</a>. The Langen, Germany-based developer further mixed things up by building the game with Unreal Engine 4, a new engine for the series. Fortunately, the changes have paid great dividends with review sites like <a href="" target="_blank">Shacknews</a> saying, “Tropico 6 is the best game I&#39;ve played all year.”<br /> <br /> To see how Limbic Entertainment produced the most beautiful Tropico game to date, we interviewed Game Designer Marcus Cool. He explains how the studio researched the series’ history to maintain the franchise’s roots while injecting new elements and features. The game designer also discusses Tropico 6’s new archipelago map designs and elaborates how the multi-island structure expands what’s possible with the series.  <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <strong>Thanks for your time. Considering the previous Tropico games were made by different developers, what’s it been like to step into shoes as big as El Presidente&#39;s?</strong><br /> <br /> <strong>Limbic Entertainment Game Designer Marcus Cool:</strong> It&#39;s always a challenge to create a new entry in a beloved franchise, but we embraced the challenge with Tropico 6. You must stay true to the series&#39; DNA while also introducing a fresh spin to the formula. It’s a delicate balance, but we feel we walked it appropriately during Tropico 6’s development.<br /> <br /> <strong>Being a new developer to the series, can you elaborate on the research that went into the development of Tropico 6?</strong><br /> <br /> <strong>Cool: </strong>The first step was to have the whole team dive into the previous games to get a feeling for what makes Tropico, Tropico. Being aware of their feel, pace, and overall look absolutely helped as we started to prototype certain aspects like agent simulation, pathfinding, and traffic management. We not only played all of the prior Tropicos themselves, but also played a lot of other sims and city builders to see what we could bring to Tropico 6 to set it apart from the pack.<br /> <img alt="Tropico-6-Super-Resolution-2019.01.28---" height="auto" src="" width="auto" /><br /> <strong>How do you balance paying tribute to what fans expect from the series while also introducing new features and mechanics? </strong><br /> <br /> <strong>Cool: </strong>Tropico games have a certain number of features, style, content, quirks, etc. Those all shaped our understanding of what a Tropico game is at its core. At the outset of development, we evaluated each and its importance to the game’s DNA. This involves a ton of research on forums, Reddit, and game reviews.  When we had this all, we identified things that should and shouldn&#39;t be in our Tropico game. We call those “design claims.” Once we had these, it was relatively simple to decide how exactly the individual new features should be implemented into the game.<br /> <br /> <strong>Tropico 6 introduces a new raid mechanic that allows El Presidente to employ pirates to steal other countries’ famous landmarks. Can you elaborate on the design of this quirky concept?</strong><br /> <br /> <strong>Cool: </strong>This is a good example for what we talked about earlier. The idea was to have a new feature that centered on El Presidente stealing world wonders from around the globe and placing them on Tropico. We initially planned it to be about a dedicated map interface while only stealing blueprints, but with the tonality and humor of the series, we changed it to stealing the wonders [themselves]. If anyone could pull it off, it’s El Presidente! <br /> <br /> We scrapped the world map as it took away the focus on the nation of Tropico. We kept the era system, so the raid feature was needed to progress through the eras. This created the opportunity to reintroduce Tropican pirates for the colonial era (a nod to Tropico 2), and the Spy Academy  (a nod to Tropico 5) for the cold war era. For the other eras, we have commandos for world wars and hackers for modern times. We all love Penultimo,  El Presidente&#39;s humble advisor, so he was a perfect fit to be spokesperson for the raiders. Once we defined the game’s DNA and formed the resulting design claims as our base, new features literally just fell into place naturally.<br /> <img alt="Tropico-6-Super-Resolution-2019.01.30---" height="auto" src="" width="auto" /><br /> <strong>Limbic Entertainment previously stated that citizens will be "fully simulated," which could affect the productivity and stability of the population. Can you elaborate on this design and explain how you implemented it? </strong><br /> <br /> <strong>Cool:</strong> Individual citizens (or "agents," as we call them internally) make autonomous decisions based on their individual needs and their perception of Tropico. In other words, there is no supervising algorithm that analyzes the game state and manages the agents based on its calculations. The core of this system was one of the first features to get implemented in a rudimentary state, where each agent would just go through a simple cycle of fulfilling their needs in dedicated buildings. The system was incrementally extended by agents deciding what to do, where to do it, and how to get there.<br /> <br /> <strong>Previous Tropico games were made using different engines. What made Unreal Engine 4 a good fit for Tropico 6?</strong><br /> <br /> <strong>Cool:</strong> Everyone on the team has had extensive experience working with Unreal from previous projects and felt confident continuing development with Unreal Engine. Unreal also has a strong feature set that is beneficial to Tropico 6 like multiplayer, materials/rendering features, and foliage system.<br /> <img alt="Tropico-6-Super-Resolution-2019.01.28---" height="auto" src="" width="auto" /><br /> <strong>With vibrant colors, realistic water, and a plethora of visual variety, Tropico 6 features the best graphics in the series by far. How did you go about nailing the game&#39;s visuals?</strong><br /> <br /> <strong>Cool: </strong>We classified the visual style we wanted to go for as "Caribbean Romanticism." If you take a screenshot from any location in the game, we wanted it to be worthy of a great "Visit Tropico" postcard. Our tech artists really worked their magic during development to produce high-quality assets. In addition, our level designers created a vast variety of maps that perfectly blended a natural look with gameplay destinations and points of interest with eye-catching details.<br /> <br /> <strong>In previous Tropico games, players generally had one big island to play around with, but Tropico 6 features archipelagos that contain several small islands that players can inhabit. Can you talk about the move to this design?</strong><br /> <br /> <strong>Cool: </strong>If you have an island in a realistic setting, naval transportation should be part of the game. There wasn’t any downside not being able to go to other islands in the previous games, we always wanted to! By traveling to new islands via archipelagos completely opened up new features, content, challenges, and story possibilities.<br /> <img alt="Tropico-6-Super-Resolution-2019.01.28---" height="auto" src="" width="auto" /><br /> <strong>Did the studio leverage Blueprints in any capacity?</strong><br /> <br /> <strong>Cool: </strong>Yes, we built our missions’ flow based on logic driven via parameters set by level designers in <a href="" target="_blank">Blueprints</a>. <br /> <br /> <strong>Does the studio have any favorite UE4 tools or features?<br /> <br /> Cool:</strong> We like UE4’s packaging/cooking tools, along with the <a href="" target="_blank">BuildGraph</a> automation system, which gives us more flexibility, especially deploying to multiple platforms/stores.<br /> <br /> Profiling tools from simple commands to capture profiling data visualization and analysis tools also came in handy!<br /> <br /> <strong>Is there anything you would like to let Tropico fans know about the sixth installment of the series?</strong><br /> <br /> <strong>Cool:</strong> We want to thank all Tropico fans for their trust and passion for Tropico. We hope you have a blast playing the game!<br /> <br /> <strong>Thanks again for your time. Where can players learn more about Tropico 6?</strong><br /> <br /> <strong>Cool:</strong> People can head to the <a href="" target="_blank">Kalypso website</a> to buy the game and sign up for the newsletter to get information on patches and future DLCs! Viva Tropico!<br />  Limbic EntertainmentTropico 6GamesJimmy ThangFri, 26 Apr 2019 14:30:00 GMTFri, 26 Apr 2019 14:30:00 GMT autonomous vehicles safer before they hit the road Simulation, makers of vehicle simulation programs CarSim and TruckSim, gives autonomous vehicle developers unprecedented opportunities to iterate and visualize test scenarios with an Unreal Engine plugin.<a href="" target="_blank">Mechanical Simulation</a> is an Ann Arbor, Michigan-based company that has been making software for accurate ground vehicle simulation since 1996. Their CarSim, TruckSim, and BikeSim product portfolio includes modules for different types of passenger and commercial vehicles. These products are focused on aggregating everything about a vehicle, its environment, and its motion to visualize or predict its behavior in a wide range of driving conditions.<br /> <br /> CarSim, TruckSim, and BikeSim use vehicle data that describes suspension behavior, powertrain properties, active controller behaviors, tire properties, and also road slope, obstacles, weather conditions, and asphalt type. At the core of the software is a simulation solver that can predict how the vehicle will react, for example whether it will tip or skid under specific conditions or whether it will brake quickly enough on a wet surface. The software also produces a visual representation of the vehicle’s motion from the solved data.<br /> <img alt="blog_body_imgVertech3.jpg" height="auto" src="" width="auto" />On the flip side, the software can also import real-world vehicle and map data and analyze for speed, response time, and other aspects of the driving experience. While this application has obvious uses for accident reconstruction and training simulators, a new use has emerged in recent years—data gathering and machine learning for autonomous vehicles. <h3><br /> <strong>Testing an autonomous vehicle</strong></h3> Self-driving cars are already available to the public in limited driving situations, yet the technology behind them still needs finesse to match safety and regulation requirements before they hit the roads in more complex driving environments. Part of that process is to record and analyze the cars’ data during test runs.<br /> <br /> Autonomous vehicles use a variety of physics-based sensors to detect the environment around them: cameras, radar, and <a href="" target="_blank">LIDAR</a>. The measure of a self-driving vehicle’s success is based largely on its ability to process the data from these sensors and interpret its distance to other cars, pedestrians, cyclists, and even debris left in the road, not to mention the slope, size, and condition of the road itself. The vehicle must also detect lane markings, signal lights, and traffic signs, and respond as a human driver would in all weather and lighting conditions.<br /> <br /> The first tests of such vehicles were performed on physical test tracks, but it soon became clear that it was much more efficient—and much safer—to perform such tests on a virtual car first.<br /> <img alt="blog_body_imgVertech1A.jpg" height="auto" src="" width="auto" />A virtual vehicle is fitted with all the sensors of a physical vehicle, and the visual data is fed to the sensors just as with a physical test. The difference is that engineers can easily try out variations in sensor placement, and quickly iterate on variations in obstacles, weather, time of day, and road conditions, all from the safety of a virtual environment. <h3><br /> <strong>Enter Unreal Engine</strong></h3> Whether done virtually or physically, testing of an autonomous vehicle requires hundreds or thousands of hours of driving, and such tests generate an enormous amount of data. When Mechanical Simulation saw this trend coming a few years ago, they set about upgrading their products to meet the challenge.<br /> <br /> “The first thing we wanted to do is improve our driving simulator product with real-world traffic and road models,” says Robert McGinnis, Senior Account Manager at Mechanical Simulation. “But then, as autonomous driving came on and people wanted to incorporate physics-based sensors, we started presenting our technology as a general-purpose vehicle simulation tool for vehicle dynamics and autonomous driving engineers.”<br /> <br /> At the same time, for the software’s visual representations, Mechanical Simulation was aware that they needed to keep pace with advances in computer graphics.<br /> <img alt="blog_body_imgRikei3.jpg" height="auto" src="" width="auto" />They found that to gain more options for visualization, many customers were starting to port the CarSim and TruckSim solvers’ results to Unreal Engine along with their own car models and environments. UE4’s readily available source code, C++ support, and Blueprint visual scripting system made it an attractive choice for processing the volume of data that driving tests generate.<br /> <br /> That’s when Mechanical Simulation decided to integrate more of their product into Unreal Engine. “It was pretty obvious that we could get the information from the road and the sensors and interface tools like <a href="" target="_blank">MATLAB/Simulink</a>, and let people integrate their own active controllers,” says McGinnis.<br /> <br /> This gave Mechanical Simulation a clear path to upgrading their offerings using Unreal Engine, leaving them more room to focus on their core technology: the solver inside their products. “Early on, our software did not have a good way to build complex scenes for visualization,” says McGinnis. “One approach we took was to add an Unreal Marketplace plugin that allows a CarSim vehicle solver to be loaded into the Unreal Editor. It allows people to create scenes and scenarios using that tool all by themselves.”<br /> <br /> The <a href="" target="_blank">VehicleSim Dynamics plugin</a>, released for free on the Unreal Marketplace in 2017, gives CarSim and Trucksim users a powerful tool for generating visual representations with all the advantages Unreal Engine has to offer, such as physically based rendering (PBR) materials, realistic lighting, landscape and foliage packs, and cityscape items. <h3><br /> <strong>The anatomy of a plugin</strong></h3> The VehicleSim Dynamics plugin works by converting the solver data to Blueprints, which can then be easily queried to produce data about both the terrain and the vehicle.<br /> <img alt="blog_body_imgCarsim2.jpg" height="auto" src="" width="auto" />To make the terrain data work best with Unreal Engine, the solver arranges the terrain description into a searchable structure that can efficiently be queried by the solver. To accommodate customers with less powerful machines, the plugin also separates the graphical and physical terrain representations.<br /> <br /> “By completely separating the physical and visual representations of the simulation, we are able to run the solver on a separate machine. We then establish a communication channel back to Unreal to represent the vehicle visually,” says Jeremy M. Miller, Lead Developer at Mechanical Simulation. “It’s a little complicated, but we had to do it to connect with HIL [<a href="" target="_blank">hardware-in-the-loop</a>] systems that don&#39;t have any GPU capacity.”<br /> <img alt="blog_body_imgCarsim4.jpg" height="auto" src="" width="auto" />The team at Mechanical Simulation sees the simplicity of the Unreal Engine plugin as a huge plus for their customers. “They don&#39;t want to be running $200,000 worth of software on a single machine which requires another engineer just to help the prime engineer get his job done,” says Miller. “We preach that the simplest tool chain is the one that&#39;s going to be the most efficient.”<br /> <br /> The plugin has also proven to be useful for training, testing, and previsualization of newly designed vehicles. The team is constantly looking to improve the plugin to better serve their customers, for example recently adding an FBX converter to bring in physical terrain models that will work with the plugin. <h3><br /> <strong>Plugin use in the field</strong></h3> Using Unreal Engine has provided some additional benefits to customers working on the more aesthetic side of vehicle design. “We have customers iterating in Unreal Engine to study topics such as headlight design, and placement of sensors on different vehicles to optimize sensor coverage at the lowest cost,” says McGinnis.<br /> <br /> One such customer is <a href="" target="_blank">VERTechs</a>, a Tokyo-based company that develops AI technologies for self-driving systems. To help test autonomous vehicles, VERTechs developed a virtual town from scratch called <a href="" target="_blank">AUTOCity</a>.<br /> <br /> "With the behavior control data from CarSim, it&#39;s possible to get an extremely photorealistic video rendered using UE4 on AUTOCity,” says Yoshiya Okoyama, CEO of VERTechs. “Depth data and segmentation images are indispensable for AI learning. Both of them are created at the same time with the technologies of UE4. Furthermore, a simulation of LIDAR can be performed at the same time by creating virtual point-cloud data for assets on AUTOCity. Those parallelized simulations have already been realized on a general-purpose computer in real time."<br />   <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> Japanese company <a href="" target="_blank">Rikei Corporation</a> is also using the Carsim UE plugin to augment its offerings. Rikei develops photoreal virtual spaces for a variety of fields including vehicle simulation.<br /> <br /> “UE4 can provide us with environments that are quite close to the real ones because of the very high reproducibility of light,” says Takanori Tamura, Sales Manager at Rikei. “UE4 makes it possible to simulate the reflection of road surfaces during and after rain.” <br /> <br /> “Specific weather conditions, including the location of the sun, can be reproduced in UE4,” says Khusinov Jakhongir, Senior Engineer at Rikei. “Testing can be performed in environments that would be dangerous if they were real, such as a very steep slope and a slippery road surface.  And UE4 makes it possible to continue testing 24 hours a day, 365 days a year.”<br /> <img alt="blog_body_imgRikei4.jpg" height="auto" src="" width="auto" />As the needs of the vehicle simulation industry continue to grow, the team at Mechanical Simulation is determined to keep growing with them. “The goal is a seamless experience for our customers,” says Miller. “A test engineer might want to run a vehicle maneuver test suite with a sunny day scenario, then complicate the test with the addition of rain, and then with rain at night. He shouldn’t have to think about how to represent those things visually—he just wants to know how the car will perform. Unreal Engine has the capacity to produce these visually pleasing scenarios, pleasing to both human simulation participants and simulated vehicle sensors.”<br /> <br /> <br /> Unreal Engine excels at handling complex data sets and turning them into real-time simulation applications to train both humans and machines. To get started in this field, <a href="" target="_blank">download Unreal Engine</a>.<br />  AutomotiveBlueprintsDesignEnterpriseManufacturingMarketplaceMechanical SimulationTraining And SimulationVehicleSim DynamicsCommunityCarSimTruckSimSébastien LozéFri, 26 Apr 2019 14:00:00 GMTFri, 26 Apr 2019 14:00:00 GMT MegaGrants welcomes Magic Leap creators with hardware support Leap is the first partner to enter the $100M Epic MegaGrants initiative, offering 500 Magic Leap One Creator Edition headsets for Unreal Engine development.Today at Unreal Engine Build: Detroit ‘19, the latest stop in Epic Games’ series of bespoke events serving Unreal Engine enterprise customers, Magic Leap revealed that the company will provide 500 Magic Leap One Creator Edition spatial computing devices for giveaway as part of Epic MegaGrants. Announced last month, <a href="" target="_blank">Epic MegaGrants</a> is Epic Games’ $100 million initiative to support media and entertainment creators, game developers, enterprise professionals, students, educators, and tools developers doing amazing things with <a href="" target="_blank">Unreal Engine</a> or enhancing open-source capabilities for the 3D graphics community. <br /> <img alt="MagicLeap_DEVICEONTABLE_PR_DIGHIGHRES__1_.jpg" height="auto" src="" width="auto" /><br /> Developers building Unreal Engine spatial computing applications across entertainment, architecture, automotive, healthcare and many other industries can apply now via online submission to receive Magic Leap One devices, free of charge. There is no deadline, with grants awarded on a rolling basis and hardware available on a first-come first-served basis, based on project merit. The Magic Leap One Creator Edition is currently available on <a href="" target="_blank"></a>, in select AT&T stores, and at AT&T online for $2,295.<br /> <br /> “The Epic MegaGrants program allows developers to pursue new goals and raise the bar for what they can accomplish, and we’re glad to support that mission by making Magic Leap One Creator Edition available to creators working in the spatial computing arena,” said Rio Caraeff, Chief Content Officer, Magic Leap. “Putting these devices directly into the hands of promising developers, along with the financial grant from Epic, will help accelerate the industry and lead to new innovation.”<br /> <br /> “We’re thrilled that Magic Leap is offering their support to the Epic MegaGrants program with this generous giveaway of 500 Magic Leap One Creator Edition devices, which offer incredible opportunities to explore applications from digital humans to product design,” said Simon Jones, Director, Unreal Engine Enterprise, Epic Games. “The option to receive this hardware as part of an Epic MegaGrant means that more of the funds can be available to spend in other areas, so developers have more financial flexibility and freedom to create.”<br /> <br /> For more information and to apply, visit <a href="" target="_blank"></a>. <br /> <br /> To learn more about Epic’s Build events, watch the recap of <a href="" target="_blank">Unreal Engine Build: Munich ‘18</a>.<br /> <br />  Magic LeapEpic MegaGrantsBuild: DetroitDana CowleyThu, 25 Apr 2019 20:00:00 GMTThu, 25 Apr 2019 20:00:00 GMT in the Walls: It's All Over comes to life at the Tribeca Film Festival Studio Co-Founder and Director Pete Billington shares how Unreal Engine is helping to bring the imaginative world of an award-winning children&#39;s book to life in VR.<a href="" target="_blank">Wolves in the Walls</a> is a connective character-driven VR adventure that follows the story of Lucy, a young girl who is utterly convinced there are wolves alive and residing within the walls of her home. The audience steps into the world as the imaginary friend of the eight-year-old, who is devoted to proving her unlikely theory to be true. Though a narrative tale at heart, the story of Wolves in the Walls is reactive to the engagement of the viewer, but never waits for a particular choice. The entire experience unfolds like the vague memory of childhood afternoons, playing at a friends house, constantly adapting to the spontaneity of each moment.<br />   <div class="embed-responsive embed-responsive-16by9"><iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" src="" width="100%"></iframe></div> <br /> Building Lucy was a monumental task. Her very existence can be described as the true hybridization of game, film, and theater technique. Motion capture, animation, and procedural systems are all seamlessly woven within Unreal Engine Blueprints to make Lucy feel true to life. When you interrupt her mid-sentence she will respond naturally, hesitate briefly, then pick up right where she left off. <br /> <br /> Lucy’s attention and interest systems are also keenly aware of the audience’s spatial location. For example, not only can you connect with Lucy through eye contact as you hand her objects, she can also take things from you, but not in ways that offend social conventions. Her intention detection system predicts user behavior, creating fluid interactivity that prompts the audience to act and react like they would in the presence of a good friend. <br /> <br /> <img alt="WolvesInTheWalls_Attic.jpg" height="auto" src="" width="auto" /><br /> <br /> As the story of Wolves in the Walls develops, Lucy’s world becomes increasingly fantastical. It was not only important, but essential to all of us at <a href="" target="_blank">Fable Studio</a> to capture the feel of the original, <a href="" target="_blank">award-winning</a> <a href="" target="_blank">children’s book</a> written by Neil Gaiman and Dave McKean. To accomplish this, we modified Unreal Engine’s source code to author a complex volumetric watercolor rendering system. Unreal Engine’s flexibility was one of the primary decisions in selecting it for the project. Access to source allowed the team to accomplish a completely unique real-time look. As objects recede, they blend together, melding into a world that feels like a living painting. <br /> <br /> During the climax of the experience, the audience finds themselves within Lucy’s imagination, experiencing her interpretation of the titular wolves while they manifest as intense fever dreams of childlike drawings. Because of the abstract nature of this moment, traditional workflows had to be put aside. Our team needed a way to iterate quickly and efficiently in-engine, as it was critical to make decisions within the context of VR. Enter Unreal Engine’s <a href="" target="_blank">Sequencer cinematic editor</a>. <br /> <br /> Sequencer’s editorial tools enabled the rapid prototyping of this scene. Controlling a complex layering of sound, geometry, haptic feedback, and interactive events would not have been possible without them. <br /> <br /> <img alt="WolvesInTheWalls_Campfire.jpg" height="auto" src="" width="auto" /><br /> <br /> Wolves in the Walls is like nothing else. It represents the first step toward a fully-aware, character-driven narrative - a story that truly reacts to audience choice, as well as one that both recognizes and remembers you. Wolves in the Walls imagines a future where you will develop deep bonds and memories with a character over many years, in a relationship that will persist across all forms of media. We cannot share that future with you all soon enough.<br /> <br /> Fable is pleased to premiere Wolves in the Walls’ second chapter for attendees of <a href="" target="_blank">Immersive</a> at the Tribeca Film Festival this weekend, along with its well-received debut chapter, which was unveiled at the 2019 Sundance Film Festival. All three chapters of The Wolves in the Walls are planned for release on the Oculus store at a future date. <div style="text-align: center;"><img alt="WolvesInTheWalls_Lucy.jpg" height="auto" src="" width="auto" /></div> Fable StudioTribeca Film FestivalWolves in the WallsGamesVRPete Billington, Co-Founder and Director of Fable StudioThu, 25 Apr 2019 17:00:00 GMTThu, 25 Apr 2019 17:00:00 GMT