Why Two Disney Films Rendered Scenes In Unreal Engine 4

K-2So In Unreal Engine 4

(Image credit: Disney)

We've finally come to the point in the history of our tech and entertainment industries where they're cross pollinating with fascinating results. One such crossover includes Disney making use of Epic Games' Unreal Engine 4 for two of their movies.

According to Develop-Online, Tim Sweeney, the CEO of Epic Games, explained during a GDC 2017 keynote speech that the droid K-2SO was rendered in the Unreal Engine 4 and composited into the final shots of Rogue One: A Star Wars Story. It's also mentioned that a scene from Finding Dory was also rendered in real-time in the Unreal Engine 4 using Pixar's USD format, known as a Universal Scene Description. Epic and Pixar have been working together to get the Unreal Engine 4 to natively read USD format for future use.

Tim Sweeney stated that it was the power of the Unreal Engine 4 that made it possible to render the Disney movie scenes in the engine, saying...

Unreal Engine is the only way to achieve these final, photorealistic pixels in real time. It's the only renderer that does that.

It is true that the Unreal Engine 4 has just about everything you need within its core toolset to render photorealistic graphics, assuming you have high enough fidelity assets and utilize a proper lighting setup. However, it is stretching it to say that the Unreal Engine is the "only way" to achieve those kind of graphics in real-time. It's a severe undercutting and undermining of the achievements made by Unity Technology.

The Unity 5 can also render, in real-time, photorealistic visuals. However, it doesn't quite have all of the tools available right from the outset, and you will need to download some additional plugins and asset tools, such as volumetric fog effects, real-time physically based area light modules, planar reflections and certain shader effects in order to achieve that photorealistic look, but it's all possible and can be done in real-time, just like the Unreal Engine 4. They demonstrated this with the Adam tech demo a while back and later released some of the specialized tools and assets they used to achieve that level of realism over on the Unity 3D blog.

Nevertheless, both engines are amazing and allow you to do some cool stuff. For instance, thanks to the Unreal Engine 4's ability to render performance capture in real-time with instantaneous CGI output, it's a highly valuable asset for both games and movies. You can literally see what the final performance will look like right there without having to send it to the animation, cinematic or art crews to clean it up in post.

In fact, the exact same way that they achieved such realistic results with the droid in Rogue One: A Star Wars Story is what Ninja Theory is using for their upcoming game Hellblade: Senua's Sacrifice. They've worked with facial capture middleware companies to inject plugin data directly into the real-time renderer of the Unreal Engine 4 so that an actor's performance can be rendered to the screen in real-time, in-engine. It allowed Ninja Theory to seamlessly transition from the game to the cinematics all within the exact same runtime environment. How well this will work in execution remains to be seen,

Even still, Sweeney is definitely right insofar that the Unreal Engine 4 is an amazing tool to achieve such a high level of fidelity that Hollywood studios are using it to render scenes and characters for their movies. It also helps that it's one of the most powerful and cost-effective design tools on the market at the moment.

Will Usher

Staff Writer at CinemaBlend.