Hellblade was originally supposed to be one of the early games to launch with the PS4, but it was pushed back, and pushed back, and pushed back some more. The game is now scheduled to release on PC and PS4 sometime in 2017, but it's actually good news for the gaming industry.
This statement alone probably doesn't make a whole lot of sense by itself, but it's actually a significant step forward for the technological advancements that will help to drastically cut costs when it comes to producing games.
In earlier developer diaries Ninja Theory explained that they were using a very small team to work on Hellblade: Senua's Sacrifice. This tiny team would focus on telling a very compelling and visually captivating story. In order to tell that story they wanted to make sure that the performance was raw and authentic, and they needed an actress to help them tell that story.
However, they planned on expanding Hellblade's tale using a flexible workflow environment. This means that they didn't just want to go through the typical performance production phase of development and then focus on the gameplay mechanics. They wanted to organically capture the mood and the emotion in real-time and see how it would look in a final render pass without having to go through weeks or months of transitioning the motion and performance capture data from a separate production studio through a team of riggers and animators and then through another team to clean it all up.
So what does this have to do with the gaming industry at large? Well, they decided to tweak the Unreal Engine 4 and feed Enlighten's lighting middleware and Cubic Motion's facial pipeline suite through Epic's game engine -- in real-time and in-engine -- to capture a performance and then see it rendered right there on screen the way it would look in the final game. This meant that the Unreal Engine 4 was capturing real motion performance data without requiring a separate pipeline to feed the data through. It was all happening right there with the actor, design engineers, and sound crew on site without requiring a separate production studio.
This drastically speeds up the process of performance capture because now designers can see exactly what the performance will look like in the actual game and tweak it as necessary. This technology could be used to help with all future performance capture projects utilizing the Unreal Engine 4. They briefly discuss it in the video below.
So while the delay is bad because gamers have to wait longer for Hellblade to arrive on PC and PS4, the delay is good in the sense that if they can standardize this workflow and create an easy-to-implement API plugin out of this to allow other developers to easily do performance capture in-house without the need of outsourcing cinematics or allowing them to do more authentic facial capture outside of CG sequences, then this could be huge for the way gaming studios move forward with game design. Definitely exciting times ahead.