The performance analysis for Diablo III: Ultimate Evil Edition has surfaced and it's caused quite a ruckus for a couple of reasons. First, Diablo III wasn't running 1080p on the Xbox One due to performance issues. The GPU just couldn't handle it. This issue caused a lot of backlash from gamers who wondered how the Xbox One would be able to make it five years into the eighth gen if it can't handle a previous-gen game at native 1080p and 60fps. Secondly, it points out another huge problem for the Xbox One: a weak GPU.
The lack of being able to hit 1080p for Diablo III actually forced Microsoft to step in and enforce the developers to make some modifications so the game would hit 1080p. Blizzard's production director, John Hight told Eurogamer that...
"We did find it challenging early on to get it to 1080p. That's why we made the decision to drop to 900. That's what we demoed and were showing around E3 time. And Microsoft was just like, 'This is unacceptable. You need to figure out a way to get a better resolution.' So we worked with them directly, they gave us a code update to let us get to full 1080p."
Some of you might be wondering why Microsoft would find it unacceptable, but remember that 1080p and 60fps is a selling point; it's a marketing tactic for eighth-gen that says that the console is worth the price of entry. I mean, it's also common sense: would you pay $500 for a console that can't play games at 1080p and 60fps when you could pay $400 for a console and know for a fact that it can play games at 1080p and 60fps?
Anyone can make excuses about price differences and performance not being important, but as I mentioned in an article last year: if there were two GPUs and one was $100 cheaper with higher performance yields, there wouldn't even be a discussion.
In this case, however, the 1080p for the Xbox One does come at a cost, and it's slightly lower shadow maps – as noted in the performance analysis article by Digital Foundry – as well as a scale back on some of the effects, though not enough to make up for the frame drops when things get busy with four players on the screen all lobbing mad magic at mobs.
You can see the frame-rate drops in action with the video below, courtesy of Digital Foundry.
It's interesting because the developers at MixedBag felt as if frame-rate was definitely a priority over resolution. One of the developers mentioned that 60fps was a must, no matter what.
Richard Leadbetter from Digital Foundry also questions Microsoft's decision to up the resolution at the cost of performance, writing...
“Hight's comments at Gamescom last week, along with reports that Microsoft sent engineers to Bungie to aid in pushing Destiny to 1080p, suggest that the platform holder is keen on reducing resolution differences in multi-platform games where possible. But if frame-rate takes a hit as a consequence, is this necessarily the best course of action?”
It does call into question if we're going to start seeing this occur as a regular scenario for Xbox One games that can't make it to 1080p. Instantaneously I'm reminded of The Witcher 3 having troubles hitting 1080p – will CD Projekt RED tone down the graphics at the risk of having performance and graphics comparisons rip the Xbox One version to shreds, or will they attempt to maintain high-fidelity to achieve 1080p at the expense of frame-rate stability?
This is going to be an ever-present issue for each game released on the Xbox One, and it's not going to go away, because optimized dev kits can't fix hardware ceilings.