In the war of resolution and frame-rates, it's undisputably accepted that 60 frames per second is king. You can't deny it, you can't argue it and if you're playing on a potato in a multiplayer game where you're running half the frames of your online opponents, you're likely to get your butt pounded worse than those fresh fish in the Penitentiary movies.
Still, Xbox One's developmental design director, Boyd Multerer, had tons of wisdom and insight to share about the Xbox One in his lengthy interview on Total Xbox.
As pointed out by Gear Nuke, Multerer addressed the resolution war that the Xbox One is losing worse than Ukrainians against Putin's mob squad. However, Multerer redirects the war effort to a different front... the frame-rate front...
“Part of it is the obvious one where everyone’s still getting to know this hardware and they’ll learn to optimise it. Part of it is less obvious, in that we focused a lot of our energy on framerate. And I think we have a consistently better framerate story that we can tell.”
Pardon me, but didn't Dead Rising 3 run at a resolution that was worse off than Ghana's economic growth and frame-rate dips that manage to make Tanzania's GDP look impressive?
Also, if I were Boyd, I would be too ashamed to even bring up resolution and frame-rate comparisons on the Xbox One, especially when titles like Call of Duty: Ghosts and Battlefield 4 run worse on Microsoft's system, before and after the PS4 versions were patched up to their full potential.
And even ignoring resolution, did Boyd so quickly forget about Tomb Raider: Definitive Edition running at 1080p on both consoles but the PS4 managing to run the game at 60fps and the Xbox One settling for the welfare-rate of 30fps? The frame-rate history doesn't even fall in favor of the Xbox One.
Sorry to say (but really I'm not) the resolution and frame-rate of the Xbox One puts it in a category all its own: the help a console foundation category.
Donate a few pixels to your nearest Xbox charity center; we all know they could use a few million.
Anyway, Multerer does talk about something that bring things back down to reality: memory bandwidth.
We've mentioned in great length about the benefits of the Wii U's potential memory bandwidth to excel above and beyond the limitations that third-party devs haven't been exploring, and Multerer mentions the exact same thing for the Xbox One, saying...
“The GPUs are really complicated beasts this time around. In the Xbox 360 era, getting the most performance out of the GPU was all about ordering the instructions coming into your shader. It was all about hand-tweaking the order to get the maximum performance. In this era, that’s important – but it’s not nearly as important as getting all your data structures right so that you’re getting maximum bandwidth usage across all the different buffers. So it’s relatively easy to get portions of the GPU to stall. You have to have it constantly being fed.”
This actually makes a lot more sense, both structurally and realistically, compared to all that other talk about the cloud and magical unicorn stacks.
Whether or not we'll see this “Banana Surprise” apply to third-party titles seems about as likely as third-party developers spending time to make use of the Wii U's secret sauce.
Still, Microsoft doing something to bring the Xbox One up to parity with the PlayStation 4 is better than continuing to let multiplatform titles release in a state where the PS4 version looks and runs like a high-class business executive and the Xbox One version looks like some kind of third-world tatterdemalion.
Too bad the games that will make use of Microsoft's "Banana Surprise" won't be hitting the market until a few years from now.