Wii U's Memory Bandwidth, GPU More Powerful Than We Thought?
So what does the bandwidth of the Wii U's eDRAM have to do with anything? According to some die-harders out there, the Wii U should be capable of tessellation displacement since it has the bandwidth to handle those procedures with ease, thanks to the inclusion of NEC's proprietary eDRAM die for the Wii U. The importance of CPU/GPU fetch timing in correlation with tessellation is highlighted in a piece by Matthias Niessner in a thesis on hardware tessellation capabilities.
All of that is to say that perhaps the Wii U's design infrastructure lends itself toward forward-thinking scaling than we all thought before, despite a lot of Nintendoom talk from some developers.
Previously we spoke with Unity Technology's David Helgason about the Wii U's GPGPU scaling properties for things such as shader model 4.0 and Unity's ability to tap into the advanced features of DirectX 11. Helgason acknowledged that while the Wii U doesn't have 1:1 DirectX 11 capabilities, they would make it possible for developers to tap into equivalent DX11 exclusive features.
Interestingly enough, it wasn't but a short while ago that Teku Studios acknowledged that they would be utilizing the Unity Engine to bring DX11 equivalent features to the Wii U in their game, Candle.
When contacted about the Wii U's capabilities for direct tessellation displacement, however, Teku Studios denied to comment due to being under NDA regarding Nintendo's hardware.
DX11 racer Project CARS is also making use of multi-threaded support and high-end GPU features for the Wii U, but once again, when pressured as to whether advanced forms of tessellation displacement would be used, the team at Slightly Mad Studios declined to comment.
We did, however, get an off-the-record comment when asking about the Wii U's access to shader model 5.0, from a developer working on an upcoming Wii U title, who – under anonymity, of course – stated that...
“The answer is no. [The Wii U can't] use [shader model 5.0] because it's Microsoft exclusive.”
He did go on to say that...
“You can do some stuff... tessellation [is possible]. You can do that on the Wii U.”
He also mentioned that direct access to these features through an API like DirectX isn't possible, but there are workarounds to get equivalent effects and results using the Wii U's GPGPU.
It's been hotly debated that even though the Wii U is capable of these things, the wattage cap prevents it from scaling, or utilizing high-end features in big, graphically demanding games.
However, software engineer Francisco Javier Ogushi Dominguez has argued that the Wii U has a lot more room to scale than we all originally thought.
According to Dominguez, he noticed that the Wii U has a class A external power supply. Class A means energy efficient and low on consumption, as noted by EDN, and outlined in the Government Energy rating.
The Wii U's power has been estimated on average at 33 watts when under load, according to Eurogamer.
Dominquez stated that...
“...if the [Wii U] has a power supply with efficiency of more or equal to 85% then we are talking about 62 watts or more for the [Wii U] out of those 73 watts labeled at [the] power supply and the rest is dissipated as heat.”
He notes that this is already accounting for watts consumed for the games, disc drive, USB and SDHC.
The most striking of what he had to say was this gem right here...
“I am sure you have heard the rumors about the e6760 being the base of the [Wii U] just to tell you, the e6760 has a performance of 16.5 gigaflops per watt, so if the [Wii U's GPGPU] has a similar efficiency, then if at least we consider 10 watts out of those 18.33 watts (or maybe more) we would be speaking about 165 gigaflops that many game[s] waste due to being quick ports.”
Well doesn't that shed a bit of light on the situation?
This might also explain why Nintendo's first-party games look and run so well at high resolutions and stable 60 frames per second. They're tapping into the system's actual capabilities. Now the real question is: will third-party developers start tapping into the system's actual capabilities?
Back to top