Wii U Hardware Investigation

eyeofcore

Member
Oct 1, 2013
50
0
0
Wii U CPU:
- Tri Core IBM 45nm PowerPC750CL/G3/Power7 Hybrid
- Core 0; 512KB Core 1; 2MB Core 2; 512KB L2 Cache
- Clocked at 1.24Ghz
- 4 stage pipeline/not Xenon-Cell CPU-GPU hybrid
- Produced at IBM advanced CMOS fabrication facility
- eDRAM is L2 cache embedded in CPU(Power7 memory implementation)

*Wii CPU core was 20% slower than Xbox 360 core, since Wii U CPU is modified/ehanced and clocked 65-70 percent higher thus two Wii U cores should be on par/equal exceed all three Xbox 360 cores or if all three cores are used then Wii U CPU is 50+ percent stronger/faster than Xbox 360 processor and faster than PlayStation 3 processor. http://gbatemp.net/threads/retroarch-a-new-multi-system-emulator.333126/page-7#post-4365165

*X360 Xenon: 1879.630 DMIPS*3 = 5638.90 DMIPS @ 3.2 GHz vs Wii U Espresso: 2877.32 DMIPS*3 = 8631.94 DMIPS @ 1.24 GHz

*X360 Xenon 32-40 stage pipeline vs Xbox One/PlayStation 4 Jaguar 16 stage pipeline vs Wii U Espresso 4 stage pipeline

*X360 Xenon in-order 3 Execution Units vs Xbox One/PlayStation 4 Jaguar out-of-order ? execution units vs Wii U Espresso out-of-order 15 execution units

*X360 Xenon 1MB shared L2 Cache vs Xbox One/PlayStation 4 4MB shared L2 cache vs Wii U Espresso 2MB/512KB/512KB L2 cache

Wii U CPU is next generation compared to Xbox 360 and people that say otherwise should just shut up, be ashamed, have a seat, accept the facts and deal with it and here is the proof; http://systemwars.com/forums/index.php/topic/112794-no-freaking-way-40-stage-pipeline/

*Since Xbox 360 has 32 to 40 stage pipeline and PlayStation 3 also has 32 to 40 stage pipeline thus it has severe penalties in terms of useable performance while Wii U with 4 stage pipeline is in comparison far more efficient thus more can be used from it and imagine how bad Xbox 360/PlayStation 3i s bad in comparison to Wii U;

http://www.youtube.com/watch?v=w9VWRB07yqc

*Wii U CPU codenamed Espresso could actually be three to four times faster than Xbox 360 Xenon since Xenon has less cache, way less execution units and way too long stage pipelines that is 8 to 10 times longer and this is awful for a lot of tasks and it is in-order versus Espresso that is out-of-order thus better for predicting code, better for AI, AI path finding and so on...

- Dual Core ARM Cortex 8 for background OS tasks clocked at 1Ghz with 64KB L1 cache per core and 1MB of SRAM as L2 Cache, evolution of "Starlet" chip
- Single ARM9 "Starlet" core for Backward Compatibility, rumored to have higher clocks

Wii U Memory:
- DDR3 2GB, 1GB for OS, 1GB for Games
- eDRAM 32MB VRAM + 4MB Gamepad + 3MB CPU L2 cache
- Clocked at 550Mhz
- eDRAM act's as "Unified Pool of Memory" for CPU and GPU thus practically eliminating latency between them

*Wii U's DDR3 RAM bandwidth has theoretical maximum of 51,2GBps since it has four 512MB chips and not one large 2GB chip so anyone thinking that its maximum bandwidth is mere 12.8GBps is a tech illiterate. Xbox 360 had theoretical maximum of 22.8Gbps though it has a bottleneck that turns it down to mere 10Gbps thanks to poor FSB.

*Xbox 360 has GDDR3 RAM that is bottlenecked by Xenon's FSB so it can not saturate theoretical maximum of 22.8Gbps since FSB can handle 10Gbps and latency of GDDR3 is atrocious compared to DDR3 RAM in Wii U, latency is very important for the CPU since the lower the latency the faster transfer between CPU/GPU and RAM and PlayStation 4 will have similar issues as Xbox 360 when comes to latency since GDDR5 is successor to GDDR4 that is successor to GDDR3 and all of them have higher latency than DDR3.

Wii U GPU:
- VLIW 5/VLIW4 Radeon HD 5000/6000 40nm
- DirectX 11/Open GL 4.3/Open CL 1.2/Shader Model 5.0
- Supports GPGPU compute/offload
- Customized, Using custom Nintendo API codenamed GX2(GX1 Gamecube)
- Clocked at 550Mhz
- Produced at TSMC advanced CMOS 40nm
- Uses 36MB eDRAM as VRAM
- 4MB of eDRAM is allocated for streaming feed for gamepad
- GPU is customized and according to modification of GPU's in their previous consoles we can presume that Nintendo won't waste any mm^2 to unneeded features and customized to fit own needs.

*Eyefinity is present on AMD cards since Radeon HD 5000 so it is at least Radeon HD 5000 w/ Terascale 2

*If it is Radeon HD 4000 series and has 320SPU's then it is Radeon HD 4670 55nm though since Wii U uses Eyefinity and GPU is 40nm thus being Radeon HD 4000 series is invalid.

*Radeon HD 6000 was released in Q3 2010, final Wii U silicon was finished in Q2 2012 and released Wii U in Q4 2012. Looking at gap between E3 2011 and final Wii U silicon Q2 2012 plus Radeon HD 6000 evolved from Radeon HD 5000 that evolved from Radeon HD 4000 thus I presume switching to a newer though similar GPU and architecture was not a problem and all of these GPU's were produced at TSMC's fabs.

*Wii U from E3 2011 and its development kits had Radeon HD 4850, it is rumored that Wii U's newer development kit replaced 4850 with modified/customized/cut down Radeon HD 6850.

*Radeon HD 6850 has 1.5Tflops of performance at clock of 775Mhz with 1GB of GDDR5 VRAM and bandwidth of 130Gbps

*GPU in Wii U is clocked at exactly 30% lower at clock of 550Mhz and if it has 1/3 of SPU's thus it has 0.352Tflops. 36MB of eDRAM with 70-130/275/550Gbps bandwidth, 2-player co-op as for example in Black Ops 2 Zombie Mode is using Eyefinity(?) to stream two different in-game images/views.

*Since eDRAM in Wii U's GPU codenamed Latte is embedded thus its theoretical maximum bandwidth of 275Gbps to even 550Gbps; http://www.ign.com/boards/threads/official-wii-u-lobby-specs-graphics-power-thread.452775697/page-203#post-481621933

*90nm 16MB eDRAM can do 130Gbps bandwidth, 45n 32MB eDRAM in WIi U should do the same plus since CPU's Cache and GPU uses eDRAM so latency is much much lower and AI can be offloaded to GPU, when embedded into GPU then it should do 275/550Gbps

Wii U Note;
- Can’t use DirectX 11 because of OS, only Open GL 4.3/Nintendo API GX2 that is ahead of DX11
- Easier to program, no multiple bottlenecks causing issues as on Xbox 360 and PlayStation 3
- Efficient hardware with no bandwidth bottlenecks
- Launch games and ports use 2 cores and only a couple of ports/games use 3rd core to a decent degree
- Wii U CPU has much higher operations per cycle/Mhz than Xbox 360/PlayStation 3, it is unknown if it is faster
- Wii's CPU core was slower
- Minor Power7 architecture implementation involving memory allows shave off 10 or more percent of texture size without loss of quality(special compression or something)
- Wii U power consumption at full system load is 30-35 watts(highest load ever possible in its current state)
- Wii U's PSU/power brick is rated 75 watt and has 90% efficiency thus it can handle 67.5 watts
- Wii U's Flash storage, RAM, USB ports, motherboard, fan and small secondary chips consume around 5 to 10 watts in total when fully stressed/used
- Wii U's SoC(CPU and GPU) estimated maximum power consumption is 20 to 25 watts
- Supports 4k displays, native 4k via HDMI 1.4b (possible 2D 4k games?)

*It is maybe in fact most efficient performance per watt in the world in terms of 45/40nm chips/SoC's

*Wii U's Power Brick/PSU is rated 75 watts and has efficiency of 90% so it can handle at max 68 watts without serious degrading and since Wii U consumes 30 watts there is available 28 watts though I would dare only to bump power consumption to 60 watts or 30 additional watts if I could increase performance.

*Since maximum power consumption of Wii U is 40 watts and from my calculation GPU consumes roughly mere 10 watts then Wii U's CPU could consume 15 to 20 watts and rest of system around 10 to 15 watts depending on how much Wii U's CPU consumes since it is an unknown factor. I am not counting any possible customizations on the Wii U's GPU, this is all rough estimations and we don't know the whole picture. Wii U is a beast when looking at what nm process was built on and probably most efficient machine on that nm process in the world.

In case you are wondering why some games run on Wii U worse compared 7th generation consoles then I have simple explanation; Wii U hardware is noticeably different than Xbox 360's or PlayStation 3's because their processors and not true CPU's since they can operate GPU related tasks well compared to Wii U that is primarily a CPU. Another reason is that developers don't put spend resources and time on porting the game for Wii U thus it lacks proper optimizations and adaptions of their game engines to Wii U's hardware. Even though some ports perform lower than on 7th generation consoles, in most cases run on higher resolution and/or at Native 720p/HD.

Most if not all 3rd party launch games were made on older Wii U development kits that had 20 to 40% lower clocks than final development kit that Nintendo released near the launch so developers did not had much time to adapt their games to the final devkit thus games were running poorly also games like Darksiders 2, Warriors Orochi 3 Hyper, Batman Arkham City and other were using only one to two cores while third was not used at all or was barely used to aid performance of the game involving CPU related tasks. Since most ports are from Xbox 360 and/or PlayStation 3 versions of the games there are sure to be incompatibilities since Xbox 360 and PlayStation 3 Processors do CPU and also GPU tasks plus GPU's, RAM/Memory, Latency and other factors are different than on Wii U thus optimizations and adaptations is needed though cheap ports as always tend to be a train wreck. Don't you agree?

Xbox 360 and PlayStation 3 will be supported for next three years and this is in a way a negative thing since it can really hold back performance of games on Wii U if those games are going to be mostly port from Xbox 360 and PlayStation 3 since the architecture on these two consoles are vastly different compared to Wii U. We may only see Wii U shine after three years when Xbox 360 and PlayStation 3 stop being supported, those two consoles will hold back development and time spent on Wii U.

Wii U may not be a "leap" as Xbox One or PlayStation 4 though it is a leap over Xbox 360 and PlayStation 3 when looking it very roughly and when taking into account all the implementations, features and "tricks" then the gap is even bigger. Wii U has more embedded RAM than Xbox One that has 32MB of eSRAM while Wii U has 36MB of superior eDRAM for GPU also eDRAM blows away GDDR5 in PlayStation 4 in terms of bandwidth if I am correct? 130/275/550Gbps on Wii U versus 80Gbps on PlayStation 4?

We need to take in consideration that Wii U's CPU Espresso has a certain implementation from Power7 architecture that allows usage of eDRAM and we know that CPU in Wii U has total of 3MB of eDRAM as L2 Cache and it could also use main eDRAM pool as L3 Cache and maintain connection with GPU thus creating HSA/hUMA like capabilities and reducing latency by far between CPU and GPU communications and data transfer.

Wii U's GPU was Radeon HD 4000 series in very first alpha development kit and that was Radeon HD 4850 that was 55nm and not RV740 40nm and by that time of first unveiling of Wii U there was Radeon HD 6000 on the scene for nearly a year or now almost three years so Nintendo could easily switch to Radeon HD 6000 series since it is basically evolution of Radeon HD 5000 that is refinement of Radeon HD 4000 series and further it is supported by using Eyefinity features on Wii U's gamepad that was proven by Unity demo on Wii U and Call Of Duty Black Ops 2 when in co-op in zombie mode also Wii U can stream up to two images at gamepad though it hasn't been used and maybe it will be used in near future.

Also people seem to forget about power consumption of dedicated GPU's versus embedded into motherboard ones that naturally consume less plus Wii U's GPU uses eDRAM that consumes 1 to 2 watts max compared to GDDR5 that consumes around 9 or more watts per 2Gb. GPU in Wii U is embedded thus it does not use PCI-E, additional PCB and chips plus has embedded eDRAM in it so consumption of eDRAM could be negated thus power consumption of eDRAM could be reduced to a literal zero.

Lets take for example Wii U's GPU die size and Radeon HD 6970 die size and assume that Wii U GPU is VLIW 4 based Radeon HD 6000 series GPU and not VLIW5. Radeon HD 6970 consumes 250 watts maximum and has die size of 389mm^2 and 2Gb of GDDR5 and is clocked at 880Mhz. Lets reduce it to 320 SPU's that will use 80mm^2 thus consumption is lowered to roughly 83 watts then we remove 2GB of GDDR5 and it goes down to 70-73 watts, now lets lower down the clock of it from 880mhz to 550mhz so that is roughly 35% lower clocks and when clocks are 25% then power consumption is cut in half since it is 35% so the GPU consumption goes down from 70/73 to roughly 14-15 watts without being embedded and we could easily shave off couple of more watts and it would most likely go down to 10 watts.

We can not really compare Wii U's GPU to any off-shelf/standard dedicated GPU, since some features that are found in regular dedicated GPU's are not needed when embedded thus with some minor modifications I can see Wii U having 384 SPU's thus at 550Mhz should have 420 Gflops rather than 320 SPU's if it was a standard dedicated GPU with die size of it. If Nintendo was to do drastic modifications they could even reach 500SPU's and very close to 600gflops. One of homebrew hackers counted 600 shader's so I am wondering if Nintendo is maybe using one technology that AMD has never used that was from ATI that they also never used and they call it "high density" that is going to be used in AMD's Steamroller cores, from what information AMD released... "High Density" can increase density of the chip by 30% and reduce size by 30% and reduce power consumption.

*I won't link most of this information's though I can assure you I did a lot of research and digging on the internet, collecting bits and pieces and then putting the together into one picture thus I had felling that is called "a ha!" or "EUREKA!!"
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
Because of this I went out and purchased 5 Wii Us. Thank you so much, they are so fast I put wheels on one of them and I can do burnouts on the highway. All my Xbox friends are so jelly they can't do burnouts on the highway with their Xbox 360s. They were like, why is your Wii U so fast and I was like: because its a next gen CPU!

They were like oh damn, ben you are so cool lets hangout. But i couldn't because all the babes were so impressed with my Wii U that they just wanted to smash all day. Thank you so much eyeofcore.
 

eyeofcore

Member
Oct 1, 2013
50
0
0
Because of this I went out and purchased 5 Wii Us. Thank you so much, they are so fast I put wheels on one of them and I can do burnouts on the highway. All my Xbox friends are so jelly they can't do burnouts on the highway with their Xbox 360s. They were like, why is your Wii U so fast and I was like: because its a next gen CPU!

They were like oh damn, ben you are so cool lets hangout. But i couldn't because all the babes were so impressed with my Wii U that they just wanted to smash all day. Thank you so much eyeofcore.

Ok.. Should I laugh or judge you? IDK... Sarcasm?

No need to thank me, I love researching stuff and since a lot of so called gamers are hating Wii U because it is Nintendo's home console and they all talk and sprout obvious BS and shows how they are FOS. I don't know about Wii U's GPU much since I can only speculate though at least evidence and research supports that Wii U CPU codenamed Espresso is much faster so I assume ports used one to two cores while I know Batman Arkham City used two cores though all of those games were ports of Xbox 360 versions and some of them were ports of PlayStation 3 versions.

It is crazy how everyone love to take a dump on Nintendo and Wii U. Wii U reminds me of Dreamcast because EA's flip flopping, main reason why Wii U is not having huge third party support is not because of architecture that is actually far simpler than Xbox 360 and PlayStation 3 nor the performance of the hardware but because Nintendo denied extreme DRM measures that 3rd party developers wanted thus Wii U was trashed and then after Microsoft announced DRM in Xbox One and the backlash of the customers was in epic proportions thus they saw that it won't work and Sony agreed with same thing as Xbox One though after they saw the backlash they did an 180 degrees to save their face. That is the truth and the fact because I won't be blind sighted by 3rd party and Sony's Ninja smoke bombs because business is also politics and we all know that in politics there are always conspiracies and deals under the table where only those who have deep rots can know about it though those who do research will find out and uncover it.
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
I've never really had a problem with nintendo hardware. Not since they learned how to program their way around the SNES slowdown back in 1992. The problem is the controls. That wavy wand thing is just not precise at all. It jsut doesnt work. I'm not convinced it will ever work. I'm not saying xbox and PSX are any better. The analog sticks on those consoles are just pure garbage, and always have been. The N64 analog stick remains to be light years beyond all these new consoles, and its a almost 20 year old design. For frack sakes why? No one really seems to give a damn that the analog sticks are so crappy. But whatever, I simply will not play any of these consoles if they are not going to give me a better controller than what I had 17 years ago.
 

eyeofcore

Member
Oct 1, 2013
50
0
0
I've never really had a problem with nintendo hardware. Not since they learned how to program their way around the SNES slowdown back in 1992. The problem is the controls. That wavy wand thing is just not precise at all. It jsut doesnt work. I'm not convinced it will ever work. I'm not saying xbox and PSX are any better. The analog sticks on those consoles are just pure garbage, and always have been. The N64 analog stick remains to be light years beyond all these new consoles, and its a almost 20 year old design. For frack sakes why? No one really seems to give a damn that the analog sticks are so crappy. But whatever, I simply will not play any of these consoles if they are not going to give me a better controller than what I had 17 years ago.

So you are crying about analog sticks and thus you don't buy those consoles? What is wrong with your head, did you fell on your head when you were born? I mean, cmon... Because of one silly thing, if it were for the games then I would have understand, but this is just... *facepalm*
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
*X360 Xenon in-order 3 Execution Units vs Xbox One/PlayStation 4 Jaguar out-of-order ? execution units vs Wii U Espresso out-of-order 15 execution units

Hilarious. You can't make this stuff up.

.. okay, apparently you can.
 

GalaxyWide

Member
Sep 14, 2012
30
0
61
So you are crying about analog sticks and thus you don't buy those consoles? What is wrong with your head, did you fell on your head when you were born? I mean, cmon... Because of one silly thing, if it were for the games then I would have understand, but this is just... *facepalm*

Because of the games? How many games that actually matter (eg., AAA titles that are commonly played by the general public) are PC only? Or maybe you meant the other way around? I cannot think of a better reason NOT to use consoles than their horrid input devices, it's why I only game on PC where I can get real control with a mouse. Why should I buy something that is harder to use, and single purpose to boot, when I can have a multi-purpose machine that does everything better? The only reason I can think of to buy a console is Halo (still mad they stopped making them for PC) or local multiplayer use.

Also, it's a bit dickish to go calling names and personally insulting people.
 

eyeofcore

Member
Oct 1, 2013
50
0
0
Hilarious. You can't make this stuff up.

.. okay, apparently you can.

You can contribute, how many execution units does Xenon have in total?

We know Xenon has 3 cores and 6 threads, right? PlayStation 4 has 8 cores though I don't know how many execution units it has and Wii U has 3 cores with 15 execution units in total.

Because of the games? How many games that actually matter (eg., AAA titles that are commonly played by the general public) are PC only? Or maybe you meant the other way around? I cannot think of a better reason NOT to use consoles than their horrid input devices, it's why I only game on PC where I can get real control with a mouse. Why should I buy something that is harder to use, and single purpose to boot, when I can have a multi-purpose machine that does everything better? The only reason I can think of to buy a console is Halo (still mad they stopped making them for PC) or local multiplayer use.

Also, it's a bit dickish to go calling names and personally insulting people.

Where did I called names? Where? What a failure...

Your existence showcases your failure as a gamer, so you will cry about controls being inferior even though it does the job rather well? So that is the reason why you miss some exclusives because of one simple thing, really? You came here dickish and then you accuse me of being just being that while you are in fact acting like that, dickish and like a prick...

Nice, already couple of people intentionally derailing the thread, why don't you get banned?!
 
Last edited:

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
You can contribute, how many execution units does Xenon have in total?

We know Xenon has 3 cores and 6 threads, right? PlayStation 4 has 8 cores though I don't know how many execution units it has and Wii U has 3 cores with 15 execution units in total.

There is public documentation for Cell and PowerPC750. The PPE in Cell is close to the same as XBox 360's Xenon chip, with a few differences in the SIMD part. You can find all the information you need about Jaguar from AMD's optimization manual. I could go through and list them all for you (I've looked at them enough to know you're way, way off) but there's no real point because number of executions units is a terrible proxy for performance. It's meaningless without knowing anything about what type of operations those units perform, how wide the front end and schedulers are, and a bunch of aspects that contribute to how well those units can be kept fed.

I really have to marvel at your logic though, you go from saying XBOne/PS4 has 4 units but Wii U has 15 because it's 5 * 3 cores.. so you're basically saying that XBOne/PS4 has half a unit per core, right?

The rest of your post is just full of misinformation or poor arguments, a couple of the most obvious examples:
- Your FLOPs comparison is totally crazy. Wii U's CPU cores can execute 2 fmadds per cycle, so 4 FLOPs/clock. XBox360 can execute 8. So that's 3 * 4 * 1.24 = 14.88 GFLOP/s for Wiiu vs 3 * 8 * 3.2 = 76.8 GFLOP/s for XBox 360. It's well known that Wii U is lacking in raw floating point SIMD throughput.
- The stuff about Wii being only 20% slower than PS3's PPE is based on measurements of a homebrew emulator ported to both systems. This is not even remotely representative of game performance, when you consider that they used a compiler much worse at generating code on PS3 (GCC vs IBM's compiler), were porting something written for PCs, was running an application that was poorly suited for PS3 (emulators can be very branchy, SNES9x blows through cache with its software rendering), didn't use any handwritten SIMD code as PS3 games often do, and as far as I'm aware was running on PS3 Linux. He even admits a couple posts later that the situation is different for game developers who optimized for the platform.
- The conclusion that Wii U's CPU can be 3-4 times faster because of random other numbers you listed is incredibly arbitrary
- Dual core Cortex-A8 doesn't exist for anything because the design doesn't support multicore coherency, and it's stunning to think Nintendo would use a Cortex-A8 for some little embedded peripheral processor when they're still sticking their handhelds with ARM11s. At any rate, whatever this processor is has zero bearing on game performance, there's zero evidence about anything used for "background OS tasks." There's also strong evidence that there's no 1MB of SRAM dedicated anywhere around where this CPU is believed to be. The 1MB of SRAM for mem0 is totally different.
- How could eDRAM embedded in the GPU practically eliminate latency between it and the CPU? Another comment which makes no sense. I think you meant it reduces latency of memory accesses from the GPU for buffers stored on eDRAM instead of main memory.
- Eyefinity lolz, I love this circular reasoning "since it's Radeon HD 5000 it must be Eyefinity, since it must have Eyefinity it can't be based on Radeon HD 4000." Here's a factoid - Nintendo could have this customized however they want. And streaming content to the controller is not in any way related to Eyefinity.
- "Some other eDRAM chip had this bandwidth so Wii U's should be at least the same since it's on a better process" is another terrible argument. You can more or less tell by die shot that the eDRAM isn't outputting more than 1024 bits per cycle, so unless it's effectively closed at > the 550MHz core speed (which would be a less than ideal design choice) it'll be ~70GB/s.. and for the level of graphics we've seen from anything there's no reason to believe it's using for or can benefit from a super high eDRAM bandwidth
- It doesn't matter how much AMD evolved their GPUs in the last N years, that doesn't have a bearing on what Nintendo decides to use, especially if they were only comfortable freezing their hardware decisions years ago. Based on your logic, if someone told you in 2011 3DS would be using ARM you'd say it'd be coming out with Cortex-A9, but no, it came out with ARM11 which was first in devices like 5 years prior.
- Has DX11 but can't use it because of OS is really dumb, when people say "DX" they mean feature set, not actual DirectX since of course Nintendo hardware isn't running DirectX. And if you think the OS would limit feature set like that you're crazy, the game developers would have access to the metal to get around it.
- "Minor Power7 architecture implementation involving memory allows shave off 10 or more percent of texture size without loss of quality(special compression or something)" Seriously where do you get this crap? How would Power7 have anything to do with texture compression in the first place. The mind boggles.
- Using the PSU rating to try to say that the thing has all this untapped potential because it's running well below that is beyond ridiculous...

Ugh, I can't go on any more, you've just loaded so many silly assumptions and bad arguments into one post, it's just too much for me >_<
 

eyeofcore

Member
Oct 1, 2013
50
0
0
I am sorry for my failure, I am not a tech savvy person though I just want to know the truth. I did some research, though lack of in depth knowledge is a big minus from me...

Can you please make most accurate as possible about hardware of the Wii U and comparison between Wii U and Xbox 360, please. Will you? I want to know the truth. I am not fanboy, I know that I stated some things wrong though I only wanted to point out. For example eDRAM embedded in the GPU so latency there should be rather very low compared to Xbox 360's eDRAM that is not embedded into GPU and/or not embedded at all?! Right?

What about the pipeline? If I am correct that Xbox 360 and PlayStation 4 have like 32 or 40 stage pipelines then it should have severe penalties...

Exophase... Please, do me a favor or Il just need to save up freaking 3000$ for Wii U development kit just to find out the performance of it and that scenario will need me saving for 8 months in a row or more depending if I find a well payed job or not. 3000$ plus transportation plus 25% tax and that will be close to 4000$. -_-"
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
Why are you so invested in this? You say you're not a fanboy but you definitely look like you're on a mission to prove to everyone that Wii U is so much better than they think. What difference does it even make? The games will be whatever the games will be.

If you want to know what other people have figured out about Wii U read a thread on a more informed forum, like this one: http://forum.beyond3d.com/showthread.php?t=60501&highlight=wii+investigation

Just don't make a post like this there, they'll eat you alive. And understand that there's a ton people have no idea about because Nintendo has revealed so little about the hardware.

Don't worry about saving up for a Wii U dev kit, Nintendo will never sell one to a non-licensed developer (and they wouldn't do it for the paltry price $3k) and even if you had one you probably wouldn't learn much of anything from it.
 

eyeofcore

Member
Oct 1, 2013
50
0
0
Why are you so invested in this? You say you're not a fanboy but you definitely look like you're on a mission to prove to everyone that Wii U is so much better than they think. What difference does it even make? The games will be whatever the games will be.

If you want to know what other people have figured out about Wii U read a thread on a more informed forum, like this one: http://forum.beyond3d.com/showthread.php?t=60501&highlight=wii+investigation

Just don't make a post like this there, they'll eat you alive. And understand that there's a ton people have no idea about because Nintendo has revealed so little about the hardware.

Don't worry about saving up for a Wii U dev kit, Nintendo will never sell one to a non-licensed developer (and they wouldn't do it for the paltry price $3k) and even if you had one you probably wouldn't learn much of anything from it.

I am not a fanboy, I was fascinated by Gamecube and a bit with Wii and I want to know about Wii U since Nintendo seems to make a well balanced hardware compared to competition...

You suggest me beyond 3D? Hell no... Those guys said that Wii U GPU is 176Gflops and that is not possible since Wii U's GPU is 40nm and 320 shaders fit easily when looking at die size of the chip and the amount of eDRAM has taken and with slight modifications/customization it could have 384 shaders.

I need unbiased people... Not wanna be geeks.

Radeon HD 5550 is the candidate though Radeon HD e6760 is also candidate because Nintendo made a deal that involved OpenGL ES 2.0 and e6760 had it. Also ignoring possible evolution of Wii U's hardware is ignorant, it was known that Wii U dev kit in 2011 was Radeon HD 4850 though why Nintendo would stay with that consumes more, has older feature set and less gflops per watt then GPU's that were available by that time? They had a bit more than a year to improve Wii U. They could easily choose Radeon HD 5000 or 6000 series also eyefinity like feature is present with Call Of Duty Black Ops 2 in local co-op zombie mode and was shown on one of the Unity demos that one game utilized it.

It is not 4000, it is either 5000 our likely 6000 since Nintendo could want OpenGL ES 2.0 in their system and if they have a custom API then it couldutilize a customized OpenGL ES 2.0.
 

Revolution 11

Senior member
Jun 2, 2011
952
79
91
Ok, Gamecube might have been "well-balanced" compared to its console peers, but the Wii U is so clearly outmatched by both Xbox One and PS4, let alone any serious gaming PC.

Nintendo may be competing on other aspects of a console besides performance for better or worse but let's call a spade a spade. The Wii U is quite slow.
 

eyeofcore

Member
Oct 1, 2013
50
0
0
Ok, Gamecube might have been "well-balanced" compared to its console peers, but the Wii U is so clearly outmatched by both Xbox One and PS4, let alone any serious gaming PC.

Nintendo may be competing on other aspects of a console besides performance for better or worse but let's call a spade a spade. The Wii U is quite slow.

I know that, I only want to know how much is powerful...

http://www.ign.com/boards/threads/o...ower-thread.452775697/page-203#post-481660123

Wii U's very first development in 2011 E3 had Radeon HD 4850, later development kit(alpha) has Radeon HD 5000 series GPU and later in near end of development of Wii U and near its release the final development kit supposedly has GPU based around Radeon HD e6760 that supports OpenGL ES 2.0 that is low level API? Am I correct so Nintendo could have custom/semi-custom API based around OpenGL 4.3 and Open GL ES 2.0!

It seems Wii U GPU is around 352Gflops and since it supports DX11 feature set, Open GL 4.3/OpenGL ES 2.0 thus it extends its lead even further because of efficency of the hardware.
 

SecurityTheatre

Senior member
Aug 14, 2011
672
0
0
I am sorry for my failure, I am not a tech savvy person though I just want to know the truth. I did some research, though lack of in depth knowledge is a big minus from me...

This....

Why would you go on a highly-technical rant using bogus information.... in a highly technical forum.....

And then apologize by saying you don't know what you're talking about...

Seriously, what's the point?

Actually...

This part was worth it.

It made me LOL. I applaud you for that.

Wii U CPU is next generation compared to Xbox 360 and people that say otherwise should just shut up, be ashamed, have a seat, accept the facts and deal with it and here is the proof; http://systemwars.com/forums/index.php/topic/112794-no-freaking-way-40-stage-pipeline/

Reading about technical features like OOP and set associativity of caches from in breathless, sentence-less, shouting made my afternoon.

Seriously, the guys working in the systems lab down the hall came to see what was so funny.

*wipes eyes*

Thanks! :thumbsup:
 

ultimatebob

Lifer
Jul 1, 2001
25,134
2,446
126
I am sorry for my failure, I am not a tech savvy person...

Then why are you posting something (that at least attempted to be) technical in the Highly Technical forum? Is this some sort of Internet troll training?

Stick with Console Gaming next time.
 
Last edited:

N-A-N-0

Member
Sep 1, 2013
26
0
0
Despite the nastiness he received, he was right on several points. Wii U is basically 360+, or like what the N64 was to the PS1 in terms of raw horsepower and more modern features. It certainly isn't comparable hardware-wise to PS4/XB1 though, other than using 1/4th of the same kind of RAM as XB1, bog-standard DDR3-1600,(though I believe Microsoft clocked theirs higher.) That's another point, the Wii U not only has 4 times the RAM of the PS3 and Xbox 360, but it's better stuff than the old GDDR3. Don't know much about the PS3's 256 MB or half the RAM though.

The big problem though is obviously that the big white beast that was the phat Xbox 360 went on sale 7 years before the Wii U. Way back in 2005, the Xbox 360 was extremely impressive. It's a shame that the Wii U is pretty comparable to that.
 
Last edited:

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
Despite the nastiness he received, he was right on several points.

Anything he presented that went against broadly common knowledge was wrong. Sadly he's been vandalizing Wikipedia to include these same made up facts based on incredible leaps of logic.

It certainly isn't comparable hardware-wise to PS4/XB1 though, other than using 1/4th of the same kind of RAM as XB1, bog-standard DDR3-1600,(though I believe Microsoft clocked theirs higher.) That's another point, the Wii U not only has 4 times the RAM of the PS3 and Xbox 360, but it's better stuff than the old GDDR3. Don't know much about the PS3's 256 MB or half the RAM though.

You can't just look at the RAM technology and the clock speed, you have to look at the bus width. The DDR3 in XB1 isn't just higher clocked, it's in a 256-bit configuration. In Wii U it's only in a 64-bit configuration. Every clock cycle of a transaction XB1's RAM transfers four times as much data.
 

N-A-N-0

Member
Sep 1, 2013
26
0
0
Anything he presented that went against broadly common knowledge was wrong. Sadly he's been vandalizing Wikipedia to include these same made up facts based on incredible leaps of logic.

So that's been him... I wonder where these mysterious Latte = AMD Radeon 5000 and 6000 series specs have coming from. GPU die doesn't even look too similar to anything but if digital foundry says it's about a match for a 4650/4670, about a match with what they figured back at E3 2011, that's good enough for me.


You can't just look at the RAM technology and the clock speed, you have to look at the bus width. The DDR3 in XB1 isn't just higher clocked, it's in a 256-bit configuration. In Wii U it's only in a 64-bit configuration. Every clock cycle of a transaction XB1's RAM transfers four times as much data.

True, though if we're talking 720p and sub-HD 7th gen games, the better GPU is more important.
 
Last edited:

sm625

Diamond Member
May 6, 2011
8,172
137
106
So you are crying about analog sticks and thus you don't buy those consoles? What is wrong with your head, did you fell on your head when you were born? I mean, cmon... Because of one silly thing, if it were for the games then I would have understand, but this is just... *facepalm*

What is wrong with your head, besides being filled with troll? I refuse to spend my money on a console that contains a controller design that is demonstrably, empirically, provably worse than a design from 20 years ago. Mario 64 is almost 20 years old and you can walk a tightrope in that game, and you could walk at 5 different speeds while doing it. You cant do anything like that on these crappy xbox/playstation controllers. You can walk, or you can run, and that's about it. And you'd be extremely lucky to be able to switch between those two speeds without altering course. Bad design is simply bad design. Rather than feed or reward these companies for their stupidity and dumbing down, I just moved to mouse/keyboard.
 

N-A-N-0

Member
Sep 1, 2013
26
0
0
What is wrong with your head, besides being filled with troll? I refuse to spend my money on a console that contains a controller design that is demonstrably, empirically, provably worse than a design from 20 years ago. Mario 64 is almost 20 years old and you can walk a tightrope in that game, and you could walk at 5 different speeds while doing it. You cant do anything like that on these crappy xbox/playstation controllers. You can walk, or you can run, and that's about it. And you'd be extremely lucky to be able to switch between those two speeds without altering course. Bad design is simply bad design. Rather than feed or reward these companies for their stupidity and dumbing down, I just moved to mouse/keyboard.

Oh my god.... You realize that's because the games have less variation in the animations and speeds for the analog stick, which Mario 64 was a showcase for...? It's nothing to do with the analog sticks themselves.
 

Batmeat

Senior member
Feb 1, 2011
803
45
91
Don't understand the point of the original posters argument here. why are you comparing the Wii U to the Xbox 360 and the PlayStation 3 when you should be comparing the Wii to them. the Wii U, generation wise, should be compared to the Xbox one and PlayStation 4.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |