Wii U CPU Espresso can use Latte's eDRAM...

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

N-A-N-0

Member
Sep 1, 2013
26
0
0
wii u bom is probably $50-90 so it aint that expensive.

Impossible it's that low... I doubt Nintendo would lie about it being sold at a loss (at least in August) either and the PS3 wasn't even profitable in 2011 so it's not surprising. Wii U on the other hand is more than just a souped-up 360 (which became profitable earlier.)

http://www.ifixit.com/Teardown/Nintendo+Wii+U+Teardown/11796/1

80-90 is a pretty laughable assumption with all that wirelessness, multiple flash memory chips (512 MB for Wii on bottom of board, 8 or 32 GB on top for Wii U), 2 gigs of DDR3 (admittedly pretty cheap), and a more modern, expensive GPU than PS360. On top of that the tablet contains yet more wirelessness, another sensor bar to the side of its camera (one sensor is already included in the package.)

A proprietary ?security? chip from Renesas on the MCM. (3rd tiny chip.)

A proprietary slot-loading Blu-ray-like drive sourced from a single manufacturer, Panasonic, that also reads the proprietary DVD-like Wii discs.

4 USB 2.0 ports and an SDHC card slot.

Metal shielding, heat sink...etc.

Consoles are not cheap to produce at all and your assumption for price is just laughable. By that estimation the PS3 cost about 300 to make in 2006... which is obviously false...
 
Last edited:

eyeofcore

Member
Oct 1, 2013
50
0
0
Exophase, you are right though it is kinda odd since RENESAS are relative new comers and they did never been involved into production of GPU's so Nintendo should have cash out a lot of green for adaptations of the fabrication and/or GPU its self though Chipworks said 40nm and not 45nm. Does RENESAS has fab capable of 40nm since Wii U was on design board since 2009 and reveal was in 2011 so if they made a deal in 2010 or early 2011 then they should have enough time to adapt/upgrade their fabs and do tests and improve ratio of successful chips.

I am don't know much if anything about fabrications and manufacturing of chips, but I do understand that you can't switch that easily and needs time to refine the process.

eDRAM is embedded into GPU so TMU's/ROP's are connected to it, right? So bandwidth should be higher...
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
I dont think anyone said the GPU was slower...

But the CPU certainly is.

The Wii U is expensive due to the gamepad. Not the console itself.

You're both wrong. Do you honestly believe that the contrller costs like $150-200 to make? Because the cosole itself obviously costs under $150 to make. I don't believe for a second that they're losing money on theconsole ououtside of millions of usold stock. The BOM is $180 for the whole thing, gamepad included.

Again, I don't see the point of discussing a last-gen console here. This CPU will be legacy in a year, and the only people who care about it are Nintendo fanboys.

Btw, typing on phone that doesn't like long text boxes.
 

eyeofcore

Member
Oct 1, 2013
50
0
0
You're both wrong. Do you honestly believe that the contrller costs like $150-200 to make? Because the cosole itself obviously costs under $150 to make. I don't believe for a second that they're losing money on theconsole ououtside of millions of usold stock. The BOM is $180 for the whole thing, gamepad included.

Again, I don't see the point of discussing a last-gen console here. This CPU will be legacy in a year, and the only people who care about it are Nintendo fanboys.

Btw, typing on phone that doesn't like long text boxes.

I seriously doubt it is just 180$ and how do you know? Do you have official documents or you got the information?! Are you breaking an NDA if that information is true at all?!?
 

insertcarehere

Senior member
Jan 17, 2013
639
607
136
You're both wrong. Do you honestly believe that the contrller costs like $150-200 to make? Because the cosole itself obviously costs under $150 to make. I don't believe for a second that they're losing money on theconsole ououtside of millions of usold stock. The BOM is $180 for the whole thing, gamepad included.

Again, I don't see the point of discussing a last-gen console here. This CPU will be legacy in a year, and the only people who care about it are Nintendo fanboys.

Btw, typing on phone that doesn't like long text boxes.
The estimate for the BOM of the MCM with the cpu/gpu is estimated to cost close to $100 by chip works, so the BOM of the console is probably not that low
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
$100 for the MCM would be a really awful price, but I wouldn't put it past Nintendo to get ripped off like this. Using an MCM at all does surely add overhead, and when it's about keeping a miniscule ~30mm^2 die separate it's just not worth it, unless some aspect of that CPU die would have brought down the yield of the GPU more than you'd expect it to.

MCMs are nice when you get to sell the component dies individually, or when they're at least close enough in size to have a big yield benefit, and when the base cost of the device is so high that the cost of the MCM substrate and assembly is small in comparison. Here it doesn't really make sense, I think they just did it because licensing prevented them from getting the CPU manufactured with everything else.
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
Impossible it's that low... I doubt Nintendo would lie about it being sold at a loss (at least in August) either and the PS3 wasn't even profitable in 2011 so it's not surprising. Wii U on the other hand is more than just a souped-up 360 (which became profitable earlier.)

http://www.ifixit.com/Teardown/Nintendo+Wii+U+Teardown/11796/1

80-90 is a pretty laughable assumption with all that wirelessness, multiple flash memory chips (512 MB for Wii on bottom of board, 8 or 32 GB on top for Wii U), 2 gigs of DDR3 (admittedly pretty cheap), and a more modern, expensive GPU than PS360. On top of that the tablet contains yet more wirelessness, another sensor bar to the side of its camera (one sensor is already included in the package.)

A proprietary ?security? chip from Renesas on the MCM. (3rd tiny chip.)

A proprietary slot-loading Blu-ray-like drive sourced from a single manufacturer, Panasonic, that also reads the proprietary DVD-like Wii discs.

4 USB 2.0 ports and an SDHC card slot.

Metal shielding, heat sink...etc.

Consoles are not cheap to produce at all and your assumption for price is just laughable. By that estimation the PS3 cost about 300 to make in 2006... which is obviously false...

i was responding to a comment and actually meant the gamepad bom not the whole system...
 

insertcarehere

Senior member
Jan 17, 2013
639
607
136
$100 for the MCM would be a really awful price, but I wouldn't put it past Nintendo to get ripped off like this. Using an MCM at all does surely add overhead, and when it's about keeping a miniscule ~30mm^2 die separate it's just not worth it, unless some aspect of that CPU die would have brought down the yield of the GPU more than you'd expect it to.

MCMs are nice when you get to sell the component dies individually, or when they're at least close enough in size to have a big yield benefit, and when the base cost of the device is so high that the cost of the MCM substrate and assembly is small in comparison. Here it doesn't really make sense, I think they just did it because licensing prevented them from getting the CPU manufactured with everything else.

I agree, the idiotic decision to pursue backward compatibility with Wii, (which the public straight up lost interest in), was the cause of a lot of issues for the manufacture of the console.
 

eyeofcore

Member
Oct 1, 2013
50
0
0
I agree, the idiotic decision to pursue backward compatibility with Wii, (which the public straight up lost interest in), was the cause of a lot of issues for the manufacture of the console.

I disagree... It is not idiotic since people expected that and that is another reason to move from Wii to Wii U though if Nintendo was to implement backward compatibility with Gamecube games then you would have seen droves of Nintendo fans buying a Wii U in a heart beat.

Wii had its own great games and people would have been pissed off if there was no backward compatibility since they got used to with Wii and imagine how many people were pissed of when Xbox 360 did not had backward compatibility and specially when Playstation 3 had it and then they had cut it from it. Backward compatibility may seem like a useless feature, but its not. Remember Intel's Itanium? It was a failure and when AMD released x86-64bit implementation and supported x86-32bit, developers were thrilled.

Wii U now supports off-tv play of Wii games though you can not control the game with gamepad for now ultil they implement it.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
I agree, the idiotic decision to pursue backward compatibility with Wii, (which the public straight up lost interest in), was the cause of a lot of issues for the manufacture of the console.

I think so too. Only I'd extend this to say that over the last decade Nintendo has been really change-averse when it comes to their internal hardware design.

Gamecube incorporated eDRAM (1T-SRAM but more or less a form of eDRAM) and at the time it was a good design and perhaps a reasoned reaction to N64's less than ideal memory layout. But then they kept almost the same chipset for Wii. Since then Sony has abandoned eDRAM in PS3 and now MS has in XB1, but with Wii U they feel like they have to use eDRAM again. They've also often stuck with NEC/Renesas and Fujitsu in the past, and I feel like they have a priority to maintain the same manufacturing relationships as well as favor Japanese partners (another example on that would be their selection of GPU IP for 3DS).
 

insertcarehere

Senior member
Jan 17, 2013
639
607
136
I disagree... It is not idiotic since people expected that and that is another reason to move from Wii to Wii U though if Nintendo was to implement backward compatibility with Gamecube games then you would have seen droves of Nintendo fans buying a Wii U in a heart beat.

Wii had its own great games and people would have been pissed off if there was no backward compatibility since they got used to with Wii and imagine how many people were pissed of when Xbox 360 did not had backward compatibility and specially when Playstation 3 had it and then they had cut it from it. Backward compatibility may seem like a useless feature, but its not. Remember Intel's Itanium? It was a failure and when AMD released x86-64bit implementation and supported x86-32bit, developers were thrilled.

Wii U now supports off-tv play of Wii games though you can not control the game with gamepad for now ultil they implement it.

Look at the general public reaction to the PS4 and XB1, both of them have no backward compatibility and that has certainly not stopped the PS4 from having a positive impression, nor has it been one of the issues that was frequently bought up with the XB1.

Slightly off topic stuff aside, the decision for BC meant that Nintendo was forced to use lagging edge, proprietary (expensive) processes with less of an avenue to reduce costs due to the highly customised design, that sounds like a pretty big set of issues to me.
 
Last edited:

eyeofcore

Member
Oct 1, 2013
50
0
0
Look at the general public reaction to the PS4 and XB1, both of them have no backward compatibility and that has certainly not stopped the PS4 from having a positive impression, nor has it been one of the issues that was frequently bought up with the XB1.

Of course since people are used to not having backward compatibility since X360 did not have it and PS3 cut it in its first year if I remember correctly, its a different story with Nintendo, Xbox 360 and Playstation 3 have around 3 more years of life since Microsoft and Sony want to recover as much money as possible since both companies lost money on their consoles. Xbox 360 and Playstation 3 will drag down Wii U for three years if developers are going to still make cheap ports.

What are chances of Wii U's CPU Espresso of having AltiVec?
https://en.wikipedia.org/wiki/PowerPC_G3#PowerPC_750VX

Since PowerPC 750VX was in development but was canceled and it had AltiVec implemented so could Nintendo implemented it into Espresso? Also this cancelled variant of the chip was able to clock up to 1.8Ghz though it had more pipeline stages. Hmmm...
 
Last edited:

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
What are chances of Wii U's CPU Espresso of having AltiVec?
https://en.wikipedia.org/wiki/PowerPC_G3#PowerPC_750VX

Since PowerPC 750VX was in development but was canceled and it had AltiVec implemented so could Nintendo implemented it into Espresso? Also this cancelled variant of the chip was able to clock up to 1.8Ghz though it had more pipeline stages. Hmmm...

No. Just no. I don't know why you people keep entertaining these ideas. It's three Broadway cores with coherency and a different type of L2 cache added. There's no other secret sauce. If Nintendo wanted a CPU with Altivec they would be using a different CPU, not getting IBM to revive a chip that may not have ever existed in the first place. The whole thing is kind of dumb to begin with, since PowerPC 7400 was already not much more than a 750 with Altivec added.

And uou're wrong about Marcan only exploiting it in Wii mode, go read his blog more, he has full Wii U access. And even if it did, your ideas that it'd appear like this totally different CPU in Wii mode vs Wii U mode are not realistic. It's amazing, when he reports something like being able to access eDRAM from the CPU that's hot news to you but when he reports that it's still just 750CLs you doubt him. BTW, backwards compatibility is almost certainly a major reason why that eDRAM is CPU addressable, because it has to play the role of Wii/Gamecube's 1T-SRAM.

If only Nintendo would reveal even a fraction of the amount of technical information MS and Sony does we wouldn't have to go through these flights of fancy every single time. I'm still reeling over how apparently DS was really clocked at > 200MHz, 3DS is clocked at 1GHz, Wii has physics accelerators and Wii U has Power7 technology or god knows what else.
 

eyeofcore

Member
Oct 1, 2013
50
0
0
@exophase

You fail, that chip was in development and it had chips and was ready for mass production though it was canned since Apple went to PowerPC G4.

Wii U utilizes Power7, the freaking eDRAM as L2 Cache and since it can access eDRAM pool on GPU thus that pool can act as L3 Cache also since eDRAM on GPU is embedded thus it should exceed Xbox 360s bandwidth of 256Gbps since it is connected/inside GPU and ROPs an TMU and basicaly everything on GPU has very direct acces with much faster bandwidth and much lower latency then on Xbox 360.z
 

insertcarehere

Senior member
Jan 17, 2013
639
607
136
@exophase

Wii U utilizes Power7, the freaking eDRAM as L2 Cache and since it can access eDRAM pool on GPU thus that pool can act as L3 Cache also since eDRAM on GPU is embedded thus it should exceed Xbox 360s bandwidth of 256Gbps since it is connected/inside GPU and ROPs an TMU and basicaly everything on GPU has very direct acces with much faster bandwidth and much lower latency then on Xbox 360.z

Of course since people are used to not having backward compatibility since X360 did not have it and PS3 cut it in its first year if I remember correctly, its a different story with Nintendo, Xbox 360 and Playstation 3 have around 3 more years of life since Microsoft and Sony want to recover as much money as possible since both companies lost money on their consoles. Xbox 360 and Playstation 3 will drag down Wii U for three years if developers are going to still make cheap ports.

Would even a single core of Power7 with cache fit on 28mm^2 @45nm process?

Frankly, the reason there are still occasional Wii U ports is because of PS360, once games for those go away (fairly soon no doubt), I doubt the Wii U will get much in the way of third party games at all.
 
Last edited:

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
I seriously doubt it is just 180$ and how do you know? Do you have official documents or you got the information?! Are you breaking an NDA if that information is true at all?!?

I was going based on a rumor, which makes sense considering the antique we ended up with.

But trust me, Nintendo was pulling some funny accounting when they claimed a loss.

By the way, I'm just as bitter as I seem to be.

Also, anyone who still thinks it's POWER7 is an idiot. I'm sorry but... come on!
 
Dec 30, 2004
12,554
2
76
Wii U's CPU is not weaker, most launch games and ports use 2 cores and they are Xbox 360 ports so games ran poorly and there difference between Wii U's Espresso and Xbox 360's Xenon is different... Also Wii U's Pipeline is like 8 to 10 times shorter and has much more L2 Cache and as Marcan and Shin'en said, it can access Latte's eDRAM so there should be benefits in terms of latency between CPU and GPU.

Here I got Gflops info about C2D T9300 and it says 15Gflops for it;
http://www.overclock.net/t/586994/cpu-gflop-performance-database

Also this;
http://www.cinemablend.com/games/Pr...e-Shadows-Multi-Threaded-Shadowing-59659.html

People only speculate it is three PPC750CL ducktapped together because Marcan said it has similarities though that does not mean that there are not modifications to increase the speed and look at the clocks and it is 4 stage pipeline so it is very efficient.

After the Wii publishers resting on fact that most customers think Nintendo's consoles are terribly underpowered. Even if it matches Xbox360 all that matters is if it's "better than Wii" people are happy. Throw in some poorly conceived Wii U remote menu additions and call it a day.

edit: I'm late to the game, so apparently they have nice looking WiiU titles now.
 
Last edited:

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
@exophase

You fail, that chip was in development and it had chips and was ready for mass production though it was canned since Apple went to PowerPC G4.

There were rumors. Never an actual official announcement. So uh, no, no fail.

What you say makes no sense anyway. Those VX rumors were circulating in 2003 (http://appleinsider.com/articles/03/12/11/ibm_powerpc_750vx_finalized_sources_say) PowerPC G4 was released in 1999.

Wii U utilizes Power7, the freaking eDRAM as L2 Cache and since it can access eDRAM pool on GPU thus that pool can act as L3 Cache also since eDRAM on GPU is embedded thus it should exceed Xbox 360s bandwidth of 256Gbps since it is connected/inside GPU and ROPs an TMU and basicaly everything on GPU has very direct acces with much faster bandwidth and much lower latency then on Xbox 360.z

Geez, it's like you're determined to demonstrate over and over how little you know about CPU design. Why don't you give it a rest and accept that you're way out of your league and stop trying to BS people who can see through it?

Having eDRAM on a CPU doesn't make it "utilizing Power7", whatever that's supposed to mean. IBM makes a lot of CPUs with eDRAM, that's their thing now. Ever heard of A2? That has eDRAM and it has nothing to do with Power7. NOTHING. They couldn't be on further ends apart of the uarch spectrum.

You also clearly don't know what cache is if you think that just because the CPU can access the GPU's eDRAM it becomes L3 cache. And if you had the faintest clue about the layout of the GPU you'd know that just because the eDRAM is on the same chip it doesn't mean the ROPs and TMUs - which are embedded inside separate clusters - have some special fat path to it. It has to go through the standard texture and ROP caches like usual.

You also don't even understand where that 256GB/s bandwidth number for XBox 360 comes from. It has "magic ROPs" that are embedded inside the eDRAM arrays. That bandwidth is amplified EIGHT times internal to the eDRAM because it does read-modify-writes instead of simple reads or writes (for both depth update and blending) and it does it over 4 samples per pixel to provide 4x MSAA. We know for a fact Wii U doesn't have magic ROPs because we can see where the ROPs are and they aren't inside the eDRAM, furthermore this'd get in the way if you want to use the thing to actually texture from. Something we know Wii U is capable of, and incidentally something MS also dropped the magic ROPs for in XB1.

Maybe the eDRAM has slightly lower latency from being on chip. Maybe it doesn't matter because rarely are GPU applications that latency sensitive. Microsoft also put 32MB of RAM on-chip for XB1, and they also said that they weren't thinking about a benefit from latency.
 

eyeofcore

Member
Oct 1, 2013
50
0
0
Did anyone opened the link involving PowerPC750VX that I posted?

PowerPC G4 had multiple generations, lowest one had like 350Mhz and highest one had 1.4Ghz, I think Apple completely switched to G4 by 2003 thus PowerPC750VX mass production was cancelled...

Then why IBM mentioned Power7... It is probably a misunderstanding from PR guy though why would Nintendo use older implementation of eDRAM and not the way Power7 has eDRAM implemented since logically looking a newer CPU should have better implementations... If there was no benefit of eDRAM being integrated/embedded into GPU then why they bothered at all?!

If you all know so much then why don't you really contribute, why don't you put facts, evidence and hints about Wii U in one document, one page, one site?!... >_>

Also just because I say it has something from Power7 does not mean it is Power7, it could have influence of that design, does Power7 higher density than PowerPC750CL if they were produced at same 45nm process?!

Soccerballtux, open this link; http://www.neogaf.com/forum/showthread.php?t=688391
 

NTMBK

Lifer
Nov 14, 2011
10,269
5,134
136
Did anyone opened the link involving PowerPC750VX that I posted?

You linked to Wikipedia. Wikipedia's only source was a single rumour article from AppleInsider. IBM never officially announced the thing, we have no idea if it ever actually existed, and if it did how close to complete it was.
 

eyeofcore

Member
Oct 1, 2013
50
0
0
You linked to Wikipedia. Wikipedia's only source was a single rumour article from AppleInsider. IBM never officially announced the thing, we have no idea if it ever actually existed, and if it did how close to complete it was.

Ok... I now actually doubt that Wii U's CPU has AltiVec since it would need more pipelines.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |