Originally posted by: jiffylube1024
Originally posted by: Keysplayr
Originally posted by: SunnyD
Originally posted by: OCguy
Wow...that could be an amazing chip. :Q
Amazingly HUGE and HOT and POWER HUNGRY... yeah. Oh yeah, also amazingly EXPENSIVE too.
You don't know the size, you don't know the heat dissapation, you don't know the power it will draw, you don't know the price. Thanks for crapping by.
But you can venture a guess that the node GT300 will be on; 40nm or 32nm at the very VERY best. Meaning unless power leakage is off the charts on GT200, you can expect something in the area of a 30% power consumption per transistor on the new GT300 chip provided it's using 40nm (almost an absolute certainty, IMO).
Actually, switching to higher clocked (1000 MHz+) GDDR5 may keep the power consumption and complexity of GT300 reasonable, since that would let them get away with a 256-bit memory bus with no performance penalty. A 384-bit to 512-bit memory bus makes for hot, power hungry cards with many PCB layers.
Originally posted by: Idontcare
55nm -> 40nm transition involved in there too, which makes most assertions regarding power consumption and die-size a pointless debate until we have data.
Educated guesses based on past die size transitions when power leakage didn't totally ruin things (ie on Intel's Prescott) are still possible.
-------
What's interesting about GT300 is how Nvidia keeps soldiering on with the monolithic die video cards, even in an economic sinkhole. Nvidia has almost become the new ATI in coming out with a top chip and then cutting it down for the high-volume markets.
It was not too long ago that Nvidia utterly dominated the lower/mid range with cheap-to manufacture cards like the 6600GT, 7900GS/GT, 9600GT and, relatively speaking, the 8800GT. Now ATI is the one releasing fantastically positioned cards like the 4850 and 4770, while Nvidia is cutting down their top-end card (and cutting into their profit margins) to make attractive cards like the GTX 260 Core 216.