Originally posted by: Rusin
So basically:
If Nvidia keeps it's trend Geforce GTX 280 could be at HD2900XT level in terms of real power consumption and GTX 260 would be very close to 8800 GTX [Don't know does this ask too much optimism..even from me ]
Originally posted by: Aberforth
Originally posted by: Rusin
So basically:
If Nvidia keeps it's trend Geforce GTX 280 could be at HD2900XT level in terms of real power consumption and GTX 260 would be very close to 8800 GTX [Don't know does this ask too much optimism..even from me ]
Who really gives a dam about Power? LOL...if a user was economical and responsible he'd have settled for 8800 GT. Why people buy TRI, QUAD SLI setups? If they spend that much money on their girlfriends or meaningful pursuits they'd rarely have Upgradematism.
Originally posted by: KhadgarTWN
I believe TDP to real 3D load ratio of RV 770 is much like the patterns of RV 670
but if it is like the ratio of R600XT, then I won't be surprise if 4870 outnumbered GTX260 in power comsumption.
What I cannot figure out is
Despite of 4870 may have much lower perf/power ratio then GTX260, someone still claimed that 4870 is suprior in term of best perf: Power balance
Originally posted by: Rusin
So basically:
If Nvidia keeps it's trend Geforce GTX 280 could be at HD2900XT level in terms of real power consumption and GTX 260 would be very close to 8800 GTX [Don't know does this ask too much optimism..even from me ]
Originally posted by: jaredpace
Anyone plan on running TRI-SLI or SLI GTX 280s?
Originally posted by: Aberforth
Who really gives a dam about Power? LOL...if a user was economical and responsible he'd have settled for 8800 GT. Why people buy TRI, QUAD SLI setups? If they spend that much money on their girlfriends or meaningful pursuits they'd rarely have Upgradematism.
Originally posted by: Rusin
You said in that other thread that Nv fanboys should stay on this thread. Why don't you stay on that ATi 4xxx thread?
So 157W TDP is your limit and GTX 260's 182W would be too much? There's 16% difference and it would take a miracle if GTX 260 wouldn't have better performance/wattage ratio. Those performance ratio's.. that "20-30% faster than 9800 GTX" was speculation and that difference between 9800 GTX and 9800 GTX were from this test http://plaza.fi/muropaketti/ar...dia-geforce-9800-gtx,2 . They test game performance, not time demo performance..and it's on my native language and I trust them. I used minimum framerates in my calculation; on average frame rates difference between 9800 GTX and GX2 would have been larger. I some times use ComputerBase's tests if there's nothing better available.
Originally posted by: semisonic9
From what I've heard, the 4870 is coming in at $350, while the gt260 is expected around $400-450? And the gt280 $500-600?
If the nVidia chips don't drastically out-perform the ATI pieces, I can't see NV holding their prices at those levels for that long. I'm pretty sure they'll lower their prices to maintain market share. This could force them into a loss situation, since, as far as I know, they're still having trouble with yields.
~Semi
He probably means because the die size is bigger than the G8x/G9x they'll get less chips per wafer.Where did you hear they were having trouble with yields?
Originally posted by: keysplayr2003
Originally posted by: semisonic9
From what I've heard, the 4870 is coming in at $350, while the gt260 is expected around $400-450? And the gt280 $500-600?
If the nVidia chips don't drastically out-perform the ATI pieces, I can't see NV holding their prices at those levels for that long. I'm pretty sure they'll lower their prices to maintain market share. This could force them into a loss situation, since, as far as I know, they're still having trouble with yields.
~Semi
Where did you hear they were having trouble with yields? Just curious.
Originally posted by: Extelleron
A 576mm^2 chip is unheard of in the consumer space.... that's Itanium-level die size, for chips that sell for $1000's each. Selling a chip that big for <$400 along with PCB, memory, cooling, etc... is not going to be ideal at all.
Originally posted by: BFG10K
Final specs here:
http://www.hardware-infos.com/news.php?news=2092
I guess the high thermals must be keeping down the clock speeds.
Originally posted by: Cookie Monster
Since when were ATi cards "true" DX10 cards? what has DX10.1 got to do with it being "true"? You do realise when a GPU supports a certain amount of requirements under the DX10 spec, it can be clsasified as a DX10 card?
And erm, not to be rude or anything but where are you getting all these "facts" from? (the rest of your post)
Originally posted by: Nemesis 1
Originally posted by: Cookie Monster
Since when were ATi cards "true" DX10 cards? what has DX10.1 got to do with it being "true"? You do realise when a GPU supports a certain amount of requirements under the DX10 spec, it can be clsasified as a DX10 card?
And erm, not to be rude or anything but where are you getting all these "facts" from? (the rest of your post)
If you would Please look up the orginal specs. of DX10 . You will find that it was changed at the 11Th hour. THE DX10.1 was left out . IF you don't understand what that means to IN the GPU development read about it and learn. NV couldn't have a part ready for the orginal DX10 specs. SO they cried to the game developers about it . Who in turn put pressure on MS. SO MS gave them 1 year. Talk about bad smell in pressuring the industry and slowing progress NV is the King. There are alot of people out there on the net saying DX10.1 brings alot to the table. Only NV and fanbois are arguing about it. NV is saying to the world . GPU rule PC,s . Yet it is NV who is trying to hold back progress. Because ATIs tech is more advanced. Its not me making it up . Its just the facts and people who are in denial over the facts. Befor the R600 was released I remember the debates here about DX10 and NV not having a DX10. Product . NOW most of us know its a fact. and NV still doesn't have a DX10 card and they won't have after the release the 200 series. IS what NV has is a revised DX10 card. NV can't do DX10.1 and thats why MS. changed the orginal specs to fit NVs needs. FACT!
Originally posted by: keysplayr2003
Where did you hear they were having trouble with yields? Just curious.
For the first time, a specific reason was given for the lower than expected G92 yields: Michael Hara claimed it was testing procedures, rather than manufacturing itself, which were the problem. If really true, that would imply they were likely too conservative.
Originally posted by: Nemesis 1
SAY What??? NV can do DX10.1 with a firm ware update. PURE BS. I believe NV shaders aren't up to the task of DX10.1 . Give me a nv. propaganda article to read . TRUE DX10 has unified shaders . FACT!
YA we can do DX10.1 but we choose not to for business reasons. PURE RIGHT OUT AND OUT BS> BS. ITS greater than BS because its a lie to hurt the comp. and hold back progress. People all remember AEG and the level NV is willing to go to keep ya in their pocket. With most the game developers in NV pocket tell me they aren't trying to stall progress. The good lord gave ya a good brain . Use it.
Originally posted by: Rusin
...if Nvidia keeps up with their trend GTX 280 would be around 8800 GTX level in realistic power consumption.