Originally posted by: bryanW1995
I know that 55nm is more efficient that 65nm, but I'm extremely concerned about the relatively small power draw of 4870. if it's really only 150w, and gt200 ends up at the rumored 250w, this could get ugly even for 4870x2. of course, it could take nvidia 6 mos to get out gt200 in volume, too...
Originally posted by: bryanW1995
I know that 55nm is more efficient that 65nm, but I'm extremely concerned about the relatively small power draw of 4870. if it's really only 150w, and gt200 ends up at the rumored 250w, this could get ugly even for 4870x2. of course, it could take nvidia 6 mos to get out gt200 in volume, too...
Originally posted by: Extelleron
Well single-chip RV770 isn't going to match GT200 in performance, and I think that 4870 X2 power consumption should be close to that of GT200 (along with performance being competitive IMO).
Originally posted by: Extelleron
I think AMD is holding back on the clocks a bit, I'm pretty sure RV770 @ 55nm can do a bit more than 850MHz on the core. R600 did 745MHz and the only reason it wasn't 800MHz+ was thermals were already insane. I think AMD wanted to focus on performance-per-watt as much as raw performance this generation, and RV770 might have a good amount of overclocking headroom because of that.
How do you know RV770 won't complete with the GT200?
Originally posted by: golem
Originally posted by: Extelleron
I think AMD is holding back on the clocks a bit, I'm pretty sure RV770 @ 55nm can do a bit more than 850MHz on the core. R600 did 745MHz and the only reason it wasn't 800MHz+ was thermals were already insane. I think AMD wanted to focus on performance-per-watt as much as raw performance this generation, and RV770 might have a good amount of overclocking headroom because of that.
I'm not sure this is a good idea, especially for their high end parts. This makes more sense as a strategy for lower end or HTPC parts, not performance parts were fastest card around is a big marketing point.
Originally posted by: munky
Originally posted by: bryanW1995
I know that 55nm is more efficient that 65nm, but I'm extremely concerned about the relatively small power draw of 4870. if it's really only 150w, and gt200 ends up at the rumored 250w, this could get ugly even for 4870x2. of course, it could take nvidia 6 mos to get out gt200 in volume, too...
Not necessarily. I would imagine 4870x2 would be competitive against a gt200, and it would not have to compete against a dual-gt200 card until either:
a) Nvidia refreshes the gt200 on a smaller, more energy-efficient process, or
b) Nvidia includes an external PSU and water cooler with every gt200gx2 it chips
Originally posted by: ghost recon88
Originally posted by: Extelleron
Well single-chip RV770 isn't going to match GT200 in performance, and I think that 4870 X2 power consumption should be close to that of GT200 (along with performance being competitive IMO).
How do you know RV770 won't complete with the GT200?
Originally posted by: BenSkywalker
How do you know RV770 won't complete with the GT200?
Pretty much based on the most optomistic estimates we have seen it won't be able to keep up with a 8800Ultra. I'm not saying thatt is the case, but the rumors that we have seen have all been in the same general ballpark, if any of them are to be believed in terms of chip architecture and what has been done to it, the GT200 will likely slaughter it single v single. What is sad, we feel fairly comfortable saying this without knowing anything about the GT200. Pretty much, if nV just realesed a slightly upclocked 9800 they could comfortably hold the single gpu crown for a while longer yet. The GT200 may end up being a laughable upgrade over the 9800- but that realisticly speakng is all nV needs if anything we have heard about the RV770 chip is correct.
The 8800 Ultra isn't 2x faster than the 3870, and it looks like the 4870 will be.
Originally posted by: BenSkywalker
The 8800 Ultra isn't 2x faster than the 3870, and it looks like the 4870 will be.
You must have seen WAY different numbers for the chip layout then I have, because what I have seen it looks to be marginal over the 3870, and the claims we have heard put it in the 1.5 times the performance of the 3870 which isn't close. Also- the 880 Ultra is normally 50%-100% faster then the 3870. I think people have just gotten used to looking at the 3870x2 when comparing it to nVidia's older parts so they are at least somewhat close.
Originally posted by: Extelleron
Originally posted by: BenSkywalker
The 8800 Ultra isn't 2x faster than the 3870, and it looks like the 4870 will be.
You must have seen WAY different numbers for the chip layout then I have, because what I have seen it looks to be marginal over the 3870, and the claims we have heard put it in the 1.5 times the performance of the 3870 which isn't close. Also- the 880 Ultra is normally 50%-100% faster then the 3870. I think people have just gotten used to looking at the 3870x2 when comparing it to nVidia's older parts so they are at least somewhat close.
4870 has 103% more shader performance, 120% more texture performance, and 72% more memory bandwidth than RV670.
And there are very few cases where 8800 Ultra is 2x 3870... Call of Duty 4 is a serious outlier. Look at Crysis, for example, and that's a game where R600 cards don't do so hot. Double the 3870's performance, and the hypothetical 4870 would be 28% faster than an Ultra. For an idea of 4870 X2 vs GT200, assuming 80% CF scaling, GT200 would have to be 130% faster than 8800 Ultra to compete.
Look at a game like Oblivion (2560x1600 4xAA/16xAF)... there doubling 3870's performance gives you 45% higher performance than 8800 Ultra. In engines where R600 does well, like UE3 (Bioshock, UT3, RB6:V, etc), 4870 would be significantly faster as well.
Originally posted by: Azn
Originally posted by: Extelleron
Originally posted by: BenSkywalker
The 8800 Ultra isn't 2x faster than the 3870, and it looks like the 4870 will be.
You must have seen WAY different numbers for the chip layout then I have, because what I have seen it looks to be marginal over the 3870, and the claims we have heard put it in the 1.5 times the performance of the 3870 which isn't close. Also- the 880 Ultra is normally 50%-100% faster then the 3870. I think people have just gotten used to looking at the 3870x2 when comparing it to nVidia's older parts so they are at least somewhat close.
4870 has 103% more shader performance, 120% more texture performance, and 72% more memory bandwidth than RV670.
And there are very few cases where 8800 Ultra is 2x 3870... Call of Duty 4 is a serious outlier. Look at Crysis, for example, and that's a game where R600 cards don't do so hot. Double the 3870's performance, and the hypothetical 4870 would be 28% faster than an Ultra. For an idea of 4870 X2 vs GT200, assuming 80% CF scaling, GT200 would have to be 130% faster than 8800 Ultra to compete.
Look at a game like Oblivion (2560x1600 4xAA/16xAF)... there doubling 3870's performance gives you 45% higher performance than 8800 Ultra. In engines where R600 does well, like UE3 (Bioshock, UT3, RB6:V, etc), 4870 would be significantly faster as well.
It doesn't always work that way.
Did you look at any of the other benchmarks? The only two games that showed anything REMOTELY approaching a 50-100% difference between the Radeon and the 8800 Ultra were CoD4 and STALKER.
4870 has 103% more shader performance, 120% more texture performance, and 72% more memory bandwidth than RV670.
Well that may be the case, there's no need or point in handing nVidia the crown before their latest card has even been released.
Originally posted by: BenSkywalker
Right now, nVidia is the King, Queen, Prince and Princess.
Originally posted by: lopri
That heatsink looks like 3850's. The core is definitely different, though. Thanks for the pic, Killrose.
I am eagerly waiting for AMD's new offerings. Day by day I grow resentful towards NV's Vista drivers. (it could be the 780i issue or both, though)
Really? And why is that odd exactly? Perhaps your tiny sample of games simply misrepresents larger samples?Seems odd to me that you "hate them more every day" and I can't find problems.
Overclocking can do the same for any vendor; what?s your point?Although I don't OC for the most part, and running products out of spec can cause misreported "NVIDIA driver errors" from what I've seen on NZONE.