Originally posted by: BFG10K
Yeah but how does it affect most users aside from seeing a higher number on the 'GPU thermometer'?
Higher thermals usually lead to more noise due to more cooling required.
And weren't you the guy worrying about an extra $2 a month so he could buy some coca-cola?
I don't care about power consumption
per-se, what I care about is how it translates to noise.
what are you talking about? *Show me the links!* i missed the "far higher" and "significant difference" you claim for thermal differences between r600 and g80.
Click.
At load the 2900 uses 50% more power than the 8800 GTS so there?s no real comparison there. That kind of power consumption visibly translates to more heat and more noise.
The 8800 Ultra is probably quite close but from the tests done online it's still quieter and vastly faster than the 2900 when running AA so if you?re going to have a noisy card you may as well get that one.
It's not necessarily allowing it. It's that ATI has it written into the driver differently.
Uh, what? ATi haven't written it into the driver because there's no profile to allow those games to use AA. You can try to use Oblivion's profile but that's unusably slow, not to mention there's no guarantee the game won't be glitch free.
I don't think ATi implements AA in those games because it forces reviewers to test without it and it puts ATi in a better light. If AA was tested ATi's cards would likely tank compared to nVidia.
I also played a game called Final Fantasy XI which had numerous reports of Nvidia's drivers leaking texture memory and dropping down to 10fps after a few minutes of play. This occured with all models of 8800 from the GTS320 to the Ultra. They all did it.
The Alt-Tab issue has now been fixed according to the driver release notes.
BFG10K - I'm confused - I thought that in DX10 mode, UT3 games couldn't do AA, period - due to a specific type of shading being used. That's what I THOUGHT anyway. If you mean DX9 mode, then you may be right, I have no clue, but last I knew, there was no AA for Bioshock, RS: Vegas and so on. Am I wrong?
You're mixing two diferent things, Unreal 3 and DX10; Unreal 3 games aren't necessarily DX10.
You're also mixing shading with DX10 when the issue is deferred rendering which has nothing to do with DX10. It can be done with DX10 but doesn't have to be.
R6: Vegas and MoH: Airborne are currently only DX9 and nVidia's driver can force AA there. Bioshock is DX9 and DX10 and while nVidia's driver can only force AA in DX9 mode, DX10 support is coming.