Originally posted by: tuteja1986
<div class="FTQUOTE"><begin quote>Originally posted by: Cookie Monster
Except that the 2900XT loses every single AA/AF benchmark.</end quote></div>
thats a lie cookie monster. Your such a fanboy. ATI AA is much quality than Nvidia AA.
8xAA ATI aa can beat up a 8xQAA Nvidia AA.
12xCFA can beatup any AA mode of Nvidia.
Sometimes i do wonder if your a kid based on some of your posts.
I was simply stating fact from tweaktown that all their benchies with AA/AF the XT lost. Whats so fanboy about saying that?
O boy and the last sentence indeed cracks me up. So most all reputable sites are wrong when they indicate that nVIDIA still has the IQ crown? tell me, how is ATi AA better than nVIDIA's AA in terms of quality? Are we going to zoom in 400% to find these negligible differences?
You say you own all the latest hardware but to me and the way to act and write (e.g beatup? this aint school), you make me believe that you making things up or pretend to talk about things that you have absolutely no idea about.
And for one, G80 can do shader AA AND CFAA which is what R600 is doing. Its just a waste of time and effort for them because there are hardly any advantages to both methods. Shader AA gains you more performance hit when enabling AA with the resulting IQ being the same, while CFAA is a more advanced form of Quincunx AA (so less blurring but there is STILL blurring going on as said by most reputable hardware sites) but in the end it means AF becomes useless due to the blur (details on textures are lost). The performance hit for enabling such AA method isnt pretty as some may have led to believe.
End of rant. What a complete waste of time. Theres a limit to what drivers can do and they wont magically fix AA performance drops because its hardware related. (R600 does not have hardware AA on their ROPs, or in other words AA resolve.)