Originally posted by: CP5670
That ixbt table is very interesting. What exactly are the "HQ" benchmarks though? I would really like to know exactly how much performance is lost by using the HQ modes on Nvidia cards (without spending two hours doing my own benchmarks ), but the performance hits for some of those things are so big that I think the reviewers may mean something different by HQ.
Translation:
"I'd like to remind you, many games already face a cpu or a full system bottleneck, and as a result the most powerful graphics accelerators cannot show full potential even in 4AA/16AF modes. As such, we are introducing a new mode HQ (high quality) which means:
HQ: ATI Radeon X1xxx: 6AA plus Adaptive AA, plus 16AF High Qualify setting
HQ: NVidia Geforce 6xxx/7xxx: 8AA plus TAA (Transparency AA with gamma correction), plus 16AF.
This implies maximized load on the graphics accelerators. In other words, their power is used almost 100%. Even with the differences between AA/AF (implying between the 2 brands here), the quality enabled is equally good.
Those who are going to be complaining that you cannot compare varying AA modes, I say, the criteria remains the same: Maximum Image Quality. With ATI, better quality AF, with Nvidia, better quality AA. But believe me, without expanding/zooming and analyzing screenshots in detail, you will not be able to descern the difference in quality. As a result, we can compare different settings between the 2 graphics cards (as we are concerned with performance at maximum image quality either can offer for the end user)."
Source:
At the top in Bold