That's because your 7800GT is running in the SM3.0 mode while the X800XL is running in SM2.0 mode
That has absolutely nothing to do with it. For one there are no extra effects with SM 3.0 since I don't run HDR or bloom on either card.
Secondly, online benchmarks show a
6800GT absolutely
demolishing an X800XL, much less the beating a 7800GT would give it.
But that's because online reviews run at the default eye-sore quality setting rather than high quality and that was why I made the initial comment.
There is really a minimal difference in performance and quality between "Quality" and "High Quality".
This is a joke, right? A
30% performance drop or more is not at all unusual. The performance drop in the Serious Sam 2 demo I use is so big that nVidia goes from a
6800GT slaughtering my X800XL to my X800XL beating an overclocked
7800GT.
If I have time I'll put up a more detailed analysis of high quality vs quality. When you start investigating further the findings are simply amazing. Painkiller, Jedi Academy, Serious Sam 2, the list goes on.
My 400 MHz 16 pipe X800XL is faster in those games than my overclocked 20 pipe 7800GT is.
Interestingly OpenGL games tend to exhibit the biggest performance hit under high quality. I don't believe ATi's OpenGL ICD is actually a slow as common knowledge would have us believe, it's simply nVidia's default "quality" setting inflating their performance.
If you insist on running the NV on high quality, then you should run the ATI on high quality, also.
ATi's out of the box settings match nVidia's high quality settings. The default nVidia quality setting really has attrocious shimmering in many games especially the commonly benchmarked ones like Serious Sam 2 and Quake 4. In games like Chrome and Call of Duty the shimmering is so bad it's like a carpet of bees swarming on the ground.
The benchmarks done online with nVidia using the quality setting are invalid for this very reason as when you run under high quality (i.e. to get the level of IQ you get with ATi) you start seeing a whole different picture in terms of performance.
If you want a true comparison then reduce nVidia's scores by 20-30% and
then take another look at the results.
What about filtering and other stuff? The NV setting is for everything, not just texture.
Again ATi's default settings are comparable to nVidia's
high quality setting. nVidia's
quality setting is grossly inferior to ATi.