Originally posted by: BFG10K
Sorry? What's false? That still images can't demonstrate shimmering?False,
What is false, is that you are pretending that NV doesnt shimmer worse.
Originally posted by: BFG10K
Agreed, nVidia's default driver settings shimmer far more than ATi. I am not disputing this at all.Under certain circumstances, NV shimmers much worse than ATi. That is a fact.
Sure looked like it in your previous post I quoted. "yet you don't seem have any trouble believing nVidia shimmers more than ATi with AF. " See?
Originally posted by: BFG10K
Obviously you have not played with each card at 1920x1200 on a 24" WS LCD, or you wouldnt believe they are both the same.
I regularly game at 1920x1440 with 16xAA.
In any case I didn't say they were the same, that's simply a strawman argument you've concocted. I simply stated still images are useless for showing the problem like Joker was attempting to do with AA.
And you do know that the xS modes reduce texture shimmering, right?
And? That has nothing to do with it. I said, a 24" WS LCD too, not just the res. Shimmering is much more noticable on a big LCD. Again, look at HardOCP's evaluation where they talk about this for confirmation. I thought your monitors max res was 1600x1200... ? But that doesnt matter, as I said, the shimmering difference is much worse on a large LCD, than on a CRT.
I simply pointed out that shimmering is much worse on NV cards with certain hardware. Dont like it? Doesnt matter, its the truth.
Yes I know this. But as I said, its useless to me. It doesnt matter how good something looks, if its a slide show, or has massive slow downs.
Originally posted by: BFG10K
I totally disagree. Maybe in the past but not now as the G7x series is fast enough to use it in a lot of games, even in single card configurations. Also if you can run 8xAA you can generally run 16xAA as the performance hit between the two is surprisingly small.NV's 8x is virtually useless.
Its your right to disagree. Another poster in this topic, agrees with me. I tried it when I had 2x7800GTX's, I also tried 8x and 16x SLI AA. 8x SLI AA was barely playable in a few games. And not any new games. 16xAA was not even close to being playable.
Originally posted by: BFG10K
Vampire Bloodlines is a 2004 game it it runs well on just a regular 7800GT with 16xAA enabled. It also looks gorgeous.Yes it looks very nice, but getting playable frames in any sort of new game is not going to happen.
"Runs well" is subjective. I do not like slow downs, stuttering, or anything like that. And it certainly would have slowed down too much for me. I tried 8x on several games when I had just one GTX, and it wasnt even close to playable in any games I was playing at the time. Again, what is playable is very subjective. I dont like pauses, stuttering, slide shows, or anything of that nature.
Originally posted by: BFG10K
Yes, it is. The image quality of 16xAA has to be seen to be believed and a very large chunk of my gaming library is now being enjoyed at 16xAA.Its nice for very old games,
That good. I dont play old games that often, if at all. I play them when they come out, and thats pretty much it. Aside from a few really good online FPS's, with good replay value. Which wouldnt run well enough, because you need all the speed you can get in those types of games.
Originally posted by: BFG10K
The same applies to 6xAA (especially adaptive) unless you run Crossfire, at which point you can bring SLI into the picture.There is zero chance of me getting playable frames in virtually any game at 1920x1200.
Not hardly. 6xAA on ATi has a much, much smaller performance decrease than 8xAA on NV's cards. When I had just one XT, I ran 6x in almost everything. 8x in the same games, on the same PC, with SLI'd 7800GTX's was not playable to me. Not even close.
And I do run Crossfire, X1900's.
Originally posted by: BFG10K
The fact is on single cards ATi has no equal to nVidia's 8xAA and 16xAA and given how ATi fanboys always harp on about ATi's superior IQ they conveniently like to downplay nVidia's AA superiority.
Fact? Looks like ATi's AA looks better in these shots, although its just one game. The bridge is what makes the difference. All other edges look about the same to me. I doubt I would notice it much while playing.
I "down play it", because its useless to me. Ive seen shots comparing SLI AA to SuperAA, and have used both of them first hand. I cant tell any difference. Sure, SLI goes up to 16xAA, but I never say the difference to that and SuperAA's 14x while playing. Also, with CF, you can do, 2x, 4x, 6x, 8x, 10x, 12x, and 14x. With NV you go from 8x, to 16x, with nothing in between. You can fine tune your playable frames with the maximum AA much more on ATi's setup. Not to mention SLI AA takes a much, much larger hit than SuperAA. In fact, 14xAA on SuperAA is usually about as fast as NV's 8x SLI AA. So yeah, to me ATi does have the AA advantage. Not because I think it looks better (they are both pretty comprable, cept for NV's SS), its because I can apply more AA, while still getting playable frames. Throw that on top of better AF, and now a select few games with HDR+AA, yeah I think ATi has the IQ advantage.