you made the comparison to a 640MB being able to run AA when the 320MB wouldn't.
As did Anandtech, Madshrimps, and HardOCP.
They made that comparison because the settings between the cards were identical and therefore valid. The only differentiating factor in their benchmarks were the performances, which is what separates the two versions.
The comparison you're pushing for is flawed because the settings are not identical. You're wanting to call a benchmark where one card has 4xAA enabled and the other has 0xAA fair when such settings are far from it.
I pointed it out to be a flawed comparison because 1) in most current games, they run within 5% of each other with AA off and 2) the 640MB will never run higher FPS with AA on than the 320MB will run with AA off.
No one ever claimed otherwise on your point #2. But what you're trying to claim is that the G80 640 MB can't use AA with current games and get acceptable frames.
Your method is just to disable AA altogether because the G80 640 MB supposedly doesn't have enough horsepower for you.
Since you can't really tell the benefit of AA to begin with, this is a completely valid method for a gamer such as yourself. I'm sorry, but there's not much reasoning I can have for one thinks a 2560x1600 display without AA looks better than a 1920x1200 one does with 16xAA.
Still, buying a G80 for AA in today's games are fine, but buying one and expecting them to be able to run your target resolutions in tomorrow's games with AA is simply unrealistic.
It's
more realistic than expecting a card that already can't compete when using AA to keep a level performance with the one that can. I'm not saying the 640 MB will be able to use 4xAA in tomorrows games but it can do so just fine in many existing ones and has a
better chance of doing so for the ones coming than the 320 MB has.
So what then? Upgrade your GPU again?
Do you think the 320 MB will have better longevity than the 640 MB?
:roll:
It's already behind now, what makes you think it's lifespan will be any better?
Think about what you're arguing. You think a good buy is an investment in an already weaker card for tomorrow's games.
I don't know of a single person who has made the transition from 1280x1024 to 1920+ who prefers a really non-aliased image to a much higher resolution...
Then I suspect you don't know very many people. And if you do, they don't know what good graphics are.
...which brings me to my initial comparison of my preferences.
Exactly. That's all you've been trying to compare: preferences. That's the problem.
You can't fault someone for wanting frames less than 60 with AA just because it doesn't fit with your preference. The 640 MB allows for higher IQ and acceptable gameplay. Maybe not for you since you want 60+ frames, but if that is your frame-target you obviously don't care about IQ since most monitors have a refresh rate of 60 hz, thus 60+ would mean you'd have to disable vsync and have ripped textures anyway.
As for the rest, again, new APIs and new game engines typically do that to the landscape of GPUs.
Precisely. That is why it is ideal to have as powerful of a GPU as possible. So that the impact will not be as severe.
If one wants to buy for tomorrow's games, the mid-range market isn't the area to purchase from. Unless of course they're okay with playing with poor IQ.
There's enough out there to set expectations for the next generation of games, and as history shows, its not looking good for AA.
History shows AA levels not only increasing in their degree but in their playability, so I'm not really sure why you're saying AA's future isn't looking good. Today the current flagship video card uses 4xAA like the previous generation used 2xAA. Many are enjoying levels of 8x or higher on a wide variety of their games.
I'd rather run 1920 with no AA instead of running 1280 + 25,534 AA, and I'm sure anyone else who made that jump would agree as well.
We're not talking about a level of AA that is astronomical. We're talking about the bare minimum AA level to achieve a noticeably different picture. The fact that you can't tell the benefit of AA doesn't mean the card that doesn't have the power for it is the better buy.
=========================================================
I would not say it even comes close to a GTX.
I didn't say it did. I said a G80 core, which can be a GTX and two different GTS's. In some cases, an X1950XTX can compete quite nicely against a GTS, and it is using a core over a year old. Doesn't happen everytime, and the nod certainly does go for the GTS overall, but that doesn't mean that it isn't impressive.
As shown in the TR review it runs AA as good or better than the best ATI currently has to offer.
Depending on the resolution and amount of AA.
Anandtech found the the X1950XTX beating the GTS in Quake 4 at 1920x1200 with 4xAA.
But I agree, I'd go for the 8800GTS overall since I myself use a resolution smaller than that.
However, that same benchmark showed a 7950GT out-pacing an 8800GTS 320 MB. Why would you bother purchasing the only G80 that a previous generation from the same company card can beat?
Did you read the TR review where is was getting good average FPS with 4XAA? Even at resolutions that used to require 2 cards?
Yes I did. But when I read three other reviews claiming otherwise I figured the TR review didn't stress both versions enough to notice that difference.
I guess maybe you are asking too much out of a $300 card, something no one has ever required from a card at that price.
Of the reviews I've seen that consistently show a difference between the GTS 320 MB and the 640 MB, the weaker GTS is only an advantage with it's price. I agree that for $300 dollars, it's a good card. And if you don't have an uber high resolution monitor, the differences won't be as great.
I suppose you are pissed that the GTX does not do full scale holographic rooms like on Star Trek or something?
No it can, you just can't use AA with it and it will look like crap. So why bother?