8800GTS 320mb reviews and conclusion

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Skyguy
So, just to switch gears.........

Are we all in agreement that at 1280x1024 the 320mb card will more than suffice, even with AA on?

It depends on the game, and also if you plan on using TRAA. For the most part it will suffice, but in some games like COD2 and COH the 320mb version is a lot slower even at 1280x1024 resolution.
 

Skyguy

Senior member
Oct 7, 2006
202
0
0
Crap, my buddy primarily plays CoD2 and I was hoping to save some $$ on his build with the 320......
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: munky
Originally posted by: Skyguy
So, just to switch gears.........

Are we all in agreement that at 1280x1024 the 320mb card will more than suffice, even with AA on?

It depends on the game, and also if you plan on using TRAA. For the most part it will suffice, but in some games like COD2 and COH the 320mb version is a lot slower even at 1280x1024 resolution.

Looking at the TR review the GTS was either close to or better in average frame rates as the X1950XTX using 4xAA. So unless you are saying anybody who bought the best ATI card ever released bought a piece of crap, I'm not sure what your complaint is. Even in Oblivion at 1920x1200 it passes the X1950 by a sizable margin. Sure a $1200 SLI 8800GTX setup is better, but that's not comparing even remotely the same thing. Pointing out that a more expensive card is better (640mb) adds nothing to this discussion.

Comparing it to it's nearest competition it wins more benchmarks (even with AA) and is less expensive.

I know, I know you can cherry pick a few benchmarks in even fewer games. However if you look at ALL the benchmarks, from ALL the sites it is better even with AA on.

Too bad Ratchet left Rage3D maybe you would believe one of his reviews.
 

munisgtm

Senior member
Apr 18, 2006
371
0
0
Overall in its price range it rocks really. So if you are looking at the budget side of things than 320mb is good.
 

NamelessMC

Senior member
Feb 7, 2007
466
0
0
As some have pointed out, there's obviously a flaw in the 8800GTS 320's driver:
1) X1900XT 256MB is beating the 8800GTS 320MB in some benchmarks.
2) At higher resolutions with no AA/AF, the 8800GTS 320MB closes the gap.
3) With cards like the 7950GT 512MB and the 7950GT 256MB, it's been proven memory size matters little compared to memory architecture and frame buffer allocation.

What I would estimate the 320MB to perform like with better drivers (after a lot of new owners complain about the performance) is 5-10% less than a 640MB version, highly over-clockable to at least run as well as a stock 640MB 8800GTS.

Don't get so discouraged guys. Seriously, it's quite simple. If you game at resolutions of 1600x1200 or higher, or you live by AA/AF @ 1280x1024 or higher, the 640MB 8800GTS is the card for you.

If you game at 1280x1024 or lower with AA/AF, get the 8800GTS 320MB in confidence, because I guarantee there's a driver issue here that's not being flagged. And even if review sites don't re-evaluate the 8800GTS 320MB after the driver correction, that's a GOOD thing, because that means demand will stay low and you're likely to see 320MB 8800GTS's in the low $200 range soon, especially when the R600 and new G80 revisions hit the market.

Also though, you guys shouldn't anticipate 640MB 8800GTS's to maintain a higher difference in cost compared to the 320MB. Just look at the 7 series cards and X1 series cards. The only huge price differences are when you hit the extreme high end (1950XTX or 7900GTX), for mid-high end cards like the GT's or regular XT's, the price difference between the 256MB and 512MB versions was miniscule, which reflected the performance differences really... "miniscule".

I predict that along with 320MB price drops in the 8800GTS, also will follow price drops in the 640MB. The only cards to stay at the top of the price curve are GTX series cards or the latest "GT's" and "Ultras" to hit the market.

Again guys, this market is extremely predictable and a lot of you are throwing early stones. Look at the 8800GTX's early benchmarks. A lot of big numbers here and there, but also a HUGE amount of driver conflicts, system crashes, over-heating. A few weeks later and 8800GTX was the bar-none best card on the market. Give the 320MB some time to settle in and get some driver support and you're likely to see it put all the 79- series and 19- series cards to shame. (Except of course the notorious X1950XTX Crossfire which still packs a punch at high resolutions, but gets the game brought to it by a single 8800GTX which is coincidentally $150-200 cheaper than an XTX CF set-up)

Nvidia doesn't want the 320MB 8800GTS to be $100 cheaper than the 640MB, because that's how marketing works. The 320MB will at most be $45-50 cheaper than the 640MB, but that's because Nvidia always wants you to "step up" higher when you compare cards.

After all, the GTX is only $75-100 more than the GTS 640MB.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: NamelessMC

3) With cards like the 7950GT 512MB and the 7950GT 256MB, it's been proven memory size matters little compared to memory architecture and frame buffer allocation.

but the g80 is a different architecture...

on high-end ati gpu's, the extra 256mb memory made a pretty significant difference, even tho on nv's g70 there was hardly any at all...
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Wreckage
Originally posted by: munky
Originally posted by: Skyguy
So, just to switch gears.........

Are we all in agreement that at 1280x1024 the 320mb card will more than suffice, even with AA on?

It depends on the game, and also if you plan on using TRAA. For the most part it will suffice, but in some games like COD2 and COH the 320mb version is a lot slower even at 1280x1024 resolution.

Looking at the TR review the GTS was either close to or better in average frame rates as the X1950XTX using 4xAA. So unless you are saying anybody who bought the best ATI card ever released bought a piece of crap, I'm not sure what your complaint is. Even in Oblivion at 1920x1200 it passes the X1950 by a sizable margin. Sure a $1200 SLI 8800GTX setup is better, but that's not comparing even remotely the same thing. Pointing out that a more expensive card is better (640mb) adds nothing to this discussion.

Comparing it to it's nearest competition it wins more benchmarks (even with AA) and is less expensive.

I know, I know you can cherry pick a few benchmarks in even fewer games. However if you look at ALL the benchmarks, from ALL the sites it is better even with AA on.

Too bad Ratchet left Rage3D maybe you would believe one of his reviews.

I give more credibility to sites that know how to do a thorough review, and moreover to sites that even know about features such as TRAA and AAA. All the sites that have been using shimmerfest settings for the last 1.5 years on the 7-series don't qualify. All the sites that recommend a $300 card because of how it performs without AA don't either.

I said "For the most part it will suffice", which is obvious from the benches. It's also obvious that the 320mb gts shows a weakness in several benches, which you seem to ignore.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Comparing it to it's nearest competition it wins more benchmarks (even with AA) and is less expensive.
It's nearest competition is the 640 MB GTS.

I agree that it looks to be a better buy over an X1950XTX. Not to say that the X1950XTX is bad - it's using the same exact core that was made over a year ago and is competing quite nicely against nVidia's 3-month-old G80 core.

Still, the G80 320 MB looks to be a good budget card. It's just that if you get it you will be opting for the only G80 that doesn't allow for much AA with newer games, if any at all, and will have to make even more IQ sacrifices with games down the road. TechReport may have seen little difference between the two with AA, but Madshrimps, HardOCP, and Anandtech have all shown the 640 MB being able to use AA where the other can't.

The 320 MB looks to be a very nice card for the $300 price point. The best. And if one simply can't go ~$80 higher, what else is there?

I just don't see the point in trying to buy a G80 that can't use the bare minimum AA in a number of current games. Especially when the price difference is ~$80.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: josh6079
That's the basis for your whole argument? A crappy port that is only one game out of many and doesn't even have working AA?
Its the closest thing we have to a DX10 engine today, whether its a crappy port or not. When you're getting 35 fps at 1920 on a GTX the last thing you should be worried about is AA lol. Oh wait, IQ is so important, you'd rather play Rainbow 6: Slideshow.

If that's your logic, then sure, buy the 320 MB for one game. You'll save around $80 bucks and not be able to use AA when you get bored of Vegas.
Its not my logic, I already bought a 640MB two months ago. I also purchased my card with the target resolution of 1920 since jumping to 2560 was completely unrealistic for a single card solution today and only becomes less of an option in tomorrow's games. However, for someone planning to run 1280 to 1600 on a 19-22" panel, the 320MB is a perfectly viable option. They'll be able to run any games today with AA at those resolutions and expect to run any of the next-gen games based on the Unreal 3 and other DX10 engines reasonably well, just maybe not with AA. I'll be in the same situation at 1920, same as the guy with the GTX at 2560.
There's simply no use to discussing image quality differences with you then.
Ofc not, you're still trying to get your image to "look good" on your 17" "gaming CRT". lol.

Sorry, but at 1920x1200 with 16xAA, you're going to have a hell-of-a-lot nicer picture than 2560x1600 without any AA.
I doubt you've seen either so continue to speculate. Not only will you get better frame rates @2560, but you won't have a smudged mess trying to mask the difference in pixel count.
Yeah, they do run similarly, without AA. When you look at more than just one crappy ported game, you find several major titles that allow the 640 MB to have playable frames and noticeably better IQ.
Major titles today. Major titles tomorrow, you're both in the same boat, no AA or slideshow. But again, even in some current titles, there's the option to run AA and get borderline playable frame rates, or simply turn AA off and get 60+. In those situations there's no tangible benefit of having 640MB over 320MB when they both run about the same with AA off.

I also would like to know where you're getting your information that Crysis will run at 25 fps with 4xAA on a 640 MB GTS, since the UT3 engine isn't what Crysis will be using.
There was a video floating around with a THG guy doing an interview about Crysis. Demo was run on a 2407 running 1920 and although most of the demo looked smooth, there was a portion with a helicoptor where frames were noticeably choppy. Without knowing any other IQ settings or even if the card used was ATI or NV, that has to raise concerns for anyone who has been around long enough to know what new game engines/APIs do to video cards.



You think running one card with 4xAA vs. another with 0xAA is a "valid comparison"?

Yes, a 640 MB with 4xAA will get lower frames than a 320 MB without AA. It is a sacrifice of IQ for performance that each gamer will decide if they'll make. You are one who seems inclined for lower IQ, better performance. That's fine.

Nope, you made the comparison to a 640MB being able to run AA when the 320MB wouldn't. I pointed it out to be a flawed comparison because 1) in most current games, they run within 5% of each other with AA off and 2) the 640MB will never run higher FPS with AA on than the 320MB will run with AA off. There isn't much of a decision to make when the game makes it for you and you're running at 24 fps with AA on one hand and 12 fps with AA on the other (if we put that in a pretty bargraph that's 100%! :Q). Anyone who isn't interested in looking at a really pretty slideshow is going to simply turn AA off.....at which point there is virtually no difference between the 320 and 640MB other than the fact they're both running at playable frame rates.

I however don't see the point in buying a G80 if I'm not going to use AA.
Based on your comments in this thread and others, I don't think you'd buy a G80 regardless. Still, buying a G80 for AA in today's games are fine, but buying one and expecting them to be able to run your target resolutions in tomorrow's games with AA is simply unrealistic. So what then? Upgrade your GPU again?
We know. No one is saying that you can't be content without using AA for your games. What I take issue with is when you're saying AA in general isn't as beneficial as a high resolution and that Crysis is only going to get 25 fps with 4x AA and a 640 MB GTS just because R6: Vegas is struggling - a completely different game ported from the 360 and using a completely different engine.
Well, I guess it comes down to preference heh. I don't know of a single person who has made the transition from 1280x1024 to 1920+ who prefers a really non-aliased image to a much higher resolution, which brings me to my initial comparison of my preferences. I'd rather run 1920 with no AA instead of running 1280 + 25,534 AA, and I'm sure anyone else who made that jump would agree as well.

As for the rest, again, new APIs and new game engines typically do that to the landscape of GPUs. What was plenty fast before becomes "just enough". I don't use 3DMark much as a frame of reference, but one thing it does do very well is show the different performance levels between API and how they stress video cards. There's enough out there to set expectations for the next generation of games, and as history shows, its not looking good for AA.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: josh6079
It's nearest competition is the 640 MB GTS.
I was considering the X1950XTX as it's competitor. I would not consider the same product with more memory as competition (just a more expensive option).

I agree that it looks to be a better buy over an X1950XTX. Not to say that the X1950XTX is bad - it's using the same exact core that was made over a year ago and is competing quite nicely against nVidia's 3-month-old G80 core.
I would not say it even comes close to a GTX.


Still, the G80 320 MB looks to be a good budget card. It's just that if you get it you will be opting for the only G80 that doesn't allow for much AA with newer games, if any at all, and will have to make even more IQ sacrifices with games down the road. TechReport may have seen little difference between the two with AA, but Madshrimps, HardOCP, and Anandtech have all shown the 640 MB being able to use AA where the other can't.
This is where the FUD comes in. You can run AA just fine, just not at uber high resolutions that used to require SLI or Crossfire. Saying is does not run AA and not qualfying that it's at a resolution that very few people use is pure BS. As shown in the TR review it runs AA as good or better than the best ATI currently has to offer. Unless people who have been buying ATI cards have never been able to run AA?

The 320 MB looks to be a very nice card for the $300 price point. The best. And if one simply can't go ~$80 higher, what else is there?

I just don't see the point in trying to buy a G80 that can't use the bare minimum AA in a number of current games. Especially when the price difference is ~$80.

Did you read the TR review where is was getting good average FPS with 4XAA? Even at resolutions that used to require 2 cards?

I guess maybe you are asking too much out of a $300 card, something no one has ever required from a card at that price.

I suppose you are pissed that the GTX does not do full scale holographic rooms like on Star Trek or something? :laugh:



 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
you made the comparison to a 640MB being able to run AA when the 320MB wouldn't.
As did Anandtech, Madshrimps, and HardOCP.

They made that comparison because the settings between the cards were identical and therefore valid. The only differentiating factor in their benchmarks were the performances, which is what separates the two versions.

The comparison you're pushing for is flawed because the settings are not identical. You're wanting to call a benchmark where one card has 4xAA enabled and the other has 0xAA fair when such settings are far from it.
I pointed it out to be a flawed comparison because 1) in most current games, they run within 5% of each other with AA off and 2) the 640MB will never run higher FPS with AA on than the 320MB will run with AA off.
No one ever claimed otherwise on your point #2. But what you're trying to claim is that the G80 640 MB can't use AA with current games and get acceptable frames.

Your method is just to disable AA altogether because the G80 640 MB supposedly doesn't have enough horsepower for you.

Since you can't really tell the benefit of AA to begin with, this is a completely valid method for a gamer such as yourself. I'm sorry, but there's not much reasoning I can have for one thinks a 2560x1600 display without AA looks better than a 1920x1200 one does with 16xAA.
Still, buying a G80 for AA in today's games are fine, but buying one and expecting them to be able to run your target resolutions in tomorrow's games with AA is simply unrealistic.
It's more realistic than expecting a card that already can't compete when using AA to keep a level performance with the one that can. I'm not saying the 640 MB will be able to use 4xAA in tomorrows games but it can do so just fine in many existing ones and has a better chance of doing so for the ones coming than the 320 MB has.
So what then? Upgrade your GPU again?
Do you think the 320 MB will have better longevity than the 640 MB?

:roll:

It's already behind now, what makes you think it's lifespan will be any better?

Think about what you're arguing. You think a good buy is an investment in an already weaker card for tomorrow's games.
I don't know of a single person who has made the transition from 1280x1024 to 1920+ who prefers a really non-aliased image to a much higher resolution...
Then I suspect you don't know very many people. And if you do, they don't know what good graphics are.
...which brings me to my initial comparison of my preferences.
Exactly. That's all you've been trying to compare: preferences. That's the problem.

You can't fault someone for wanting frames less than 60 with AA just because it doesn't fit with your preference. The 640 MB allows for higher IQ and acceptable gameplay. Maybe not for you since you want 60+ frames, but if that is your frame-target you obviously don't care about IQ since most monitors have a refresh rate of 60 hz, thus 60+ would mean you'd have to disable vsync and have ripped textures anyway.
As for the rest, again, new APIs and new game engines typically do that to the landscape of GPUs.
Precisely. That is why it is ideal to have as powerful of a GPU as possible. So that the impact will not be as severe.

If one wants to buy for tomorrow's games, the mid-range market isn't the area to purchase from. Unless of course they're okay with playing with poor IQ.
There's enough out there to set expectations for the next generation of games, and as history shows, its not looking good for AA.
History shows AA levels not only increasing in their degree but in their playability, so I'm not really sure why you're saying AA's future isn't looking good. Today the current flagship video card uses 4xAA like the previous generation used 2xAA. Many are enjoying levels of 8x or higher on a wide variety of their games.
I'd rather run 1920 with no AA instead of running 1280 + 25,534 AA, and I'm sure anyone else who made that jump would agree as well.
We're not talking about a level of AA that is astronomical. We're talking about the bare minimum AA level to achieve a noticeably different picture. The fact that you can't tell the benefit of AA doesn't mean the card that doesn't have the power for it is the better buy.


=========================================================


I would not say it even comes close to a GTX.
I didn't say it did. I said a G80 core, which can be a GTX and two different GTS's. In some cases, an X1950XTX can compete quite nicely against a GTS, and it is using a core over a year old. Doesn't happen everytime, and the nod certainly does go for the GTS overall, but that doesn't mean that it isn't impressive.
As shown in the TR review it runs AA as good or better than the best ATI currently has to offer.
Depending on the resolution and amount of AA.

Anandtech found the the X1950XTX beating the GTS in Quake 4 at 1920x1200 with 4xAA.

But I agree, I'd go for the 8800GTS overall since I myself use a resolution smaller than that.

However, that same benchmark showed a 7950GT out-pacing an 8800GTS 320 MB. Why would you bother purchasing the only G80 that a previous generation from the same company card can beat?
Did you read the TR review where is was getting good average FPS with 4XAA? Even at resolutions that used to require 2 cards?
Yes I did. But when I read three other reviews claiming otherwise I figured the TR review didn't stress both versions enough to notice that difference.
I guess maybe you are asking too much out of a $300 card, something no one has ever required from a card at that price.
Of the reviews I've seen that consistently show a difference between the GTS 320 MB and the 640 MB, the weaker GTS is only an advantage with it's price. I agree that for $300 dollars, it's a good card. And if you don't have an uber high resolution monitor, the differences won't be as great.
I suppose you are pissed that the GTX does not do full scale holographic rooms like on Star Trek or something?
No it can, you just can't use AA with it and it will look like crap. So why bother?
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
So, all-in-all is the GTS 320 an upgrade from an X1900XT 512MB if I am stuck at 1280x1024 or a downgrade?
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: josh6079
As did Anandtech, Madshrimps, and HardOCP.

They made that comparison because the settings between the cards were identical and therefore valid. The only differentiating factor in their benchmarks were the performances, which is what separates the two versions.

The comparison you're pushing for is flawed because the settings are not identical. You're wanting to call a benchmark where one card has 4xAA enabled and the other has 0xAA fair when such settings are far from it.
Nope, I'm comparing playable frame rates for both cards and in many of the more intensive games at 1920+ resolution that means both cards are turning AA off to get reasonable frames, which is going to be the norm for future games and at that point frame rates are within 5% of each other. You're making the comparison that a 640MB can run AA where a 320MB can't, and in some games thats true, but in others it might be 50-150% better but that still doesn't make it playable.

Actually you might want to take a look at HardOCP's review again, since they tackle it in exactly that manner. They make concessions in IQ throughout to reach a "minimum playable" scenario, setting Avg. FPS around 40 as a target. That's a bit low imo since minimum frames with a 40 avg. can easily drop into the teens but it clearly shows not even the 640MB can run everything cranked up at 1920 for all games. You think that situation is going to get any better in future games? lol.

No one ever claimed otherwise on your point #2. But what you're trying to claim is that the G80 640 MB can't use AA with current games and get acceptable frames.
Actually you have on multiple occasions, implying the 640MB could run AA where the 320MB couldn't when in reality they would both need to turn AA off in order to get playable frame rates.

Your method is just to disable AA altogether because the G80 640 MB supposedly doesn't have enough horsepower for you.
At 1920, it doesn't and I knew that full well when I bought it. Sure the 640MB gives me a little more wiggle room (just as a GTX would've given me more wiggle room), but otherwise turning off AA results in nearly identical performance as the 320MB. You've acknowledged it yourself on numerous occasions, there's a trade-off in terms of IQ and performance. If one card runs 35 fps with AA on and the other runs 80 fps with AA off, you'd go with the former option for the sake of IQ?

Since you can't really tell the benefit of AA to begin with, this is a completely valid method for a gamer such as yourself. I'm sorry, but there's not much reasoning I can have for one thinks a 2560x1600 display without AA looks better than a 1920x1200 one does with 16xAA.
I'll put it in simpler terms for you since you don't seem to get it. Run at 800x600 with as much AA as you like and compare it to a 1280x1024 with no AA. Which has the better image quality? There's really no comparison, more pixels = less need to mask ugly pixels with AA. Got it? Good. Now, take your 17" CRT and imagine there's 2x as many pixels in there. That's what you get by going from 1920 > 2560.
It's more realistic than expecting a card that already can't compete when using AA to keep a level performance with the one that can. I'm not saying the 640 MB will be able to use 4xAA in tomorrows games but it can do so just fine in many existing ones and has a better chance of doing so for the ones coming than the 320 MB has.
Again, I expect neither to be able to use AA at higher resolutions so that's a moot point, but I guess we'll see.
Do you think the 320 MB will have better longevity than the 640 MB?
It'll be in the same position as the last-gen cards in terms of 256 vs. 512MB. Neither will be fast enough to run AA so they'll perform roughly the same. And history repeats itself.

It's already behind now, what makes you think it's lifespan will be any better?

Think about what you're arguing. You think a good buy is an investment in an already weaker card for tomorrow's games.
Already behind with settings that already choke the card its chasing. Why don't you think about what you're arguing? lol. By your reasoning the only way to go is to buy 2 of the fastest cards available every time they're released. Even then it still may not be enough. But keep chasing the impossible dream of running AA 24/7/365 in every game for eternity.
Then I suspect you don't know very many people. And if you do, they don't know what good graphics are.
Actually quite a few of my rl and online friends have made the transition at my suggestion. Instead of crushing their FPS trying to mask their ugly low resolutions, they'd rather run without AA at a higher resolution and get better frame rates to boot. Its a win-win situation really. Good graphics aren't running a lower resolution and trying to cover it up by smearing jaggies, good graphics are running full uncompressed textures at higher resolutions. But hey, I'm sure there's a bunch of TV salesmen at Best Buy that would love to sink their hooks into you. They'd love to sell you a 40" 720p HDTV that has really cool video smoothing features for the same price as the 40" 1080p HDTV right next to it....
Exactly. That's all you've been trying to compare: preferences. That's the problem.
Preferences based upon actual experience between the two. One day you'll realize that resolution > AA in terms of IQ, you'll just have to see it first-hand to believe it I guess.

You can't fault someone for wanting frames less than 60 with AA just because it doesn't fit with your preference. The 640 MB allows for higher IQ and acceptable gameplay. Maybe not for you since you want 60+ frames, but if that is your frame-target you obviously don't care about IQ since most monitors have a refresh rate of 60 hz, thus 60+ would mean you'd have to disable vsync and have ripped textures anyway.
Nope, 60 is a target exactly because of Vsync, the problem is that an Avg. FPS means there are minimum frames well below 60. With Vsync on, frames above 60 are "lost" but I can tell you for a fact drops below 60 are noticeable when gaming and drops below 30 are just unbearable.
Precisely. That is why it is ideal to have as powerful of a GPU as possible. So that the impact will not be as severe.
The GTX will have exactly the same problems. Its the most powerful GPU available today. Now what? Buy two? Still going to run into the same problems. Time to notch down those AA settings I guess, heh.

If one wants to buy for tomorrow's games, the mid-range market isn't the area to purchase from. Unless of course they're okay with playing with poor IQ.
Right, everyone needs to buy top of the line all the time. That makes sense. Or, they can purchase a card that will run games well today and run tomorrow's games well by simply sacrificing the luxury of AA.
History shows AA levels not only increasing in their degree but in their playability, so I'm not really sure why you're saying AA's future isn't looking good. Today the current flagship video card uses 4xAA like the previous generation used 2xAA. Many are enjoying levels of 8x or higher on a wide variety of their games.
While its true AA performance has improved with the latest generation of cards (X1900, and especially G80), the DX9 API has been out for around 3 years. Expecting the same level of performance with AA enabled once DX10 games hit the market is unrealistic. In another 3 years with G100 and R1000 maybe, but not with G80 and R600.
We're not talking about a level of AA that is astronomical. We're talking about the bare minimum AA level to achieve a noticeably different picture. The fact that you can't tell the benefit of AA doesn't mean the card that doesn't have the power for it is the better buy.
Its really simple, any level of AA isn't going to compensate for a lower pixel count (and therefore, a lower quality image). Anyone who thinks that simply doesn't have the ability to run higher resolutions or hasn't seen the difference first-hand. Its really that simple. There's so many non-PC examples and if you can't tell the difference, I don't know what to tell you. 480i vs. 720p vs 1080p. Xbox or Wii vs. Xbox360 or PS3. Seriously, even my 5 year old nephew could tell you which has the higher IQ lol.

 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: aka1nas
So, all-in-all is the GTS 320 an upgrade from an X1900XT 512MB if I am stuck at 1280x1024 or a downgrade?

Its a lot better than an X1900XT at 1280 but I'm not sure how much of a difference you'll see in today's games. Better off waiting a bit to see if prices drop on the 320MB parts as expected and possibly get a driver update to improve performance at higher resolutions and with AA. Right now the price is too close to a 640MB to make it a worthwhile buy. If it dropped to $250-280 or so then it'd be the best mid-range buy for sure.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Right now the price is too close to a 640MB to make it a worthwhile buy. If it dropped to $250-280 or so then it'd be the best mid-range buy for sure.
QFT.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Actually you have on multiple occasions, implying the 640MB could run AA where the 320MB couldn't when in reality they would both need to turn AA off in order to get playable frame rates.
It depends on the game. I've only implied that the 640 MB can run AA in the titles where the 320 cannot.
At 1920, it doesn't and I knew that full well when I bought it.
You're saying your 8800 GTS 640 MB doesn't have enough horsepower to run at 1920x1200 with AA? Is R6: Vegas the only game on your playlist?

Because it can get 51 FPS in FEAR and 56 FPS in Q4 with 4xAA enabled at 1920x1200. No where near your ridiculous examples of "35 fps with AA on and the other runs 80 fps with AA off".

If that isn't enough horsepower for you then you bought the wrong card.
There's really no comparison, more pixels = less need to mask ugly pixels with AA.
You're forgetting about the size of the monitor. The fact that the resolution increases means nothing if the size of the panel is larger. You're still going to need AA.
Now, take your 17" CRT...
In case you missed that thing at the bottom of my posts called a signature, I don't have a 17 CRT, so I'm beginning to think you're pulling a beggerking and getting me confused with someone else.
...and imagine there's 2x as many pixels in there. That's what you get by going from 1920 > 2560.
Once again, you're forgetting about the size of the screen. The fact that you increase the resolution to 2560x1600 means nothing if you increase the surface area as well.
Again, I expect neither to be able to use AA at higher resolutions so that's a moot point, but I guess we'll see.
It's not a moot point because when one card was already sacrificing AA where are you going to trim the fat from next? Would you rather have the card that is just now having to stop using it's AA or the card that already couldn't use it?
The GTX will have exactly the same problems. Its the most powerful GPU available today. Now what? Buy two? Still going to run into the same problems. Time to notch down those AA settings I guess, heh.
So we should all get the weakest G80 now so it can perform just like the strongest one for tomorrow's games?

Honestly, if the GTX is finally having to sacrifice AA to get playable frames, where do you think the GTS 320 MB is going to be? It's not like the 320 is bottomed-out and can't go any lower. It's frames will either be worse at that point or having to make IQ cuts elsewhere to keep up.
Right, everyone needs to buy top of the line all the time.
If they're expecting to play games not yet released with high IQ, then yes. Otherwise, no.
Or, they can purchase a card that will run games well today and run tomorrow's games well by simply sacrificing the luxury of AA.
The card you're discussing is already sacrificing AA. What will it sacrifice when the greater versions are having disable AA?
Expecting the same level of performance with AA enabled once DX10 games hit the market is unrealistic.
I never said they would be able to use the same levels. I said the greater versions would have a better chance of doing so and more of an IQ cushion to begin with.
 

coldpower27

Golden Member
Jul 18, 2004
1,677
0
76
This is very interesting, it's been awhile where I have seen such a dramatic impact with frame buffer usage on Nvidia designed GPU's. It's no wonder the MSRP is set to 299 USD vs the 449USD of the 640MB version. The performance differences also seem to vary depending on what game you play on, Quake 4 is actually fine as long as your not running Ultra Quality, check out the TechReport benches, while F.E.A.R already has some impact even at more standard resolutions of 12x10.

From my perspective, the 8800 GTS represents a decent value since the difference in value between them is about 135 CDN here after you factor in GST, and since the GTS 320 is only 340 CDN, so it's an increase of 37.35% and you also have to factor in what resolutions you run, I believe this card should be acceptable if you run 1280x1024 even with AA/AF, and borderline if you run 1680x1050 and 1600x1200, any higher and I would start recommending the 640MB GTS and the 8800 GTX.

I am not too sure I would be too interested in using a first generation DirectX 10 card to play DirectX 10 games, as the performance will only be tolerable at best, so DirectX 10 at this point for me would be more a checkbox feature then anything else, but at least you have the ability to run them, albeit at reduced performance.
 

Fraggable

Platinum Member
Jul 20, 2005
2,799
0
0
I just ordered one along with a 20.1" 1680 X 1050 LCD. I mainly play BF2, HL2, Far Cry and Oblivion. I saw that BF2 takes a big hit with this 320MB card but I want to be able to play Crysis when it comes out and this is just about all I could afford. I think it will be fine. I'm upgrading from a 1800XT 256MB so I should see an increase in frames, at least in Oblivion. BF2 was already maxed out at 1440 X 900 at 99FPS.
 

dreddfunk

Senior member
Jun 30, 2005
358
0
0
Nope, 60 is a target exactly because of Vsync, the problem is that an Avg. FPS means there are minimum frames well below 60. With Vsync on, frames above 60 are "lost" but I can tell you for a fact drops below 60 are noticeable when gaming and drops below 30 are just unbearable.

chizow - I don't want to interject myself into this debate in a major way (because I think the discussion between yourself and josh is mostly 'much ado about nothing') but the statement above really, really depends on the game and your definition of 'unbearable', which is an opinion in and of itself.

Personally, I prefer less FPS-style games. Oblivion is anything but 'unbearable' for me as long as the fps stays within the 25-30 range. The break-even point for me is about 20 fps and below 20 the game becomes difficult.

The reason I say this is that, depending upon the game, there is a huge difference in 30fps and 15fps. Oblivion is just such a game. It's the reason why the top end ATI cards were so popular, prior to G80, for those folks that wanted to play it. For you, they both might be 'slideshows' but for a lot of people they aren't.

Any of the G80 versions would be fantastic for Oblivion right now, that's not my point. My only point is that, once we start talking about 'bearable' and 'unbearable' framerates, as well as 'better' and 'worse' IQ, we're talking to a significant degree about opinion, and not objective fact. Everyone would agree that 80fps is better than 30fps--in theory. Then again, if I have to choose between 80fps for Oblivion running at 640x480 versus 30fps running at 1280x1024, I know which I'll choose, every time.

That's a personal choice, however, in balancing my subjective idea of 'best' IQ with my subjective idea of 'bearable' performance.

Cheers.
 

goingmerry

Member
Feb 6, 2007
34
0
0
hey, i have a quick question, i'm trying to get parts ordered asap to build my new comp, plan to game on a 22 inch widescreen. should i get this 8800gs 320mb, or save the money and upgrade my card latter and get this 7950gt, or a x1950 pro 512mb. thanks.
 

dreddfunk

Senior member
Jun 30, 2005
358
0
0
Truthfully, because I was bored at work and a little interested in the actual scaling of resolutions relative to panel size, here's what I've found:

[bbw - these numbers aren't absolutely precise but they're darn good estimates]

17" 1280x1024 - 1,310,720 pixels / 144.5 sq.in. = 9070.73 pixels/sq.in.
19" 1280x1024 - 1,310,720 pixels / 180.5 sq.in. = 7261.61 pixels/sq.in.
20.1" 1600x1200 - 1,920,000 pixels / 202 sq.in. = 9504.95 pixels/sq.in.
20.1" 1680x1050 - 1,764,000 pixels / 202 sq.in. = 8732.64 pixels/sq.in.
22" 1680x1050 - 1,764,000 pixels / 242 sq.in. = 7289.26 pixels/sq.in.
23" - 1920x1200 - 2,304,000 pixels / 264.5 sq.in. = 8710.78 pixels/sq.in.
24" - 1920x1200 - 2,304,000 pixels / 288 sq.in. = 8000 pixels/sq.in.

If higher resolutions are to provide better image quality over AA, then the argument rests on pixel density, not just the gross resolution size. A higher pixel density is what produces a more finely-grained mosaic, which in turn requires less AA to smooth out any obvious, jagged, edges.

As you can see from the figures above, there is a significant leap between 1280x1024 and higher resolutions only on 19" monitors [edit] and only if you don't choose a 22" 1680x1050 panel. If you have a 17" LCD with 1280x1024 resolution, you actually have a pixel density higher than any of the panels with 1920x1200 resolution.

My own, 18.1" 1280x1024 panel has a density of 8001.71, which is almost precisely that of the very popular 24" 1920x1200 panels. This means that, from the eye's perspective, a 24" 1920x1200 panel requires just as much AA as an 18.1" 1280x1024 panel. That is, the pixel density of the higher resolution panel is no higher and thus allows for no better natural smoothing of edges.

I'll grant you that simply having a screen that big provides a much, much better experience for gaming. But it has nothing to do with pixel density and the need (or lack thereof) for AA.

Cheers.

[edited to add the 22" 1680x1050 panels]
 

HigherGround

Golden Member
Jan 9, 2000
1,827
0
0
Originally posted by: dreddfunk
Truthfully, because I was bored at work and a little interested in the actual scaling of resolutions relative to panel size, here's what I've found:

[bbw - these numbers aren't absolutely precise but they're darn good estimates]

17" 1280x1024 - 1,310,720 pixels / 144.5 sq.in. = 9070.73 pixels/sq.in.
19" 1280x1024 - 1,310,720 pixels / 180.5 sq.in. = 7261.61 pixels/sq.in.
20.1" 1600x1200 - 1,920,000 pixels / 202 sq.in. = 9504.95 pixels/sq.in.
20.1" 1680x1050 - 1,764,000 pixels / 202 sq.in. = 8732.64 pixels/sq.in.
23" - 1920x1200 - 2,304,000 pixels / 264.5 sq.in. = 8710.78 pixels/sq.in.
24" - 1920x1200 - 2,304,000 pixels / 288 sq.in. = 8000 pixels/sq.in.

If higher resolutions are to provide better image quality over AA, then the argument rests on pixel density, not just the gross resolution size. A higher pixel density is what produces a more finely-grained mosaic, which in turn requires less AA to smooth out any obvious, jagged, edges.

As you can see from the figures above, there is a significant leap between 1280x1024 and higher resolutions only on 19" monitors. If you have a 17" LCD with 1280x1024 resolution, you actually have a pixel density higher than any of the panels with 1920x1200 resolution.

My own, 18.1" 1280x1024 panel has a density of 8001.71, which is almost precisely that of the very popular 24" 1920x1200 panels. This means that, from the eye's perspective, a 24" 1920x1200 panel requires just as much AA as an 18.1" 1280x1024 panel. That is, the pixel density of the higher resolution panel is no higher and thus allows for no better natural smoothing of edges.

I'll grant you that simply having a screen that big provides a much, much better experience for gaming. But it has nothing to do with pixel density and the need (or lack thereof) for AA.

Cheers.

pixel density doesn't quite tell the whole story. You have to consider eye-to-screen distance as well ... and that increases as the displays get larger causing the effective pixel size to become smaller.
 

dreddfunk

Senior member
Jun 30, 2005
358
0
0
Higher - sure, that's true, but then the effective size of the panel also decreases. I.e., if I sit 5' away from my 24" 1920x1200 panel, it does make the pixels appear smaller, but then it's also taking up less of my actual visual field.

I think there is improvement there but it is somewhat incremental. Effectively you are getting a slightly better experience and you are able to game from your barca-lounger.

The reason I brought pixel density into the discussion was in order to suggest that higher resolutions are not, in and of themselves, magic bullets that render AA useless. If you want to sit close to that new, 24" screen so you can feel truly immersed, then you are going to notice just as many issues with jagged edges as a comparable 17" panel.

Cheers.
 

munisgtm

Senior member
Apr 18, 2006
371
0
0
Originally posted by: goingmerry
hey, i have a quick question, i'm trying to get parts ordered asap to build my new comp, plan to game on a 22 inch widescreen. should i get this 8800gs 320mb, or save the money and upgrade my card latter and get this 7950gt, or a x1950 pro 512mb. thanks.

keep it safe, save some money and buy the 640mb version.
 

PingSpike

Lifer
Feb 25, 2004
21,733
565
126
Originally posted by: dug777
Originally posted by: chizow
The higher resolution you run, the less need there is for AA. I'd rather run 1920x1200 with no AA and no AF and get 60+ frames instead of 1280x1024 with 10,055X AA and 45,485X AF with < 30fps. Fact of the matter is that anything over 1920 w/ AA is borderline unplayable on anything short of an 8800GTX in most current games, and its only going to get worst in future games. Simply put, gaming at high resolutions and AA/AF cranked up simply don't mix well with newer games. And history repeats itself.

Personally I think there's more at play than simply a lack of memory for the frame buffer at high resolutions w/ AA. Something as simple as a driver optimization that more efficiently caches/clears the frame buffer on the 320MB part would probably improve performance dramatically. There's no other explanation for how the 256MB X1900 parts outperform the 320MB GTS in some of the higher resolution benchmarks with AA enabled other than poor memory allocation/management by the driver.

agreed.

I just noticed this as well. The 320MB GTS is suppose to be identical to its bigger brother besides extra ram. And the 640mb is stomping on the ATI parts. The only reason the 320 should fall behind those same parts is if its ram starved...and yet it actually has more ram then some of the parts its being beaten by. There's something weird going on there and the most logical guess is that the drivers are treating the card like it has 640mb or ram or something.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |