nVidia GT200 Series Review Thread

Page 11 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

golem

Senior member
Oct 6, 2000
838
3
76
Originally posted by: ShadowOfMyself


Thats what we are speculating, but I very much doubt that... Since it was said by AMD we need to lower the % a bit, so lets assume 70%

GTX 280 = 100fps
HD 4870 = 70 fps
HD 4870X2 = 120- 140 fps ( depending on how the game scales )

It still comes out with a healthy margin for a much lower price

I thought sli/xfire scaling was generally 0 to 80%

So it should be
GTX 280 = 100 fps
HD 4870 = 70 fps
HD 4870 = 70 - 126 fps (depending on how the game scales)

(70 x 1.8 equals 126 unless I'm using the wrong numbers for dual gpu scaling)
 

sourthings

Member
Jan 6, 2008
153
0
0
Originally posted by: chizow
I went ahead and compiled a comparison of different benchmarks between the 8800 GTX, GTX 280, GTX 260 and 9800 GX2 at 1920x1200 using the 8800 GTX as reference as that is the card and resolution I game at.

1920x1200 Review Compilation

As you can see, the differences from a single 8800GTX and GTX 280 (Column L vs P = Q) is significant and often the difference between playable or non-playable. This can be largely subjective but I can tell you for sure the games I own will play better without a doubt jumping from a 30-45FPS average (sub 60) to 60+ always.

The difference between a GTX 280 and 9800GX2 isn't nearly as significant and even though the GX2 often beats the GTX 280, its already at a performance level higher than necessary to achieve solid playable framerates (60+ for me). Keep in mind this difference will be even more pronounced when you go to higher resolutions as bandwidth and framebuffer become more of an issue in GTX 280's favor.

Now, sure you can daisy-chain a whole bunch of cheaper gimpy cards to achieve similar performance, but then you introduce a bunch of other problems including but not limited to:

  • Profiles/Scaling- SLI/CF rely on driver profiles for their performance and in the case of ATI, you can't change these yourself. So if your particular game doesn't have a pre-defined profile you may see no benefit or even *worst* performance than with a single card. In the case of relying on two individually slower cards than your single card, you can see that you may actually be paying more for *worst* performance which is unacceptable to me.
  • Micro-stuttering- Pretty heated debate about the significance of this problem on this board and others although it pops up infrequently. Basically the timing of each frame from the different GPU in AFR can be erratic, leading to this effect. Apparently some people are very sensitive to it and some aren't. I don't know as I have never used SLI, but I certainly wouldn't be happy if I spent $400-600 for SLI/CF only to find I couldn't stand micro-stutter.
  • Heat/Power/Space - Typically not an issue for most enthusiasts, but it can become a problem when you have 2 or even 3x the power draw and heat from high-end cards. The PSU issue can be a total W issue, but also a power connector issue with so many high-end parts needing 6 or even 8-pin PCI-E connections. Many cases can also have problems accomodating 1x9"+ card, much less 2+.
  • Multi-Monitor (NV only) - NV multi-GPU solutions do not support multi-monitors. I don't know if this is a superficial driver limitation to prevent desktop cards being used in professional workstations or a truly technical issue, but I'm leaning towards driver limitation as I'm assuming the Quadro GX2 would support more than 1 monitor..... Multi-Monitor support is important to me as I play full screen on my 1920 and use my 2nd monitor for various monitoring tools, surfing the web, etc.
  • Bandwidth/Frame Buffer - Not as big a deal at 1920, but one of the major reasons to upgrade to the fastes GPU is for ultra high resolutions with AA. With a GX2 or SLI/CF solution, you're still limited to the same bus width and frame buffer as the individual cards even if you have more rendering horse power. This limitation is apparent in the higher resolutions with AA when comparing a GTX 280 with a true 512-bit bus and 1GB frame buffer to the X2/SLI solutions with a 256-bit bus and 512MB buffer.
  • Chipset specific limitations - ATI CF requires an Intel/AMD chipset and NV SLI requires an NV chipset. This unnecessarily ties your platform to your GPU between generations and in the case of SLI, to NV's flaky chipsets.
  • Overclocking ability? - NV used to have problems overclocking in SLI in Vista but I think its been fixed. Not sure if ATI has similar problems although I know many of their parts are clock-locked via BIOS.

Sorry but simple math and "logical" multi-GPU solutions are not going to make those problems go away. The only solution that is going to provide a significant increase in performance from where I am without multi-GPU problems is a GTX 280, which is why its worth it (to me).


That sure is a lot effort put into trying to shore up the quality of the 280. All it proved to me is that the older cheaper nv card is better. These are video cards, not elegant paintings or nice furniture. In the end it is how they perform in comparison to one another that matters. Much like arguments over how much power is consumed is silly. This card is a lame duck. Bring on the 4870x2 imo to stomp the 280 into the ground.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: sourthings

That sure is a lot effort put into trying to shore up the quality of the 280. All it proved to me is that the older cheaper nv card is better. These are video cards, not elegant paintings or nice furniture. In the end it is how they perform in comparison to one another that matters. Much like arguments over how much power is consumed is silly. This card is a lame duck. Bring on the 4870x2 imo to stomp the 280 into the ground.

A lot of effort? It took me an hour on and off while watching TV to plug in those numbers (a nice benefit of not using SLI and having multi-monitors). I actually wanted to quantify the difference between a 260 and 280 but threw in the 9800GX2 numbers in there even though I was comfortable with the differences with GTX 280 just eye-balling it. For someone else considering a 260 vs 280 the difference might be worth it but the difference for me isn't worth it coming from an 8800GTX at 1920.

Saying the "older card is better" or the "4870x2 is going to stomp it into the ground" isn't going to explain away the problems with multi-GPU that I brought up. Compiling that list took very little effort as those are considerations I've weighed on and off over the last 7-8 months every time a SLI/CF/multi-GPU frankencard came along promising greater performance for less than the original price of a G80 GTX.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: Foxery
Originally posted by: keysplayr2003
We put together some benchmarks as well. They'll defo be some follow ups when newer drivers arrive.

GT200

Got questions? Let 'er rip..

Eek, your page is wider than my 720p monitor. Nice mini-review, though. You're going to switch that Folding@Home team to AnandTech soon, right?

I do have a silly question, or at least clarification I need:
We keep hearing about CUDA PhysX. Is there any (non-NDA'ed) information about how running PhysX affects the GPU's video-rendering speed? I'm blindly assuming here that the same transistors will be split up and used for both, as opposed to the cards currently hiding a second set of secret circuits from us.

Originally posted by: taltamir
What I wanna know is... why is the 9800GTX still priced at 250-300$? it is totally obsolete.

There's still a production cost involved that both nVidia and the retailer need to recoup as best they can. And it's only a few months old, not instantly useless because of one new product.

actually 9800gtx was useless upon release unless you wanted it for tri-sli because you could get 8800gts 512 for ~ 60% of the cost with 95% of the performance. It is going to be pushed down in price by 48xx in the next few weeks, however.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: chizow
Originally posted by: sourthings

That sure is a lot effort put into trying to shore up the quality of the 280. All it proved to me is that the older cheaper nv card is better. These are video cards, not elegant paintings or nice furniture. In the end it is how they perform in comparison to one another that matters. Much like arguments over how much power is consumed is silly. This card is a lame duck. Bring on the 4870x2 imo to stomp the 280 into the ground.

A lot of effort? It took me an hour on and off while watching TV to plug in those numbers (a nice benefit of not using SLI and having multi-monitors). I actually wanted to quantify the difference between a 260 and 280 but threw in the 9800GX2 numbers in there even though I was comfortable with the differences with GTX 280 just eye-balling it. For someone else considering a 260 vs 280 the difference might be worth it but the difference for me isn't worth it coming from an 8800GTX at 1920.

Saying the "older card is better" or the "4870x2 is going to stomp it into the ground" isn't going to explain away the problems with multi-GPU that I brought up. Compiling that list took very little effort as those are considerations I've weighed on and off over the last 7-8 months every time a SLI/CF/multi-GPU frankencard came along promising greater performance for less than the original price of a G80 GTX.

you are the person whom nvidia is aiming the gtx 280 at. I would highly recommend at least waiting for 4870's release, however, since the gtx 280 price probably won't go up even if 4870 sucks, but it likely will go DOWN if 4870 is even close to the rumored performance.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,996
126
In games like QW: ET and even COD4, both with AA, there's very little difference in the 16/19 resolutions. I don't think faster CPU is really the answer, I think devs need to really start taking advantage of more cores in the way they write their games.
I agree that CPU limitations are at play but the answer is much simpler than that. The answer is for reviewers to use higher levels of AA instead of clinging to 4x like it?s some kind of holy grail. nVidia?s 8xQ is equivalent to AT?s 8x box filter and both have been around for almost two years so there?s absolutely no excuse for reviewers not to use them.

Fillrate is king and GT200 isn't so much better than G92 where it was starved for bandwidth. Although ROP helps you in high resolutions and AA it doesn't help you when there are multiple textures.
Azn, did you see this quote from Anandtech?s article?

http://www.anandtech.com/video/showdoc.aspx?i=3334&p=3

For a 87.5% increase in compute, there's a mere 25% increase in texture processing power. This ratio echoes what NVIDIA has been preaching for years: that games are running more complex shaders and are not as bound by texture processing as they were in years prior. If this wasn't true then we'd see a closer to 25% increase in performance of GT200 over G80 at the same clock rather than something much greater.

Also clock for clock results here: http://www.anandtech.com/video/showdoc.aspx?i=3334&p=8

The results are far higher than 25% which is what you?d expect if games were limited by texture fillrate. I think at this stage it?s safe to say texture fillrate is not the primary bottleneck in these cards.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: BFG10K
I agree that CPU limitations are at play but the answer is much simpler than that. The answer is for reviewers to use higher levels of AA instead of clinging to 4x like it?s some kind of holy grail. nVidia?s 8xQ is equivalent to AT?s 8x box filter and both have been around for almost two years so there?s absolutely no excuse for reviewers not to use them.
I think reviewers stick to 4x AA because the differences between it and higher levels are extremely minute. Going from 0 to 2 to 4 comes with big improvements, but past that it's a sharp dropoff in perceived improvement, especially on high-DPI monitors like 20"/24"/30". I think most members in this forum would be challenged to play a game at 4x and 8x and be able to point out the differences.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: ViRGE
Originally posted by: BFG10K
I agree that CPU limitations are at play but the answer is much simpler than that. The answer is for reviewers to use higher levels of AA instead of clinging to 4x like it?s some kind of holy grail. nVidia?s 8xQ is equivalent to AT?s 8x box filter and both have been around for almost two years so there?s absolutely no excuse for reviewers not to use them.
I think reviewers stick to 4x AA because the differences between it and higher levels are extremely minute. Going from 0 to 2 to 4 comes with big improvements, but past that it's a sharp dropoff in perceived improvement, especially on high-DPI monitors like 20"/24"/30". I think most members in this forum would be challenged to play a game at 4x and 8x and be able to point out the differences.

I know a certain video mod who can most certainly tell the difference

@bfg: texture hasn't been a g80/g92 bottleneck, but for rv670 it definitely has been one.
 

ddarko

Senior member
Jun 18, 2006
264
3
81
Originally posted by: chizow
I went ahead and compiled a comparison of different benchmarks between the 8800 GTX, GTX 280, GTX 260 and 9800 GX2 at 1920x1200 using the 8800 GTX as reference as that is the card and resolution I game at.

1920x1200 Review Compilation

As you can see, the differences from a single 8800GTX and GTX 280 (Column L vs P = Q) is significant and often the difference between playable or non-playable. This can be largely subjective but I can tell you for sure the games I own will play better without a doubt jumping from a 30-45FPS average (sub 60) to 60+ always.

The difference between a GTX 280 and 9800GX2 isn't nearly as significant and even though the GX2 often beats the GTX 280, its already at a performance level higher than necessary to achieve solid playable framerates (60+ for me). Keep in mind this difference will be even more pronounced when you go to higher resolutions as bandwidth and framebuffer become more of an issue in GTX 280's favor.

Now, sure you can daisy-chain a whole bunch of cheaper gimpy cards to achieve similar performance, but then you introduce a bunch of other problems including but not limited to:

  • Profiles/Scaling- SLI/CF rely on driver profiles for their performance and in the case of ATI, you can't change these yourself. So if your particular game doesn't have a pre-defined profile you may see no benefit or even *worst* performance than with a single card. In the case of relying on two individually slower cards than your single card, you can see that you may actually be paying more for *worst* performance which is unacceptable to me.


  • This issue of profiles and scaling is a red herring. The fact is, in most games now, the profiles and scaling works. No, it doesn't scale 100% and no, it doesn't work with 100% of games. But it works with most games and in the games it does work with, it scales well enough that the dual-GPU invariably offers better performance than a single GPU, even when it's last gen dual-GPU (9800GX2) versus latest gen single GPU (280). Therefore, the question should be turned around: if performance is so paramount to you, then why would you accept a single-GPU that offers worse performance than a dual-GPU solution in most games and provides this inferior performance at a higher price? Yes, dual-GPU may not work in 3 out of 10 games, picking a number out of the blue. But that means it DOES work in 7 out of 10 games, which means it is FASTER 70% of the time. So for the certainty of avoiding a 30% chance of slower performance, one should turn down a solution that offers superior performance 70% of the time? Um, that seems a little inverted.

    [*]Micro-stuttering- Pretty heated debate about the significance of this problem on this board and others although it pops up infrequently. Basically the timing of each frame from the different GPU in AFR can be erratic, leading to this effect. Apparently some people are very sensitive to it and some aren't. I don't know as I have never used SLI, but I certainly wouldn't be happy if I spent $400-600 for SLI/CF only to find I couldn't stand micro-stutter.

    Another red herring. The very fact that a heated debate has occurred tells us that many, if not most people don't even know this issue exists. To warn people away from a solution because of a "problem" that most may not even detect and even if they can detect it, detect it only infrequently, is to completely blow this out of proportion. It's also true that some people are bothered by the low 60Hz refresh rate of LCD monitors; does that mean we should be cautioning people to stick with CRTs to avoid the headache-inducing dangers of LCD monitors? Don't misunderstand me: if you detect micro-stuttering, then of course dual-GPU solution isn't for you. But to list a remote possibility as a major caution flag for everyone to worry about is scare-mongering. To those affected by it, it's a deal killer. But the important point is, the vast majority of people aren't affected.

    [*]Bandwidth/Frame Buffer - Not as big a deal at 1920, but one of the major reasons to upgrade to the fastes GPU is for ultra high resolutions with AA. With a GX2 or SLI/CF solution, you're still limited to the same bus width and frame buffer as the individual cards even if you have more rendering horse power. This limitation is apparent in the higher resolutions with AA when comparing a GTX 280 with a true 512-bit bus and 1GB frame buffer to the X2/SLI solutions with a 256-bit bus and 512MB buffer.

    Where is this theoretical problem manifest in the real world? The benchmarks on Anand seem to show that even at the highest resolution used by 30" monitors, the 9800GX2 still outperforms the 280. If we all bought a product based on its theoretical numbers and in disregard of what real world testing demonstrate, we'd all be ATI owners by now.

    [*]Chipset specific limitations - ATI CF requires an Intel/AMD chipset and NV SLI requires an NV chipset. This unnecessarily ties your platform to your GPU between generations and in the case of SLI, to NV's flaky chipsets.

    Um, another red herring since dual-GPU cards exist that run in a single slot. And why is "ties your platform to your GPU" supposed to be such an issue? So what if you're tied down to Nvidia or ATI? I don't exactly hear the masses complaining how their motherboard limits them to "only" AMD or Intel processors, even though every single motherboard out there forces you to "tie" yourself down to a processor line. So why should being limited be such an issue for GPUs? I have yet to read a motherboard review or forum user complain that a motherboard "only works with Intel/AMD processors and not both." Until I do, I'm going to dismiss the "problem" of being restricted to dual graphics cards. It's a non-issue for CPUs, it's a non-issue for GPUs.

    Sorry but simple math and "logical" multi-GPU solutions are not going to make those problems go away. The only solution that is going to provide a significant increase in performance from where I am without multi-GPU problems is a GTX 280, which is why its worth it (to me).

    So you've bought a solution that offers slower performance in most games but still costs more, all to avoid "problems" that are illusory or easily manageable. Very logical indeed.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: ddarko
This issue of profiles and scaling is a red herring. The fact is, in most games now, the profiles and scaling works.
Really? And exactly what are you basing this on? Reviews? Reviews of the most popular, most played, most scrutinized, most reviewed PC games out there? LMAO. Yes, of course games that are popular and tend to be reviewed will eventually get straightened out with proper profiles and tweaks, but I can guarantee you that wasn't the case when those multi-GPU cards or games were released juding from prior reviews and I can also guarantee you support for SLI/CF profiles will be one of the first to miss the check-list in future driver releases. You are completely at the mercy of continued profile support with SLI/CF with every new title, that to me is not a desirable prospect.

I can name at least 3 popular titles that, from user feedback, do not have properly working profiles or require workarounds. LOTRO still does not properly scale with SLI/CF. AOC had issues with SLI/CF as of final beta client. Mass Effect does not benefit at all from CF unless you rename the .exe as a workaround. These are only a few off the top of my head but of course you're going to assume SLI/CF works without problems because the top 10 PC games of 2007-08 scale properly...LMAO.

Another red herring. The very fact that a heated debate has occurred tells us that many, if not most people don't even know this issue exists. To warn people away from a solution because of a "problem" that most may not even detect and even if they can detect it, detect it only infrequently, is to completely blow this out of proportion. It's also true that some people are bothered by the low 60Hz refresh rate of LCD monitors; does that mean we should be cautioning people of the headache-inducing dangers of LCD monitors? Don't misunderstand me: if you detect micro-stuttering, then of course dual-GPU solution isn't for you. But to list it as a major bullet point caution for everyone to worry about is more akin to scare-mongering by transforming a remote possibility into a likelihood.
You lost me at the part of LCD "refresh rate" giving people headaches.

Some of the brightest minds on this forum can't reconcile or dismiss micro-stutter, but I'm sure your red herring explanation along with the promise of $400-600 spent on multi-GPU guaranteed micro-stutter-free is enough to allay people's fear. /sarcasm. I'll take 70-100% of the performance at 100-150% of the cost and call it a day, otherwise I would've just spent that money on whatever micro-stuttering multi-GPU solution months ago.

Where is this theoretical problem manifest in the real world? The benchmarks on this site seem to show even at the highest resolution used by 30" monitors, the 9800GX2 still outperforms the 280. If we all bought a product because its theoretical numbers are better than real world performance, then we should all be ATI owners.
Actually many of the 2560 resolutions with AA do show the limitations of bandwidth and frame buffer; the X2 256-bit bus parts with 512MB x 2 buffers did even when compared to the G80 Ultra.

Um, another red herring since single dual-GPU cards exist that do not need Crossfire or SLI. And why is "ties your platform to your GPU" supposed to be such an issue? First of all, it only ties yourself down if you're looking for a dual card solution, which dual GPU obviates the need for. Secondly, even if you're limited, so what? I don't exactly hear the masses complaining how their motherboard limits them to only AMD or Intel processors, even though every single motherboard out there forces you to "tie" yourself down to a processor line. So why should being "limited" be such an issue for GPUs? I have yet to read a motherboard review or forum user complain that a motherboard "only works with Intel/AMD processors." Until I do, I'm going to dismiss the "problem" of being restricted to Crossfire or SLI. It's a non-issue for CPUs, it's a non-issue for GPUs.
And a single-card multi-GPU solution negates much of the cost benefit of going multi-GPU in the first place. 3870X2 retailed at ~$500 and the 9800GX2 for $600, so in reality you don't save anything but inherit many of the problems associated with multi-GPU.

If you don't think the socket/chipset/CF/SLI issue isn't a big deal with enthusiasts you obviously haven't read enough user replies on these forums. There's without a doubt people who purchase NV chipsets solely for the purpose of SLI and there's certainly people out there who go with Intel but would prefer to use an NV graphic solution. AMD CPUs aren't really an issue with enthusiasts right now so it really boils down to CF/SLI and being tied to that decision based on superficial chipset limitations. Fact of the matter is, if you have a card or multi-card from one camp but want to go multi-card with the other, it will cost you 2 cards + a motherboard to change, which is a significant added expense.

So you've bought a solution that offers slower performance at a higher cost in most games. Very logical indeed.
Without the hassle and problems of SLI. Very Logical indeed.
 

CP5670

Diamond Member
Jun 24, 2004
5,535
613
126
Originally posted by: chizow
Sorry but simple math and "logical" multi-GPU solutions are not going to make those problems go away. The only solution that is going to provide a significant increase in performance from where I am without multi-GPU problems is a GTX 280, which is why its worth it (to me).

Aside from the microstutter problem, the main turn off for me is the inconsistent vsync and triple buffering support on multi GPU setups. I consider these to be essential features and am ready to pay more for a slightly slower single GPU card in order to avoid wasting a hour with each of my games, messing with various driver settings and third party programs to get both working properly without any performance hit (which is what I ended up doing when I used SLI a while ago).

The 9800GX2 in particular also has low minimum framerates in a number of games, more than other multi GPU setups, according to this Xbit review. The performance of this card is deceptive in many benchmarks due to this, as hardly any sites show the minimum numbers these days.

As far as I am concerned though, it's a good thing that most people don't care about these issues. It means that the GTX 280 price will come down, and fast.

I think reviewers stick to 4x AA because the differences between it and higher levels are extremely minute. Going from 0 to 2 to 4 comes with big improvements, but past that it's a sharp dropoff in perceived improvement, especially on high-DPI monitors like 20"/24"/30". I think most members in this forum would be challenged to play a game at 4x and 8x and be able to point out the differences.

I can see the difference pretty easily in most games, certainly between 4x and 16x at least. It's a lot more apparent in motion than it looks in screenshots. At the same time though, I don't think any of these cards is generally pulling high enough framerates that it makes sense to increase the level of AA.
 

LittleNemoNES

Diamond Member
Oct 7, 2005
4,142
0
0
Originally posted by: bryanW1995
Originally posted by: ViRGE
Originally posted by: BFG10K
I agree that CPU limitations are at play but the answer is much simpler than that. The answer is for reviewers to use higher levels of AA instead of clinging to 4x like it?s some kind of holy grail. nVidia?s 8xQ is equivalent to AT?s 8x box filter and both have been around for almost two years so there?s absolutely no excuse for reviewers not to use them.
I think reviewers stick to 4x AA because the differences between it and higher levels are extremely minute. Going from 0 to 2 to 4 comes with big improvements, but past that it's a sharp dropoff in perceived improvement, especially on high-DPI monitors like 20"/24"/30". I think most members in this forum would be challenged to play a game at 4x and 8x and be able to point out the differences.

I know a certain video mod who can most certainly tell the difference

@bfg: texture hasn't been a g80/g92 bottleneck, but for rv670 it definitely has been one.

I love using high levels of AA whenever possible.
I use 8xSQ for BF2, 4xS AA for Bioshock, 4x AA for Mass Effect

I look forward to faster cards to crank this up higher.
BTW does anyone know if the new gen cards are going to remove the limitation for 4x4 SSAA < 1024x768?
 

CP5670

Diamond Member
Jun 24, 2004
5,535
613
126
BTW does anyone know if the new gen cards are going to remove the limitation for 4x4 SSAA < 1024x768

I doubt that will be removed any time soon, at least on the consumer video cards. It comes from the maximum texture size of 4096x4096 (which is also the maximum size of the frame buffer). That limit has been the same since at least the 6 series and there are no games that come anywhere close to hitting it with their textures.
 

ddarko

Senior member
Jun 18, 2006
264
3
81
Originally posted by: chizowI can name at least 3 popular titles that, from user feedback, do not have properly working profiles or require workarounds. LOTRO still does not properly scale with SLI/CF. AOC had issues with SLI/CF as of final beta client. Mass Effect does not benefit at all from CF unless you rename the .exe as a workaround. These are only a few off the top of my head but of course you're going to assume SLI/CF works without problems because the top 10 PC games of 2007-08 scale properly...LMAO.

And I can name a couple games that are filled with bugs and need continuous patches and updates. Like, every single PC ever released. You understand my point? Constant driver support and being at the mercy of continued driver support is a fact of life with PC games. Why you're fixated with the driver support with dual GPU solutions when driver issues and patches are endemic with PC gaming, including with single GPU cards, is mystifying. You don't want to be at the mercy of patches to make something run properly on your PC? Then throw out your PC!

And to use your own argument - so what if SLI/Crossfire don't work perfectly in those 3 games? Don't they offer performance that's "good enough"? After all, that was your excuse for accepting lower performance from a 280; it was good enough. Something tells me Lord of the Rings Online isn't exactly unplayable.

You lost me at the part of LCD "refresh rate" giving people headaches.

Proving my point. This is an issue for some people with LCD monitors (go to the LCD forums to look it up). The fact that you're completely unaware of it demonstrate how small the number of ppl it affect. Micro-stuttering is the same. If it affects you, then it's a deal-breaker. But you can't even say that micro-stuttering DOES affect you. Instead, you've read about it and magnified the remote possibility that it MAY affect you into a certainty that it WILL affect you in your mind. Completely irrational.

Actually many of the 2560 resolutions with AA do show the limitations of bandwidth and frame buffer; the X2 256-bit bus parts with 512MB x 2 buffers did even when compared to the G80 Ultra.

Show me the single benchmark at 2560x1600 from Anandtech that demonstrate the bandwidth limiting the 9800GX2's performance to below the 280's. Oh right. There isn't one.

And a single-card multi-GPU solution negates much of the cost benefit of going multi-GPU in the first place. 3870X2 retailed at ~$500 and the 9800GX2 for $600, so in reality you don't save anything but inherit many of the problems associated with multi-GPU.

9800GX2 for $600?! LOL. Look, you can make up numbers. For the rest, look at newegg: there's a reference-clock 9800GX2 selling right now for $470 before rebate, $430 after mail-in relate.

http://www.newegg.com/Product/...x?Item=N82E16814127337

And you don't save anything? Except for that pesky fact that you get faster performance. it's interesting you can't acknowledge or answer this pesky obviously irritating fact: better performance most of the time for less money. But bah, who needs more for less? I'd rather get less for more! You sir, are the ideal Nvidia customer! Congratulations!
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: CP5670
Originally posted by: chizow
Sorry but simple math and "logical" multi-GPU solutions are not going to make those problems go away. The only solution that is going to provide a significant increase in performance from where I am without multi-GPU problems is a GTX 280, which is why its worth it (to me).

Aside from the microstutter problem, the main turn off for me is the inconsistent vsync and triple buffering support on multi GPU setups. I consider these to be essential features and am ready to pay more for a slightly slower single GPU card in order to avoid wasting a hour with each of my games, messing with various driver settings and third party programs to get both working properly without any performance hit (which is what I ended up doing when I used SLI a while ago).

The 9800GX2 in particular also has low minimum framerates in a number of games, more than other multi GPU setups, according to this Xbit review. The performance of this card is deceptive in many benchmarks due to this, as hardly any sites show the minimum numbers these days.

As far as I am concerned though, it's a good thing that most people don't care about these issues. It means that the GTX 280 price will come down, and fast.

Excellent points, I'll have to add them to any future SLI/CF discussions. I also consider Vsync/Triple buffering essential with a larger LCD so that would *really* annoy me if those features were broken. Another really good point with the GX2 minimums and ya having to research/tweak or wait for a profile fix is a great reason not to deal with SLI/CF.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: ddarko
And I can name a couple games that are filled with bugs and need continuous patches and updates. Like, every single PC ever released. You understand my point? Constant driver support and being at the mercy of continued driver support is a fact of life with PC games. Why you're fixated with the driver support with dual GPU solutions when driver issues and patches are endemic with PC gaming, including with single GPU cards, is mystifying. You don't want to be at the mercy of patches to make something run properly on your PC? Then throw out your PC!
And none of that discounts the fact that a multi-GPU solution without proper profile support may not have *any* benefit at all over a single GPU. There's so many pitfalls in the PC business, why bother with another right?

And to use your own argument - so what if SLI/Crossfire don't work perfectly in those 3 games? Don't they offer performance that's "good enough"? After all, that was your excuse for accepting lower performance from a 280; it was good enough. Something tells me Lord of the Rings Online isn't exactly unplayable.
No, read again, or better yet, look at the spreadsheet I linked. I said there was a signficant difference upgrading from a 8800 GTX to a GTX 280 with only a negligible difference from a GX2, as both exceed begin to exceed 60FPS avg. These are quantifiable and easily observable thresholds when gaming on an LCD with Vsync. Being locked at 60 FPS Vsync vs. 15-20-25-30-60 increments is a HUGE difference in gaming experience even if the "average" FPS is only 25-30 different.

And yes LOTRO is playable in RAIDS or PVP, after I turn almost everything down besides textures. AOC is borderline, Mass Effect is comfortable, but both those games and nearly every other title I play are consistently below 60 FPS or require me to signficantly turn down IQ settings. A GTX 280 would remedy that for nearly every title except for Crysis.

Proving my point. This is an issue for some people with LCD monitors (go to the LCD forums to look it up). The fact that you're completely unaware of it demonstrate how small the number of ppl it affect. Micro-stuttering is the same. If it affects you, then it's a deal-breaker. But you can't even say that micro-stuttering DOES affect you. Instead, you've read about it and magnified the remote possibility that it MAY affect you into a certainty that it WILL affect you in your mind. Completely irrational.
:looks around for LCD forums:

Oh you must be referring to whatever forum you normally post from. How about this, you link me whatever "LCD Refresh Rate" problem you're referring to on whatever "LCD Forums" you're referring to and we'll take it from there. In the meantime, feel free to explain away Micro Stutter to those that notice it......

Show me the single benchmark at 2560x1600 from Anandtech that demonstrate the bandwidth limiting the 9800GX2's performance to below the 280's. Oh right. There isn't one.
Oblivion 2560

"The race is pretty close between the GTX 280 and 9800 GX2 in Oblivion with AA enabled. It's worth noting that at all other resolutions, the GX2 is faster than the GTX 280 but at 2560 x 1600 the memory bandwidth requirements are simply too much for the GX2. Remember that despite having two GPUs, the GX2 doesn't have a real world doubling of memory bandwidth."

9800GX2 for $600?! LOL. Look, you can make up numbers. For the rest, look at newegg: there's a reference-clock 9800GX2 selling right now for $470 before rebate, $430 after mail-in relate.

http://www.newegg.com/Product/...x?Item=N82E16814127337

And you don't save anything? Except for that pesky fact that you get faster performance. it's interesting you can't acknowledge or answer this pesky obviously irritating fact: better performance most of the time for less money. But bah, who needs more for less? I'd rather get less for more! You sir, are the ideal Nvidia customer! Congratulations!

Rofl....are you serious? You're comparing current prices on old parts? But lets not ignore the fact that 1) 9800GX2 was $600 when it was released and 2) 8800GTS SLI cost less at $220-250 each and 3) 8800GT cost much less at $200-$220 each and 4) Both offered better or equal performance for much less, once again negating your single-card multi-GPU benefit.

But hey, if you want to compare current prices, why not compare the $150-$160 8800GTS out there that will without a doubt provide better performance for less, or the 8800GT that will come close for MUCH MUCH less.....

There's certainly a place for multi-GPU if you're willing to put up with the associated problems, but simply pointing at some benchmarks and a price tag aren't going to make the problems go away, sorry.
 

Leon

Platinum Member
Nov 14, 1999
2,215
4
81
"The race is pretty close between the GTX 280 and 9800 GX2 in Oblivion with AA enabled. It's worth noting that at all other resolutions, the GX2 is faster than the GTX 280 but at 2560 x 1600 the memory bandwidth requirements are simply too much for the GX2. Remember that despite having two GPUs, the GX2 doesn't have a real world doubling of memory bandwidth."

And thats with default settings, add QTP3 texture pack, and the memory usage jumps to 800Mb+ at this res.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: ddarko
Originally posted by: chizowI can name at least 3 popular titles that, from user feedback, do not have properly working profiles or require workarounds. LOTRO still does not properly scale with SLI/CF. AOC had issues with SLI/CF as of final beta client. Mass Effect does not benefit at all from CF unless you rename the .exe as a workaround. These are only a few off the top of my head but of course you're going to assume SLI/CF works without problems because the top 10 PC games of 2007-08 scale properly...LMAO.

And I can name a couple games that are filled with bugs and need continuous patches and updates. Like, every single PC ever released. You understand my point? Constant driver support and being at the mercy of continued driver support is a fact of life with PC games. Why you're fixated with the driver support with dual GPU solutions when driver issues and patches are endemic with PC gaming, including with single GPU cards, is mystifying. You don't want to be at the mercy of patches to make something run properly on your PC? Then throw out your PC!

And to use your own argument - so what if SLI/Crossfire don't work perfectly in those 3 games? Don't they offer performance that's "good enough"? After all, that was your excuse for accepting lower performance from a 280; it was good enough. Something tells me Lord of the Rings Online isn't exactly unplayable.

You lost me at the part of LCD "refresh rate" giving people headaches.

Proving my point. This is an issue for some people with LCD monitors (go to the LCD forums to look it up). The fact that you're completely unaware of it demonstrate how small the number of ppl it affect. Micro-stuttering is the same. If it affects you, then it's a deal-breaker. But you can't even say that micro-stuttering DOES affect you. Instead, you've read about it and magnified the remote possibility that it MAY affect you into a certainty that it WILL affect you in your mind. Completely irrational.

Actually many of the 2560 resolutions with AA do show the limitations of bandwidth and frame buffer; the X2 256-bit bus parts with 512MB x 2 buffers did even when compared to the G80 Ultra.

Show me the single benchmark at 2560x1600 from Anandtech that demonstrate the bandwidth limiting the 9800GX2's performance to below the 280's. Oh right. There isn't one.

And a single-card multi-GPU solution negates much of the cost benefit of going multi-GPU in the first place. 3870X2 retailed at ~$500 and the 9800GX2 for $600, so in reality you don't save anything but inherit many of the problems associated with multi-GPU.

9800GX2 for $600?! LOL. Look, you can make up numbers. For the rest, look at newegg: there's a reference-clock 9800GX2 selling right now for $470 before rebate, $430 after mail-in relate.

http://www.newegg.com/Product/...x?Item=N82E16814127337

And you don't save anything? Except for that pesky fact that you get faster performance. it's interesting you can't acknowledge or answer this pesky obviously irritating fact: better performance most of the time for less money. But bah, who needs more for less? I'd rather get less for more! You sir, are the ideal Nvidia customer! Congratulations!

You are beating a dead horse. Not to mention you are arguing with chizow who takes a price about 2 months ago when talking current prices just to win an argument. Some people actually make sense some people don't. Those people seem to talk out their ass instead of thinking logically.

Face it GTX280 is nice card but for the price it's crap. I wouldn't even buy it even if it was $400.
 

kreacher

Member
May 15, 2007
64
0
0
For $400 it would be a pretty good card but certainly not for $650. Unless SLI scaling goes over 90% I don't consider it a viable option (plus it limits your motherboard choices).
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: CP5670
Originally posted by: chizow
Sorry but simple math and "logical" multi-GPU solutions are not going to make those problems go away. The only solution that is going to provide a significant increase in performance from where I am without multi-GPU problems is a GTX 280, which is why its worth it (to me).

Aside from the microstutter problem, the main turn off for me is the inconsistent vsync and triple buffering support on multi GPU setups. I consider these to be essential features and am ready to pay more for a slightly slower single GPU card in order to avoid wasting a hour with each of my games, messing with various driver settings and third party programs to get both working properly without any performance hit (which is what I ended up doing when I used SLI a while ago).

The 9800GX2 in particular also has low minimum framerates in a number of games, more than other multi GPU setups, according to this Xbit review. The performance of this card is deceptive in many benchmarks due to this, as hardly any sites show the minimum numbers these days.

As far as I am concerned though, it's a good thing that most people don't care about these issues. It means that the GTX 280 price will come down, and fast.

I think reviewers stick to 4x AA because the differences between it and higher levels are extremely minute. Going from 0 to 2 to 4 comes with big improvements, but past that it's a sharp dropoff in perceived improvement, especially on high-DPI monitors like 20"/24"/30". I think most members in this forum would be challenged to play a game at 4x and 8x and be able to point out the differences.

I can see the difference pretty easily in most games, certainly between 4x and 16x at least. It's a lot more apparent in motion than it looks in screenshots. At the same time though, I don't think any of these cards is generally pulling high enough framerates that it makes sense to increase the level of AA.

personally I couldn't give a whiff about max and average FPS... the ONLY thing that matters is minimum FPS. If min dips under 20 fps then the settings needs to be lower.

Originally posted by: bryanW1995
Originally posted by: Foxery
Originally posted by: keysplayr2003
We put together some benchmarks as well. They'll defo be some follow ups when newer drivers arrive.

GT200

Got questions? Let 'er rip..

Eek, your page is wider than my 720p monitor. Nice mini-review, though. You're going to switch that Folding@Home team to AnandTech soon, right?

I do have a silly question, or at least clarification I need:
We keep hearing about CUDA PhysX. Is there any (non-NDA'ed) information about how running PhysX affects the GPU's video-rendering speed? I'm blindly assuming here that the same transistors will be split up and used for both, as opposed to the cards currently hiding a second set of secret circuits from us.

Originally posted by: taltamir
What I wanna know is... why is the 9800GTX still priced at 250-300$? it is totally obsolete.

There's still a production cost involved that both nVidia and the retailer need to recoup as best they can. And it's only a few months old, not instantly useless because of one new product.

actually 9800gtx was useless upon release unless you wanted it for tri-sli because you could get 8800gts 512 for ~ 60% of the cost with 95% of the performance. It is going to be pushed down in price by 48xx in the next few weeks, however.

Point, what I meant though was that it is worth a lot less money relatively now that something better arrived. However, I still remember 8800GTS 320 and 640 selling for 100-200$ more then a 8800GTS 512 for the first couple of weeks after release. If people will PAY more for less then the companies are happy.
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
Originally posted by: kreacher
For $400 it would be a pretty good card but certainly not for $650. Unless SLI scaling goes over 90% I don't consider it a viable option (plus it limits your motherboard choices).

I would get it for $400 too, Its a good card that actually gives playable frame rate with 8xAA in high res on new games. If you checked computerbase.de benchmarking.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Originally posted by: JACKDRUID
to Extelleron you are such a ATI fanTard..
just have to say that... or are you getting paid by AMD

in every post, EVERY post, you make assumption that 4850/70 is going to be faster than 280 without any prove what so ever.... just makes me sick..

to Ocguy31

G280 sucks hard price/performance/heat/environmentally friendliness. G260 is definately a good value, consider that it is a single card without limitation of SLI/Crossfire.

Yeah but it still sucks. I can't even hope to play Crysis on very high on a 1920x1200 monitor with any AA or AF on it. 2 years after the 8800GTX and we get this turd? Wow. Just wow.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |