Originally posted by: chizow
I went ahead and compiled a comparison of different benchmarks between the 8800 GTX, GTX 280, GTX 260 and 9800 GX2 at 1920x1200 using the 8800 GTX as reference as that is the card and resolution I game at.
1920x1200 Review Compilation
As you can see, the differences from a single 8800GTX and GTX 280 (Column L vs P = Q) is significant and often the difference between playable or non-playable. This can be largely subjective but I can tell you for sure the games I own will play better without a doubt jumping from a 30-45FPS average (sub 60) to 60+ always.
The difference between a GTX 280 and 9800GX2 isn't nearly as significant and even though the GX2 often beats the GTX 280, its already at a performance level higher than necessary to achieve solid playable framerates (60+ for me). Keep in mind this difference will be even more pronounced when you go to higher resolutions as bandwidth and framebuffer become more of an issue in GTX 280's favor.
Now, sure you can daisy-chain a whole bunch of cheaper gimpy cards to achieve similar performance, but then you introduce a bunch of other problems including but not limited to:
- Profiles/Scaling- SLI/CF rely on driver profiles for their performance and in the case of ATI, you can't change these yourself. So if your particular game doesn't have a pre-defined profile you may see no benefit or even *worst* performance than with a single card. In the case of relying on two individually slower cards than your single card, you can see that you may actually be paying more for *worst* performance which is unacceptable to me.
- Micro-stuttering- Pretty heated debate about the significance of this problem on this board and others although it pops up infrequently. Basically the timing of each frame from the different GPU in AFR can be erratic, leading to this effect. Apparently some people are very sensitive to it and some aren't. I don't know as I have never used SLI, but I certainly wouldn't be happy if I spent $400-600 for SLI/CF only to find I couldn't stand micro-stutter.
- Heat/Power/Space - Typically not an issue for most enthusiasts, but it can become a problem when you have 2 or even 3x the power draw and heat from high-end cards. The PSU issue can be a total W issue, but also a power connector issue with so many high-end parts needing 6 or even 8-pin PCI-E connections. Many cases can also have problems accomodating 1x9"+ card, much less 2+.
- Multi-Monitor (NV only) - NV multi-GPU solutions do not support multi-monitors. I don't know if this is a superficial driver limitation to prevent desktop cards being used in professional workstations or a truly technical issue, but I'm leaning towards driver limitation as I'm assuming the Quadro GX2 would support more than 1 monitor..... Multi-Monitor support is important to me as I play full screen on my 1920 and use my 2nd monitor for various monitoring tools, surfing the web, etc.
- Bandwidth/Frame Buffer - Not as big a deal at 1920, but one of the major reasons to upgrade to the fastes GPU is for ultra high resolutions with AA. With a GX2 or SLI/CF solution, you're still limited to the same bus width and frame buffer as the individual cards even if you have more rendering horse power. This limitation is apparent in the higher resolutions with AA when comparing a GTX 280 with a true 512-bit bus and 1GB frame buffer to the X2/SLI solutions with a 256-bit bus and 512MB buffer.
- Chipset specific limitations - ATI CF requires an Intel/AMD chipset and NV SLI requires an NV chipset. This unnecessarily ties your platform to your GPU between generations and in the case of SLI, to NV's flaky chipsets.
- Overclocking ability? - NV used to have problems overclocking in SLI in Vista but I think its been fixed. Not sure if ATI has similar problems although I know many of their parts are clock-locked via BIOS.
Sorry but simple math and "logical" multi-GPU solutions are not going to make those problems go away. The only solution that is going to provide a significant increase in performance from where I am without multi-GPU problems is a GTX 280, which is why its worth it (to me).