tigersty1e
Golden Member
- Dec 13, 2004
- 1,963
- 0
- 76
when they find a way to finally start using the memory of both cards instead of sharing it, we have taken the next step.
I have the money for a high-end PC so I like to turn the settings up.
You should leave you mom's basement and get a job. Try it sometime.
Uh...touchy.
I left my parents last millenium.
Also went to war last millenium.
I have had a job since I was 13.
My last rig (self build) cost ~$5000
You couldn't make your self look any more stupid if you tried again.
But all this dosn't alter my previous post one bit...just makes you look like a fool.
Come again...
I disagree with your analogy and also with the one above it, where it was equitable to 'rot'. lolSome very good points have been raised already:
Yes, framerate fluctuations happen on single cards too, but AFR micro-stutter is on top of this. By definition it doesn’t exist on a single card.
- Micro-stutter and input lag are inherent to AFR.
- Extra points of failure in terms of scaling and general compatibility.
- More heat and noise.
Micro-stutter is a lot like AA. It’s only when you lose it that you notice a huge difference.
edit: I agree that ms can exist , but I also find resistance, to believe that some don't experience it.Analysis
The single GPU 5870 actually has MORE ustutter than the dualGPU 5970. Nearly ~44% of the frames rendered with the single GPU deviated from the local average render time by >70%! The 5970 on the other hand only had ~31% of its frames rendered with a deviation of >70% that of the average render time. To address the possibility that the 5970 has fewer, but larger variances while the 5870 has more smaller variances, the threshold for ustutter was increased to 90% variance and the 5970 had ~7% occurrence while the 5870 had ~11%. This suggests that the 5870 has both more and larger ustutter events than the dual GPU 5970.
This data refutes the myth that multi-GPU systems create ustutter.
http://forums.nvidia.com/index.php?showtopic=158727IDENTIFYING MICROSTUTTERING
"Microstuttering" is a term used to describe a general inconsistency regarding the time between two frames being displayed, and could be best described as a rapid, noticeable shift between a high frame rate and a low one. In an ideal situation each frame is output at equal or very similar intervals, producing smooth and even gameplay, but this isn't always the case even with a single video processor. Microstuttering is a compound issue caused by an improper frame output synchronization between multiple GPUs and a lack of compensation mechanism to assist in normalizing frame delays. This phenomenon becomes progressively easier to see at low frame rates since there is greater time between frame output, but can reduce the subjective frame rate at higher numbers. As previously mentioned, this problem is found on any multi-GPU scaling system drawing frames that require multiple rendering passes. The use of vertical synchronization (and, if possible, triple buffering) can reduce the prevalence of this to some degree.
Astrallite & Lonbjerg, cut the personal attacks please.
Super Moderator BFG10K.
Single GPU systems use less power and have, generally, fewer problems. They also have more predictable performance across games.
I run 3x 30" on 2x 580's and in most games I can get 60+ FPS with some graphic setting tweaking. You'd be surprised on how much quicker your games can run when turning down settings that a lot of the time don't even hurt image quality. Only problem with 12.3 mega pixels is I do run out of Vram pretty quickly and in some games I can't run any AA and some just 2x AA. But at that super high resolution you don't need much AA to begin with.
What does my eye spot?:
http://hardforum.com/showthread.php?p=1036629532#post1036629532
So turning down settings even happens on multi-GPU...now ain't that ironic.
Well of course. I don't think anyone claimed that having a multi-GPU setup meant that every game, even on three 30" monitors could be played at full eye candy.
Let's pretend that a single GTX580 can drive three 30" monitors (Maybe it can? I think with Nvidia you need multi-GPU to use Surrond). Would you think the gaming experience would be better on a single GTX580 or two GTX580's? Or looking at the same budget, would you think three 30" monitors would be better over a GTX580 or two 6950's?
So turning down settings even happens on multi-GPU...now ain't that ironic.
With the stagnant state of PC gaming and the over abundance of power found in the latest gpus; these days multi-gpu is for very high resolutions and maximum IQ settings with healthy doses of AA.
That or going for single-gpu halo performance on a budget with setups like 6850CF or 460SLI.
There is some incredible performance for your dollar to be had these days with multi-gpu. 6950CF is a truckload of performance for $500. Setups like 460SLI are impressive amounts of performance as well at $350.
Depends on if your flavour is consoleports (like most people in here)....or you like PC games.
I've owned 3 Crossfire setups since it's inception and the scaling was never an issue for me. If you stuck to high profile games (like I did) there was no issue. When using a Multi-GPU I found I got used to the microstuttering and after a while didn't notice it. It was only until I swapped out my Multi-GPU setup for a powerful single GPU solution that I saw the difference was night and day - since then, I will never go back to Multi-GPU while it still uses AFR. Theres no other way to say it but in retrospect it was a sub par gaming experience on multi.
IMO Fastest single GPU > Multi-GPU