CrossFire and SLI frame rates do not reflect reality because of lack of synchronization!

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DeathSniper

Member
Oct 19, 2004
96
0
66
You would never use SLI/Xfire for competitive gaming. Everything is set to it's lowest for two reasons: to maximize fps, and to reduce distractions/make things easier to see. For example in CSS a lower mat_picmip level will "wash out" textures, thus making player models much easier to see from farther away. The only exception would be when a certain feature provides a particular benefit (such as Shadows on high in CSS - you cannot see shadows through walls on anything except for high; ie cs_office door).

Competitive players usually have two different settings - one for competitive play and one for relaxation/enjoyment, although how often that's really used is subjective (since you need to freaking practice night and day in order to be truly competitive *sigh*).
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
This definitely isn't the first time microstutter has been mentioned here, but its certainly the most attention its gotten. Very interesting points all around, but I never really saw any discussion on what's causing the inconsistent delays or if the IHVs are aware of it. I saw a small blurb about ATI saying it was lazy scene rendering or something, but that doesn't give much insight.

So is it a software/driver issue? Hardware limitation? We've seen recently a few new curveballs in the form of NV's option to set # of pre-rendered scenes and the increasing realization demanding games like Crysis are actually CPU limited past a certain FPS. With AFR, you're essentially requiring data to be processed by the CPU and memory subsystem 2x+ as fast to keep those multi-GPU happy. Then you have whatever cross-talk/bridging mechanism that may introduce latency, potentially causing longer delay/duration compared to a single GPU.

What's a reasonable fix? Capping FPS or normalizing FPS to smooth out frame rates on multi-GPU solutions? Its been known for some time that enabling Vsync can reduce heat and load on the GPU by limiting frames rendered. This could possibly help multi-GPU solutions if they could normalize performance for target framerates.

Personally I do think that erratic delay between frames would be more noticeable than consistent delays, even if the overall durations are shorter for the erratic frames. Similar to minimum FPS drops and HDD thrashing, I notice those much more than smooth, consistent but lower FPS in games.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: apoppin
Originally posted by: dug777
It's good to see Video can't resist pounding someone, even in these cuddlier and fluffier times!



EDIT: Like many others in this thread I can see 'stutter' in that video, but I guess if you're fine with that, or don't notice it, more power to you

If you want to look for it, the video (to me) clearly 'two-steps', which I assume is the two quick frame updates followed by the lag, repeat.

of course, if any brand new person shows up and is extremely aggressive - they get a bit more then they give; we have to find out "who" is trying to bring "what" into Video
--And it looks like the OP passed our little "test" .. how often do i apologize here?


Most of us are willing to suspend disbelief and enjoy the faster experience and more AA that multi-GPU allowes; for me it is mostly 'moot' as i use the "extra" GPU to give me CrossFire AA which does not have this problem so obviously - heck even a single GPU can "micro stutter"

so this isn't "revolutionary" by any means - it just allowed us to examine it in better detail than any other tech forum up till now and come up with some interesting theories

Anyone hear back from Derek?

i am really proud of Video - 3 years ago, this thread would have deteriorated into flames and name-calling

Heck, even P&N has mellowed - and i am back there with the 'boys'; my controversial posting is mostly confined there now there
[heck you can even insult someone there [really bad] and get away with it - don't tell!]
:Q

:thumbsup:

:beer:
 

ChrisRay

Junior Member
Oct 21, 2005
20
0
0
BFG. Just think about it for a second. Nvidia wants user's to use their software for overclocking/tweaking. They are not going to go out of their way to assist Unwinder/Grestorm in documentation regarding the tweaking of things like their profile bits or registry values. These are "expert" tools. What you have shown me is not anything new. If Nvidia did not want people modifying the profiles. They would not have let the drivers do so. Before you go throw another hissy fit about how little Nvidia helps grestorm. Why dont you go tell me another multi GPU vendor that provides anywhere near the same flexibility of modifying profiles as SLI does. In my dealings and interactions with Nvidia. Nhancer has been the absolute baseline for guiding Nvidia in ways to improve control panel SLI functionality. So you cannot tell me that Nvidia does not like Nhancer.

Also. You can delete a profile if you want. But you have to reboot for it to take in the registry. That is nothing new. What you need to do is ensure the predefined tag does not exist when you make the modification. And yes this was done for the driver restore function. But whatever keep your tin foil hat on. I have no interest in arguing with you. I learned it was a useless gesture years ago.

Oh and if you like selective quoting.

Grestorm
@Spartan & ChrisRay: biggrin.gif You people are the reason why I still like to work on nHancer. Thanks!

http://forums.slizone.com/inde...&view=findpost&p=11632
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
Well Chris, at the end of the day my only hope is that nHancer continues to function with future nVidia hardware/drivers as losing it would be a colossal blow.

I don?t like it when changes nVidia make stand in the way of it working.
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Originally posted by: BFG10K
Originally posted by: myocardia
So, since only a portion of the few people on Earth who have two or more video cards in one computer experience this/can notice the difference, and there are settings that will either completely erradicate it or make it not nearly as noticeable, how could this even be considered a "problem"?
I'm not sure what you're saying here exactly.

I thought it was fairly self-evident, but I can expound on it. What I'm saying is if so few people on the planet are effected/can even notice, why should I, the public at large, or even ATI or nVidia care? If you happen to be one of the few people who are effected, you'll just need to buy a faster single card, or lower your res and or settings if you have a huge monitor, and the fastest single card isn't fast enugh.
 

Lithan

Platinum Member
Aug 2, 2004
2,919
0
0
Originally posted by: bfdd
Originally posted by: apoppin
If you are taking *competition* you want to be running GTX SLI with just enough details to see what you *need* and the resolution of choice was probably 10x7 .. or a low WS resolution - to get 300FPS and everything flowing silky smooth

if you pause to admire the gfx someone will frag your ass
:Q

Haha yeah you run the bare minimum you need to get by and you usually have WAY more card than you need to get by with. Even for the Source engine.

Higher reso with high dpi mouse = better hitboxes though, remember... assuming you don't play with your feet.

And the old max human eye can see comes up. Though the 30fps people were wrong, it's true that there is a point where the human eye can no longer distinguish that one video is smoother than another, and it's well under 100fps... so the 300fps guys may want to look into that.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
AFAIK, higher than 60fps the eye can hardly perceive it (Specially when V-Sync is on), I heard somewhere that the human refresh rate works pretty much the same as CRT with those scan interleave lines moving upwards and downwards (Something like that, crazy) Very high FPS can cause screen tearing which is easy to spot, lower than 60fps can cause stuttering (When using V-Sync) or screen tearing when not using V-Sync. I cannot see a difference after 60fps, just slighly faster, a bit unreal sometimes.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Lithan
Originally posted by: bfdd
Originally posted by: apoppin
If you are taking *competition* you want to be running GTX SLI with just enough details to see what you *need* and the resolution of choice was probably 10x7 .. or a low WS resolution - to get 300FPS and everything flowing silky smooth

if you pause to admire the gfx someone will frag your ass
:Q

Haha yeah you run the bare minimum you need to get by and you usually have WAY more card than you need to get by with. Even for the Source engine.

Higher reso with high dpi mouse = better hitboxes though, remember... assuming you don't play with your feet.

And the old max human eye can see comes up. Though the 30fps people were wrong, it's true that there is a point where the human eye can no longer distinguish that one video is smoother than another, and it's well under 100fps... so the 300fps guys may want to look into that.

i dunno, i like 300FPS
-with everything fully maxed out and with Full filtering applied
- fortunately i do not compete .. so i get the 'details' and put up with the minor instability like micro stutter .. if i DID compete, it would be with as much details as possible and at 10x7. 60 FPS is way too low for me even for a minimum FPS .. but i put up with 20 in Crysis
--for me it is all *compromise*

for some of us, it is like racing cars .. we can anticipate the green light before it hits and will time it exactly with 'go' - that slight advantage over the competition.

 

Pelu

Golden Member
Mar 3, 2008
1,208
0
0
hm.. i was wondering if vsync relief this because... if you sync your frame rate with the monitor sync... at least the monitor throw its refresh rate at the same interval.. like 60hz give it each 16ms... so if you push the vsync.. you are pushing the cards to give the frame each 16ms... and not that crazy interval....

of course all of what I just said.. only happen in a dream or something i had some days ago... the real deal is....

vsync on.. means.. OMG!!!! this is CRAP!!! runs like CRAP!!!! so slow... like a power point slide show... OH DEARS!!!! this is so BS!!!!!....

in the end.. i push vsync off... and bite the tongue each time I see some micro stutter or any kind of lag.. like >RRRRRRRR.....
 

Datenschleuder

Junior Member
Apr 17, 2008
23
0
0
No, VSync doesn't help.
VSync is about aligning the front buffer swap with the vertical blanking interval of the RAMDAC.

The vertical blanking interval is homogeneous, but the problem is about an inhomogeneous content update of the back buffer, so it doesn't have anything to do with the problem.
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Originally posted by: Alx
At this point I'm ready to file the statement "I can't see microstutter!" alongside all time greats such as "I can't see ghosting on my LCD!" and "I can't see the difference beyond 30fps!". That should be a studied phenomenon on its own, how eyes seems to change function when looking at hard earned possessions.

Funny that you should mention it, but I've never in my life seen ghosting on an LCD, nor have I ever been able to tell any difference with my eyes between 30 FPS & 300 FPS, and it made no difference whether it was my system or not. Oh, and before you respond with something along the lines of "you need to get your vision checked", I have had it checked, and it's 20/15.

Also, have any of you people with multiple video cards (who can actually see this microstutter, that is) ever done any experimentation with over or underclocking your CPU's? The reason I ask is because it seems to me that the people who don't experience it always seem to be CPU-bound. Now, whether or not that has any correlation, I can't say. It seems to me that it might just be something worth looking into, though.
 

bfdd

Lifer
Feb 3, 2007
13,312
1
0
Originally posted by: Lithan
Originally posted by: bfdd
Originally posted by: apoppin
If you are taking *competition* you want to be running GTX SLI with just enough details to see what you *need* and the resolution of choice was probably 10x7 .. or a low WS resolution - to get 300FPS and everything flowing silky smooth

if you pause to admire the gfx someone will frag your ass
:Q

Haha yeah you run the bare minimum you need to get by and you usually have WAY more card than you need to get by with. Even for the Source engine.

Higher reso with high dpi mouse = better hitboxes though, remember... assuming you don't play with your feet.

And the old max human eye can see comes up. Though the 30fps people were wrong, it's true that there is a point where the human eye can no longer distinguish that one video is smoother than another, and it's well under 100fps... so the 300fps guys may want to look into that.

What are you talking bout high res with high dpi mouse = better hit boxes? Hit boxes are all relative to the character model. High resolution = smaller model = smaller hit boxes. They're not technically smaller, but they're smaller on screen. I don't think I played above 800x600 in CS and that was on a monitor capable of 1600x1200.
 

jjzelinski

Diamond Member
Aug 23, 2004
3,750
0
0
Ok, took a good 30 minutes to really get what the OP was saying but, while I agree with his analysis, would like to offer a positive spin on SLI that I don't think has been explicitly laid out. Although we might be perceiving what amounts an entirely new "frame rate" based on the pattern recognition of the "stutters" (wide deltas), SLI still helps prevents more severe slowdowns that occur if one were to hit "traditionally" low frame rates. To put it another way, while the micro stutters make for a perceptually worse frame, that new frame rate is still more consistent, and thereby more preferable, than a true slow down below 30 fps.

I think it's a safe bate that your average schmo could easily detect a traditional frame rate drop from 30 to 14 fps, whereas that same schmo may or may not notice the more consistent degradation of frame to frame smoothness.

There's the new argument for SLI
 

Grestorn

Junior Member
Apr 23, 2008
13
0
0
Originally posted by: BFG10K
Me and Grestorm communicate with each other regularly.
I?ve talked to him about the issue too (not with this handle though) and what you?re saying wasn?t the impression I got from him.

First of all... my name is actually Grestorn

This posting from nVNews that you've quoted was written when I was really frustrated.

Last time I checked (with one of the 174er drivers), it seemed that the checksum is no longer necessary for the DX10 SLI flags to work. Most likely, this checksum was just a bad idea from someone at nVidia which was corrected later.

Anyway, nHancer can cope with the situation for some time now, and I'll keep that in anyway, because as a side effect, you can now revert each profile to its original state as delivered by nVidia.

Originally posted by: BFG10K
Originally posted by: Grestorn
I tried to use some contacts to get in touch with the driver team, but I never got any response from them (wasn't my first try). So my guess is, that they don't really care much for nHancer.

Well, I guess I have to live with that. In the end, I might very well be forced to stop working on nHancer, if they keep making it more and more difficult for third party apps to work with their drivers...

Again this isn?t painting the picture you?re trying to paint. Furthermore the idea to use ghost profiles was garnered from the findings in that thread, not from nVidia?s ?support?.

While it's true that I never got any official reply from anyone at nVidia directly, they never actively did anything to block nHancer either - with the checksum being the notable exception. But I'd chalk that up as a mistake during the early DX10 driver development.

ChrisRay has gone out of his way to help me where he could. He's actually the closest to an insider contact I have ... And I appreciate that very much.

So don't hit on Chris, he's an important link between nVidia and the user community. While the communication with nVidia could still be better, it's definitely not Chris' fault that it isn't. For some reason large companies seem to be quite reserved when dealing with their customers. Well, I guess we have to live with that.

Still, I most definitely prefer nVidia's way of allowing the user to work with the driver's profiles over ATI's closed system, where the only hope to get CF and AA working with an unsupported game is to rename the .exe file.


About the "micro-stutters": This effect was actually first discovered by a user in the forum I hang out most of the time ("tombman"). We've talked a lot about the problem.

As nVidia has confirmed in the meantime, they're working on the issue. In my opinion, the driver is the best spot to fix the problem. And it should be fixable. Someone has written a small application which evens the frame times which worked quite good in most situations, but this tool was more a prove of concept.

Also, the more GPUs are used, the less pronounced is the micro-stuttering effect. I.e. in situation where you get micro-stutters with 2-way SLI, there's a very good chance, that the same scene works fine with Quad-SLI under Vista. And everyone has to keep in mind, that micro stuttering is not happening all the time, but only in very specific situations.
 

jjzelinski

Diamond Member
Aug 23, 2004
3,750
0
0
Good read Grestorn, thanks for taking the time to share your thoughts with us

Btw, makes sense about evening the intervals.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Again, I don't think people really understand the concept that 60 frames per second is quite meaningless as one single frame can take more than its fair share of time. 59 frames rendered in 500ms, and 1 frame taking the other 500ms still results in 60 frames per second, but will look absolutely horrible.

I wish people would understand the whole 100 frames per second = smooth is a fallacy. 30 frames per second CAN be smoother than 100 frames per second. It all depends on the consistancy of the frames rendered.

Therefore, anything measured in seconds is going to have a large margin of error in the results and it doesn't matter if you display the minimum frame rates or not, the same problem occurs.

Results need to be in frames per milliseconds if we are to get a true picture of performance, IMO.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
Firstly, I'd like to welcome Grestorn to the forums. nHancer is made on his own spare time but it absolutely puts nVidia's abominable control panel to shame. :thumbsup:

So don't hit on Chris, he's an important link between nVidia and the user community.
Yeah don't get me wrong, I wasn?t attacking him, I was frustrated with nVidia in general for making nHancer more difficult to get working.

Still, I most definitely prefer nVidia's way of allowing the user to work with the driver's profiles over ATI's closed system, where the only hope to get CF and AA working with an unsupported game is to rename the .exe file.
Yep, transparent profiles are a big advantage for nVidia. To be honest I'm not sure what ATi are playing at by not allowing them. With nVidia I can do things like force AA into games months before nVidia does it officially.

Last time I checked (with one of the 174er drivers), it seemed that the checksum is no longer necessary for the DX10 SLI flags to work.
Cool, maybe now you can restore the ?remove all profiles? feature.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
I thought it was fairly self-evident, but I can expound on it. What I'm saying is if so few people on the planet are effected/can even notice, why should I, the public at large, or even ATI or nVidia care?
Again I?m not following your reasoning. AFR systems are not for the public at large to begin with; they?re a niche to obtain higher performance than a single card by combining multiple high-end GPUs.

In that light people in that niche are the most likely to notice something like micro-stutter since it erodes the whole purpose of getting multiple GPUs.

If you happen to be one of the few people who are effected, you'll just need to buy a faster single card, or lower your res and or settings if you have a huge monitor, and the fastest single card isn't fast enugh.
Sure, but if they fixed AFR input lag and micro-stutter I might be more inclined to consider an AFR system. If they want to garner more business it?s in their best interests to address the negatives of said system.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Grestorn
Originally posted by: BFG10K
Me and Grestorm communicate with each other regularly.
I?ve talked to him about the issue too (not with this handle though) and what you?re saying wasn?t the impression I got from him.

First of all... my name is actually Grestorn

This posting from nVNews that you've quoted was written when I was really frustrated.

Last time I checked (with one of the 174er drivers), it seemed that the checksum is no longer necessary for the DX10 SLI flags to work. Most likely, this checksum was just a bad idea from someone at nVidia which was corrected later.

Anyway, nHancer can cope with the situation for some time now, and I'll keep that in anyway, because as a side effect, you can now revert each profile to its original state as delivered by nVidia.

Originally posted by: BFG10K
Originally posted by: Grestorn
I tried to use some contacts to get in touch with the driver team, but I never got any response from them (wasn't my first try). So my guess is, that they don't really care much for nHancer.

Well, I guess I have to live with that. In the end, I might very well be forced to stop working on nHancer, if they keep making it more and more difficult for third party apps to work with their drivers...

Again this isn?t painting the picture you?re trying to paint. Furthermore the idea to use ghost profiles was garnered from the findings in that thread, not from nVidia?s ?support?.

While it's true that I never got any official reply from anyone at nVidia directly, they never actively did anything to block nHancer either - with the checksum being the notable exception. But I'd chalk that up as a mistake during the early DX10 driver development.

ChrisRay has gone out of his way to help me where he could. He's actually the closest to an insider contact I have ... And I appreciate that very much.

So don't hit on Chris, he's an important link between nVidia and the user community. While the communication with nVidia could still be better, it's definitely not Chris' fault that it isn't. For some reason large companies seem to be quite reserved when dealing with their customers. Well, I guess we have to live with that.

Still, I most definitely prefer nVidia's way of allowing the user to work with the driver's profiles over ATI's closed system, where the only hope to get CF and AA working with an unsupported game is to rename the .exe file.


About the "micro-stutters": This effect was actually first discovered by a user in the forum I hang out most of the time ("tombman"). We've talked a lot about the problem.

As nVidia has confirmed in the meantime, they're working on the issue. In my opinion, the driver is the best spot to fix the problem. And it should be fixable. Someone has written a small application which evens the frame times which worked quite good in most situations, but this tool was more a prove of concept.

Also, the more GPUs are used, the less pronounced is the micro-stuttering effect. I.e. in situation where you get micro-stutters with 2-way SLI, there's a very good chance, that the same scene works fine with Quad-SLI under Vista. And everyone has to keep in mind, that micro stuttering is not happening all the time, but only in very specific situations.

incredible .. you are most welcome to our forum; the work you have done with nHancer is awesome, i am in awe personally

you are "dead on" about AMD's suspport, and i hope they will get back into the game

i think "now" you just may have found some some possibly faster insider contacts to NVIDIA;the closest you can get and still be completely independent. Please keep us informed of your progress with this micro stutter issue - especially that program that "evens" it out; and i am looking forward to the next version of your own quality program. NVIDIA would be shooting themselves in the foot if they ever drop support for one of their best tools for real fans and especially SLi enthusiasts.





 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
Please keep us informed of your progress with this micro stutter issue - especially that program that "evens" it out;
I surmised this was a possible solution a few pages ago but to be honest I can't see how it would work.

If the delayed frames are caused by driver the app can't speed them up, all it can do is slow down the faster frames and make their durations more in line with the slower frames.

Of course doing that will significantly reduce the average framerate and if there?s any buffering involved it?ll increase input lag which is already a problem with AFR.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: BFG10K
Please keep us informed of your progress with this micro stutter issue - especially that program that "evens" it out;
I surmised this was a possible solution a few pages ago but to be honest I can't see how it would work.

If the delayed frames are caused by driver the app can't speed them up, all it can do is slow down the faster frames and make their durations more in line with the slower frames.

Of course doing that will significantly reduce the average framerate and if there?s any buffering involved it?ll increase input lag which is already a problem with AFR.

I'll take anything I've experienced with AFR over single card any day BFG, here's why:

50fps average at COH with Quad is more playable than 23fps average with a 8800U

60fps average at Oblivion with Quad is more playable than 20fps average with a 8800U

98fps average at HL EP2 with Quad is more playable than 43fps average with a 8800U

139fps average at COD4 with Quad is better then 44fps average with a 8800U

41fps average at Crysis with Quad is better than 19fps average with a 8800U

People buy multi GPU sets to have higher image quality BFG. Single cards aren't even playable at all in 3/5 benchmarks I linked to above.

You can talk about barely perceptible fluctuations in caused by load balancing all you like- but the fact of the matter is multi card delivers a far better gaming experience.

What can anyone even say to this? That it's better to count the frames at 20fps than look at fluid animation at 60fps?


Edit:From the reviews I've seen of CrossfireX, this is the way of ATi hardware as well.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |