Why does a 4 fps drop seem so bad?

FearoftheNight

Diamond Member
Feb 19, 2003
5,101
0
71
Hey guys I play starcraft 2 on my mac here and due to the various driver issues and stuff I cap my fps at 30. However in battles my fps drops to 24 or 25 which doesn't seem like a big difference but it just fels like the system just isn't responsive. Is it in my head or is it really supposed to feel that choppy from just losing a few fps like that?
 

twinrider1

Diamond Member
Sep 28, 2003
4,096
64
91
Just off the top of my head, I think 26fps is the standard level used in film...the level in which the eye sees fluid motion and not individual frames. If I am remembering that correctly, you wouldn't notice 34 dropping to 30, but you're right on the border there. Again, it's just off the top of my head, and it's late, but 26fps sounds very familiar.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Just off the top of my head, I think 26fps is the standard level used in film...the level in which the eye sees fluid motion and not individual frames. If I am remembering that correctly, you wouldn't notice 34 dropping to 30, but you're right on the border there. Again, it's just off the top of my head, and it's late, but 26fps sounds very familiar.

you are thinking of 24fps but a game is NOT a movie.
 

akugami

Diamond Member
Feb 14, 2005
5,837
2,101
136
It's not the 4fps drop so much as it is the threshold you're dropping under. You can drop from 45fps to 41fps and will barely notice it for example. Ideally you want to remain above roughly 30fps at all times. Once you drop below that, you'll notice animations become much less smooth and jerky at times. It also depends on the game. Twitch and heavy action games you'll notice it more while something like solitaire you won't even notice it if it drops to 15fps.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
It depends on a lot of things. And you're thinking of 24fps, I think. But you shouldn't compare television to video games. Films and TV productions use blurring to bridge the gaps between frames to good effect, and to save bandwidth (more frames=more data over cable wires). So most people wouldn't notice a difference between 24 FPS and the amount of frames per second it takes to approximate fluid movement as it appears to the naked eye, unless they were shown side-by-side comparisons.

Most video games do not use blurring like this, and many of those that do, do so rather poorly. In addition to this, a lot of the time you're moving the camera around very quickly, much more quickly than TV and film camera work does. This only exacerbates the problem-- and why most people agree that more FPS is better, even with the diminishing returns that come with frame rates past 40 or so.

There's also a correlation between FPS and game responsiveness in some games whose engines are tied to frame rate count.

But I'm no expert. This is just what I've gathered.
 

Carcass

Member
Jun 14, 2010
30
0
0
I'd agree with the notion that a sub-30fps isn't really acceptable for most gaming, and you're seeing first hand why. However, isn't there something to be said that a 6fps drop from 30fps is a 20% decrease? You'd probably notice a drop to 48fps from 60fps
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Hey guys I play starcraft 2 on my mac here and due to the various driver issues and stuff I cap my fps at 30. However in battles my fps drops to 24 or 25 which doesn't seem like a big difference but it just fels like the system just isn't responsive. Is it in my head or is it really supposed to feel that choppy from just losing a few fps like that?

I'm not sure if SC2 has vsync or not, but if it does, then make sure it's off. If you leave it on then anything between 15 and 29.9999999999 fps gets set to 15fps.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
I'm not sure if SC2 has vsync or not, but if it does, then make sure it's off. If you leave it on then anything between 15 and 29.9999999999 fps gets set to 15fps.

QFT

Vsync will do that in increments:
120 -> 60 -> 30 -> 15

*pats his CRT's and love being able to live without Vsync*
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Hey guys I play starcraft 2 on my mac here and due to the various driver issues and stuff I cap my fps at 30. However in battles my fps drops to 24 or 25 which doesn't seem like a big difference but it just fels like the system just isn't responsive. Is it in my head or is it really supposed to feel that choppy from just losing a few fps like that?

2 reasons. First, human eye's doesn't function like a monitor. Some say people eyes can't see above 60 FPS, but that is wrong. A human eye will see gray if to slide, one black and one white, alternatively, at 60 FPS. In fact, at 55FPS, you may see flashing white and black, at 58 FPS, you will see gray.

Second, FPS is an approximation, it isn't accurate. It is only accurate if GPU is the only battleneck. Also, keep in mind that if the time between frames is not even, than you will be able to detect it. When it is 30 FPS, although GPU is bottlenecking, the time interval between frames are same. That also means your GPU is capable of displaying the scene at 30 FPS. Whenever it drops, it means there are other bottleneck. These are sudden slow down, probably last very short time, this won't hurt FPS much but as like mini hangs. It is detectable.

Trust your eyes, not the number.
 

tweakboy

Diamond Member
Jan 3, 2010
9,517
2
81
www.hammiestudios.com
Ya 30FPS is no good. It wont be smooth and life like. Once it drops below 30fps which is the minimum you should get,, then 24fps or 25fps is very jerky.

If you were getting 40fps ,, and dropped 5fps you wouldnt notice that much. Also shame on you for using MAC,, jk jk jk.
 

Idontcare

Elite Member
Oct 10, 1999
21,118
59
91
Second, FPS is an approximation, it isn't accurate. It is only accurate if GPU is the only battleneck. Also, keep in mind that if the time between frames is not even, than you will be able to detect it. When it is 30 FPS, although GPU is bottlenecking, the time interval between frames are same. That also means your GPU is capable of displaying the scene at 30 FPS. Whenever it drops, it means there are other bottleneck. These are sudden slow down, probably last very short time, this won't hurt FPS much but as like mini hangs. It is detectable.

It is also an average value that doesn't really convey the standard deviation of the population.

30fps means on average a new frame is being displayed every 33ms.

But you can get the same fps value (30) by having 27 of those frames rendered in 25ms each while having 3 frames per second (10% of the frames) taking 108ms each.

The one with 3 frames that take 108ms will seem quite "stuttery" to the end user even though the other 90% of the frames are rendered even faster in comparison to the setup that yields a consistent 33ms frame time for all 30 frames every second.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
you are thinking of 24fps but a game is NOT a movie.

Its amazing to me how often that analogy gets brought up.

The movie/TV/video game frame rate has been debated/debunked for years (movies are not games, they have motion blur to help ease the transition from frame to frame, and they chose 24fps to save on cost of film, not because we couldn't tell any higher) and years and years yet it keeps popping up.

The perpetuation of ignorance is
 

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
Just off the top of my head, I think 26fps is the standard level used in film...the level in which the eye sees fluid motion and not individual frames. If I am remembering that correctly, you wouldn't notice 34 dropping to 30, but you're right on the border there. Again, it's just off the top of my head, and it's late, but 26fps sounds very familiar.

I hate people constantly propagating this stuff. The human eye can tell the difference between frame rates into the hundreds. Human eye is analogue not digital there is not a hard cap on what you can perceive. There is a point at which you will not notice a difference but it is well beyond 30.

Another issue deals with the fact that frame rate and input response are interconnected in many games - it makes it much easier to feel the difference between 25 and 30 fps.

I have a copy of a program to show this that was written in 96 but it doesn't function properly. You could set the left and right side of a 3d object to different frame rates. I could tell a difference between 120 and 150 but from there I could not. It has to do with rendering times and how evenly spaced displayed frames are. Between 25 and 30 would be a blaring difference because of the incredible choppiness.
 

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
It is also an average value that doesn't really convey the standard deviation of the population.

30fps means on average a new frame is being displayed every 33ms.

But you can get the same fps value (30) by having 27 of those frames rendered in 25ms each while having 3 frames per second (10% of the frames) taking 108ms each.

The one with 3 frames that take 108ms will seem quite "stuttery" to the end user even though the other 90% of the frames are rendered even faster in comparison to the setup that yields a consistent 33ms frame time for all 30 frames every second.

EXACTLY. This is why more frames rendered by the GPU = better. You have a higher chance of displaying frames that were rendered within equal (or at least close to equal) intervals therefor giving a feeling of fluidity and responsiveness.

It may also be worth noting that in some games certain things are only possible with a high fps because of the way calculations occur client side. IE jumping into certain spots may be impossible in cod4 or MW with low fps.
 

sandorski

No Lifer
Oct 10, 1999
70,130
5,658
126
24 fps in Movies is rather interesting. Next time you watch a Movie, really look at the transition between frames. It's quite noticeable when looking for it. This becomes even more noticeable during Gaming because of your interaction with what's going on the screen. It's even more problematic when your FPS goes from a higher value where gaps are less noticeable down to a lower value where they become more noticeable.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,452
10,120
126
24 fps in Movies is rather interesting. Next time you watch a Movie, really look at the transition between frames. It's quite noticeable when looking for it. This becomes even more noticeable during Gaming because of your interaction with what's going on the screen. It's even more problematic when your FPS goes from a higher value where gaps are less noticeable down to a lower value where they become more noticeable.

You do realize that movies shown in theaters, the ones still using analog film projectors, they have a "double-shutter" feature, that actually appears to double the FPS to 48 when shown on the screen.
 

sandorski

No Lifer
Oct 10, 1999
70,130
5,658
126
You do realize that movies shown in theaters, the ones still using analog film projectors, they have a "double-shutter" feature, that actually appears to double the FPS to 48 when shown on the screen.

I did not know that, watch them on TV. It's very noticeable.
 
Dec 30, 2004
12,554
2
76
It depends on a lot of things. And you're thinking of 24fps, I think. But you shouldn't compare television to video games. Films and TV productions use blurring to bridge the gaps between frames to good effect, and to save bandwidth (more frames=more data over cable wires). So most people wouldn't notice a difference between 24 FPS and the amount of frames per second it takes to approximate fluid movement as it appears to the naked eye, unless they were shown side-by-side comparisons.

Most video games do not use blurring like this, and many of those that do, do so rather poorly. In addition to this, a lot of the time you're moving the camera around very quickly, much more quickly than TV and film camera work does. This only exacerbates the problem-- and why most people agree that more FPS is better, even with the diminishing returns that come with frame rates past 40 or so.

There's also a correlation between FPS and game responsiveness in some games whose engines are tied to frame rate count.

But I'm no expert. This is just what I've gathered.

I was going to post this myself, thanks. Also, welcome.
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
I hate people constantly propagating this stuff. The human eye can tell the difference between frame rates into the hundreds. Human eye is analogue not digital there is not a hard cap on what you can perceive. There is a point at which you will not notice a difference but it is well beyond 30.

Another issue deals with the fact that frame rate and input response are interconnected in many games - it makes it much easier to feel the difference between 25 and 30 fps.

I have a copy of a program to show this that was written in 96 but it doesn't function properly. You could set the left and right side of a 3d object to different frame rates. I could tell a difference between 120 and 150 but from there I could not. It has to do with rendering times and how evenly spaced displayed frames are. Between 25 and 30 would be a blaring difference because of the incredible choppiness.
This is completely true. When these human eye/FPS threads come up I like to say the human eye operates by Fhotons Per Second. Its completely variable to the change in the scene. For example imagine you are out in space next to a beautiful black neutron star that suddenly shoots out a gamma ray burst for 1/50000th of a second. If you had time to process the image before you died, you definitely would see it. However imagine staring into a movie of incredibly slow moving fog rendered at 10 fps. It would be very hard if not impossible to tell between the different frames.

The same holds true to the type of game you are playing and your tolerance to framerate. I can't really tell a difference between 100-160 FPS on Starcraft II (although noticeable delay when under 160), yet on a competitive FPS like Quake, I can determine individual frames above 160 even though over 100 does feel smooth.

Everyone has different tolerances regarding FPS/input delay. It seems you start to feel uncomfortable delay with the Starcraft engine around 25 FPS
 

Idontcare

Elite Member
Oct 10, 1999
21,118
59
91
This is completely true. When these human eye/FPS threads come up I like to say the human eye operates by Fhotons Per Second. Its completely variable to the change in the scene. For example imagine you are out in space next to a beautiful black neutron star that suddenly shoots out a gamma ray burst for 1/50000th of a second. If you had time to process the image before you died, you definitely would see it. However imagine staring into a movie of incredibly slow moving fog rendered at 10 fps. It would be very hard if not impossible to tell between the different frames.

The same holds true to the type of game you are playing and your tolerance to framerate. I can't really tell a difference between 100-160 FPS on Starcraft II (although noticeable delay when under 160), yet on a competitive FPS like Quake, I can determine individual frames above 160 even though over 100 does feel smooth.

Everyone has different tolerances regarding FPS/input delay. It seems you start to feel uncomfortable delay with the Starcraft engine around 25 FPS

Interesting way to put it there Ben90...what you are getting at is that ocular resolution in terms of a chronometer is intrinsically tied to our spatial processing capability in addition to our color palette resolution.

If something moves 10 pixels horizontally on a screen in the time-span that is less than or equal to one "frame" then spatially our brains will detect the "jump" even if it occurred over a very narrow timeslice.

But take that same 10pixel movement and stretch it out over the span of 5 minutes and we'd barely notice it occurring whether it is delivered in 150 fps or 2 fps.

I bet there are more than a few PhD dissertations out there collecting dust in a library on the science behind it all. Sadly what little we laymen come to understood is something born from a mishmash of abused marketing terms and analogies combined with personal experience.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
they did something like your neutron star thing before. A discussion much like this one: http://www.tomshardware.com/forum/1875-3-frames-games-flat-panels Note however it is completely irrelevant to gaming. If you are in a pitch black room, the slightest bit of light will set your neurons firing, hence the 1/220th second test results.

Motion blur can compensate somewhat for lower fps but only if done well. E.g., Crysis looks better than its actual fps because of its motion blur.

For games that don't have good motion blur, higher fps is needed.

The point raised about microstutter is valid. If most of the frames are clustered at one end of the one-second interval, then there may be perceptible stutter, even if technically speaking the fps was 30 or 60 or even 120. To take an extreme example, say that in between 0.000 and 0.001 seconds, 119 frames were rendered on a 120Hz monitor. Then, at 0.5 seconds, the 120th frame was rendered. This is technically 120 frames per second, but it will look far uglier than that.

See this fun page for a comparison between 15, 30, and 60 fps: http://www.boallen.com/fps-compare.html

This is completely true. When these human eye/FPS threads come up I like to say the human eye operates by Fhotons Per Second. Its completely variable to the change in the scene. For example imagine you are out in space next to a beautiful black neutron star that suddenly shoots out a gamma ray burst for 1/50000th of a second. If you had time to process the image before you died, you definitely would see it. However imagine staring into a movie of incredibly slow moving fog rendered at 10 fps. It would be very hard if not impossible to tell between the different frames.

The same holds true to the type of game you are playing and your tolerance to framerate. I can't really tell a difference between 100-160 FPS on Starcraft II (although noticeable delay when under 160), yet on a competitive FPS like Quake, I can determine individual frames above 160 even though over 100 does feel smooth.

Everyone has different tolerances regarding FPS/input delay. It seems you start to feel uncomfortable delay with the Starcraft engine around 25 FPS
 
Last edited:

Campy

Senior member
Jun 25, 2010
785
171
116
I hate people constantly propagating this stuff. The human eye can tell the difference between frame rates into the hundreds. Human eye is analogue not digital there is not a hard cap on what you can perceive. There is a point at which you will not notice a difference but it is well beyond 30.

Another issue deals with the fact that frame rate and input response are interconnected in many games - it makes it much easier to feel the difference between 25 and 30 fps.

I have a copy of a program to show this that was written in 96 but it doesn't function properly. You could set the left and right side of a 3d object to different frame rates. I could tell a difference between 120 and 150 but from there I could not. It has to do with rendering times and how evenly spaced displayed frames are. Between 25 and 30 would be a blaring difference because of the incredible choppiness.

This is so true. I can easily tell the difference between gaming with 85/100/120/150Hz when playing CS on my CRT. I read somewhere that what we can percieve is somewhere up around 250-300Hz. In games a lot of the problem is exacerbated by quick/sharp movements though, and i think we need a whole lot of fps before everything's as smooth as real life even when turning quickly.
Personally i can't wait for manufacturers to start focusing more on higher Hz on their screens, it will be a great day for gaming. Oh and i hope they start making movies with 100fps too, i think that would rock for the fast moving sequences :biggrin:
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |