You were incorrect, you don’t understand pre-rendering, and you don’t understand how the buffering chain works. Your arguments prove that. Now stop with the rhetoric please, it’s tiresome.
First, that paragraph is about the personal attack.
Did you read the links I posted? Do you understand where the 30 FPS is coming from and why it’s happening?
I have read your post, your links and know what you were trying to say. However, you don't seems to understand the difference between theories and facts. In theory, double buffer can hurt FPS by a lot. In practice, its effect is small.
Are you for real? This whole thing started when you said:
To which I responded:
Exactly, I said vsync doesn't really hurt FPS, and you said only vsync
without triple buffer will. What you probably don't realize is the fact that a) game code can force driver to enable triple buffer when vsync is on and b) triple buffer on game profile only works for openGL games. That means, user actually doesn't have control over whether or not to use triple buffer or not (unless it is XP).
http://forums.nvidia.com/index.php?showtopic=173860
The article from Anandtech clearly stated that game developer is responsible to use triple buffer, not video card user.
I’ve bolded where I mentioned triple buffering. If you can't understand that the second sentence is related to the first then that's your problem. For you to claim I never mentioned triple buffering is an outright lie on your part. Please retract your statement immediately.
Whether or not vsync is going to use triple buffer or not depends on the game, not user, and whether or not double buffering hurts FPS really depends on the time required by last few stages on rendering.
If the time required by the last few stages > the frequency of the refresh rate, then it hurts FPS. That is, on a 60hz monitor, if the time required by the last few stages > 16.67ms, then FPS will be cap at 60/X, where X is an integer. The reason for this is the reason you have given, but is the time required by the last few stages a dominate factor to the FPS shown?
I never said vsync doesn't hurt FPS, I said its impact is ever so small that it can be ignored. You said I am incorrect. On paper, it can, and I have give example for it, but since today's video card can finish the last few stages in under 10ms easily at 1920x1080 res, be it double or triple buffer, it won't make a big differences. To prove this, simply run the game and test.
Furthermore your statement is wrong anyway since a cap will reduce the framerate anytime it would have been exceeded. That’s the dictionary definition of a cap.
That is why I said "more like". It isn't a cap to begin with, it is synchronization. In your log, there is a 61, and 61 > 60. It isn't a cap, but it can be understood at one.
FPS is a numeric approximation of how many frames per second. While the name is obvious, do you actually know the algorithm used to derive that number? In short, it is an approximation. The actual formula looks like this
time = time/X + render time of last frame/Y, where X + Y = 1.
FPS = 1/time (in ms).
another way of implementing this is by keeping keeping the time needed for N most recent frames divide by N.
However, without seeing the formula used, people believe that there exists a counter within the video card that actually counts the number of frames by a certain interval, which than believes that if the FPS shows 1000, then the video card actually rendered 1000 frames, where it is more like 1ms in average for the last N frames.
As you can see, FPS <> the number of frames displayed. With or without vsync, the screen can not refresh more frames than its refresh rate. FPS can be misleading if you don't understand how it is derived.
When you see FPS = 60, what it really means is the average time needed to render by GPU is 16.67ms. Just because it shows 16.67ms doesn't mean the video card actually took 16.67ms on average to generate those frames, but the average time from the video card being 2 ready states.
In other words, it is not a cap.
Then you made another statement no, vsync doesn't really affect frame rate which is wrong, with or without triple buffering.
You’re playing semantic games and using strawman arguments.
My friend, I linked videos with FPS. Strawman? Semantic games? What exactly does it mean when the FPS reminds unchanged (much) with or without vsync? In my words, it means vsync doesn't really affect frame rate. You said, it is incorrect.
I already posted four links proving my statements. Did you read them or are you just going to continue to troll?
Did I not posted links showing otherwise? I have no problem seeing you proving your point. I have problem on how you get personal. This would be a healthy debate if you can control yourself.
Here’s a personal FPS log from CoJ. You keep telling me that you know the topic so you should have no trouble understanding what it means and why it shows I’m right:
Code:
VSYNC
On Off
61 97
60 90
57 101
60 96
58 83
60 83
60 90
60 115
60 109
60 121
60 115
58 95
60 78
60 70
49 59
31 51
30 52
30 53
29 50
31 50
29 50
30 48
30 48
30 48
30 47
30 47
29 39
30 39
30 43
47 61
30 45
29 36
30 39
30 40
30 42
30 41
30 41
30 37
30 39
30 45
51 70
31 46
30 47
30 43
48 66
33 59
38 59
38 53
33 61
43 54
30 44
30 49
29 47
31 42
29 38
30 37
30 38
30 37
30 36
30 38
The debate between you and toyota was entertaining. While Toyota had this finding
okay since you claimed Call of Juarez goes right from 60fp to 30fps I decided to check that out. that means if I am not averaging 60fps then I will be at 30fps according to what you are saying. I first tested the game with vsync off and then with it on.
vsync off
Frames, Time (ms), Min, Max, Avg
3108, 64677, 34, 71, 48.054
vsync on
Frames, Time (ms), Min, Max, Avg
2595, 54913, 29, 61, 47.257
well I sure as heck cannot maintain close to 60fps with vsync off yet I still average the same with it on or off. so again give me another game and I will show you that triple buffering claim does not pan out in reality.
And you ended up digged log files to prove your point. On a log file that consists of 3000 frames, you pick the sector which supports your you said, while discarded the rest on a game that appears to be using double buffer.
Again, when an entry shows 30 in the Fraps log, it means that for the last N frames, the average time between 2 ready states is 33.33ms. You can assume that the time the video card needs to render is a between 16.67ms to 33.33ms, which is a valid assumption. You can also use this entries to show that there are times where the video card are forced to wait before the buffer is ready. I, nor Toyota, ever challenged that. However, the statement you made about vsync, especially on double buffering, is really confusing.
You said in both threads, FPS will be cap at 30 with vsync where without it can be 30-59. On paper, it can, but that doesn't mean it is always the case. As long as the video card is fast enough to finish its task within 1/refreshrate, then vsync doesn't hurt FPS. Since each frames may require different time to render, it is really hard to say whether or not vsync actually hurts FPS.
While me and Toyota both presented to you the overall performance and how small the impact is, you dig into the times where it does happen. That is fine, but state that others don't know what they are talking about. Is that really necessary?
I see with vsync, the average FPS is 47.257, and without, the average FPS is 48.054. Even kids know that it isn't half. There are impacts, but it is at most 1-47.257/48.054 = 1.66% of the time knowing that without vsync, FPS can go beyond refresh rate. Is this really a big deal? The difference is even less than 1 FPS.
"If you dig the log, you will find the differences." Yes, digging logs will find differences, but that is not what most people care about. Most people care about what they see from the monitor.