Someone explain this to me because I don't get it. Why is Batman: AC even dipping under 60FPS? Is this at like 1600p? I get around 80-100FPS @ 1080p.
Console port with poorly written code.
These charts rather confuse the issue and don't accurately explain it.
If your monitor is a 60Hz monitor then it displays 60 images per second, aka 60FPS.
Every 1 s /60 frame = 16.(6) ms/frame it will read the image buffer and replace the current image with the one on the image buffer. Reading the image buffer is not an instantaneous process, it takes several ms to perform.
If you have vsync on, then the GPU will match its rendering of frames to the actual refresh rate of the monitor. If it fails to render a frame in time for the refresh (16.7 ms) it will leave the previous frame there and wait until the next refresh (33.2 ms) to swap buffers (it uses 2 or 3 buffers and swaps their label rather then copying from one buffer to another).
Thus the instantaneous "drop" in FPS from 60 (16.7 ms per frame) to 30 (33.2 ms per frame) and what causes one of the many forms of micro-stutter. And opponents of vsync claim it affects input rate (it doesn't with the exception of very badly coded games. Input is independent pathway. What it will do is give you an image lag, the enemy will appear on your monitor a maximum of 16.7 ms = 1/60th of a second later then his actual arrival in game). It does however potentially affect the time of executing a command, if you saw it later and hit the button later.
Your overall FPS will still measure as something higher then 30, for example 50, because it displays the average of all of the FPS over the last second not the per frame FPS. But without vsync you would have seen it average out to something even higher... like say, 55. However that would be entirely due to tearing frames.
If vsync is off and you are generating 150 fps then you are making a new image every 6.(6) ms.
0 ms - render start
6.7 ms - first frame done
13.3ms - second frame done, first frame discarded and was a waste to even bother rendering.
16.7ms - reading frame buffer by monitor starts (from top).
20 ms - third frame finished, tearing will occur as monitor is now reading the third frame not second frame.
21 ms - monitor finishes reading the third frame. Got a stitched together image from frame 2 and 3.
The figure of 21ms is hypothetical as it varies based on resolution and speed of your connection (DVI, DP, HDMI, and various versions of those formats).
adaptive vsync is meant to appease those upset with the skipping of a frame. It will not allow FPS over 60 like vsync already does, but if the monitor started updated an image it will not skip it for the next cycle but use it now (as if you have no vsync).
It is still vsync, its just god a different policy on what to do when you miss the deadline. Rather then waiting until next refresh it will immediately turn it in, mixing it in with the previous frame. This actually guarantees that you will see tearing every time your FPS drops below 60. But you reduce your image lag for the lower half of the screen (or was it upper?), since half the screen is the old image and half is the new image displayed earlier then it otherwise would (the two half images are separated at the tear). You do not actually get a full 1/60th of a second though, since by definition it was only an issue because the image was late to render. You get between 16.(6) and 16.(6) - X where X is the time it takes to transmit the image.
While I personally will use regular vsync, adaptive is useful for those who find regular vsync unacceptable and will now be able to use it instead of leaving vsync entirely off. (this will greatly improve their thermals, power consumption, and GPU lifespan too since its doing less work)
Further benefits of vsync, adaptive or not, is that if an image is done early then the GPU idles, and that means it can start on the next image sooner. And it works even better with nvidia's new GPU turbo boost, where the extra idle time gives it time to cool down and reduces overall power consumption.