[pcper] Interview: AMD's Richard Huddy

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
The proof is the blurbusters detailed latency review:
http://www.blurbusters.com/gsync/preview2/

From that we know that gsync adds no additional latency compared to vsync off. The absolute numbers make it impossible for there to be an entire frame of latency hidden in there, the CS:GO figures just don't allow 8ms to hide in 22ms of total latency when you account for mouse sampling, CPU time and GPU rendering time, there just isn't space in those numbers for the monitor to also be adding 8ms (a frame) in addition to the pixel switch time.

I think you messed up your numbers for CS:GO. First of all it's only 7 ms you need to hide since the monitor is 144hz (6.94 ms per frame), secondly the average latency with vsync off is 24 ms versus 39.5 ms for G-SYNC, for an average difference of 15.6 ms. So not only is there plenty of room to hide a 7 ms frame, you can even hide 2.

At 144 FPS max the difference is 11.5 ms so still plenty of room for a 7 ms frame.

At 120 FPS max, well who knows, Blur Busters didn't test 120 FPS max for vsync off.

We have real solid numbers showing the real end to end latency, hard scientific proof.

Whilst the Blur Busters article, is a nice little review and certainly informative, I'd dare say, that if you consider that hard scientific proof, you have pretty low standards.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
When you are at the max refresh rate of the screen G-Sync acts like V-Sync because it's delaying the frame waiting for the screen to be ready to refresh.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
There is an even better way to look at this to prove it more convincingly. If you think a little harder about the test blurbusters did you realise we can use the minimum latency they saw. Why you ask? Because it was sufficient time for the mouse button to be clicked, led to light and then see the impact on the screen, it is a full and complete latency picture. The variance is all to do with timing, there is an 8ms (at 120hz which is the right one to use as 144 is being impacting by vsync waiting) difference in frames and if the mouse is clicked at the worst time it can add up to 8ms of additional time. Thus we see a variance. There is other variance there as well, likely due to the game CPU usage being somewhat uneven but in essence the minimum is as valid a number as the others and we don't care about the average, the average is hiding additional latency from the timing.

So now the number that has to hide 7ms is 17ms. The GPU is taking 8ms because we are at 120hz (1000/120 = 8.3) and tests of this panel in other places tell us pixel switch time is 5ms, grand total so far being 13ms there is now only 4ms for the CPU, mouse sampling and most importantly there is meant to be an additional 7ms of screen latency in there. But there isn't.

Its a sufficient test to determine the impact of monitor latency. While it doesn't directly measure that (tftcentral will, but just bare in mind that the lightboost 2 monitors are sub frame latency as well and gsync is basically lightboost 3) we have enough information to know its not possible. Huddy is the one that is asserting that there is additional frame buffering (something monitors haven't done for quite some time) so its for him to prove it, I am confident we have the evidence to the contrary in the public domain.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
There is an even better way to look at this to prove it more convincingly. If you think a little harder about the test blurbusters did you realise we can use the minimum latency they saw. Why you ask? Because it was sufficient time for the mouse button to be clicked, led to light and then see the impact on the screen, it is a full and complete latency picture. The variance is all to do with timing, there is an 8ms (at 120hz which is the right one to use as 144 is being impacting by vsync waiting) difference in frames and if the mouse is clicked at the worst time it can add up to 8ms of additional time. Thus we see a variance. There is other variance there as well, likely due to the game CPU usage being somewhat uneven but in essence the minimum is as valid a number as the others and we don't care about the average, the average is hiding additional latency from the timing.

This is an okay idea, it does however have one weakness, namely that fact that it assumes that the measured minimums for vsync and G-SYNC, are at the global minimum, which isn't necessarily the case. But even then it should still get us closer to finding the actual latency.

So now the number that has to hide 7ms is 17ms. The GPU is taking 8ms because we are at 120hz (1000/120 = 8.3) and tests of this panel in other places tell us pixel switch time is 5ms, grand total so far being 13ms there is now only 4ms for the CPU, mouse sampling and most importantly there is meant to be an additional 7ms of screen latency in there. But there isn't.

That's not how an fps cap works, the GPU doesn't suddenly become slower and take longer to render a frame, instead the GPU (or more precisely the CPU, before it passes the next frame to the GPU) is made to wait before starting on a new frame until enough time has passed to stay under the fps cap.

Since we know that the titan they can run at least 300 fps (and probably significantly higher), it render a frame in an average of 3.33 ms. However since where looking at the minimum latency time, chances are that the render time is actually below average as well, so most likely there's only 2-3 ms of GPU render time.

The time spent by the CPU on the frame adds up to the total frame time and since we know the average frame time when uncapped is at max 3.33 ms, the CPU most likely doesn't spend more than 1 ms on average per frame (unless it can start on the next frame before the GPU has finished, also known as a render ahead queue, but I don't know if CS:GO uses that).

So that gives 2-3 ms for the GPU, 1 ms for the CPU and 5 ms for the panel. leaving up to 9 ms for latency (from any other source), and thus still enough room for 7 ms latency from a single frames worth of latency.
 

Mand

Senior member
Jan 13, 2014
664
0
0
This is an okay idea, it does however have one weakness, namely that fact that it assumes that the measured minimums for vsync and G-SYNC, are at the global minimum, which isn't necessarily the case. But even then it should still get us closer to finding the actual latency.

No, it doesn't assume that. What it is doing is comparing G-Sync to vsync off, which is as low as you can get.
 

Mand

Senior member
Jan 13, 2014
664
0
0
So that gives 2-3 ms for the GPU, 1 ms for the CPU and 5 ms for the panel. leaving up to 9 ms for latency (from any other source), and thus still enough room for 7 ms latency from a single frames worth of latency.

Your math is off. You can't use the listed GTG panel refresh times, as those don't really show up in this sort of scenario. Not only is that number a very specific transition that doesn't apply to the transition being made in the demo, you're also not counting the time it takes to scan the frame, which is 8 ms. That eats up all of your margin, especially when you consider things like the mouse latency.

There just isn't any way to argue: the data does not show there being enough time to hide a one-frame latency delay. Yes, it's noisy, yes, it's got a high variance. But actually run the statistics on the data with a hypothesis test on an unaccounted-for 8 ms delay, and you will find that there isn't one.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
Few job titles are likely to draw as much envy as Gaming Scientist, a role recently filled at AMD by Richard Huddy.

He's served stints at 3D Labs, Nvidia and ATI, but outsiders may not know this is actually Huddy's second round at the company. He first joined after AMD bought ATI in 2006 but left to "dabble" in online poker in 2011. It was a short-lived sojourn, thanks in no small part to the US' refusal to legalize it.

AMD wasn't able to open the doors to Huddy then, so it was on to Intel for a few years. However, as of June 1, 2014, Huddy is back in AMD's arms.
http://www.techradar.com/us/news/ga...ia-gameworks-talks-amd-s-new-attitude-1255538
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |