[Tom's Hardware] CPU bottlenecking/frame latency benchmarks

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Aug 11, 2008
10,451
642
126
The whole thing just seems "off" somehow. According to this, an A4 is better than either i5 2500k or FX 8350. Even the old phenoms are better than either of the newer processors. An interesting metric, but just doesnt add up. I also cant remember where, but I saw similar data previously, and intel was ahead, especially 3570k, quite a bit better than 2500k.

Edit: I was cited the wrong graph here. It was FC3 that showed the low latency with the A4.
But the point still stands. Does anyone really believe an athlon X3 is a superior gaming processor to an i5 or 8350?
 

TuxDave

Lifer
Oct 8, 2002
10,572
3
71
Edit: I was cited the wrong graph here. It was FC3 that showed the low latency with the A4.
But the point still stands. Does anyone really believe an athlon X3 is a superior gaming processor to an i5 or 8350?

It's just a measure of consistency. This is far from accurate since I don't have the standard deviation of time per frame but you can interpret that benchmark something like.

i5-3550 59.9 FPS (average) = 16.7ms +/- 1.9 ms per frame (average)
A10-5800K = 47.7 FPS (average) = 21.0 +/- 0.7 ms per frame (average)
 

inf64

Diamond Member
Mar 11, 2011
3,765
4,223
136
Thanks for the benchmark AtenRa. Basically that one little dip to 31fps is what made the minimum look worse on 8350. If you look at the performance of both CPUs across the timeline it's pretty much even with one being slightly ahead at some areas and other in other areas. That's why average is pretty much the same .
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Thanks for the benchmark AtenRa. Basically that one little dip to 31fps is what made the minimum look worse on 8350. If you look at the performance of both CPUs across the timeline it's pretty much even with one being slightly ahead at some areas and other in other areas. That's why average is pretty much the same .

Did you expect anything else when you move the bottleneck away from the CPU to the GPU?
 

inf64

Diamond Member
Mar 11, 2011
3,765
4,223
136
Well I thought that FC3 is CPU heavy game,one of the rare of that kind (since it's FPS). And 7950 is no slouch card either.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
Did you expect anything else when you move the bottleneck away from the CPU to the GPU?

It clearly shows that the FX8350 will not bottleneck an OCed HD7950 at 1GHz. So no matter how many fps you get at lower settings/resolution when it comes to actual gameplay settings, both systems will be equals (in FC3 at those settings).
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
It clearly shows that the FX8350 will not bottleneck an OCed HD7950 at 1GHz. So no matter how many fps you get at lower settings/resolution when it comes to actual gameplay settings, both systems will be equals (in FC3 at those settings).

Yes, at those settings
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
The whole thing just seems "off" somehow. According to this, an A4 is better than either i5 2500k or FX 8350. Even the old phenoms are better than either of the newer processors. An interesting metric, but just doesnt add up. I also cant remember where, but I saw similar data previously, and intel was ahead, especially 3570k, quite a bit better than 2500k.

18xMSAA is really unusual, but on that graph A4 is the worst, can you read lower is better?
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
At what settings would you play the game with the HD7950 or your GTX680 ??


I want to play at 640x480 with my i7 so I can get 500fps. I want to hear my FPS!




If the Video card ain't squeeling like a pig, it ain't doin its job.
 

SithSolo1

Diamond Member
Mar 19, 2001
7,740
11
81
I know its been said time and time again but the i3-3220/i3-3225 and FX-6300 make wonderful budget build cornerstones. Online CPU+Mobo combos will only run you $190-220.
Those living near a MC can have them with a decent mobo for as little as $150 out the door.

Each has its pros and cons but neither is a bad buy at the price point.
 
Aug 11, 2008
10,451
642
126
18xMSAA is really unusual, but on that graph A4 is the worst, can you read lower is better?

I said later that I linked to the wrong graph. I should have linked to the graph for Far Cry 3.
The times for the A4 are 1.3,1.4,4.0
8350 are 2.2,3.0,5.9
2500k are 2.5,3.4,6.9

So in that game, the latency is somehow lower for an A4 than two modern much more powerful chips. And yes, I can read. Obviously lower is better. I just made a mistake in the graph I was referring to.
 
Aug 11, 2008
10,451
642
126
Link: http://www.tomshardware.com/reviews/gaming-processor-frame-rate-performance,3427.html

Interesting article over at Tom's on CPU bottlenecking, testing Intel dual and quad cores, and AMD 3, 4, 6, and 8 cores.

Tom's has adopted a frame latency benchmarking technique similar to one pioneered by TechReport (Edit: but not the same - it's testing consistency in frame time for consecutive frames - thanks to ThePeasant for noting that). And the big surprise - AMD CPUs are beating Intel pretty badly in regard to frame consistency, even while losing in frames per second, as shown below:



The conclusion is that the i3-2120 is faster than AMD's best chip, the FX8350, but the 8350 beats it in all the frame latency testing. So which is the real winner? Note that the frame latency testing can't be summarized in one nice graph, so you'll have to look at the article to see it game by game, but here's an example:






Looks like this could lead to some discussion...proceed, ladies and gentlemen.

The article did also say that the latencies were very low, "almost irrelevant" until you get to the level of the dual core intels. And in most other games, the latencies were even lower, not to mention that there was no consistent pattern from game to game of which chips had the lowest latency.
 

ThePeasant

Member
May 20, 2011
36
0
0
I actually think this methodology is superior to TR's. If you want the average frame latency, then that's just the inverse of the average frame rate, so nothing lost there. Showing the stats for the deltas between successive frames does more to communicate information about "jerkiness" than TR's methodology, which may still end up giving a bad score to a very smooth experience.

For example, many low-latency frames followed by a steady increase to higher latency frames, and then many high latency frames will get a bad score from TR, even though the experience will be smooth. TH would give this scenario a good score, because the frame latency would change gradually, which is the very definition of smoothness.

Unless I've misinterpreted what "the average time difference between consecutive frames" means, I think their method serves as a poor proxy for the variability in frame times and I'll try and show you why:

Consider a series of frames with the first frame taking 13 ms to render and each subsequent frame taking 1 ms more than the one before it to render. In one second about 34 frames would be rendered and the last frame in that second would take 46 ms to render. Now imagine this pattern repeats every second. Using tom's method, the limit of the average difference between consecutive frames as the number of frames tends to infinity would be ~1.94 ms with a 75th and 95th percentile of 1 ms. But with an average frame latency of 29.5 ms and latencies moving between a minimum and maximum of 13 ms (~77 fps) and 46 ms (~22 fps) respectively each second, we can see that Tom's method would not pick up these large and regular dispersions among the frame latencies.

Replace this pattern with any pattern having relatively 'slow' transitions (gradual transition over a relatively large number of frames) between 'good' frame times and 'bad' (even within a second) and this method gets totally bypassed.

In the real world one might be tempted to say that games rarely exhibit 'slow' transitions and that there is some correlation between the consistency of frames and the 'slowness' (which is what is really being measured) and that may very well be a valid assumption most of the time but without any kind of background basis or empirical evidence to suggest and quantify this, their methodology seems somewhat arbitrary and ill-suited to me.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Ryan Smith are you listening? If even Tom's is ahead of your testing methodology, I don't know if Anandtech can be considered a top-tier GPU review site anymore.
 

Concillian

Diamond Member
May 26, 2004
3,751
8
81
The correct way to do this kind of testing is to view clips with KNOWN latencies between frames, or periodic latencies between frames...

Say like 6ms, 6ms, 6ms, X ms, 6ms, 6ms, 6ms

where X is 10, 15, 20, etc...

Each person will have some threshold below which they can't perceive a difference. That threshold will be a little different for everyone, but there's going to be a certain level of performance at which nobody can perceive any difference. Survey 10-15 people or whatever with a blind test where there are 5 or so rates sampled 3 or 4 times in random order. set your threshold at the minimum level that any of those people can repeatedly identify the difference.

My guess is that threshold will be WELL over the numbers the Tom's article is seeing at 95th percentile in most of these games. I would think it would be something like 98th+ percentile, but the actual percentile would vary by CPU, because that threshold of perception is going to be fairly consistent across games, and anything that is rendered below that threshold should be considered "perfectly" fast. As in fast enough that you cannot tell if it gets faster.

The real benchmark you want is the % of perfectly rendered frames.
 
Last edited:
Aug 11, 2008
10,451
642
126
The correct way to do this kind of testing is to view clips with KNOWN latencies between frames, or periodic latencies between frames...

Say like 6ms, 6ms, 6ms, X ms, 6ms, 6ms, 6ms

where X is 10, 15, 20, etc...

Each person will have some threshold below which they can't perceive a difference. That threshold will be a little different for everyone, but there's going to be a certain level of performance at which nobody can perceive any difference. Survey 10-15 people or whatever with a blind test where there are 5 or so rates sampled 3 or 4 times in random order. set your threshold at the minimum level that any of those people can repeatedly identify the difference.

My guess is that threshold will be WELL over the numbers the Tom's article is seeing at 95th percentile in most of these games. I would think it would be something like 98th+ percentile, but the actual percentile would vary by CPU, because that threshold of perception is going to be fairly consistent across games, and anything that is rendered below that threshold should be considered "perfectly" fast. As in fast enough that you cannot tell if it gets faster.

The real benchmark you want is the % of perfectly rendered frames.

Good point. I also wonder how the refresh rate of the monitor would affect this. I you had a 120 hz monitor vs 60, would that affect the level at which you could perceive a difference?
 

Rezist

Senior member
Jun 20, 2009
726
0
71
It looks like were going to need a high speed camera's on all our cpu/gpu reviews going forward and I'm all for it.
 

ThePeasant

Member
May 20, 2011
36
0
0
The correct way to do this kind of testing is to view clips with KNOWN latencies between frames, or periodic latencies between frames...

Say like 6ms, 6ms, 6ms, X ms, 6ms, 6ms, 6ms

where X is 10, 15, 20, etc...

That still makes no sense, using the differences between successive frames does not give you a meaningful and consistent picture of variability. You could satisfy the first few of the above differences with these absolute frame times for example:

10 ms/100 fps, 16 ms/62.5 fps, 22 ms/~45 fps, 28 ms/~36 fps, ...

And you can see that in the span of 76 ms you have gone from 100 to 36 fps (frame times have more than doubled). This highlights my whole point that this is pretty much a measure of the rate of change of frame time with respect to the number of frames and not of consistency or dispersion in delivery times. Depending on the data, they may correlate but it is fundamentally measuring related to but different from variablilty.
 

Concillian

Diamond Member
May 26, 2004
3,751
8
81
That still makes no sense, using the differences between successive frames does not give you a meaningful and consistent picture of variability..

That's good that it doesn't give a picture of variability, because that's not what I'd be trying to determine.

I'm assuming in that case that 6ms is "smooth" to everyone. I think it's likely to be, as that translates to over 150 FPS. The "odd frame" every 5th or 10th frame or whatever would be trying to test people's perception of single high latency frames. You wouldn't have X of different values in the same file, you'd use the same X for a looped animation and ask them if it's smooth or not. Then you'd have 5 or so different X values and repeat those files 3 or 4 times (without telling the subject) so you'd have 20 tests and some way of estimating confidence through repetition of the different levels of latency.

This kind of thing is EXACTLY what happens in a game where you have ~99% of the frames smooth, but see a few high latency frames, and that becomes noticeable. Also simulates microstutter that is seen in some SLI / Crossfire setups. It tests what level of this microstutter or high latency frames the person is sensitive to. I'm pretty sure there will be some latency threshold to which a subject considers all frames smooth. Pinning down that level of latency for a dozen or two people then using the minimum value should give you a decent threshold level to use for a review.

I do not believe that variability is an issue. My hypothesis is that variation below the threshold value is meaningless, and should be completely ignored. My hypothesis is that all that matters is the threshold value. Any frames above the threshold latency are perceived as not smooth, and any frames below that threshold latency are perceived as smooth. Realistically, there is probably some 'variation' in the threshold value for any one subject from things like color contrast between moving and motionless objects, but you can't account for everything. A light on dark or dark on light motion sample should be worst case.
 
Last edited:

Idontcare

Elite Member
Oct 10, 1999
21,118
59
91
I do not believe that variability is an issue. My hypothesis is that variation below the threshold value is meaningless, and should be completely ignored. My hypothesis is that all that matters is the threshold value. Any frames above the threshold latency are perceived as not smooth, and any frames below that threshold latency are perceived as smooth. Realistically, there is probably some 'variation' in the threshold value for any one subject from things like color contrast between moving and motionless objects, but you can't account for everything. A light on dark or dark on light motion sample should be worst case.

This is true, and we know it to be true because of the LCD screens we are (mostly) all staring. The switching time between various shades of colors within any given pixel...it either happens fast enough that we don't detect it (ever) or it happens slow enough that we see it and don't like it (ghosting).

I don't know how fast my LCD switches from being a yellow pixel to being a red pixel, surely it is slower than switching from a white pixel to a black pixel, but I do know it happens fast enough for me to not be impacted by however fast or slow it makes the transition.

And that is all we are after in gaming. Throw 3000 frames a second at my eyeballs for all I care, I won't care how fast or slow the gaps are between any of those frames provided the longest gap between frames doesn't exceed some minimum threshold of detectability by my eyes.

Find that number, a distribution across the population of humans to be sure, and statistically analyze frame rate latency and variability from a physiological basis the same as people analyze SSD vs HDD and 50Hz LCDs vs 120Hz LCDs.
 

bononos

Diamond Member
Aug 21, 2011
3,894
162
106
T.........
I do not believe that variability is an issue. My hypothesis is that variation below the threshold value is meaningless, and should be completely ignored. My hypothesis is that all that matters is the threshold value. Any frames above the threshold latency are perceived as not smooth, and any frames below that threshold latency are perceived as smooth. Realistically, there is probably some 'variation' in the threshold value for any one subject from things like color contrast between moving and motionless objects, but you can't account for everything. A light on dark or dark on light motion sample should be worst case.
Agreed, the THW report is making a similar mistake as simple FPS benchmarks, its simply averaging the fr latencies and not keeping score on the possible blips in time where glitches occur like what TR is doing here.
 

ThePeasant

Member
May 20, 2011
36
0
0
That's good that it doesn't give a picture of variability, because that's not what I'd be trying to determine.

I'm assuming in that case that 6ms is "smooth" to everyone. I think it's likely to be, as that translates to over 150 FPS. The "odd frame" every 5th or 10th frame or whatever would be trying to test people's perception of single high latency frames. You wouldn't have X of different values in the same file, you'd use the same X for a looped animation and ask them if it's smooth or not. Then you'd have 5 or so different X values and repeat those files 3 or 4 times (without telling the subject) so you'd have 20 tests and some way of estimating confidence through repetition of the different levels of latency.

The thing is you are now using 6 ms in a way that is different from what you stated. 6 ms would be the actual frame time equivalent to ~167 fps, but a 6 ms difference between successive frames which is what you implied when you said
..with KNOWN latencies between frames, or periodic latencies between frames..
would mean that the next frame takes 6 ms more or 6 ms less than the one before it (which is consistent with how I interpret Tom's method). That tells you nothing about the actual latencies of the frames or anything direct about their consistency.

I do not believe that variability is an issue. My hypothesis is that variation below the threshold value is meaningless, and should be completely ignored. My hypothesis is that all that matters is the threshold value. Any frames above the threshold latency are perceived as not smooth, and any frames below that threshold latency are perceived as smooth. Realistically, there is probably some 'variation' in the threshold value for any one subject from things like color contrast between moving and motionless objects, but you can't account for everything. A light on dark or dark on light motion sample should be worst case.

The point of all of these 'new' techniques is exactly to measure variablility and it is an issue especially since not every game on every setup can render at high enough minimum frame rates and of course it is relevant because they are attempting to quantify the differences in performance of the cards even if we are unable to perceive the differences. Micro-stuttering is characterized by the uneven delivery of frames and can happen even with very high average fps.

I agree there are limits to our perception and there should be a threshold below which no human could discern fluctuations in frame times and this is what TR's 99th percentile and time spent beyond x ms attempt to address. The 99th percentile tells you the worst case times for 99% of your frames and perhaps you find that once 99% of your frames are at or below 20 ms you cant really tell if they are consistent or not. That together with time spent over 20 ms could help to tell you how disruptive those 1% of frames are. Based on how I interpret Tom's method, you cannot draw the same conclusions from the average, 75th and 95th percentile. Actually I'm not sure what kind of relevant conclusions could be drawn from those statistics.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,601
2
81
Yes, at those settings

Exactly that is the problem. Who would play at those settings with 30fps min and 44fps avg? That is way too low for a 1st person shooter, at least for me. And I think I'm not alone with this opinion. The question must be allowed if these settings make sense for the ambitious gamer. In my opinion they do not. The measure by which CPUs and GPUs should be judged is not usage but fps. If the 7950@1GHz cannot provide sufficient fps, turn the settings down - it is really that simple sometimes. To create an artificial GPU bottleneck at unplayable or barely playable settings is nonsense.

I want to play at 640x480 with my i7 so I can get 500fps. I want to hear my FPS!

If the Video card ain't squeeling like a pig, it ain't doin its job.

How about at least 50fps sustained? Better 60fps, at least most of the time.

And no, your second statement is completely wrong. It's ultimately about fps, not GPU usage (or CPU usage for that matter).
 
Last edited:
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |