- Dec 8, 2010
- 11,900
- 74
- 91
There's been a thread or two over at VC&G subforum about pclab's collection of gaming benches, comparing graphics cards over 3-4 years... But I decided to look at that article from another perspective. I want to focus on the relevance of CPU overclocking and HyperThreading in current games using current graphics cards.
Typically, this question is answered with average framerates in average sitautions or even in GPU-heavy prerendered benchmarks. The typical conclusion I see is something along the lines of "overclocking is just a hobby, you don't really need it" and "hyperthreading is basically not useful unless you're buying it for productivity, you'll be fine with an i5". I constantly make conclusions like this myself as well, but how true is that really? Since gamers these days are pretty demanding when it comes to motion fluidity, visual immersion, and maintaining smooth 60 fps or even above 60 fps on a 144Hz monitor, I figured performance in worst case scenarios could be used to approximate the quality of the subjective gaming experience. If your framerate is dipping below subjectively acceptable thresholds too often using a midrange CPU, it won't really matter that your average framerate is three frames away from that of someone who uses the same GPU but with a top end overclocked CPU.
A section in the article (pages 11 and 12) compared AMD and NVIDIA CPU overheads. That comparison was done roughly so (google translation):
Basically, what they did was deliberately pick the worst case situations where you would expect the most work for the CPU.
Since this is the CPUs subforum, let's not make it AMD vs NVIDIA. Let's just compare the overclocked Intel CPU's to the stock i5-4570. I collected all the average fps data into a single spreadsheet for a nice overview (you can check the minimum fps numbers in the article, they're not far below the averages):
Avg FPS @ 1080p
FPS numbers alone aren't that interesting, so let's convert the whole thing to:
Fractions of i5-4570 performance
In terms of raw CPU performance, i5 @ 4.5GHz is roughly 30-35% faster than i5-4570, and the potential benefit from HT can be up to 30-40% with heavily threaded apps. How much of this is actually visible in demanding gaming situations?
With GTX 970 the differences are small, not subjectively significant. With R9 390 and faster GPU's, you start to actually see the effects of an overclocked i5 and i7. But on average, the benefit is far less than you'd expect with pure CPU bottlenecking.
If we focus on the worst cases of these worst case scenarios (which I've chosen as any scenario where i7-6700K OC gives more than 5% better performance compared i5-4570, on a GTX 970), differences are better visible, with performance scaling basically linearly with i5-6600K clock speed when using Fury X or 980 Ti:
Fractions of i5-4570 performance - worst cases
Finally let's compare i7 to i5:
i7 OC vs i5 OC- worst case scenarios only
Benefit of i7 over i5 is 8-21% in worst case scenarios, depending on GPU. Some of that benefit is explained by the 200MHz higher overclock, and some by the generational advantage of Skylake over Haswell, which basically makes HyperThreading more or less not relevant even when using a Fury X.
I'll leave the conclusions to be drawn by you guys ... so, discuss.
Typically, this question is answered with average framerates in average sitautions or even in GPU-heavy prerendered benchmarks. The typical conclusion I see is something along the lines of "overclocking is just a hobby, you don't really need it" and "hyperthreading is basically not useful unless you're buying it for productivity, you'll be fine with an i5". I constantly make conclusions like this myself as well, but how true is that really? Since gamers these days are pretty demanding when it comes to motion fluidity, visual immersion, and maintaining smooth 60 fps or even above 60 fps on a 144Hz monitor, I figured performance in worst case scenarios could be used to approximate the quality of the subjective gaming experience. If your framerate is dipping below subjectively acceptable thresholds too often using a midrange CPU, it won't really matter that your average framerate is three frames away from that of someone who uses the same GPU but with a top end overclocked CPU.
A section in the article (pages 11 and 12) compared AMD and NVIDIA CPU overheads. That comparison was done roughly so (google translation):
pclab.pl said:Tests done all graphics cards on a platform equipped with a very fast processor, Core i7-6700K, which was overclocked to 4.7 GHz. This is undoubtedly a good idea for testing the fastest GPU, but few combine this type of CPU card, say, middle class. [...]
For this reason, we performed additional tests, including Radeon R9 390 and GeForce GTX 970 (both cards are factory overclocked), using the weaker processors: Core i5-4690K @ 4.5 GHz (four cores and four threads), Core i5-4570 (four cores, four threads, 3.2 GHz + Turbo) and a unit in the form of AMD FX-9590 (four modules, eight threads, 4.7 GHz + Turbo). [...]
[W]e performed additional tests, including Radeon R9 Fury X and GeForce GTX-980 and Ti (the first one has been manually tweaked, the second is a model reference), using the weaker processors: Core i5-6600K @ 4.5 GHz and Core i5-4570 (3.2 GHz + Turbo) [...]
We chose four games, in our opinion the most important, where you can find places with fast-paced action, where many opponents, so the processor is of great importance, as well as the place where the player's interaction with the virtual world is limited to exploring the prepared scene.
Basically, what they did was deliberately pick the worst case situations where you would expect the most work for the CPU.
Since this is the CPUs subforum, let's not make it AMD vs NVIDIA. Let's just compare the overclocked Intel CPU's to the stock i5-4570. I collected all the average fps data into a single spreadsheet for a nice overview (you can check the minimum fps numbers in the article, they're not far below the averages):
Avg FPS @ 1080p
FPS numbers alone aren't that interesting, so let's convert the whole thing to:
Fractions of i5-4570 performance
In terms of raw CPU performance, i5 @ 4.5GHz is roughly 30-35% faster than i5-4570, and the potential benefit from HT can be up to 30-40% with heavily threaded apps. How much of this is actually visible in demanding gaming situations?
With GTX 970 the differences are small, not subjectively significant. With R9 390 and faster GPU's, you start to actually see the effects of an overclocked i5 and i7. But on average, the benefit is far less than you'd expect with pure CPU bottlenecking.
If we focus on the worst cases of these worst case scenarios (which I've chosen as any scenario where i7-6700K OC gives more than 5% better performance compared i5-4570, on a GTX 970), differences are better visible, with performance scaling basically linearly with i5-6600K clock speed when using Fury X or 980 Ti:
Fractions of i5-4570 performance - worst cases
Finally let's compare i7 to i5:
i7 OC vs i5 OC- worst case scenarios only
Benefit of i7 over i5 is 8-21% in worst case scenarios, depending on GPU. Some of that benefit is explained by the 200MHz higher overclock, and some by the generational advantage of Skylake over Haswell, which basically makes HyperThreading more or less not relevant even when using a Fury X.
I'll leave the conclusions to be drawn by you guys ... so, discuss.
Last edited: