Are 4C/4T quads obsolete for gaming?

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,729
136
So just because the delta in the future might not be as large as the delta now in 720 tests, I should buy the slower CPU in games for the same money?! Do you really think that's logical?! Since when lower performance is better?
It doesn't matter whether you buy the slower or faster CPU. I've absolutely no interest in going into that discussion. What I'm actually interested in is the validity of 720p gaming benchmarks to justify how a slower CPU might hold back a fast GPU in the future. I'm not arguing that a slower CPU wouldn't hold it back, because it will.

It's a bit like clocks - suppose you and I have two different clocks each but yours is 30 mins ahead than mine. But it doesn't matter if it is consistently 30min ahead than mine, because we can agree on the actual time if we have a third reliable reference - an atomic clock. But suppose that in 10 years your clock is only 10min ahead, and checking that other reference it is found that while my clock is consistent, yours has continuously slowed down over the 10 years. In that case, one would have no other option but to conclude that your clock is showing the wrong time.

My argument on 720p benchmarks is along a similar line.
 

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
"The 'future GPU being handicapped by slower CPU theory because 720p gaming says so' advocates miss the point how poorly 720p results are reflected in the real way how people play games."

"Prove that 720p CPU gaming benchmarks are indicative of how well a CPU can keep up with a faster GPU in the future. Your entire argument is based on the usefulness of 720p gaming benchmarks."

Whether you plan to run at 720p or not is irrelevant. The 720p charts still show the CPU headroom as an important metric by itself when overclocking / using different settings:-
720P (no bottleneck) - https://www.techpowerup.com/reviews/Intel/Core_i5_8400/11.html
1080P (GPU bottlenecked) - https://www.techpowerup.com/reviews/Intel/Core_i5_8400/12.html

Picking several games (not just GTA V) which highlight this effect (and ignoring stuff like 200fps engine caps in Doom). NB: i3-8100 = i5-7500. Same chip, same cores, same 4C load 3.6GHz clocks, same architecture (and same price as R3 1300X):-

BF1:-
R3 1300X = 132.4fps (1080p) -> 138.7fps (720p) = 4.7% headroom
i3-8100 = 147.1fps (1080p) -> 207.2fps (720p) = 40.8% headroom
R5-1600X = 142.5fps (1080p) -> 184.1fps (720p) = 29% headroom
i5-8400 = 147.7fps (1080p) -> 231.0fps (720p) = 56.4% headroom

Deus Ex: Mankind Divided:-
R3 1300X = 79.3fps (1080p) -> 86.2fps (720p) = 8.7% headroom
i3-8100 = 81.8fps (1080p) -> 119.9fps (720p) = 46.5% headroom
R5-1600X = 81.4fps (1080p) -> 101.3fps (720p) = 24.4% headroom
i5-8400 = 81.7fps (1080p) -> 126fps (720p) = 54.2% headroom

Dishonored 2:-
R3 1300X = 73.5fps (1080p) -> 76.7fps (720p) = 4.3% headroom
i3-8100 = 91.4fps (1080p) -> 94.7fps (720p) = 3.6% headroom
R5-1600X = 87.2fps (1080p) -> 89.2fps (720p) = 2.3% headroom
i5-8400 = 98.8fps (1080p) -> 103.4fps (720p) = 4.6% headroom

Far Cry Primal:-
R3 1300X = 97.0fps (1080p) -> 99.1fps (720p) = 2.1% headroom
i3-8100 = 107.6fps (1080p) -> 121.3fps (720p) = 12.7% headroom
R5-1600X = 99.2fps (1080p) -> 100.4fps (720p) = 1% headroom
i5-8400 = 111.4fps (1080p) -> 138.3fps (720p) = 24.1% headroom

Rise of the Tomb Raider:-
R3 1300X = 109.0fps (1080p) -> 108.6fps (720p) = 0% headroom
i3-8100 = 126.5 fps (1080p) -> 198.4fps (720p) = 56.8% headroom
R5-1600X = 114.4fps (1080p) -> 116.2fps (720p) = 1.5% headroom
i5-8400 = 126.3fps (1080p) -> 207.2fps (720p) = 64.1% headroom

^ This headroom can be used for either 1. Buying a new GPU without needing to swap CPU's, 2. OCing your current GPU (which benchies don't show), or 3. Lowering some shader / post-processing settings that often disproportionately impact the GPU more than the CPU (which benchies also don't show), eg, using SMAA vs 4x MSAA or disabling blurs can easily gain +20-30% IF the CPU has that headroom to feed additional frames over and above what Ultra benchmarks show). Eg, for Far Cry Primal, the i3-8100 (i5-7500) with the 12.7% headroom (1080p vs 720p) headroom means the CPU has enough spare horsepower to gain up to another +12% "free" fps if you OC'd the GPU +12% or tweaked the settings. CPU's with 1-2% headroom, however, may simply see +1% gain and then quickly bottleneck (ie, the GPU % usage figure starts to lower whilst the fps remains the same because the CPU is already maxed out which is what's being measured when fps don't go up much when lowering the resolution).

This isn't an AMD vs Intel thing, but rather highlighting why it's important to measure both headroom (720p) and normal (1080p), not just pick and choose when to use it according to whether it's "your team" or not. People already know this full well because as soon as Ryzen reviews appeared and R3-1300X vs i3 -7100 or R5-1500X vs i5-7500 Youtube vids appeared, the comments section was filled with "OMG, look at Ryzen's lower CPU usage!" comments (which is measuring exactly the same thing as you're claiming to "not matter"!) One (1080p) tells you the basics of how a GPU will load up a CPU given a standard resolution & preset. The other (720p) tells you how much extra untapped horsepower the CPU has to feed additional frames by tweaking / OCing it. You commented on "real gamers" and yet as mentioned previously, many of us in the real world turn off a lot of cr*p individual "Pure Ultra" settings off like Chromatic Aberration, stupidly overdone DoF / blurs, etc, that are computed mostly on the GPU's shaders, whilst others may be using High vs Ultra or their own custom mixture. The end result is lower GPU load than the 1080p Ultra charts show, and disproportionately higher fps but only to CPU's that have that headroom. And the only way of measuring that in advance is to test at lower resolutions.

I don't know why you think this is new. Testing CPU's by not GPU bottlenecking them has been the industry standard for like 25 years of benchmarking going back to when 1024x768 4:3 CRT resolutions were the norm and people included 640x480 / 800x600 benchmarks to see by how much the CPU was being held back by the GPU, even though they had no intention of gaming like that. Whether you "agree" with it or not, it's extremely useful data to have for overclockers, people who "play" with the settings, people who play on High / Medium, etc.
 
Last edited:

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,729
136
Unless 720p benchmarks have been actually demonstrated to give reasonable credence to this premise of yours, they will remain inconclusive.

Even when 1024x768 was the norm back in the CRT days in the early 2000s, people still gamed at 800x600, just like when the first 1080p displays were hitting the market 10-12 years back, people still gamed at lower resolutions like 1680x1050 or 1440x900. Hence including those resolutions made sense in the context of those days. Nowadays when 100-120$ GPUs coupled with a 80-100$ CPU can give you a decent 1080p 60fps experience at medium-high detail settings, it takes some incredibly convoluted logic to insist upon focusing on 720p benchmarks.
 

PhonakV30

Senior member
Oct 26, 2009
987
378
136
BF1:-
R3 1300X = 132.4fps (1080p) -> 138.7fps (720p) = 4.7% headroom
i3-8100 = 147.1fps (1080p) -> 207.2fps (720p) = 40.8% headroom
R5-1600X = 142.5fps (1080p) -> 184.1fps (720p) = 29% headroom
i5-8400 = 147.7fps (1080p) -> 231.0fps (720p) = 56.4% headroom

Deus Ex: Mankind Divided:-
R3 1300X = 79.3fps (1080p) -> 86.2fps (720p) = 8.7% headroom
i3-8100 = 81.8fps (1080p) -> 119.9fps (720p) = 46.5% headroom
R5-1600X = 81.4fps (1080p) -> 101.3fps (720p) = 24.4% headroom
i5-8400 = 81.7fps (1080p) -> 126fps (720p) = 54.2% headroom

Dishonored 2:-
R3 1300X = 73.5fps (1080p) -> 76.7fps (720p) = 4.3% headroom
i3-8100 = 91.4fps (1080p) -> 94.7fps (720p) = 3.6% headroom
R5-1600X = 87.2fps (1080p) -> 89.2fps (720p) = 2.3% headroom
i5-8400 = 98.8fps (1080p) -> 103.4fps (720p) = 4.6% headroom

Far Cry Primal:-
R3 1300X = 97.0fps (1080p) -> 99.1fps (720p) = 2.1% headroom
i3-8100 = 107.6fps (1080p) -> 121.3fps (720p) = 12.7% headroom
R5-1600X = 99.2fps (1080p) -> 100.4fps (720p) = 1% headroom
i5-8400 = 111.4fps (1080p) -> 138.3fps (720p) = 24.1% headroom

Rise of the Tomb Raider:-
R3 1300X = 109.0fps (1080p) -> 108.6fps (720p) = 0% headroom
i3-8100 = 126.5 fps (1080p) -> 198.4fps (720p) = 56.8% headroom
R5-1600X = 114.4fps (1080p) -> 116.2fps (720p) = 1.5% headroom
i5-8400 = 126.3fps (1080p) -> 207.2fps (720p) = 64.1% headroom

Huge different in RoTR/Far cry.looks like Nothing! draw call back? 64% vs 1% While BF1/Deus there is benefit.
 

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
Unless 720p benchmarks have been actually demonstrated to give reasonable credence to this premise of yours, they will remain inconclusive.
They have. That's why people have been using them for years. Plenty of examples from previous generations too, eg, +41% overhead for i5-4690K vs only 11% for FX-8350 in Deus Ex Human Revolution 720p vs 1080p). As time went on, and new GPU's came out games became more and more CPU bottlenecked on the FX until by the time we reached OC'd Maxwell / Pascal on newer games, it was losing to the i3's half the time. Assuming you read my post, as I mentioned it's not just about "future GPU's" but includes stuff today like overclocking a GPU, lowering shader settings, etc. I'm honestly not sure what you're struggling to understand about removing one component from being a bottleneck being a positive when trying to benchmarking a different component.
 
Reactions: OTG

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
To get back on topic, just because AMD and Intel have new 6-cores out doesn't mean games instantly become super-threaded or 4C's obsolete. That's typically a longer term trend not an overnight thing. Looking at Steam HWSurvey, "19/20" is the visible pattern, ie, 19/20 own 2-4 core CPU's, 19/20 own GTX 1060's or less (970/960/1050/750, etc), 19/20 use 1080p/60. Over time that number will go up but that's going to take literally years. Looking at the new Coffee Lake vs Ryzen 6C vs older 4C i5-7600K benchmarks, it's pretty clear for average 19 out of 20 gamers with 1080p/60fps target that a 4C/4T i5-7500 / 7600 / i3-8100 is enough for 60fps capped, and aside from the usual few cr*ppy ports, the biggest 6C gain is for 120Hz gamers. What's really likely to change over the next year for budget AAA gamers is the 2C/4T chips. The previous recommended G4560 is already replaced by the R3 1300X which in turn will probably be replaced again with the i3-8100 when the budget H310/B360 mboards come out.

Meanwhile, AAA games continue to have 80-90% of their sales comes from consoles not PC's. The golden age of PC exclusives having zero cross-platform constraints is long over and the main advantage of PC hardware vs console is higher resolutions and 120-144Hz monitors, not actually making "smarter" PC-optimized games. Even if AMD / Intel released a 1024C/2048T CPU tomorrow, we still wouldn't see half the things people are fantasizing about like ultra-real unique AI, lifelike personalities, non-reused motion-cap animations, etc, for each individual of thousands of NPC's in AC:Unity sized crowds, because 1. It would grind consoles "8x Jaguar's = barely i3-4170 class horsepower to a halt (and without those the game wouldn't be financially viable to make), and 2. Anything that triples, quadruples, etc, the game development time / budget from 2-4 year to taking 6-10 years in the making or hiring hundreds of extra devs just isn't going to happen. The ultimate bottleneck for actually making better games (beyond just churning out more fps for higher res / 144Hz monitors) is still basically consoles CPU's and publisher allocated time / budget constraints.
 

IRobot23

Senior member
Jul 3, 2017
601
183
76
@BSim500

stop showing TPU benchmarks.

Okey, here is different benchmark.


This graph shows that i5, FX,R3, even i3 (pentium) stock are completely capable. But when you see actually gameplay people reporting frame drops (stutters).

The death of quad cores/threads will be when you will need more than 4 thread to even start the game. Can you 144Hz on quad core, yes you can if you OC it well. Even then you will have problems with deeps top 80fps or lower.

But why would anyone buy i5 8350K for 5GHz (delid, water cooler etc.), if you can have i5 8400 or R5 1600 65W TDP. Which will simply destroy it in new games. Yes you need fast ram.
 
Last edited:

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,729
136
They have. That's why people have been using them for years. Plenty of examples from previous generations too, eg, +41% overhead for i5-4690K vs only 11% for FX-8350 in Deus Ex Human Revolution 720p vs 1080p). As time went on, and new GPU's came out games became more and more CPU bottlenecked on the FX until by the time we reached OC'd Maxwell / Pascal on newer games, it was losing to the i3's half the time. Assuming you read my post, as I mentioned it's not just about "future GPU's" but includes stuff today like overclocking a GPU, lowering shader settings, etc. I'm honestly not sure what you're struggling to understand about removing one component from being a bottleneck being a positive when trying to benchmarking a different component.
That doesn't even answer my question, because my question wasn't about bottleneck. Those results are for 720p and 1080p on the same GPU. What I'm asking is this - test a comparatively weaker GPU at 720p against a more powerful GPU at 1080p. Under these circumstances, what is the difference in performance between two CPUs at 720p( call this difference X) and 1080p(call this difference Y). My question is simple - is X=Y, or if X!=Y then what is the actual difference between X and Y. If it is X=30%, then what is Y? Again, I'm not asking for an explanation of any difference between the results due to one CPU being faster than the other. I'm asking for hard data.

Unless a proper test at 720p like I described before in my previous post is carried out, comparing 720p and 1080p results using the same GPU is completely meaningless in order to gauge future potential performance.
 

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
That doesn't even answer my question, because my question wasn't about bottleneck. Those results are for 720p and 1080p on the same GPU. What I'm asking is this - test a comparatively weaker GPU at 720p against a more powerful GPU at 1080p.

720p vs 1080p IS all about removing the GPU bottleneck. That's the sole reason for doing it! If your question about 720p benchmarks isn't about GPU bottlenecking, then you're asking the wrong question. It's a CPU test, this is a CPU thread in a CPU forum. You're in the wrong forum if you want to start benchmarking different GPU's against each other doing nothing but changing resolution. If a GTX 1050Ti got 60fps at 720p, and a GTX 1060 got 60fps at 1080p, that's completely and utterly irrelevant to CPU testing of different CPU's on the same GPU's.

stop showing TPU benchmarks.
You already posted that once, and my answer is simply I'll show what's relevant to the same spammed "But I don't game at 720p and don't understand why they exist, so they must be wrong" over and over in multiple threads, even after like 20 other people now have replied with exactly the same answer, over and over. In the context of this "4C for future gaming" thread, those benchmarks (sites other than TPU were posted too) showing, eg, an i5-7600K getting large gains at 720p over 1080p on commonly owned GPU's absolutely highlights in which games there a lot of future "extension" if avg fps dips below 60fps in next year's games due to a GPU bottleneck which is subsequently removed via a GPU upgrade after which the CPU's still have enough overhead to feed 60fps, and therefore continue to be "good enough for gaming" (the whole point of this topic). Just because you don't understand what people are benchmarking / looking for in different benchmarks, doesn't make the benchmarks "wrong".
 
Last edited:
Reactions: Yuriman

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
I think what is going to happen is that all of the budget gamers that build systems using 2c/4t CPUs will move on to the Coffee Lake and newer i3 or Ryzen 3 1200. So quad core CPUs will be very far from obsolete for mainstream gaming for a long time to come.
 

IRobot23

Senior member
Jul 3, 2017
601
183
76
You already posted that once, and my answer is simply I'll show what's relevant to the same spammed "But I don't game at 720p and don't understand why they exist, so they must be wrong" over and over in multiple threads, even after like 20 other people now have replied with exactly the same answer, over and over..

Nope, I think TPU is not credible. Thats why. My on opinion.
As I said ryzen is capable of pushing 144FPS even demanding titles even at 3,9GHz.

Ryzen at 5GHz even with less cores would be faster. We will probably see Ryzen 7nm 5GHz+.
 
Last edited:

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,729
136
720p vs 1080p IS all about removing the GPU bottleneck. That's the sole reason for doing it! If your question about 720p benchmarks isn't about GPU bottlenecking, then you're asking the wrong question. It's a CPU test, this is a CPU thread in a CPU forum. You're in the wrong forum if you want to start benchmarking different GPU's against each other doing nothing but changing resolution. If a GTX 1050Ti got 60fps at 720p, and a GTX 1060 got 60fps at 1080p, that's completely and utterly irrelevant to CPU testing of different CPU's on the same GPU's.
No matter how many times you post 720p benchmarks, you cannot say that it is an indication of the dependence of a CPU on future GPU performance when more powerful GPUs are available.

720p benchmarks are plain rubbish unless you actually do a test that is a direct demonstration of the claim that CPU 1 which gives lower FPS than CPU 2(on a percentage basis) using a certain GPU at 720p, will also give same lower FPS on a percentage basis with a much more powerful GPU at more realistic settings like 1080p.

After all, isn't this your primary claim? Put your money where your mouth is and do this test and see what the results are. Till then no matter where you source 720p data from and how much you cling to them, you haven't proved anything.

One cannot make things any clearer than this.
 

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
720p benchmarks are plain rubbish unless you actually do a test that is a direct demonstration of the claim that CPU 1 which gives lower FPS than CPU 2(on a percentage basis) using a certain GPU at 720p, will also give same lower FPS on a percentage basis with a much more powerful GPU at more realistic settings like 1080p.
I already have and you refuse to look at them simply because you do not want to. If two CPU's are GPU bottlenecked but upgrading to the next tier +50% faster GPU gives one CPU a +40% boost but the other a +10% boost (both due to CPU bottlenecks by differing degrees), then you'll generally see the +40% one get higher fps at 720p. It doesn't have to be exactly +40.0000% and <39.9999% or >40.0001% is "invalid", but that's the general effect. This isn't even up for debate beyond trolling.

Deep down I get the impression that the reason why hundreds of "Look at Ryzen's low in-game CPU's usage, that's a lot of overhead for a future GPU! Can you lower the resolution to 720p so we can get a rough idea of how next year's cards do?" comments in various forums were "good" between Ryzen and Coffee Lake launch when measuring R3-1300X (4/4) vs i3-7100 (2/4) or R5-1500X (4/8 or 6/12) vs i5-7500 (4/4) (which favors AMD), but are now magically "bad" post CoffeeLake launch when measuring overhead of 4/4 vs 4/4 or 6/12 vs 6/6 (which favors Intel) is because one of them comes in a blue box and the other in a red box but people don't want to be honest and simply say they like the color of the box, instead insisting on this silly tap-dance routine of "negative arguing" of trying to talk around everything that counts (how it performs now and how it will perform with no GPU bottleneck 1-2 years down the line) without making it look too obvious they had zero inclination of making a serious Ryzen vs Coffee Lake purchase decision anyway...
 

Loser Gamer

Member
May 5, 2014
145
7
46
How can we go from games don't use more than one core to 4 cores are obsolete? I think people want 4 cores to be useless because in their head 8 and 16 cores means something more. But in reality a 2 core CPU can actually still game.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,729
136
I already have and you refuse to look at them simply because you do not want to. If two CPU's are GPU bottlenecked but upgrading to the next tier +50% faster GPU gives one CPU a +40% boost but the other a +10% boost (both due to CPU bottlenecks by differing degrees), then you'll generally see the +40% one get higher fps at 720p. It doesn't have to be exactly +40.0000% and <39.9999% or >40.0001% is "invalid", but that's the general effect. This isn't even up for debate beyond trolling.
How can it be a reliable indicator of performance in the future when the way games will be utilizing resources may change over time? Suppose I had both the 8700K and the 1800X in 2013. So how can 720p performance with a GTX 780 Ti in games of 2013 - tell me about the performance on those two CPUs at 1080p with a GTX 1080 in today's games? How can you predict Battlefield 1 performance in 2016 using Battlefield 3 720p benchmarks in 2013? After all, isn't this the main argument for 720p benchmarks?
 

IRobot23

Senior member
Jul 3, 2017
601
183
76
How can we go from games don't use more than one core to 4 cores are obsolete? I think people want 4 cores to be useless because in their head 8 and 16 cores means something more. But in reality a 2 core CPU can actually still game.

Well we know that you need 4threads. Some games will run great, some will have little problem. Is it worth buying new PC with 4 cores for gaming... depends on budget with GTX 1050/TI, RX 560 yes. With GTX 1060 you might wanna grab R5 1400/1600 or i5 8400,...


https://www.youtube.com/watch?v=ysAdXfhqGtU&t=854s
 

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
How can it be a reliable indicator of performance in the future when the way games will be utilizing resources may change over time? Suppose I had both the 8700K and the 1800X in 2013. So how can 720p performance with a GTX 780 Ti in games of 2013
Again, you've got it completely backwards. The context of this thread is about older CPU's lasting longer on newer GPU's, not the other way around. You still seem incapable of mentally grasping that trying to simulate how current CPU's perform being maxed out with future GPU's, requires a test that actually tries to max out the CPU before the GPU in the same way (ie, lowering resolution). Again, this isn't something new and has been industry standard in benchmarking for like +20 years...
 

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,729
136
Again, you've got it completely backwards. The context of this thread is about older CPU's lasting longer on newer GPU's, not the other way around. You still seem incapable of mentally grasping that trying to simulate how current CPU's perform being maxed out with future GPU's, requires a test that actually tries to max out the CPU before the GPU in the same way (ie, lowering resolution). Again, this isn't something new and has been industry standard in benchmarking for like +20 years...
How will you test older CPUs coping with newer GPUs if not by testing on newer games as well? If all I do is play older games, then why should I upgrade my GPU in the first place? How else would potentially untapped CPU headroom, which according to you is shown by 720p testing, manifest itself in reality?
 

eddman

Senior member
Dec 28, 2010
239
87
101
How will you test older CPUs coping with newer GPUs if not by testing on newer games as well? If all I do is play older games, then why should I upgrade my GPU in the first place? How else would potentially untapped CPU headroom, which according to you is shown by 720p testing, manifest itself in reality?

Again, it won't show the exact improvement delta of the future, but it would indicate that if a CPU is faster in low res tests, then there is a very good chance it would also be with a faster GPU in normal resolutions. Sure, some future games might be so GPU heavy that the faster CPU won't have an effect at all, but there would also be games that won't be, not to mention all the current known games.

The delta might be similar for some games and be nonexistent in others, but when I see that a CPU's average FPS is 19% higher at 720 in 8 games, and its 99th percentile is higher by 14%, I can assume that it would hold an advantage with a faster future GPU too, specially in this case that they have the same number of cores and similar clock speeds, unless SMT suddenly becomes uber effective in games.

https://www.computerbase.de/2017-10...st/5/#diagramm-performancerating-fps-1280-720

EDIT: Even at 1080, which is GPU limited in some of those games, 8400 is 14% and 11% faster.

The claim never was that you would get an exact delta number (although there is still a chance of seeing it in the future in some games), just that low res tests would better show a CPU's capabilities by removing the GPU limit.

Actually forget about the delta and future stuff for a second. If I'm making a purchase now and see that CPU A is faster than CPU B in games (both in the same price category), which one I would choose? I can't go for B and hope that in the future all games would be GPU limited, or my CPU would become faster. I'm making a purchase based on what data I'm seeing right now, and the current data is telling me CPU A is faster.
 
Last edited:
Reactions: BSim500

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
How will you test older CPUs coping with newer GPUs if not by testing on newer games as well? If all I do is play older games, then why should I upgrade my GPU in the first place? How else would potentially untapped CPU headroom, which according to you is shown by 720p testing, manifest itself in reality?
Your arguments are now changing from post to post. If all you want is to play older games, then there's no reason for you to upgrade either, and the answer to this thread is "No, for many people quad cores are not going to be obsolete for a lot of games". Newer games will continue to be the usual mix of well optimised (eg, Doom), bad ports that take +6 months of patching just to be playable even on new CPU's, those that run like complete turds for the visuals with laughably broken AI (eg, Civ 6) and stuff that's in between. They'll also continue to be designed for consoles which themselves have limits to how much stuff you can cram in. Performance isn't going to keep halving every year just for simple 1080p/60 or you'll end up with 10-15fps on consoles. I'd love to give you a time machine so you can test 2019 games today, but since they don't exist the next best thing is to simply measure how much CPU overhead you do have once GPU bottleneck is removed to gain a rough idea.

But if at this point you still don't get why raising resolution hits the GPU harder relative to the CPU (GPU now has twice the work per frame (more pixels) whilst CPU still feeds same number or fewer frames, whilst lowering it hits the CPU harder relative to the GPU (GPU now has only half the work (half the pixels) and will force the CPU to work harder to deliver more frames), and is a fairly accurate predictor of how GPU-heavy games will scale (ie, most AAA's), despite over a dozen different people explaining to you it in several different threads, all I can do at this point is shrug and say "Have a great Sunday afternoon". Take a break from trying to understand it, and it'll eventually come to you, often when you're just messing about with the settings.
 

Loser Gamer

Member
May 5, 2014
145
7
46
What if a man didn't listen to the sales people and stayed with a 4 core? Would that make him a non-money maker? The people pushing 6 core (we had that with AMD 7 years ago) and hyper threads do you not think they are in it for the money?
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
What if a man didn't listen to the sales people and stayed with a 4 core? Would that make him a non-money maker? The people pushing 6 core (we had that with AMD 7 years ago) and hyper threads do you not think they are in it for the money?
Of course they are.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,595
136
https://www.computerbase.de/2017-10...#diagramm-battlefield-1-multiplayer-1920-1080

8400 is 23% faster in BF1 MP. 1600X boosts to 3.7 GHz, so add 5.5% to the score for 3.9 GHz. 8400 would still be about 16-17% faster.


Two things: 1) That's SP, IINM. 2) at 1440 the game is GPU limited. For a true CPU comparison you should look at 1080 or even 720.
Naa the 8400 is aprox 10% point better in the all important 99% frametimes in mp bf1 vs 1600x. Avg is useless as a metric.
The 1600x is performing more like the 7800x here. I would guess the 99.9% frametimes is more or less the same for all those 6c processors.
But that also shows how stellar value the 8400 is. Imo much better than the 7700k !
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |