Ryzen 1600 is 30% faster than 1600x , yep It's logic!
or R3 1200 vs R3 1300X.
around 35-40%
Ryzen 1600 is 30% faster than 1600x , yep It's logic!
It doesn't matter whether you buy the slower or faster CPU. I've absolutely no interest in going into that discussion. What I'm actually interested in is the validity of 720p gaming benchmarks to justify how a slower CPU might hold back a fast GPU in the future. I'm not arguing that a slower CPU wouldn't hold it back, because it will.So just because the delta in the future might not be as large as the delta now in 720 tests, I should buy the slower CPU in games for the same money?! Do you really think that's logical?! Since when lower performance is better?
"The 'future GPU being handicapped by slower CPU theory because 720p gaming says so' advocates miss the point how poorly 720p results are reflected in the real way how people play games."
"Prove that 720p CPU gaming benchmarks are indicative of how well a CPU can keep up with a faster GPU in the future. Your entire argument is based on the usefulness of 720p gaming benchmarks."
Unless 720p benchmarks have been actually demonstrated to give reasonable credence to this premise of yours, they will remain inconclusive.*snip*
BF1:-
R3 1300X = 132.4fps (1080p) -> 138.7fps (720p) = 4.7% headroom
i3-8100 = 147.1fps (1080p) -> 207.2fps (720p) = 40.8% headroom
R5-1600X = 142.5fps (1080p) -> 184.1fps (720p) = 29% headroom
i5-8400 = 147.7fps (1080p) -> 231.0fps (720p) = 56.4% headroom
Deus Ex: Mankind Divided:-
R3 1300X = 79.3fps (1080p) -> 86.2fps (720p) = 8.7% headroom
i3-8100 = 81.8fps (1080p) -> 119.9fps (720p) = 46.5% headroom
R5-1600X = 81.4fps (1080p) -> 101.3fps (720p) = 24.4% headroom
i5-8400 = 81.7fps (1080p) -> 126fps (720p) = 54.2% headroom
Dishonored 2:-
R3 1300X = 73.5fps (1080p) -> 76.7fps (720p) = 4.3% headroom
i3-8100 = 91.4fps (1080p) -> 94.7fps (720p) = 3.6% headroom
R5-1600X = 87.2fps (1080p) -> 89.2fps (720p) = 2.3% headroom
i5-8400 = 98.8fps (1080p) -> 103.4fps (720p) = 4.6% headroom
Far Cry Primal:-
R3 1300X = 97.0fps (1080p) -> 99.1fps (720p) = 2.1% headroom
i3-8100 = 107.6fps (1080p) -> 121.3fps (720p) = 12.7% headroom
R5-1600X = 99.2fps (1080p) -> 100.4fps (720p) = 1% headroom
i5-8400 = 111.4fps (1080p) -> 138.3fps (720p) = 24.1% headroom
Rise of the Tomb Raider:-
R3 1300X = 109.0fps (1080p) -> 108.6fps (720p) = 0% headroom
i3-8100 = 126.5 fps (1080p) -> 198.4fps (720p) = 56.8% headroom
R5-1600X = 114.4fps (1080p) -> 116.2fps (720p) = 1.5% headroom
i5-8400 = 126.3fps (1080p) -> 207.2fps (720p) = 64.1% headroom
They have. That's why people have been using them for years. Plenty of examples from previous generations too, eg, +41% overhead for i5-4690K vs only 11% for FX-8350 in Deus Ex Human Revolution 720p vs 1080p). As time went on, and new GPU's came out games became more and more CPU bottlenecked on the FX until by the time we reached OC'd Maxwell / Pascal on newer games, it was losing to the i3's half the time. Assuming you read my post, as I mentioned it's not just about "future GPU's" but includes stuff today like overclocking a GPU, lowering shader settings, etc. I'm honestly not sure what you're struggling to understand about removing one component from being a bottleneck being a positive when trying to benchmarking a different component.Unless 720p benchmarks have been actually demonstrated to give reasonable credence to this premise of yours, they will remain inconclusive.
That doesn't even answer my question, because my question wasn't about bottleneck. Those results are for 720p and 1080p on the same GPU. What I'm asking is this - test a comparatively weaker GPU at 720p against a more powerful GPU at 1080p. Under these circumstances, what is the difference in performance between two CPUs at 720p( call this difference X) and 1080p(call this difference Y). My question is simple - is X=Y, or if X!=Y then what is the actual difference between X and Y. If it is X=30%, then what is Y? Again, I'm not asking for an explanation of any difference between the results due to one CPU being faster than the other. I'm asking for hard data.They have. That's why people have been using them for years. Plenty of examples from previous generations too, eg, +41% overhead for i5-4690K vs only 11% for FX-8350 in Deus Ex Human Revolution 720p vs 1080p). As time went on, and new GPU's came out games became more and more CPU bottlenecked on the FX until by the time we reached OC'd Maxwell / Pascal on newer games, it was losing to the i3's half the time. Assuming you read my post, as I mentioned it's not just about "future GPU's" but includes stuff today like overclocking a GPU, lowering shader settings, etc. I'm honestly not sure what you're struggling to understand about removing one component from being a bottleneck being a positive when trying to benchmarking a different component.
That doesn't even answer my question, because my question wasn't about bottleneck. Those results are for 720p and 1080p on the same GPU. What I'm asking is this - test a comparatively weaker GPU at 720p against a more powerful GPU at 1080p.
You already posted that once, and my answer is simply I'll show what's relevant to the same spammed "But I don't game at 720p and don't understand why they exist, so they must be wrong" over and over in multiple threads, even after like 20 other people now have replied with exactly the same answer, over and over. In the context of this "4C for future gaming" thread, those benchmarks (sites other than TPU were posted too) showing, eg, an i5-7600K getting large gains at 720p over 1080p on commonly owned GPU's absolutely highlights in which games there a lot of future "extension" if avg fps dips below 60fps in next year's games due to a GPU bottleneck which is subsequently removed via a GPU upgrade after which the CPU's still have enough overhead to feed 60fps, and therefore continue to be "good enough for gaming" (the whole point of this topic). Just because you don't understand what people are benchmarking / looking for in different benchmarks, doesn't make the benchmarks "wrong".stop showing TPU benchmarks.
You already posted that once, and my answer is simply I'll show what's relevant to the same spammed "But I don't game at 720p and don't understand why they exist, so they must be wrong" over and over in multiple threads, even after like 20 other people now have replied with exactly the same answer, over and over..
No matter how many times you post 720p benchmarks, you cannot say that it is an indication of the dependence of a CPU on future GPU performance when more powerful GPUs are available.720p vs 1080p IS all about removing the GPU bottleneck. That's the sole reason for doing it! If your question about 720p benchmarks isn't about GPU bottlenecking, then you're asking the wrong question. It's a CPU test, this is a CPU thread in a CPU forum. You're in the wrong forum if you want to start benchmarking different GPU's against each other doing nothing but changing resolution. If a GTX 1050Ti got 60fps at 720p, and a GTX 1060 got 60fps at 1080p, that's completely and utterly irrelevant to CPU testing of different CPU's on the same GPU's.
I already have and you refuse to look at them simply because you do not want to. If two CPU's are GPU bottlenecked but upgrading to the next tier +50% faster GPU gives one CPU a +40% boost but the other a +10% boost (both due to CPU bottlenecks by differing degrees), then you'll generally see the +40% one get higher fps at 720p. It doesn't have to be exactly +40.0000% and <39.9999% or >40.0001% is "invalid", but that's the general effect. This isn't even up for debate beyond trolling.720p benchmarks are plain rubbish unless you actually do a test that is a direct demonstration of the claim that CPU 1 which gives lower FPS than CPU 2(on a percentage basis) using a certain GPU at 720p, will also give same lower FPS on a percentage basis with a much more powerful GPU at more realistic settings like 1080p.
How can it be a reliable indicator of performance in the future when the way games will be utilizing resources may change over time? Suppose I had both the 8700K and the 1800X in 2013. So how can 720p performance with a GTX 780 Ti in games of 2013 - tell me about the performance on those two CPUs at 1080p with a GTX 1080 in today's games? How can you predict Battlefield 1 performance in 2016 using Battlefield 3 720p benchmarks in 2013? After all, isn't this the main argument for 720p benchmarks?I already have and you refuse to look at them simply because you do not want to. If two CPU's are GPU bottlenecked but upgrading to the next tier +50% faster GPU gives one CPU a +40% boost but the other a +10% boost (both due to CPU bottlenecks by differing degrees), then you'll generally see the +40% one get higher fps at 720p. It doesn't have to be exactly +40.0000% and <39.9999% or >40.0001% is "invalid", but that's the general effect. This isn't even up for debate beyond trolling.
How can we go from games don't use more than one core to 4 cores are obsolete? I think people want 4 cores to be useless because in their head 8 and 16 cores means something more. But in reality a 2 core CPU can actually still game.
Again, you've got it completely backwards. The context of this thread is about older CPU's lasting longer on newer GPU's, not the other way around. You still seem incapable of mentally grasping that trying to simulate how current CPU's perform being maxed out with future GPU's, requires a test that actually tries to max out the CPU before the GPU in the same way (ie, lowering resolution). Again, this isn't something new and has been industry standard in benchmarking for like +20 years...How can it be a reliable indicator of performance in the future when the way games will be utilizing resources may change over time? Suppose I had both the 8700K and the 1800X in 2013. So how can 720p performance with a GTX 780 Ti in games of 2013
How will you test older CPUs coping with newer GPUs if not by testing on newer games as well? If all I do is play older games, then why should I upgrade my GPU in the first place? How else would potentially untapped CPU headroom, which according to you is shown by 720p testing, manifest itself in reality?Again, you've got it completely backwards. The context of this thread is about older CPU's lasting longer on newer GPU's, not the other way around. You still seem incapable of mentally grasping that trying to simulate how current CPU's perform being maxed out with future GPU's, requires a test that actually tries to max out the CPU before the GPU in the same way (ie, lowering resolution). Again, this isn't something new and has been industry standard in benchmarking for like +20 years...
How will you test older CPUs coping with newer GPUs if not by testing on newer games as well? If all I do is play older games, then why should I upgrade my GPU in the first place? How else would potentially untapped CPU headroom, which according to you is shown by 720p testing, manifest itself in reality?
Your arguments are now changing from post to post. If all you want is to play older games, then there's no reason for you to upgrade either, and the answer to this thread is "No, for many people quad cores are not going to be obsolete for a lot of games". Newer games will continue to be the usual mix of well optimised (eg, Doom), bad ports that take +6 months of patching just to be playable even on new CPU's, those that run like complete turds for the visuals with laughably broken AI (eg, Civ 6) and stuff that's in between. They'll also continue to be designed for consoles which themselves have limits to how much stuff you can cram in. Performance isn't going to keep halving every year just for simple 1080p/60 or you'll end up with 10-15fps on consoles. I'd love to give you a time machine so you can test 2019 games today, but since they don't exist the next best thing is to simply measure how much CPU overhead you do have once GPU bottleneck is removed to gain a rough idea.How will you test older CPUs coping with newer GPUs if not by testing on newer games as well? If all I do is play older games, then why should I upgrade my GPU in the first place? How else would potentially untapped CPU headroom, which according to you is shown by 720p testing, manifest itself in reality?
Of course they are.What if a man didn't listen to the sales people and stayed with a 4 core? Would that make him a non-money maker? The people pushing 6 core (we had that with AMD 7 years ago) and hyper threads do you not think they are in it for the money?
Naa the 8400 is aprox 10% point better in the all important 99% frametimes in mp bf1 vs 1600x. Avg is useless as a metric.https://www.computerbase.de/2017-10...#diagramm-battlefield-1-multiplayer-1920-1080
8400 is 23% faster in BF1 MP. 1600X boosts to 3.7 GHz, so add 5.5% to the score for 3.9 GHz. 8400 would still be about 16-17% faster.
Two things: 1) That's SP, IINM. 2) at 1440 the game is GPU limited. For a true CPU comparison you should look at 1080 or even 720.