8700K vs 2700X on with 2080Ti [computerbase]

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Brunnis

Senior member
Nov 15, 2004
506
71
91
If low resolution testing was the determinor of CPU capabilities, why not go the whole hog and go for the lowest possible resolution?
Test a really old game that only came with a terrible resolution setting.
Would tell me nothing about the price of fish in Sowetto, but at least I'd know that a 8700k could get 554785211588444 fps, whilst a 2600x could get 488524862288553 fps. Meanwhile, my 60/144Hz monitor that displays at 1080p/4k resolution, still wouldn't care what CPU was being used, only what GPU it was being fed from.
Low-res competitive eSports gamers are the only folk that notice any difference in top-end CPU performance
Yup, better to just flip a coin on which CPU to buy for your gaming rig. Makes much more sense than running a test.
 
Aug 11, 2008
10,451
642
126
It makes sense that with a more powerful gpu the difference would be magnified. The 1080Ti results are not consistent with the previous data, but using save games vs built in benchmarks would most likely involve a heavier cpu load, so I would not dismiss them out of hand.

Edit: For those that are using the "cpu doesnt matter at 4K" argument, then just get the 8400. Cheaper than either.
 
Reactions: Zucker2k

PotatoWithEarsOnSide

Senior member
Feb 23, 2017
664
701
106
Yup, I'm gonna test the 8700k at Space Invaders, coz that'll tell me how fantastic the CPU is.
Low-res testing is solely used for dishonest marketing. "8700k, up to 40%* faster than a 2600x" yawn.
In the overwhelming majority of use-cases, the 8700k will simply not be 40% faster than a 2600x, but some sheeple will see the headline figure and just run with it.
Testing should be done for your own specific use-cases, not edge cases that only apply to 1% of the target market.
I genuinely find it hard to believe that anyone buying a 2080ti won't also be rocking a 4k monitor, and then only gaming at 1080p.
 

DrMrLordX

Lifer
Apr 27, 2000
21,805
11,161
136
Yup, better to just flip a coin on which CPU to buy for your gaming rig. Makes much more sense than running a test.

I mean, I don't really know what you're getting at here. Testing a CPU at some resolution I will not use, does not help me. I will still want to know what differences there will be at my chosen resolution. Maybe there will be none at all (as in a 4k test, at least with some GPUs anyway). Or if I am a 1080p esports gamer, then maybe I'll see a big boost picking the Intel CPU.

For my part, my target is 2k @ 144 MHz, using an AMD GPU since I chose freesync. AMD does not sell a GPU that can reach those speeds (and good settings) with a single GPU solution, and may not for awhile, but at least I know what company I'll be buying from. And let's be honest, I know what CPU company I'll be buying from too, since I am a partisan when it comes to my own money. Not that I'll spend a lot of time here trying to make people think the exact same way that I do.

But for other people with the same target but no inborn loyalty to any GPU or CPU provider, if their target is (for example ) 2k@144 MHz, what are they to do with some 1080p results? Assume they'll see the same @ 2k? Let's say I have an RTX 2080 or whatever, and I am choosing between the 2700x and 8700k for the underlying CPU. At some resolution higher than 1080p, am I going to see 30% difference in fps between the two? No, probably not. I think the 8700k will show the better results. Also, if it's absolutely necessary to hit 144 MHz with some kind of settings, the 8700k is the one that's more likely to get there. Not sure if the 9900k will make it any better. More l3 is probably going to help. I do think the 2700x will put in a good showing, though. Certainly better than what we saw from these somewhat-dubious 1080p tests.

Assuming that "what I want is not a CPU test" is basically nonsense. Even @ 4k, the CPU will make some difference. As we get more and more powerful GPUs, we'll see more and more differentiation at this resolution.

The relevance of 1080p benchmarks is slipping into the past alongside 720p. I can not recommend to anyone a gaming CPU purchase based solely on 1080p results unless that is the specific resolution they plan to use every day. The results will be lopsided.
 

Brunnis

Senior member
Nov 15, 2004
506
71
91
I mean, I don't really know what you're getting at here. Testing a CPU at some resolution I will not use, does not help me.
If processor X runs the game logic of a certain game 30% faster than processor Y at 1080p, processor X will run the game logic around 30% faster at 4K as well (or has that changed? Honest question...). You can even out the differences by adding a GPU bottleneck, but it just means processor X will spend more time idle than processor Y. Until a game comes along that needs the extra CPU performance. Therefore, processor X will have better longevity and benchmarking only at higher resolutions may mask that fact. That's the point I'm getting at.

I think it's interesting to see both lower res and higher res results when I want to pick out a CPU. I get to see both which CPU that has the highest headroom (therefore longevity) and I get to set my expectations for the complete build on current games at the resolution I play at. For example, I might see that processor X and processor Y perform almost the same in most games at 4K, but processor X is 30% faster at 1080p and it only costs an additional $30. That tells me that over time, it's probably worth it to add $30 to get processor X, since I'll be able to use it for a longer time (e.g. keeping it after my next GPU upgrade).

I think that's the best I can explain it, so I think that'll be the last on the subject from me. Sorry for sort of hijacking the thread.
 
Last edited:

JoeRambo

Golden Member
Jun 13, 2013
1,814
2,105
136
If processor X runs the game logic of a certain game 30% faster than processor Y at 1080p, processor X will run the game logic around 30% faster at 4K as well (or has that changed? Honest question...). You can even out the differences by adding a GPU bottleneck, but it just means processor X will spend more time idle than processor Y. Until a game comes along that needs the extra CPU performance. Therefore, processor X will have better longevity and benchmarking only at higher resolutions may mask that fact. That's the point I'm getting at.

Oh please stop,. common sense is not needed in this forum. If 1440P results show Ryzen is equal in FPS, well they are equal, right? Heck even in those "equal" situations, there are already cracks in fan logic when one looks at frame time percentiles. But they conveniently ignore things like game load times, turn times, overall smoothness of game, stability of frame rate etc. Cause equal FPS in some GPU limited scenario = as-good.
 

gdansk

Platinum Member
Feb 8, 2011
2,492
3,386
136
These results are dubious.

ComputerBase is showing almost no performance difference between 1080 Ti and 2080 Ti on the 8700K. As an example, for Wolfenstein II (1920x1080) ComputerBase shows a 20 FPS difference. Meanwhile TechPowerUp shows a 140 FPS difference. Something is off, they're both run on 8700K.
 
Last edited:

Gideon

Golden Member
Nov 27, 2007
1,714
3,936
136
Oh please stop,. common sense is not needed in this forum. If 1440P results show Ryzen is equal in FPS, well they are equal, right? Heck even in those "equal" situations, there are already cracks in fan logic when one looks at frame time percentiles. But they conveniently ignore things like game load times, turn times, overall smoothness of game, stability of frame rate etc. Cause equal FPS in some GPU limited scenario = as-good.
That's a valid point. FPS is a poor metric for game "smoothness" but people usually fail to grasp frame-times, or just dismiss them as unimportant.

Take this tech-report Deus Ex benchmarks as an example. All CPUs give good FPS: 1800X and 1600X should be as good as any other CPU (at the very least with 60Hz monitor) right?


Well actually no, because they spend > 10 frames of the test run under 60 FPS (which is 33.3ms per frame):



Of which some are also clearly visible as spikes on this graph, which will be very noticeable, as our eyes pick up sudden changes in movement a lot better than smooth transitions:



So at the every least 1600X will stutter noticeably, despite putting out decent FPS. Despite minimal difference in framerate, thre Ryzen 2XXX series feels way smoother.
Overall even the 8700K with fast ram dips below 60 FPS for 4 frames. But It's still the smoothest, and it does show.
 

Malogeek

Golden Member
Mar 5, 2017
1,390
778
136
yaktribe.org
From what I can tell from the basic translation, default memory timings were used. We all know how much of a difference that could make in these tests. In fact the author links to their article on Ryzen memory timings yet doesn't elaborate on what was used for the tests? All it seems to say is 16Gb of DDR4 RAM was used and nothing else?

Also no mention of whether multi-core boost was on/off for the Intel.
 
Reactions: ryan20fun

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Two main points:

1) There are serious inconsistencies in computerbase.de results. I wouldn't use them as definitive of anything.
2) Testing on more powerful GPUs should reveal more CPU differences, because games will be less GPU limited.

IIRC HW Unboxed was talking about doing some new CPU scaling tests with RTX cards. They tend to do a much better job of covering all the bases. I will look forward to their results.
 

moinmoin

Diamond Member
Jun 1, 2017
4,994
7,765
136
From what I can tell from the basic translation, default memory timings were used. We all know how much of a difference that could make in these tests. In fact the author links to their article on Ryzen memory timings yet doesn't elaborate on what was used for the tests? All it seems to say is 16Gb of DDR4 RAM was used and nothing else?
The article is a little silly indeed. After all the memory optimization in the last couple months it resets the clock and just uses memory clocks "as set by the manufacturer" (could be whatever the BIOS automatically set). So one could say the article is about performance out of the box, and the test shows Intel is significantly faster out of the box. After all the insight into how much faster optimized RAM timings can be for Ryzen this article is like a "no sh*t, Sherlock" troll.
 

legcramp

Golden Member
May 31, 2005
1,671
113
116
So if you want high fps with high refresh rate go 8700K pretty much... very useful for people looking at this. Some people in this thread need to understand how testing CPU works though lol
 

Campy

Senior member
Jun 25, 2010
785
171
116
Well actually no, because they spend > 10 frames of the test run under 60 FPS (which is 33.3ms per frame):

60 fps is 16.7ms/frame. No CPU in that test spends more than 150ms above 16.6ms frametimes, so very few frames overall. It shows what we already know, that Intel is faster but I don't think it's a meaningful difference at 60fps. If you're gaming at 100Hz and above the difference will be much more noticeable
 

TheGiant

Senior member
Jun 12, 2017
748
353
106
Low-res competitive eSports gamers are the only folk that notice any difference in top-end CPU performance
QFT
overall those for me are expected results...the difference will be IMO even higher with i9-9900K
it seems many are underestimating that the micro stuttering is mostly caused by CPU not GPU performance
If GPU is the bottleneck, my experience is that in general the game isn't performing well and the average FPS is low too
the difference of ryzen /8700K is not as big as with sandy bridge vs buldozer (which reminds me of the p4 vs Athlon 64 in favor of AMD despite the benchmark no one was gaming on P4...) but it is big enough for me as occasional gamer (and former quake 3 progamer) to buy the intel CPU currently (price difference with CPU+RAM+board+GPU upgrade is pretty much negligible)

CB should explain the saves- but maybe finally someone is running a CPU test in CPU intensive game scenes not general everyonebenchmarkrunningaveragejoetest
 

DrMrLordX

Lifer
Apr 27, 2000
21,805
11,161
136
If processor X runs the game logic of a certain game 30% faster than processor Y at 1080p, processor X will run the game logic around 30% faster at 4K as well

And? At a higher res, even with a sufficiently-powerful GPU to remove GPU bottlenecking (or to mitigate it), game logic becomes a smaller part of what the CPU has to manage while running the game. Things like draw calls predominate. Never mind that the percentage differences in the linked benchmark can't be replicated anywhere else, at any resolution, even 720P . . .

Oh please stop,. common sense is not needed in this forum. If 1440P results show Ryzen is equal in FPS,

1440P results won't show the 2700x to be equal. Even 4k would not with a GPU like the RTX 2080Ti. But it'll show something realistic, rather than something fabricated or manipulated by a bad benchmark. Also who is asking to ignore frame times and minimums? Hell what I absolutely want to see are minimums! That is my #1 most important metric from a benchmark. Averages mean nothing to me.

Two main points:

1) There are serious inconsistencies in computerbase.de results. I wouldn't use them as definitive of anything.
2) Testing on more powerful GPUs should reveal more CPU differences, because games will be less GPU limited.

IIRC HW Unboxed was talking about doing some new CPU scaling tests with RTX cards. They tend to do a much better job of covering all the bases. I will look forward to their results.

Thank you. Of course it would have been harder to show such extreme results at a higher resolution, so of course they went with the lowest "credible" resolution.

Some people in this thread need to understand how testing CPU works though lol

I was not aware that the benchmark's author was posting here . . . maybe he should not run Ryzen systems with DDR4-2133? Just a thought.
 

ZGR

Platinum Member
Oct 26, 2012
2,054
661
136
So if you want high fps with high refresh rate go 8700K pretty much... very useful for people looking at this. Some people in this thread need to understand how testing CPU works though lol

Having used both Intel and AMD at 1080p144hz+ in my favorite FPS, I can't really notice the difference.
 

Abwx

Lifer
Apr 2, 2011
11,167
3,862
136
QFT
overall those for me are expected results...the difference will be IMO even higher with i9-9900K

Methink that it will be lower than in thoses "tests", and that in a few days some people will get a ride for their money...
 

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
The article is a little silly indeed. After all the memory optimization in the last couple months it resets the clock and just uses memory clocks "as set by the manufacturer" (could be whatever the BIOS automatically set). So one could say the article is about performance out of the box, and the test shows Intel is significantly faster out of the box. After all the insight into how much faster optimized RAM timings can be for Ryzen this article is like a "no sh*t, Sherlock" troll.

There is merit to both forms of testing. 'Stock' memory settings usually means XMP and that is what the vast majority of people will run, out of the box. Not many people are going to go into BIOS and tinker with memory timings to eek out an extra 5% frames.

I always knew that with next gen GPUs like the 2080Ti would make the gap between the 8700K and Ryzen chips grow, it's simple logic, especially at 1080P.

One can make the argument that is not a realistic resolution for that class of GPU, but as others have mentioned already in this thread, it's still more useful data than running a 4K test that is basically testing a GPU bottleneck, not the CPU.

I'm not against 4K testing per se, I just already know the results will be basically a flat line from a Ryzen 3 / Core i3 all the way up to the 9900K.

I think 1440P testing represents a good 'compromise' between 1080P and 4K, in fact many top grade gaming monitor are at this resolution.

I think you'll see similar seperation between CPUs at 1440P w/2080Ti as you would at 1080P w/1080Ti. The only truly GPU bottlenecked resolution now is 4K.
 
Reactions: ryan20fun

scannall

Golden Member
Jan 1, 2012
1,948
1,640
136
There is merit to both forms of testing. 'Stock' memory settings usually means XMP and that is what the vast majority of people will run, out of the box. Not many people are going to go into BIOS and tinker with memory timings to eek out an extra 5% frames.

I always knew that with next gen GPUs like the 2080Ti would make the gap between the 8700K and Ryzen chips grow, it's simple logic, especially at 1080P.

The ram speeds aren't in there at all. What is troublesome is seeing what has been shown many time as a 12-14 percent deficit all of a sudden 30? With limited infromation on the test beds. It seems more like click bait than anything to me. Intel currently enjoys a 7ish percent advantage in IPC, and a small disadvantage in SMT. Plus a large clockspeed advantage. To sum it up, this %30 is an outlier. By quite a bit, with too little information to dig into it.
 

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
Yeah I agree we need more data, but 1080P with a 2080Ti is basically like testing at 720P with previous gen GPUs. You're going to be running into CPU bottlenecks a lot more often, so differences between CPUs that normally won't be evident in a more GPU limited environment will now be more apparent.

I'm reserving judgement until more sites do comprehensive testing on this, I don't expect to see a 30% delta between a 8700K and 2700X except in certain outliers, but I wouldn't be surprised at a 15 - 20% gap at 1080P, and perhaps 10% at 1440P.
 
Reactions: ryan20fun
Aug 11, 2008
10,451
642
126
The old AMD marketing department would love you!
Yep, I still remember AMD's marketing hype that Bulldozer was as fast as the 1000.00 intel extreme edition. Of course the test they used was gpu limited and the mainstream 350.00 intel cpu was far better for gaming than the extreme edition.
 

Innokentij

Senior member
Jan 14, 2014
237
7
81
Nothing new here to be honest, design of ryzen is for slow parrerel workload it won't ever challenge Intel in dx11 games. Just look at 4K 2080ti far cry 5 bench from Linus Both 1080ti and 2080ti gets 10 fps 97th percentile higer on 8700k vs 2700x. Any gamer should stay away from this inferior Cpu to dx12 is standard.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |