The 8700k

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

eek2121

Diamond Member
Aug 2, 2005
3,100
4,398
136
Did you switch the TR modes between gaming / creation and UMA vs NUMA memory or turn of SMT?

TR is not the most ideal pure gaming platform, but it has lots of room for tweaking and the 1920X reviews did highlight what settings work best for gaming.

I suspect he did not try anything. He's probably running dual channel DDR4 2133 or some nonsense and running in distributed mode. Some BIOS tuning and decent RAM puts Threadripper 1920/1950X ahead of the 1800X for gaming.

EDIT: Oh and I see a lot of people suggesting to turn off SMT. I've never seen any benefit to doing this at all. The most I've ever had to do (and very rarely) is disable cores, and I've only had to do that on Ubisoft titles. I suspect that Ubisoft is like valve in that they can't count...
 
Last edited:

Dasa2

Senior member
Nov 22, 2014
245
29
91
It seems Farcry 5 does have a performance problem with Threadripper & 7900x being ~65% slower than 8600k\8700k.
It seems its very sensitive to cache\memory latency.
AMD Ryzen Threadripper 2950X Review

https://techreport.com/review/33977/amd-ryzen-threadripper-2990wx-cpu-reviewed/12


But then this review doesn't look so bad with Threadripper at least competitive with 2600 instead of being 20FPS behind and only ~25% slower than 8700k which makes you wonder whats different.

https://www.guru3d.com/articles_pages/amd_ryzen_threadripper_2950x_review,28.html
 
Last edited:

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
I am sorry but this is the most BS reason to make a CPU choice and actually ignores everything that pushes new GPU sales. People don't generally get new video cards just to get additional framrates in previous games. People get a video card to best fit their use case. When new games come out and they aren't getting what they want out of their previous card they upgrade. No one is is going to be sitting at 4k screen playing a game comfortably, decide well gosh darn it I am going to get a new card, find out my current game runs faster but not as fast as it could if they got another cpu and feel sad. People get new video cards because a new game is taxing their video card or they are "prepping" for a new game. A CPU outside being patched for something that then kills its performance in an area that matters for a Game will never be worse than it is now in that game. The next game is going to be even more GPU bottlenecked at whatever res you are using than the current one.

That said I am not saying don't get a CPU that excels at gaming if it's your primary or only solution, or if you are on a budget and need to be able to skimp (like getting a 2500k over a 2600k). But people need to be careful about skimping out on everything else in chacing the all elusive FPS, that for outside truly competitive circumstances, or 140hz setups, they are never actually realized and getting something closer to future trends and more professional use might serve them better in the long run.

This selective quote completely and utterly misses the point. You edited out the section where I already stated the 2700x is already a better multicore high multitask solution.

GPU doesn't have much to do with what I'm talking about other than current GPUs being a bottleneck at higher resolution for game testing, making the 2700/8700 comparison look closer than it probably will be in the future.

It's not a mystery, we've seen it before. A nicely OCd 8350 and 2600k back in 2012 era with a GTX 680 would look basically identical at 1440p in most games of the time like Battlefield 3, because of GPU limitation. Fast forward to today, and a 4.5ghz 8350 with an 1170 is pretty CPU limited, to the point where the minimums are a real distraction. A 2600k @ 4.5ish is still passable, due to higher IPC.

Your post is just confusing, selectively quoting to ignore what is specifically something you state differently, yet call it BS? It's just rude and bizarre.

8700k is better for gaming. It just is. And as new GPUs come, this will still be the case. Just as 2700x is overall superior for all core loading tasks such as encoding or extreme multitasking. Time won't change that, the Ryzen will maintain that lead over the lifetime of the usability of the system. This is not a mystery. If gaming is your priority, 8700k/etc are A+, if it's less important, go Ryzen. Both are decent enough not to be disappointing in any particular area anyway.

Next time you quote someone, do yourself a favor and don't conveniently edit out everything but a fraction combined with an insulting tone. It just makes you look unhinged and rude.
 

TheELF

Diamond Member
Dec 22, 2012
4,026
753
126
But then this review doesn't look so bad with Threadripper at least competitive with 2600 instead of being 20FPS behind and only ~25% slower than 8700k which makes you wonder whats different.
Techpowerup at least claims this (they're real games, not 3D benchmarks).
So I would guess it's the difference in running the in build benchmark(wich is just a 3D benchmark) and actually playing the game.

https://www.techpowerup.com/reviews/Performance_Analysis/Far_Cry_5/5.html
On their GPU review of this game they state
We tested the game using actual gameplay, with the latest drivers for AMD and NVIDIA (which were released just a few hours ago). While a benchmark is available, it seems quite short and doesn't accurately represent in-game performance (the FPS results from the benchmark are roughly 10% higher than what you'll see in typical gameplay).
 

gipper53

Member
Apr 4, 2013
76
11
71
I just built a new 8700K system a couple of weeks ago, upgrading from Ivy Bridge. Was going to wait for 9th gen but once it was confirmed that Z370 is compatible I said f-it and pulled the trigger. Microcenter deals made it too tempting to hold out any longer.

I'm thrilled with the 8700K and mild 4.8Ghz OC. That said, I'm still very interested in what the 9900K will bring. I'm not a gamer, but my Photoshop workload can use extra cores for certain tasks (RAW exporting mainly). If the gains are enough, I'll consider swapping the 8700K for the 8 core. Not sure at the rumored $450 price though...
 

Dasa2

Senior member
Nov 22, 2014
245
29
91
I'm not a gamer, but my Photoshop workload can use extra cores for certain tasks (RAW exporting mainly). If the gains are enough, I'll consider swapping the 8700K for the 8 core.

My results using this photoshop CC Benchmark
https://www.pugetsystems.com/labs/articles/Puget-Systems-Adobe-Photoshop-CC-Benchmark-1132/

2133 to 3866c16 percentage increase
Overall Score 9.5%
General Score 14.5%
Filter Score 12.2%
Photomerge Score 4.3%
GPU Score 19%



Some older tests
Lightroom also sees over 20% improvement from higher ram speeds when exporting but only ~10% when rendering previews

time taken to export 150 24mp .nef to jpeg
2133c15 3:20
3200c14 2:50
3866c16 2:39

time taken to build 150 24mp 1:1 previews
2133c15 6:03
3200c14 5:33
3866c16 5:23
 
Reactions: ZGR

Topweasel

Diamond Member
Oct 19, 2000
5,437
1,658
136
This selective quote completely and utterly misses the point. You edited out the section where I already stated the 2700x is already a better multicore high multitask solution.

GPU doesn't have much to do with what I'm talking about other than current GPUs being a bottleneck at higher resolution for game testing, making the 2700/8700 comparison look closer than it probably will be in the future.

It's not a mystery, we've seen it before. A nicely OCd 8350 and 2600k back in 2012 era with a GTX 680 would look basically identical at 1440p in most games of the time like Battlefield 3, because of GPU limitation. Fast forward to today, and a 4.5ghz 8350 with an 1170 is pretty CPU limited, to the point where the minimums are a real distraction. A 2600k @ 4.5ish is still passable, due to higher IPC.

Your post is just confusing, selectively quoting to ignore what is specifically something you state differently, yet call it BS? It's just rude and bizarre.

8700k is better for gaming. It just is. And as new GPUs come, this will still be the case. Just as 2700x is overall superior for all core loading tasks such as encoding or extreme multitasking. Time won't change that, the Ryzen will maintain that lead over the lifetime of the usability of the system. This is not a mystery. If gaming is your priority, 8700k/etc are A+, if it's less important, go Ryzen. Both are decent enough not to be disappointing in any particular area anyway.

Next time you quote someone, do yourself a favor and don't conveniently edit out everything but a fraction combined with an insulting tone. It just makes you look unhinged and rude.
I captured a statement used over and over and over again that doesn't hold water. Even your example is handled in the worst way. That is quite possibly the worst example of differences between CPU's to start with and still the BD gained performance over the 2600. So how do you make it seem worse, well talk about over clocks giving a defined example on one and leaving the other as "nice".

My point and the onlything I cared to take from your original post was the one that I found BS. It's still BS. For most people as a generality upgrade video cards to go from one GPU bottleneck to another. As it pertains to this topic, any future games are more likely to use the increased resources of the 8700 or 9900 or Ryzen 5 or 7. Not continuing to rely on ST performance as a sole indicator of out right performance. Making your statement and the dozens in the past year to hold past ideas of performance markers.
 

TheELF

Diamond Member
Dec 22, 2012
4,026
753
126
any future games are more likely to use the increased resources of the 8700 or 9900 or Ryzen 5 or 7. Not continuing to rely on ST performance as a sole indicator of out right performance. Making your statement and the dozens in the past year to hold past ideas of performance markers.
*This never happened and never will,it all depends on the consoles because all games are coded for consoles,and because they already showed us how lazy they are you can be sure that even if the new consoles will have better CPUs they won't run games any better because the devs will keep coding games the way they do now.

*ever since the consoles took over

Also we have tons of games that already scale to 8 cores and more but still nothing has changed.
 

urvile

Golden Member
Aug 3, 2017
1,575
474
96
For those who put the 8700k on air at 5 Ghz. Please post your handbrake scores. Here are mine for this video ripping to 2580x1080 @ 60 frames. I am genuinely curious about what sort of temps those on air are getting. I am currently running 2 x 140mm fans on my radiator off the 4 pin cpu header on my motherboard. The pump runs of a pump header. If you are getting temps lower than what I am would like to know.

I have some new ram (3200Mhz) incoming so when I fit that I might adjust what the fans are plugged into.


 

Topweasel

Diamond Member
Oct 19, 2000
5,437
1,658
136
*This never happened and never will,it all depends on the consoles because all games are coded for consoles,and because they already showed us how lazy they are you can be sure that even if the new consoles will have better CPUs they won't run games any better because the devs will keep coding games the way they do now.

*ever since the consoles took over

Also we have tons of games that already scale to 8 cores and more but still nothing has changed.

Are you sure BF MP? That RTS that scales down max units based on CPU's playing the game? There are more but that's the thing for the most part we are dealing with engines made for DX11 and hardware that was available years ago. Things are slow to change. But they are changing. How you can't see I don't understand.
 

TheELF

Diamond Member
Dec 22, 2012
4,026
753
126
Are you sure BF MP? That RTS that scales down max units based on CPU's playing the game?
BF mp has no problems at all, BF mp on certain maps has definite issues,just like fallout 4 has definite issues on certain "maps" corvega factory or the big settlements.
That RTS is not made for consoles though is it?Thanks for helping my point.
 

TheELF

Diamond Member
Dec 22, 2012
4,026
753
126
Things are slow to change. But they are changing. How you can't see I don't understand.
I do see them change that's what my whole post was all about,devs used to have to port games over to PC and while they still had lots of problems the efficiency was still better(you would get more FPS per core) now it's complete rubbish and some games run like dirt even on the best CPUs available and on plenty of cores.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
I captured a statement used over and over and over again that doesn't hold water. Even your example is handled in the worst way. That is quite possibly the worst example of differences between CPU's to start with and still the BD gained performance over the 2600. So how do you make it seem worse, well talk about over clocks giving a defined example on one and leaving the other as "nice".

My point and the onlything I cared to take from your original post was the one that I found BS. It's still BS. For most people as a generality upgrade video cards to go from one GPU bottleneck to another.

https://youtu.be/QrNKeLhDhsQ

The 8350 hasn't really gained anything on the 2600k. I mean, when they were new, at least the GPUs were strongly bottlenecking both at 1080p/ultra. Gamers tend to keep a CPU for at least a couple of GPU spins due to the glacial progress in that area, but now that there is more headroom to test with, we can see sizable gaps at 1080p, moderate gaps at 1440, and although testing wasn't done at 4k, I'd guess the gaps there would be pretty small.

The 8700k starting out faster than current Ryzens in gaming is not likely to change, and they seem most pronounced when a GPU is used that doesn't bottleneck the test at 1080p. At 1440/4k, the gap is usually impossible to see due to GPU not having any more headroom. If GPUs are used with more headroom, then the same % gap we already see at 1080p will emerge. (Eg; 8700k user in 2021 with 7nm+ 16GB Navi derivative or GTX *80 of the time.

This is much less of an issue if the user is more of a mid-range buyer. Someone who used a 660ti in 2012, 960 4gb in 2015, and only now looking at maybe a 1060 would have seen few situations where an 8350 wouldn't have been about equal to a 2600k in reasonable settings such as 1080/medium-high for common games.

A higher end user that went from GTX 680 to R290 to GTX 1080 otoh, would see a difference even at 1440p, but especially at 1080p once they got to the newer cards.

This isn't controversial. A gamer today with a new rig using a 1060 6GB would probably see basically zero difference between a 2700x and 8700k for gaming, because the 1060 is too slow to show the gap. The Ryzen being better $ for $ would easily make more sense if they are planning on say keeping the CPU/Mobo for 3-4 years, and maybe only a single mid-range GPU upgrade if any during that time.
 
Reactions: Zucker2k

urvile

Golden Member
Aug 3, 2017
1,575
474
96
I suspect he did not try anything. He's probably running dual channel DDR4 2133 or some nonsense and running in distributed mode. Some BIOS tuning and decent RAM puts Threadripper 1920/1950X ahead of the 1800X for gaming.

EDIT: Oh and I see a lot of people suggesting to turn off SMT. I've never seen any benefit to doing this at all. The most I've ever had to do (and very rarely) is disable cores, and I've only had to do that on Ubisoft titles. I suspect that Ubisoft is like valve in that they can't count...

I actually turned it off and back on again. Then I set up a shrine to Lisa Su and prayed to it every day. Still nothing. So I built an 8700k based gaming monster and now the TR runs as a Hyper-V and Plex server. Which it does very well.
 
Reactions: ehume and Zucker2k

urvile

Golden Member
Aug 3, 2017
1,575
474
96
Simply because I now have a monster gaming rig....and culling the herd.

 

gipper53

Member
Apr 4, 2013
76
11
71
My results using this photoshop CC Benchmark
https://www.pugetsystems.com/labs/articles/Puget-Systems-Adobe-Photoshop-CC-Benchmark-1132/

2133 to 3866c16 percentage increase
Overall Score 9.5%
General Score 14.5%
Filter Score 12.2%
Photomerge Score 4.3%
GPU Score 19%



Some older tests
Lightroom also sees over 20% improvement from higher ram speeds when exporting but only ~10% when rendering previews

time taken to export 150 24mp .nef to jpeg
2133c15 3:20
3200c14 2:50
3866c16 2:39

time taken to build 150 24mp 1:1 previews
2133c15 6:03
3200c14 5:33
3866c16 5:23


Yeah I've been using that benchmark for a little while. My previous 3770 was scoring in the 580 range for the overall score. Felt it was finally time to move on.

Results with the new system at 4.8Ghz and 32GB of 3200c16 memory:

Photoshop CC 2018,Overall Score,Ver. 19.1.5,1031,Score
Photoshop CC 2018,General Score,Ver. 19.1.5,97.6,Score
Photoshop CC 2018,Filter Score,Ver. 19.1.5,105.9,Score
Photoshop CC 2018,Photomerge Score,Ver. 19.1.5,108.5,Score
Photoshop CC 2018,GPU Score,Ver. 19.1.5,95.1,Score
Photoshop CC 2018,RAW File Open,Best Time,2.1,Seconds
Photoshop CC 2018,Resize to 500MB,Best Time,2.42,Seconds
Photoshop CC 2018,Rotate,Best Time,1.01,Seconds
Photoshop CC 2018,Magic Wand Select,Best Time,6.05,Seconds
Photoshop CC 2018,Mask Refinement,Best Time,3.92,Seconds
Photoshop CC 2018,Paint Bucket,Best Time,2.16,Seconds
Photoshop CC 2018,Gradient,Best Time,0.55,Seconds
Photoshop CC 2018,Content Aware Fill,Best Time,10.21,Seconds
Photoshop CC 2018,PSD File Save,Best Time,3.92,Seconds
Photoshop CC 2018,PSD File Open,Best Time,2.58,Seconds
Photoshop CC 2018,Camera Raw Filter,Best Time,5.28,Seconds
Photoshop CC 2018,Lens Correction,Best Time,14.79,Seconds
Photoshop CC 2018,Reduce Noise,Best Time,18.53,Seconds
Photoshop CC 2018,Smart Sharpen,Best Time,21.29,Seconds
Photoshop CC 2018,Field Blur,Best Time,14.1,Seconds
Photoshop CC 2018,Tilt-Shift Blur,Best Time,12.64,Seconds
Photoshop CC 2018,Iris Blur,Best Time,14.39,Seconds
Photoshop CC 2018,Adaptive Wide Angle,Best Time,15.38,Seconds
Photoshop CC 2018,Liquify,Best Time,7.18,Seconds
Photoshop CC 2018,Photomerge 22MP Images,Best Time,74.87,Seconds
Photoshop CC 2018,Photomerge 45MP Images,Best Time,95.81,Seconds
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |