AMD Ryzen (Summit Ridge) Benchmarks Thread (use new thread)

Page 174 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.
D

DeletedMember377562

Who told you anything about "real-life scenario", dammit? We talk about testing CPU performance. CPU performance does not improve with resolution, so it is natural to test it at resolution that allows lower GPU load. Some go all the way, like [H] did. Others do not, and keep it at 1080p and high quality settings. But hey, it helps to establish the upper bound on overall framerates you can achieve in any resolution higher than 1080p.
Those tests are there to show how CPUs perform in games. The very fact that you test CPUs in games is a starting point of them making that cases, as you don't test "CPU performance" in applications that are heavily GPU-dependent (which games are). And they are there to be show a real-life scenario. Meaning a real-life representation. It they weren't, then the resolution tested at would be much, much lower than 1080p. Richard himself has made a point of this in his reviews (Skylake, Skylake vs. Haswell-E, and now their two Kaby Lake tests), and also discusses it in this one.
They are never supposed to represent real-life experience when they are CPU tests, they are supposed to represent CPU. Now, gaming tests is a whole another ordeal.

These are gaming tests we are talking about.
The irony of this statement is that i am willing to bet BF1 is CPU-limited by i5 in 1440p too! Lemme check real quick... Yes, i probably would be CPU bottlenecked in 1440p by locked Haswell i5 on 1080.

Lol, check? How did you check? And where is the irony, when even I myself said that and i5 probably would perform inferior to an i7 in BF1 MP at 1440p (but not by much). Or did you also miss that part?
Learn what's called "context". Then apply it.

Context, as in the contect of DF helping create this image of many of its viewers that they HAVE to buy a 7700K, otherwise they'll be bottlenecked? That's the average people getting. Hell, it's the image that even enthusiasts on forums use as "evidence" that games are more multithreaded. Even the basis that Richard himself in that test uses when claiming that "i5 is not enough for gaming". It's funny that you talk about context, without having even seen the DF video yourself. Maybe you should teach Richard about context too, huh?
Because they are YouTubers with all that entails. Otherwise, how do you explain them using SLI 1080s in 2016/2017?

It's completely rational to assume that someone with GTX 1080s in SLI would run in a 4K system.
 
Last edited by a moderator:
D

DeletedMember377562

Huh.. 15x more people use two 1080p monitors for gaming compared to 1440p users. And you don't think they'll be using a GTX1080?
15x more people? You are not very good with numbers, are you? 2.7% of users have multi-monitors with a combined 3840x1080p resolution, vs. 1.8% with only 1440p displays. How do you make that into 15x more?

Also ultrawides and multimonitor setups are becoming a lot more common now, and if anything i think we're likely to see more users on 2560x1080 than 2560x1440 in the coming years, that's my opinion.

2560x1080 is an outdated resolution. Not outdated, but more something that came with monitors few years back. Newer ultrawide monitors have 3440x1440, and GPUs are able to push this res more and more as we go along. Furthermore, even if we were to assume 2560x1080, it still goes under my argument, as it's all based on how when people play in higher resolutions, the kind of CPU you have becomes less important.
 
Last edited by a moderator:

ecogen

Golden Member
Dec 24, 2016
1,217
1,288
136
15x more people? You are not very good with numbers, are you? 2.7% of users have multi-monitors with a combined 3840x1080p resolution, vs. 1.8% with only 1440p displays. How do you make that into 15x more?

It's pretty obvious that it was meant to be 1.5x and he forgot a dot, chill dude.
 

lolfail9001

Golden Member
Sep 9, 2016
1,056
353
96
The very fact that you test CPUs in games is a starting point of them making that cases, as you don't test "CPU performance" in applications that are heavily GPU-dependent (which games are).
You very much do, because CPU sets the upper bound on GPU performance.
And they are there to be show a real-life scenario.
Replace equality sign with Lesser Than and here is your "real-life" scenario.
Meaning a real-life representation.
These are tests are done in real-life, man.
It they weren't, then the resolution tested at would be much, much lower than 1080p.
Guess what, [H]ardOCP does that.
These are gaming tests we are talking about.
Do they compare GPUs as well? If they only compare CPU/memory, they are CPU tests in games, not gaming tests.
Lol, check? How did you check?
Made a trivial deduction based on 3 numbers and 3 contexts. Here they are: 1080/1080+6700=134,1080/1080+4670=90,1080/1440+6700=104.
Context, as in the contect of DF helping create this image of many of its viewers that they HAVE to buy a 7700K, otherwise they'll be bottlenecked?
It is valid image in fact, because in few years the Titan XM performance should trickle down to ~$200 GPU class, landing in combination with 1080p monitors by nature of it's cost. And in overwhelming majority of games, it will be cpu bottlenecked unless you do not have something akin to 6700k. But hey, having some sense of perspective is too hard, i know.
Hell, it's the image that even enthusiasts on forums use as "evidence" that games are more multithreaded.
That evidence is done by checking Windows Task Manager, don't be dense.
It's funny that you talk about context, without having even seen the DF video yourself.
I have seen enough of them to make a conclusion that when they compare CPUs, they compare CPUs, not gaming on them, and establish one thing: 6700k/7700k is straight superior to the rest of the pack.
It's completely rational to assume that someone with GTX 1080s in SLI would run in a 4K system.
And why not 1440p@144hz system instead? Or ultrawide 1440p@100hz? See, when we talk about people with few thousand bucks in PCs, you cannot assume how much do they dump into monitors.
 

Crumpet

Senior member
Jan 15, 2017
745
539
96
15x more people? You are not very good with numbers, are you? 2.7% of users have multi-monitors with a combined 3840x1080p resolution, vs. 1.8% with only 1440p displays. How do you make that into 15x more?



2560x1080 is an outdated resolution. Not outdated, but more something that came with monitors few years back. Newer ultrawide monitors have 3440x1440, and GPUs are able to push this res more and more as we go along. Furthermore, even if we were to assume 2560x1080, it still goes under my argument, as it's all based on how when people play in higher resolutions, the kind of CPU you have becomes less important.


2560 x 1440
1.84%
+0.01%

3840 x 1080
30.91%

+0.41%

Do the maths.

Oh and 2560x1080 is outdated?

Acer Predator Z35, 35" 2560x1080p curved 144hz gaming monitor.
Release date: December 2015

Acer BX340ck 2560x1080p
Release date: December 2015

that's only really one year ago.

And with one third of the Steam users gaming at 1080p, who's to say they won't step up to an ultrawide, considering how much popularity they are gaining. Especially as prices come down.

The thing is, the average user isn't using brand new hardware, so why will they be using brand new monitors and gaming at higher resolutions?
 
D

DeletedMember377562

You very much do, because CPU sets the upper bound on GPU performance.
.

And higher resolutions make this irrelevant. Which is what GTX 1080 users play at. They don't play in 1080p.
These are tests are done in real-life, man.

In 1080p, with an OCed Titan XP. Which is unrealistic for a user of that card.
Guess what, [H]ardOCP does that.

Guess what? I mentioned that other sites do that, and criticized them for it. Yet another part of my earlier posts you did not even read?
You very much do, because CPU sets the upper bound on GPU performance.
It is valid image in fact, because in few years the Titan XM performance should trickle down to ~$200 GPU class, landing in combination with 1080p monitors by nature of it's cost. And in overwhelming majority of games, it will be cpu bottlenecked unless you do not have something akin to 6700k. But hey, having some sense of perspective is too hard, i know.

We've been through this before. Let's suppose you are right. But by the time this happens, then the 1440p users today on high-end systems would have moved over to 4K. And the 1080p users today would have moved over to 1440p. So no, it is not a valid image, either of today or of the future.

But hey, having some sense of perspective or understanding of what you are saying is too hard, I know...
I have seen enough of them to make a conclusion that when they compare CPUs, they compare CPUs, not gaming on them, and establish one thing: 6700k/7700k is straight superior to the rest of the pack.
.

In other words, you haven't seen the video or read the test we are discussing, in which Richard (the narrator) very clearly uses the results as an argument for i5 being bottlenecked in modern games (the whole test is about how these CPUs perform in games). But that's a bullshit statement, as the GPU he tests with would almost always be used in 1440p and 4K by its users, not 1080p.
And why not 1440p@144hz system instead? Or ultrawide 1440p@100hz? See, when we talk aboutpeople with few thousand bucks in PCs, you cannot assume how much do they dump into monitors.

Again you completely ignore everything I have ever said. Not to mention completely lacking understanding of what you yourself are talking about. Even a GTX 1070 wil achieve 95 FPS in average in 1440p. GTX 1080 will achieve 110-120 FPS. OCed Titan XP easily does 140 FPS in 1440p. So they have every incentive to go to higher resolutions when adding a second card to their system (which, in BF1's case, will give an increase of about 80% in performance).

If we suppose ultrawide, it still goes under my definition, as modern ultrawide displays have much higher resolution than 2560x1440, and therefore are more GPU-bound agian.
 
Last edited by a moderator:

zinfamous

No Lifer
Jul 12, 2006
110,806
29,557
146
I'm making bold statements? Coming from the guy claiming that GTX 1080 users are using 1080p displays.



GTX 1080 literally is this "1% bubble". It's an enthusiast GPU.



Ok, fine. Let's look at some proper surveys. Let's look at Steam, a client that has hundreds of millions of users from all respective segments.

According to Steam Survey, GTX 1080 owners make up 0.99% of the users:

http://store.steampowered.com/hwsurvey/videocard/

According to Steam, 1440p is used by 1.84% of users: http://store.steampowered.com/hwsurvey

So that means more people have 1440p monitors than people owning GTX 1080s. Which I assume amount to people using GTX 1070s and probably a few 980 Tis as well.

So if we are to go by any relatable surveys out there, you are wrong. GTX 1080 users are in general using 1440p displays.




You make it sound like there's an effort to be had to look at benchmarks. Give me a break.


so you're taking two separate values, looking at their independent usage numbers, and making an assumption towards their relatedness. You can't even qualify (in a significant manner) those two values as correlative .

You seem to have very little experience working with actual data. If you do this for a living, you should ask your employer to pay for your tuition, because you need to go back to school.

I don't know what else to tell you man. You are simply defining an argument with your pre-determined answer, and seeking out whatever data supports your answer. This may not be completely dishonest, but it is wholly incompetent.

I'm pretty sure now that you created this account yesterday explicitly to troll certain threads here. Further, you aren't bringing any real ammo to the table with your charges against "proper benchmarking" which contradicts pretty much every accepted standard over the years. Again, you are accepting the answer that you want and whatever standard tool contradicts that answer, is somehow illegitimate.
 
Reactions: dfk7677
D

DeletedMember377562

2560 x 1440
1.84%
+0.01%

3840 x 1080
30.91%

+0.41%

Do the maths.

Lol. Are you trying to fabricate numbers all of the sudden?

The 30.91% is the 30.91% of multi-monitors. Not of monitors in general. Either you are very, very bad at reading numbers. Or you are playing dumb on purpose to further your arguments.
It's pretty obvious that it was meant to be 1.5x and he forgot a dot, chill dude.

Is it still "pretty obvious" to you, now that he clearly made a point of the opposite?
Oh and 2560x1080 is outdated?

Acer Predator Z35, 35" 2560x1080p curved 144hz gaming monitor.
Release date: December 2015

Acer BX340ck 2560x1080p
Release date: December 2015

Wow, you found two examples! Congrats!

Here: https://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100160979 4814 601276169&IsNodeId=1&cm_sp=Tab_Gaming-Monitors_2-_-TopNav-_-27inch

Look at the numbers of 2560x1080 that are much older than that. Not look at the number of 3440x1440, that all happen to be pretty new.

2
The thing is, the average user isn't using brand new hardware, so why will they be using brand new monitors and gaming at higher resolutions?

Oh, I'm sorry. Did I forget to mention that we are discussing GTX 1080 users (who amount to 1%) playing on 1440p resolution (who amount to 1.8%)? Did you not get that part?
 
D

DeletedMember377562

so you're taking two separate values, looking at their independent usage numbers, and making an assumption towards their relatedness. You can't even qualify (in a significant manner) those two values as correlative .
.

Actually you can. More than your claims based on....well based on NOTHING other than your own words. You can do it based on the fact if it's not GTX 1080s running those monitors, or even GTX 1070s, it must be GPUs that clearly can not handle those resolutions at any good frame rates. GTX 1080s, however, can handle that resolution perfectly, and are therefore more logical cards for those resolution.

But of course it makes a lot more sense for you that GPUs below GTX 1080 is running 1440p monitors than the 1080s themselves. Because all those GTX 1080s are occupying 1080p monitors right. "zinfamous logic", right there...

Having GTX 1080s running 1080p monitors is like having GTX 970s running a 650x480p monitors. I guess in your fucked up world the overwhelming majority of GTX 970 users have 480p monitors as well...

You seem to have very little experience working with actual data. If you do this for a living, you should ask your employer to pay for your tuition, because you need to go back to school.

Yes, because the amount of data you have presented in our discussion is staggering...

so you're taking two separate values, looking at their independent usage numbers, and making an assumption towards their relatedness. You can't even qualify (in a significant manner) those two values as correlative .
This may not be completely dishonest, but it is wholly incompetent.
.

Whereas making the claim that the overwhelming majority of GTX 1080 users are on 1080p displays is wholly competent. Bwahhahahhahahha...

Again, you are accepting the answer that you want and whatever standard tool contradicts that answer, is somehow illegitimate.

What tool? You haven't brought forward any arguments?

I'm just criticizing how sites are benchmarking GPUs like GTX 1080s and Titan XP's in only 1080p, when these GPUs, by all and every feasible understanding, are played at higher resolutions. Of course, I know you don't understand that. But It's not my fault you have zero understanding of how the world works.

Insulting other members is not allowed.
Markfw
Anandtech Moderator
 
Last edited by a moderator:

lolfail9001

Golden Member
Sep 9, 2016
1,056
353
96
Actually you know, i was about to write a long reply noting that you are not really competent at all, use imaginary sources et cetera.

But Abwx put me back on track, it's derailed enough.
 
Reactions: dfk7677

parkerface

Member
Aug 15, 2015
49
32
91
People that own BF1 and an i5 will have understood what I mean by now.

I'm using a 4670k running at 4.6ghz and I definitely see CPU related stuttering in BF1. My rx480 can play 1080p ultra without breaking a sweat, but a 64 player game on something like Empire's Edge or Suez Canal can cause framerates to drop down to mid 40's during the chaos.

Getting a CPU with more threads is definitely next on my upgrade list.
 
D

DeletedMember377562

"Imaginary sources":

http://www.eurogamer.net/articles/digitalfoundry-2016-what-is-the-fastest-gaming-cpu

http://www.eurogamer.net/articles/digitalfoundry-2017-intel-kaby-lake-core-i7-7700k-review

http://www.eurogamer.net/articles/digitalfoundry-2015-intel-skylake-core-i7-6700k-review

http://www.tomshardware.com/reviews/multi-core-cpu-scaling-directx-11,4768-2.html

But let's look at the sources brought forward by you guys. Oh wait...there are NONE.

I'm using a 4670k running at 4.6ghz and I definitely see CPU related stuttering in BF1. My rx480 can play 1080p ultra without breaking a sweat, but a 64 player game on something like Empire's Edge or Suez Canal can cause framerates to drop down to mid 40's during the chaos.

Getting a CPU with more threads is definitely next on my upgrade list.

Again, in 1080p, that's completely understandable. Nobody has denied that. What we were discussing was in 1440p -- which I even admitted could push an i5 to its limits in BF1 MP, but not in 98% of other cases and games (as shown in the TH test above).

This is a Zen thread. Get back on topic.
Markfw
Anandtech Moderator
 
Last edited by a moderator:

parkerface

Member
Aug 15, 2015
49
32
91
Again, in 1080p, that's completely understandable. Nobody has denied that. What we were discussing was in 1440p -- which I even admitted could push an i5 to its limits in BF1 MP, but not in 98% of other cases and games (as shown in the TH test above).

You are not going to rope me in to your neurotic thread crapping. Move along.
 

Crumpet

Senior member
Jan 15, 2017
745
539
96
I'm using a 4670k running at 4.6ghz and I definitely see CPU related stuttering in BF1. My rx480 can play 1080p ultra without breaking a sweat, but a 64 player game on something like Empire's Edge or Suez Canal can cause framerates to drop down to mid 40's during the chaos.

Getting a CPU with more threads is definitely next on my upgrade list.

Well that at least correlates with my own experience, so there's that.

To be honest, i'm not asking for much from Ryzen, it doesn't even need to be better than my current i5.. It just needs to not be a quad core with 4 threads. So for me at least, Zen is a massive success. Anything past there is just a bonus.

I've really enjoyed reading into the direction AMD has gone though, taken steps to improve the way we compute, rather than just sheer power.
 

parkerface

Member
Aug 15, 2015
49
32
91
Well that at least correlates with my own experience, so there's that.

To be honest, i'm not asking for much from Ryzen, it doesn't even need to be better than my current i5.. It just needs to not be a quad core with 4 threads. So for me at least, Zen is a massive success. Anything past there is just a bonus.

I've really enjoyed reading into the direction AMD has gone though, taken steps to improve the way we compute, rather than just sheer power.

That sums up my thoughts as well. If a 4790k wasn't so damn expensive (for an older CPU) I'd drop that in--but for the money I'd have to pay to get one I feel like I may as well sell my current setup and have nearly enough for a nice Ryzen upgrade.
 

zinfamous

No Lifer
Jul 12, 2006
110,806
29,557
146
I'm just criticizing how sites are benchmarking GPUs like GTX 1080s and Titan XP's in only 1080p, when these GPUs, by all and every feasible understanding, are played at higher resolutions. Of course, I know you don't understand that. But It's not my fault you have zero understanding of how the world works.

They aren't benchmarking the GPUs. They are benchmarking the CPUs. that is the point.

I think this is your problem--you don't understand the test?

I don't know how the world works? Granted, I don't spend the majority of my time fantasizing about hardware and the habits of a tiny niche population of uber gamers, like you seem to do, but I do live and work in the real world. I'm a geneticist by trade, so I understand something about moving large data sets--I'm talking extremely large; not something you will ever see in this kind of hardware benchmarking--as well as proper, scientifically valid testing.

I really haven't seen you put forth a convincing argument that the current standard is as flawed as you claim. But I get it--it seems that you really don't understand why we are using overpowered GPUs at "underpowered" resolutions in order to benchmark CPU performance.

You sound...young? And so full of hubris and a generally inability to understand and accept how your assertions could actually be very wrong when you refuse to understand a certain concept. It's OK, it pretty much happens with everyone as they grow up.
 
Last edited:
Reactions: HutchinsonJC

Veradun

Senior member
Jul 29, 2016
564
780
136
Well i'm a GTX970 user and i'm at 1440p..

My girlfriend has an R9 390 and she's at 1440p.. (she's even using an AMD cpu, the horror! Oh..wait.. no nvm, it games at 1440p just fine)
I have an old terascale era HD6950 connected to an Asus MG279Q myself

Well , I'm waiting for Vega, but I'm currently playing games :>
 
Reactions: Crumpet

Crumpet

Senior member
Jan 15, 2017
745
539
96
Has AMD released any new RyZen benchmarks since the blender demo?

If they had we probably wouldn't be sat here arguing about Nvidia Graphics cards..

We are all in desperate need of some new info. However I cannot blame AMD for keeping this under such tight wraps. Whenever they release anything they are called out for "overhyping". Though at this point I genuinely don't believe I could be any more hyped for the release.

I genuinely loved my Fx8350, probably because it was flawed.. It had character, it was the plucky underdog, it overclocked like a demon and never let me down.
My i5 in comparison is a great chip, it plugs away and does stuff without complaining. It's a bit like a librarian. It does what it does really well, but there's just absolutely nothing special about it.
 

DrMrLordX

Lifer
Apr 27, 2000
21,804
11,157
136
I don't have even 1080p

I'm still on 1600x1200. 4:3 4 lyfe!!!

Has AMD released any new RyZen benchmarks since the blender demo?

Nope.

Er, they haven't, have they? I haven't seen anything.

Latest rumours have been concerning what DDR4 speeds will be supported by Ryzen/Summit Ridge. Some are saying, good luck with DDR4-3200, others saying DDR4-4000 . . . no real hard, concrete data there though.
 

Crumpet

Senior member
Jan 15, 2017
745
539
96
Well technically Skylake only supports 2100 and Kaby Lake 2400, so it's down to the motherboard manufacturers really, isn't it?
 

zinfamous

No Lifer
Jul 12, 2006
110,806
29,557
146
It's completely rational to assume that someone with GTX 1080s in SLI would run in a 4K system.

You're right! It is completely rational to assume this.

Now, find the actual data. Let's stop assuming. ....but let's also stop making the mistake that a rational assumption is an accurate approximation of the commonly irrational decision-making of the common consumer.

(and yes--I agree with you that, especially with SLI, this is where you will find 1080 users--but finding the data is a different monster)
 

coercitiv

Diamond Member
Jan 24, 2014
6,393
12,826
136
Though at this point I genuinely don't believe I could be any more hyped for the release.
Trust me, you could be: if a badly taken photo of an AM4 BIOS were to show a 6c/12t overclocked to 4.7Ghz you would enter a completely different state of hype.

In fact, the mere mention of the words above will prompt some people to ask if it's true.
 
Status
Not open for further replies.
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |