I was leery of the power-consumption numbers of Ryzen all the way from AMD's demo to clueless reviewers using P95 stress-test numbers to report power consumption when Ryzen wasn't running that code optimally. Finally, here's a more realistic result of the actual power Ryzen consumes under full load. The 65w chip is at 113, 95w at 161, while Intel's 140watter is at 132. No free lunch here, folks!
http://hexus.net/tech/reviews/cpu/103270-amd-ryzen-7-1700-14nm-zen/?page=7
To emulate real-world usage scenarios, we record system-wide mains power draw when idle, when encoding video via HandBrake and while playing Deus Ex: Mankind Divided
The Ryzen 7 1700X's lower clock speed means that it takes a reasonable chop out of the R7 1800X's power consumption.
That same advice is applicable to the R7 1700, whose 65W TDP translates into lower multi-thread power consumption than a four-core Core i7-7700K, and you already know that it obliterates said chip in applications such as Cinebench and HandBrake.
AMD has done an impressive job in the performance-per-watt metric on Ryzen, with this design philosophy best exemplified by the R7 1700.
I know this..... but thanks for pointing it out anyway.TDP does not equal power consumption.
Krumme, must you always see red everywhere when someone raises any question about AMD? I am very well aware of the strengths of Ryzen in media encoding. I was merely pointing out that this is the first time (I am noticing) that the reported power consumption is actually on point to what I expected it to be. A lot, if not majority, of early reviewers were taking their power consumption numbers from Prime 95. The Stilt made it known in his technical thread that Prime 95 was broken. The result was that while Blender and Handbrake scores were obscenely high, power consumption figures were ridiculously low. This was a false picture. My first comment in the technical thread was to ask about this discord. Still interested in Ryzen's Prime 95 numbers once everything is fixed, by the way.Wake up dude. You still dont get the memo dont you?
You are referring to the handbrake test.
Did you notice the performance difference here for the same test?
A 1700 non x beats a 6900. Read again.
A 1700x beats a 6950. Yes the 350 usd cpu beats the 10 core 1800 usd intel. Read again.
Now go look at the power figures again.
http://m.hexus.net/tech/reviews/cpu/103270-amd-ryzen-7-1700-14nm-zen/?page=3
I know it wasnt you intention but you just proved both how insanely powerfull zen is and how efficient it is at the same time. Lol.
Free lunch for all that open their eyes to the new reality.
No worry, bro, I get it. See above. I do appreciate your comprehensive response and your demonstrated maturity in responding to my post. Much appreciated!You know that Hexus like most reviewers take their power readings of the whole system at the wall?
And that none of the system have a 0W idle?
So what you really want is the power deltas between idle and load.
Something like this :
Although that still doesn't tell the full story as what the chip draws is only after the PSU's and the VRM's inefficiencies. Now we don't know the VRMs but the PSU they used is a be quiet Dark Power Pro 11 (1,000W). Which is a Platinum rated supply and we can do better than that since TPU reviewed it:
https://www.techpowerup.com/reviews/beQuiet/DarkPowerPro11_1000W/6.html
Right, not going to account for that whole curve as what the FX9590 used at load (295W) is a lot different than what the i3-7350K did (68W), but if we said they are all on average 90%, then we get something like this:
Okay, for Handbrake the i7-6950K doesn't use 140W, but the 1700 and 1700X are well within their 65W and 95W TPU. The 1800X exceeds it a bit.
Mind you, still no idea how much heat the VRMs waste - rather suspect it's more than 7.6W though.
The 6950x is a quad-channel setup and yet consumes 30Watts lower than the 1800x while only being 8fps slower. The 1700x is 4fps faster while consuming 6 watts more. So yeah, "no free lunch," my brother!
Yea, Intel might just price themselves right out of the artificial segment they themselves created.Yeah, your right.... it doesn't come free of charge.
Last time I looked its something like $1200 or similar extra in your pocket.
Depends on how much you want to overclock it. You need a pretty nice air cooler to get over 3.7Ghz sustained.What is the minimum cooler to overclock 1700 ?
I know this..... but thanks for pointing it out anyway.
I was merely pointing out that this is the first time (I am noticing) that the reported power consumption is actually on point to what I expected it to be.
Unbelievable!Ahh broe. You were talking about "clueless reviewers" then went on to link a bm for total power consumption that dont stress the fpu in core framing it like 1700 was 113w while hedt Intel was only 130w. Thats perhaps even more clueless or misleading than using prime.
By all means. For this workload you can have your cake and eat it - free lunch all the way if you like - with the 1700. Its both faster and uses less energy than either 7700 or 6900. And cost less. I mean if this is not the definition of free lunch i dont know what it is?
Its 100% the oposite of what your message was. You were just 100% wrong. So what. Good news. Move on.
It seems Asus had some issue with the Hero's BIOS, now solved: https://twitter.com/BitsAndChipsEng/status/846872691982381058
You know that Hexus like most reviewers take their power readings of the whole system at the wall?
And that none of the system have a 0W idle?
So what you really want is the power deltas between idle and load.
Something like this :
Although that still doesn't tell the full story as what the chip draws is only after the PSU's and the VRM's inefficiencies. Now we don't know the VRMs but the PSU they used is a be quiet Dark Power Pro 11 (1,000W). Which is a Platinum rated supply and we can do better than that since TPU reviewed it:
https://www.techpowerup.com/reviews/beQuiet/DarkPowerPro11_1000W/6.html
Right, not going to account for that whole curve as what the FX9590 used at load (295W) is a lot different than what the i3-7350K did (68W), but if we said they are all on average 90%, then we get something like this:
Okay, for Handbrake the i7-6950K doesn't use 140W, but the 1700 and 1700X are well within their 65W and 95W TPU. The 1800X exceeds it a bit.
Mind you, still no idea how much heat the VRMs waste - rather suspect it's more than 7.6W though.
VRM efficiency varies with load, as with PSU, with similar curve, but absolute maximum VRM efficiency is around 86-88%...
Nitpicking aside, the general quality of CPU reviews these days really isn't up to snuff. Most sites are only showing a few data points without proper apples to apples or context provided. In reality every data point is valid as long as the context of the test methodology is understood. I won't harp on you about cherry picked benchmarks because that is clearly showing an entire genre of games (RTS) that are typically very CPU bound under performing compared to the 7700k. The makers of Total War and Ashes have already come out and said they're in the process of writing optimizations for Ryzen topology. I imagine game engine optimization will show big yields with regard to how the threads are managed for AI simulations and physics due to the sheer scale of battles in these games.
I would love to see some serious frame time analysis of these games at 1080p with streaming, Discord, music playing, and various chrome tabs open. This is what most people do that game online, every single one of my friends that games uses their PC in this way. Sitting in competitive queue while listening to music, watching you tube, reading articles and voice comm. at the same time is pretty typical behavior. This is the true CPU test that reviewers don't seem to want to deep dive into, because empirically you can't guarantee the same load every time. What would probably be far more useful data would be to have the forum users run frame analysis on their own Ryzen systems for a few days in a row and take the overall average and organize it by GPU so we can see real numbers from the real world.
It hit me a couple days ago.The fact that we are even splitting hairs over performance in many games and applications is incredible. AMD has mounted one of the biggest comebacks ever, never thought I'd again see the day where I actually wanted AMD processors in my rigs.
I have an i5 7500, and I am "mildly enraged" that I can get a 6C/12T overclockable CPU for roughly the same price.It hit me a couple days ago.
I want an AMD CPU in my rig. Me, a 6600K owner who up until now was perfectly satisfied.
They broke up the market so much that despite being satisfied with my processor, I want to upgrade. I don't find 4C/4T acceptable anymore when a 6C/12T option will be available for the same price soon. By Zen2 I should have the funds (and the actual need) to upgrade, so that's probably when I will do so, but damn if even now Zen isn't tempting!
For the entire duration of my interest in this field, AMD was basically irrelevant. I only got interested in processors when the FX 8350 and friends were all AMD had. This is brand new for me pretty much lol
Spot on. As an enthusiast gamer, playing @4K I would trade my 6700K OC@4.9+GHz for 1700.
I have to turn off YouTube when I play BF4 because it drops the frames a fair bit.
Normally when I game, I multitask also, and so do all of my friends. Multiple tabs open, skype, YT/Twitch@1080p (or skype video) on second screen...
Review sites are so behind the times, they are like your grandparents sending you snailmail and mentioning how long it takes.
All these game benchmarks in a vacuum, at 1080p, in Singleplayer instead of Multiplayer (if its mainly a MP title) are as useful to a real modern gamer as synthetic ones. Then when directly comparing a 4c to 8c cpu, they make even less sense, in today's multitasking world.
Current CPUs are B1Does anyone have an idea what a B2 stepping could bring? What stepping are we at now?
For the entire duration of my interest in this field, AMD was basically irrelevant. I only got interested in processors when the FX 8350 and friends were all AMD had. This is brand new for me pretty much lol
Sure, but wouldn't that require 64 BF1 accounts though?re. multiplayer benchmarks...
is it really beyond the wit of man to have several remote computers send keyboard/mouse scripted actions of "players" in a multi-player environment, as well as have a keyboard/mouse script on the local computer that can allow for a repeated multi-"player" environment for benchmarking?
It can't be much more complicated.
Games are typically not deterministic, which means that there will be subtle differences each run. Add unpredictable network latency to that, and the inputs will end up out of sync.re. multiplayer benchmarks...
is it really beyond the wit of man to have several remote computers send keyboard/mouse scripted actions of "players" in a multi-player environment, as well as have a keyboard/mouse script on the local computer that can allow for a repeated multi-"player" environment for benchmarking?
It can't be much more complicated.