Discussion Speculation: Zen 4 (EPYC 4 "Genoa", Ryzen 7000, etc.)

Page 448 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Vattila

Senior member
Oct 22, 2004
805
1,394
136
Except for the details about the improvements in the microarchitecture, we now know pretty well what to expect with Zen 3.

The leaked presentation by AMD Senior Manager Martin Hilgeman shows that EPYC 3 "Milan" will, as promised and expected, reuse the current platform (SP3), and the system architecture and packaging looks to be the same, with the same 9-die chiplet design and the same maximum core and thread-count (no SMT-4, contrary to rumour). The biggest change revealed so far is the enlargement of the compute complex from 4 cores to 8 cores, all sharing a larger L3 cache ("32+ MB", likely to double to 64 MB, I think).

Hilgeman's slides did also show that EPYC 4 "Genoa" is in the definition phase (or was at the time of the presentation in September, at least), and will come with a new platform (SP5), with new memory support (likely DDR5).



What else do you think we will see with Zen 4? PCI-Express 5 support? Increased core-count? 4-way SMT? New packaging (interposer, 2.5D, 3D)? Integrated memory on package (HBM)?

Vote in the poll and share your thoughts!
 
Last edited:
Reactions: richardllewis_01

Hitman928

Diamond Member
Apr 15, 2012
5,593
8,770
136
I don't want to derail this thread anymore than it already has been, but HWU's 7600x vs 13600K has garnered a lot of criticism from lots of people, not just me. In fact, there's a large Reddit thread on r/hardware which discusses the results and Steve from HWU actually chimes in.


A poster in the Reddit thread brought up something interesting that I never noticed about the review. In the HWU roundup, A Plague Tale Requiem has the 13600K rig using nearly 120w more than the 7600x rig, yet it loses in that game by 10% at 1440p. How is that even possible?

I have that game, and it really only becomes moderately CPU intensive when the rats are on screen or if it's in a populated area. But if the 13600K rig is pulling that much extra power draw over the 7600x, it's not going to be from the CPU as a gaming workload won't stress the CPU that much. So assuming the power consumption figures were correct, the GPU must be drawing extra power with the 13600K......but where is the performance if that is the case? I guess all of that extra power is going into a black hole LOL!

In the Gamers Nexus 13600K review, Tech Jesus found that the 13600K drew about 44w more than the 7600x in Blender, a workload that maxes out all cores.

A Plague Tale requiem benchmarks are hard to find, but Computerbase.de did benchmark it at 720p with an RTX 4090 and a 13900K vs a 7950x and they found the 13900K to be 18% faster:



Other posters also took him to task for not including ray tracing in the benchmarks, when everyone knows that it increases CPU load significantly. Especially as he tested with an RTX 4090. These are legitimate criticisms whether you want to deny them or not.

In fact, the more I think about it, the more I vastly prefer Computerbase.de's approach to testing in that they are very specific in the details about their testing methodology and actually provide the settings used for each individual game.

HWU is still a relatively good source for bulk data points, but their testing methodology leaves much to be desired and has plenty of unexplained anomalies.

Computerbase restricts their memory to officially supported speeds only which means their 13th gen results will naturally be better than Zen 4 compared to HWU which uses at least decently tuned speed and timings. This makes Computerbase’s results valid but ultimately nearly pointless for gamers.

HWUB results are far more relevant for gamers, even if you toss out the seemingly anomalous results which HWUB themselves have identified and provided averages without them included.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Computerbase restricts their memory to officially supported speeds only which means their 13th gen results will naturally be better than Zen 4 compared to HWU which uses at least decently tuned speed and timings. This makes Computerbase’s results valid but ultimately nearly pointless for gamers.

This is fair, but Computerbase.de has very accurate results and we can go back and forth about memory speed all day because Raptor Lake can use much higher memory frequencies than Zen 4 can, yet most reviewers tend to run the Raptor Lake CPUs at the same memory frequencies as Zen 4, ie DDR5 6000.

HWUB results are far more relevant for gamers, even if you toss out the seemingly anomalous results which HWUB themselves have identified and provided averages without them included.

Yes, but HWUB has too many anomalies for my tastes that just flat out don't make any sense. The latest one being the power consumption for Plague Requiem, that I just noticed. How can the 13600K pull that much extra power out of the GPU, yet underperform compared to the 7600x system? Presumably, the GPU drawing more power would mean the performance increased, but in this case the 13600K falls behind at 1080p, 1440p and even 4K!?!

Makes you wonder...... You can extend that line of thinking to some of the other games in the lineup as well.
 
Reactions: Exist50

Hitman928

Diamond Member
Apr 15, 2012
5,593
8,770
136
This is fair, but Computerbase.de has very accurate results and we can go back and forth about memory speed all day because Raptor Lake can use much higher memory frequencies than Zen 4 can, yet most reviewers tend to run the Raptor Lake CPUs at the same memory frequencies as Zen 4, ie DDR5 6000.



Yes, but HWUB has too many anomalies for my tastes that just flat out don't make any sense. The latest one being the power consumption for Plague Requiem, that I just noticed. How can the 13600K pull that much extra power out of the GPU, yet underperform compared to the 7600x system? Presumably, the GPU drawing more power would mean the performance increased, but in this case the 13600K falls behind at 1080p, 1440p and even 4K!?!

Makes you wonder...... You can extend that line of thinking to some of the other games in the lineup as well.

HWUB used higher memory speeds for Intel.

You’re assuming the higher power consumption was used by the GPU. I agree, it’s something they should look into, but it’s not crazy enough to just throw the results out on face value. Being a system level measurement, there are a lot of variables that come into play.
 
Reactions: Tlh97 and Kaluan

Kaluan

Senior member
Jan 4, 2022
503
1,074
106
HWUB used higher memory speeds for Intel.

You’re assuming the higher power consumption was used by the GPU. I agree, it’s something they should look into, but it’s not crazy enough to just throw the results out on face value. Being a system level measurement, there are a lot of variables that come into play.

There's plenty of incobsitencies with CB data as well (beyond strange choice for RAM)... but we don't see people like him ever nitpick on those for 30 forum pages in a row, now, do we? 😅

Half of the post-launch comments on this thread are about him and his HUB bickering.

Anyway, the moment they started gibbering about reddit was the last confirmation I needed. They're not a troll, they're mental.
 
Reactions: DAPUNISHER

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
You’re assuming the higher power consumption was used by the GPU. I agree, it’s something they should look into, but it’s not crazy enough to just throw the results out on face value. Being a system level measurement, there are a lot of variables that come into play.

What else could be causing the additional power draw? Raptor Lake is very power efficient in gaming workloads, and when Computerbase.de tested the 13600K, the average power draw was 88w. The 7700x's average power draw was 72w, so I'm going to guess the average power draw for the 7600x may be in the 60w range, although they never tested the power draw for the 7600x.

In any case, the only other part that could account for an extra 100w is the GPU as that is the only thing other than the CPU that draws a significant amount of power. But yet the performance that you would expect to come from an additional 100w power draw isn't there.

And Plague Requiem doesn't support RT at this time, so the CPU usage ranges from small to moderate depending on what's on the screen.
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
There's plenty of incobsitencies with CB data as well (beyond strange choice for RAM)... but we don't see people like him ever nitpick on those for 30 forum pages in a row, now, do we? 😅

Official supported memory standard = strange choice of RAM got it!

Anything beyond the official memory standard is technically overclocked and not stock. While I myself never ever run purely stock settings, I respect the reviewer's decision to stick to standardized settings.

Anyway, the moment they started gibbering about reddit was the last confirmation I needed. They're not a troll, they're mental.

You like to snipe from the shadows, but not one of you has attempted to explain any of these anomalies. Your take is, "just accept it." Well, maybe you're that naive and want to assume that HWUB is infallible but I'm sure as hell not.
 

Hitman928

Diamond Member
Apr 15, 2012
5,593
8,770
136
What else could be causing the additional power draw? Raptor Lake is very power efficient in gaming workloads, and when Computerbase.de tested the 13600K, the average power draw was 88w. The 7700x's average power draw was 72w, so I'm going to guess the average power draw for the 7600x may be in the 60w range, although they never tested the power draw for the 7600x.

In any case, the only other part that could account for an extra 100w is the GPU as that is the only thing other than the CPU that draws a significant amount of power. But yet the performance that you would expect to come from an additional 100w power draw isn't there.

Faster memory with tighter timings takes power both from the memory and on the CPU. Different motherboards can have significantly different power usage. You also have different efficiency curves for the CPUs, motherboards, and PSUs which all come into play when taking a wall measurement. All of these things individually may seem small, but added up, they can be a large amount of power. How does resizable bar effect system power usage? How about with different levels of CPU cache? You can't just assume it's all GPU power making up the difference as there are so many options with no data to tell how the power is being used.

It's also possible that there's some kind of driver bug with the Intel platform and Nvidia cards in some cases where it increases GPU power without increasing performance. I tend to doubt it but it is possible and would still make HWUB's results valid. Hopefully HWUB can look into it further but without any kind of obvious flaw in their setup or how they are taking their data, I don't see any reason to ignore the results on its face.
 
Reactions: Elfear

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Faster memory with tighter timings takes power both from the memory and on the CPU. Different motherboards can have significantly different power usage. You also have different efficiency curves for the CPUs, motherboards, and PSUs which all come into play when taking a wall measurement. All of these things individually may seem small, but added up, they can be a large amount of power. How does resizable bar effect system power usage? How about with different levels of CPU cache?

You can't just assume it's all GPU power making up the difference as there are so many options with no data to tell how the power is being used.

Although you're absolutely correct that we have limited data to truly pinpoint what's happening, logic and process of elimination points to the GPU, because none of those things you mentioned can account for that much power draw. Both memory modules use the same voltage (1.10v) and probably use less than 10w as DDR5 has very low power draw. Everything else in the system was the same except for the CPU, motherboard and memory. Also, since I have that particular game, I know for a fact that it is heavily GPU bound and an RTX 4090 can consume up to 450w by itself at 4K maxed settings. I can post screenshots on demand if you want.

Even though HWUB's system power draw test is done at 2K, the Zen 4 system's power consumption is absurdly low and the Raptor Lake system is closer to what I would expect being an owner of the game and an RTX 4090.

In their 13600K review, HWUB measures the system power draw at 315w for the 13600K and 226w for the 7600x with blender. 90w difference, which is still less than the much lower power draw gaming workload LOL!



It's also possible that there's some kind of driver bug with the Intel platform and Nvidia cards in some cases where it increases GPU power without increasing performance. I tend to doubt it but it is possible and would still make HWUB's results valid. Hopefully HWUB can look into it further but without any kind of obvious flaw in their setup or how they are taking their data, I don't see any reason to ignore the results on its face.

I have the same platform and I've never noticed anything of the sort myself personally. Since I built my system, it has performed beyond my expectations even when it's underclocked and behaves as expected.

That said, I'm not saying we should discount HWUB's results completely. I'm only saying there are anomalous results in their round up which may point to problems with their methodology.
 
Reactions: Exist50

MarkPost

Senior member
Mar 1, 2017
239
345
136
In fact, the more I think about it, the more I vastly prefer Computerbase.de's approach to testing

errr no. A site testing ST perfomance with 4 benchmarks, three of them from the same tool (Cinebench), and the other (Pov-Ray) hurting Ryzen perfomance deactivating AVX2 when a Ryzen CPU is detected (*). Yeah great approach to testing.


(*) Fortunately Pov-Ray is open source so its easy to activate AVX2 use for Ryzen, just compiling binary removing that limitation. And this is what happens (my 7950X, stock):

binary with just AVX for Ryzen (ST)


binary with AVX2 ON for Ryzen (ST)


binary with just AVX for Ryzen (MT)


binary with AVX2 ON for Ryzen (MT)


So when you guys see pov-ray results (like the ComputerBase one), you have to know Ryzen perf is being damaged avoiding the use of AVX2 instruction set (15-16% dif). It wouldn't be slower than Raptor Lake, but actually faster:



 

Hitman928

Diamond Member
Apr 15, 2012
5,593
8,770
136
Although you're absolutely correct that we have limited data to truly pinpoint what's happening, logic and process of elimination points to the GPU, because none of those things you mentioned can account for that much power draw. Both memory modules use the same voltage (1.10v) and probably use less than 10w as DDR5 has very low power draw. Everything else in the system was the same except for the CPU, motherboard and memory. Also, since I have that particular game, I know for a fact that it is heavily GPU bound and an RTX 4090 can consume up to 450w by itself at 4K maxed settings. I can post screenshots on demand if you want.

Even though HWUB's system power draw test is done at 2K, the Zen 4 system's power consumption is absurdly low and the Raptor Lake system is closer to what I would expect being an owner of the game and an RTX 4090.

In their 13600K review, HWUB measures the system power draw at 315w for the 13600K and 226w for the 7600x with blender. 90w difference, which is still less than the much lower power draw gaming workload LOL!





I have the same platform and I've never noticed anything of the sort myself personally. Since I built my system, it has performed beyond my expectations even when it's underclocked and behaves as expected.

That said, I'm not saying we should discount HWUB's results completely. I'm only saying there are anomalous results in their round up which may point to problems with their methodology.

You are comparing 2 very different workloads and trying to come to a conclusion with only 2 data points and loads of variables. This is not possible. For instance, GN only found a 44.4W difference in CPU power being used during blender between a 13600k and a 7600x. If we were to try and use that in a cross comparison, it would show that the Intel platform has an unexplained 44.6 W of extra power consumption in the system that isn't being used by the CPU or GPU (Edit: JT's actually shows 135W difference in power between the 2 at the system level during Cinebench which would show an even greater amount of unaccounted for power [90.6 W] for the Intel platform). That's without considering PCIe links being active or how often the GPU would need to go to memory in the gaming scenario or how often it would have to load data through a high speed PCIe bus. So with almost no data and no controls over any variables, we can't reach any kind of conclusion here. You can say it looks strange, sure, but with zero evidence to the contrary or any actual flaw you can identify in their testing setup, I see no reason to say we should toss out this result just because it doesn't look like you would expect. If they look into it and say, we made a mistake here and update the results, great, but so far they seem pretty confident in their results so I feel comfortable in treating it as valid as any other data point from any other reviewer.

I'm glad you are very happy with your system, but unless you have actually replicated HWUB's system, software, and test methodology, it doesn't really give any insight into the validity of their results.
 
Last edited:

Yosar

Member
Mar 28, 2019
28
136
76
What makes those other reviewers crap?
Btw, the metareviews from 3Dcenter.org are usually supported by large-data set benchmarks from other popular (maybe you consider them "good") benchmarkers such as hardware unboxed.
For example the 5800x3d is ~15% faster than the 5800x in both hardware unboxed 41 game benchmark, as WELL as the meta review from 3Dcenter.

And 7600XT is 5% faster than 13600 K according to HU 54 games benchmark.
And according to 'metareview' based on 6-8 usually the same tests it's 13600 K which is 5% faster. That's why this 'metareview' is crap. You take already distorted results (too little sample of games) and multiply them 10 times using almost exactly the same data.
Not every benchmark is worth the same, and especially if they use the same tests. It's not how stochastic works. Quality of samples matter.
One HU 54 games benchmark is better and more trustworthy than 10 6-8 games benchmarks.
 

Exist50

Platinum Member
Aug 18, 2016
2,452
3,101
136
You are comparing 2 very different workloads and trying to come to a conclusion with only 2 data points and loads of variables.
The comparison was obviously to point out a "worst case" scenario.
You can say it looks strange, sure, but with zero evidence to the contrary or any actual flaw you can identify in their testing setup, I see no reason to say we should toss out this result just because it doesn't look like you would expect.
There is nothing reasonable or valid about a result that is 100W different than anyone else, and likewise with many of their other outliers. This is tech, not magic. You don't get 100W difference by random fluctuations, slight memory differences, etc.

There is very obviously something wrong with several of their results. Good outlets would look at that data, make the obvious conclusion, and then retest after fixing the underlying issue. HWUB decided to publish with none of that due diligence. That is plenty reason to throw out their results.
One HU 54 games benchmark is better and more trustworthy than 10 6-8 games benchmarks.
If HWUB was known for reasonably representative methodology, that might make sense. But all you're doing here is amplifying the impact of obviously flawed testing.
 
Reactions: Carfax83

Geddagod

Golden Member
Dec 28, 2021
1,205
1,172
106
And 7600XT is 5% faster than 13600 K according to HU 54 games benchmark.
And according to 'metareview' based on 6-8 usually the same tests it's 13600 K which is 5% faster. That's why this 'metareview' is crap. You take already distorted results (too little sample of games) and multiply them 10 times using almost exactly the same data.
Not every benchmark is worth the same, and especially if they use the same tests. It's not how stochastic works. Quality of samples matter.
One HU 54 games benchmark is better and more trustworthy than 10 6-8 games benchmarks.
A couple things:
HU unboxed battle field V benchmarked is bugged because of Intel E-cores glitch, which he himself said in the video, shrinking the margin to 4 percent. And if you actually WATCHED the video, he says margins that are within 5 percent are effectively a tie.
The meta review does include a repeat of many of the same games, sure, but that's because the games tested are supposed to be representative of CPU-limited scenarios and popular games.
But more importantly, I also know that 3dcenter also includes data like 99% lows and such, other than just averages, though I'm not sure if they did that on the raptor lake meta review, ik they did it in previous meta reviews. They also include data from lower resolutions to accentuate the performance differences.
Additionally HU also tests games in a methodology that many don't exactly agree with, and result in different percentages in games than basically every other reviewer. For example, Ik this was already discussed above in this thread, but the Plague Requiem anomaly.
So to wrap it up:
HUB themselves admitted the difference between the two CPUs was within the margin of error in their own video.
HUB has some testing methodology flaws that results in some games having varying results such as Plague and Battlefield 5
3Dcenter.org includes data from lower resolutions and also data such as 99 percent lows and such.
Overall I would STILL say 3dcenter is the "higher quality" source. Because even IF they have a lower aggregate game pool, the fact that the impact of flawed testing methodology could be minimized by having so many reviewers, means that it ends up being more accurate in the end.
 

Yosar

Member
Mar 28, 2019
28
136
76
If HWUB was known for reasonably representative methodology, that might make sense. But all you're doing here is amplifying the impact of obviously flawed testing.

And you know the methology of those tests from 'metareview'? And they are clearly better? You have that certainty? Like Computerbase shown above? Definitely much better methodology. 3 basically the same benchmarks, and one crippling internally one of the tested processors.
If anything the 'representative' methodology for game test with 6-8 is much more suspicious, due to being done in time constraints or simply lack of knowledge or whatever reasons.
13 tests for 21 in 'metareview' are 6-8 games tested benchmarks, the rest are equal or below 12, it's a pure joke not 'representative' game testing.
 
Reactions: Markfw

Geddagod

Golden Member
Dec 28, 2021
1,205
1,172
106
HWUB is a credible source of information on PC Hardware and technology. You are just some random dude on the internet.
I think HUB is a good source. The problem is that when pinpointed problems with his testing methodology for some games, they just refuse to amend their results nor do they even talk about their mistake to my knowledge.
The results are fine. They claimed the 13600k and 7600x performed basically the same since 5% was within the margin of error, and ye I think that's pretty much the case, as 3dcenter also showed them to be within 5% of each other, leaning towards the 13600k. Most of their games have results that still make sense and follow the line of other reviewers, so a couple of games tested incorrectly or that have 'anomalies' don't mess up the data too much.
But also, sayin the HWUB is a credible source of information because he is not just some "random dude" is kinda inaccurate. HWUB is also just some random dude on the internet, but with hundreds of thousands of more subscribers. The only problem is that popularity is a poor, poor reason to defend a persons claims... are you going to start calling Tom from MLID a legit leaker now because he also has thousands of subscribers?
 

Geddagod

Golden Member
Dec 28, 2021
1,205
1,172
106
And you know the methology of those tests from 'metareview'? And they are clearly better? You have that certainty? Like Computerbase shown above? Definitely much better methodology. 3 basically the same benchmarks, and one crippling internally one of the tested processors.
If anything the 'representative' methodology for game test with 6-8 is much more suspicious, due to being done in time constraints or simply lack of knowledge or whatever reasons.
13 tests for 21 in 'metareview' are 6-8 games tested benchmarks, the rest are equal or below 12, it's a pure joke not 'representative' game testing.
The meta review has something like 15 or so reviewers. The chances that ONE reviewer has made a mistake (which is already clearly shown where THEY THEMSELVES admitted there was a mistake with one of the games due to an E-core bug, an additional anomaly which doesn't match other reviewers or even common basic sense, and THEY THEMSELVES admitting that the 7600x and 13600k perform IDENTICALLY since 5% is considered withing the margin of error) is a lot higher than a significant amount of those other reviewers making the same mistakes which cause raptor lake CPUs to perform much higher than their zen 4 counter parts.
 
Reactions: Carfax83

Exist50

Platinum Member
Aug 18, 2016
2,452
3,101
136
HWUB is a credible source of information on PC Hardware and technology. You are just some random dude on the internet.
So, they're credible not only despite clearly inexplicable results, but you insist that they're more credible than almost every other outlet combined? Is that seriously the argument you want to make?

Or let's put it another way. An outlet is only credible if they produce credible results. You're reversing cause and effect.
And you know the methology of those tests from 'metareview'? And they are clearly better? You have that certainty?
You realize the irony of this, right? You're defending an outlet with very clearly flawed methodology, and insisting that instead it must be everyone else that's wrong. So yes, I don't believe that they're better than almost every other outlet combined.

It's incredible the flaws some people are willing to overlook when then produce results that conform to their preexisting biases.
 
Reactions: Carfax83

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
errr no. A site testing ST perfomance with 4 benchmarks, three of them from the same tool (Cinebench), and the other (Pov-Ray) hurting Ryzen perfomance deactivating AVX2 when a Ryzen CPU is detected (*). Yeah great approach to testing.

Wait a sec, are you implying that Computerbase.de intentionally deactivated AVX2, or that theCinebench and Pov-Ray applications themselves are deactivating AVX2 when a Ryzen CPU is detected.

I've never heard anything like what you are saying. I'm not that familiar with POV-Ray, but is AVX2 even activated by default? You would think it is, as AVX2 does come in handy for rendering workloads.

And wouldn't Intel CPUs be affected by this as well? Intel CPUs also support AVX2 as well you know, and actually, Raptor Lake has better AVX2 performance core for core compared to Zen 4.

So when you guys see pov-ray results (like the ComputerBase one), you have to know Ryzen perf is being damaged avoiding the use of AVX2 instruction set (15-16% dif). It wouldn't be slower than Raptor Lake, but actually faster:

Until we know whether AVX2 is on by default and has to be deactivated, or if it's off by default and has to be activated, I'm reserving judgement. I find it odd that AVX2 wouldn't be on by default as it is with Cinebench.

And these benches you uploaded are obviously edited:

 
Last edited:

Mopetar

Diamond Member
Jan 31, 2011
8,004
6,446
136
Oh no, not this again ☠☠☠

Apparently I missed something. Normally I'd just drop it to avoid rekindling a flame that's been smothered, but right now it'd only be like tossing a smoke bomb into a forest fire.

It looks like another post pointed out that it was a bug due to e-cores. Steve just needs to disable the e-cores to get an accurate result. Obviously it's important to report results like that though because someone running that game who doesn't know there's a bug is losing a lot of performance.

AMD really needs to hurry up and release Zen 4D if only to beat the pants off of everything else so this thread has a chance of getting back on track.
 
Reactions: Kaluan

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
You are comparing 2 very different workloads and trying to come to a conclusion with only 2 data points and loads of variables. This is not possible. For instance, GN only found a 44.4W difference in CPU power being used during blender between a 13600k and a 7600x. If we were to try and use that in a cross comparison, it would show that the Intel platform has an unexplained 44.6 W of extra power consumption in the system that isn't being used by the CPU or GPU (Edit: JT's actually shows 135W difference in power between the 2 at the system level during Cinebench which would show an even greater amount of unaccounted for power [90.6 W] for the Intel platform).

You're missing the point though. While you're correct that Jarrod's Tech found a 135W difference between the 13700K and the 7700x, the former is also 56% faster than the latter. So 56% faster, for 67% more power. This is different from the Plague Requiem discrepancy which shows the 13600K rig using 25% more power but performing 10% slower than the 7600x.

How is that even possible?

That's without considering PCIe links being active or how often the GPU would need to go to memory in the gaming scenario or how often it would have to load data through a high speed PCIe bus. So with almost no data and no controls over any variables, we can't reach any kind of conclusion here. You can say it looks strange, sure, but with zero evidence to the contrary or any actual flaw you can identify in their testing setup, I see no reason to say we should toss out this result just because it doesn't look like you would expect. If they look into it and say, we made a mistake here and update the results, great, but so far they seem pretty confident in their results so I feel comfortable in treating it as valid as any other data point from any other reviewer.

As I said above, it's not just the fact that the 13600K rig is using much more power than the 7600x, it's that it's supposedly using much more power while underperforming. Bringing up stuff like PCIe links and what not is just silly. We're talking at least 100w of power difference between the two here, and that cannot be explained away without bringing up the GPU as that is the only component other than the CPU which could account for the increased power draw.

And I'm certain it's not the CPU, because as I said, the game is primarily GPU heavy and only moderately taxes the CPU in crowded areas or when the rats are on screen. In normal gaming circumstances, the game is very light on the CPU.

I'm glad you are very happy with your system, but unless you have actually replicated HWUB's system, software, and test methodology, it doesn't really give any insight into the validity of their results.

What bothers me is that HWUB doesn't even seem to care about validating their results and being transparent about their testing methodology. I've already unsubscribed from them, because the BF5 fiasco was already pushing things, but now I see that is only the tip of the iceberg.
 

MarkPost

Senior member
Mar 1, 2017
239
345
136
Wait a sec, are you implying that Computerbase.de intentionally deactivated AVX2, or that theCinebench and Pov-Ray applications themselves are deactivating AVX2 when a Ryzen CPU is detected.

I've never heard anything like what you are saying. I'm not that familiar with POV-Ray, but is AVX2 even activated by default? You would think it is, as AVX2 does come in handy for rendering workloads.

And wouldn't Intel CPUs be affected by this as well? Intel CPUs also support AVX2 as well you know, and actually, Raptor Lake has better AVX2 performance core for core compared to Zen 4.



Until we know whether AVX2 is on by default and has to be deactivated, or if it's off by default and has to be activated, I'm reserving judgement. I find it odd that AVX2 wouldn't be on by default as it is with Cinebench.

I thought I was pretty clear: when binary detects a Ryzen CPU, it only uses AVX, avoiding AVX2 despite Ryzen as we all know includes AVX2. When binary detects an Intel CPU with AVX2 support, uses AVX2.

So for Ryzen to use AVX2, its necessary to activate it in source code (as I said, its open source) and compiling a new binary (i did that, and tested it, as can be seen above).

And these benches you uploaded are obviously edited:

Obviously. Its including Ryzen perfomance with a proper binary supporting AVX2 not only for Intel CPUs, but for Ryzen too.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |