- Oct 9, 1999
- 4,962
- 3,392
- 136
With the release of Alder Lake less than a week away and the "Lakes" thread having turned into a nightmare to navigate I thought it might be a good time to start a discussion thread solely for Alder Lake.
On Newegg right now the 5900x is $524. 12900KF is $630.
To be clear, I understand that benchmarks need to show the differences between the CPUs when the stress is actually put on the CPU. However, my problem is that when you get to games, benchmarking a high-end CPU at 1080p is often not indicative of what someone will see when they go to WQHD or 4K (i.e. more GPU bound). As someone with an i9-9900k that plays games at WQHD@160Hz, what sort of benefit will I get? Are there any sites that take a peek at higher resolutions even though they may not be as indicative of raw CPU performance?
That's...unexpected.
Well ... Nothing to be honest. Even with a 3090 the 5950X is only 0-2% faster than the 5600X.
With a typical midrange GPU, anything from a 2700X, 8086K, etc on up will be functionality identical at 1080/ultra and 1440p/medhigh.
The 1080 benches are more for seeing what the ultimate capabilities of the architecture is once GPUs get fast enough.
That's...unexpected.
Huh, Blender performance per watt........................
View attachment 52333
View attachment 52335
But hey Geekbench scores are great.
Looks like Intel is ahead albeit with just more power usage. Probably going to skip until better DDR5 modules are out.
Not everything ran perfectly, though. In several of our tests, the workload got scheduled onto the wrong cores. We did use Windows 11 for all our testing, which has proper support for the big.LITTLE architecture of Alder Lake and includes the AMD L3 cache fix, too. Intel allocated extra silicon estate for "Thread Director," an AI-powered network in the CPU that's optimized to tell the OS where to place threads. However, several of our tests still showed very low performance. While wPrime as an old synthetic benchmark might not be a big deal, I'm puzzled by the highly popular MySQL database server not getting placed into the P cores. Maybe the logic is "but it's a server background process"? In that case, that logic is flawed. If a process is bottlenecked by around half (!) and it's the only process on the machine using a vast majority of processor resources, doesn't it deserve to go onto the high-performance cores instead? I would say so. Higher performance would not only achieve higher throughput, and faster answers to user requests, but it would also reduce power consumption because queries would be completed much faster. Other reviewers I've talked to have seen similar (few) placement issues with other software, so it seems Intel and Microsoft still have work to do. On the other hand, for gaming, Thread Director works pretty much perfectly. We didn't have time to test Alder Lake on Windows 10 yet—that article is coming next week.
Check out the TPU link.To be clear, I understand that benchmarks need to show the differences between the CPUs when the stress is actually put on the CPU. However, my problem is that when you get to games, benchmarking a high-end CPU at 1080p is often not indicative of what someone will see when they go to WQHD or 4K (i.e. more GPU bound). As someone with an i9-9900k that plays games at WQHD@160Hz, what sort of benefit will I get? Are there any sites that take a peek at higher resolutions even though they may not be as indicative of raw CPU performance?
System power consumption != CPU power consumption. Steve recorded 243W CPU power consumption at gamers nexus for blender.
EDIT: Apparently you CAN enable AVX-512.
AMD is perfectly fine, they can slash prices a bit on 5900x and 5950x. 11% higher IPC is really underwhelming given how much wider and bigger Golden Cove is. Intel needs to rethink its approach as this will not work in the long run, they need a brand new core just like AMD has done with Zen.TPU seems to be one of the most thorough reviews. HUB / ComputerBase did some DDR5 vs DDR4 testing. For most games it makes little difference. DDR5 4400 is obviously slower but DDR4 3800 vs DDR5 6000 is pretty much a wash (although WD:L seems to like DDR5).
At 4k all the games are GPU limited although for some reason the 3080 seems to have a high ceiling with Intel CPUs than AMD CPUs in RDR2.
Still it looks like ADL is around 7-10% faster than Zen 3 in gaming (with the 12600K holding up better vs the 12900K than I thought). Zen 3d is likely to beat that based on the 15% average uplift AMD saw.
No AnandTech review?
That should explain the reason why TPU had just the P-Cores enabled consuming much more power than made sense.Interestingly, Hardwaredeluxx testing suggest that AVX-512 can be activated by disabling the E-cores:
Funktioniert doch! AVX-512 mit Alder Lake-S - Seite 17 - Hardwareluxx
Core i9-12900K und Core i5-12600K: Hybrid-Desktop-CPUs Alder Lake im Test.www.hardwareluxx.de