- Oct 9, 1999
- 4,945
- 3,374
- 136
With the release of Alder Lake less than a week away and the "Lakes" thread having turned into a nightmare to navigate I thought it might be a good time to start a discussion thread solely for Alder Lake.
I'm not ignoring the power consumption. ADL will consume more power but it will also get the work done faster too, so the extra power consumption will be for shorter periods of time. This assumes that workloads are mixed. Sustained and prolonged highly parallel workloads make ADL the bad choice. But then, people who do that sort of stuff are not in the millions.First, that is one benchmark. Second, you are ignoring power consumption.
Maybe in the US. Not the case where I live (UAE). 5950X costs $800+ here.Oh, and MSRP might be higher, but my last 2 5950x were $600 and $580. And they go for $650 everyplace.
Worldwide, things differ. Lets drop this and stick to Alder lake in an Alder lake thread. They have good uses, but are not totally dominant right now. There are too many areas where they are behind. Default power settings for one. And as you said heavily 100% multi-threaded tasks. But its progress. But why they chose high power just to win a few benchmarks to me is a mistake on Intel's part.Maybe in the US. Not the case where I live (UAE). 5950X costs $800+ here.
That's 16 ADL threads smacking down 32 Zen 3 threads in MySQL. How is that NOT impressive?
That was not a joke but nonetheless it was amusing to observe how desperately you were trying to use your obvious dishonesty, even searching for a post from literally 20 years before and STILL not realizing what that sentence means 😂You made a good joke! If you need another:
I could go on and on spamming this thread with dozens of times he called Intel or its chips jokes, crap, etc. But, I prefer just to quote himself when he says he hates Intel in multiple threads.
Actually, he didn't have to go back 20 years. Thankfully, the person you're defending is quite consistent so here goes. This is from February 12 this year, just 15 days ago:That was not a joke but nonetheless it was amusing to observe how desperately you were trying to use your obvious dishonesty, even searching for a post from literally 20 years before and STILL not realizing what that sentence means 😂
And people wonder why we hate Intel most of the time ! Charging extra for whats already in the chip. They care about nothing but Money. I hope AMD can change that eventually through competition. In servers they are the king, but too many are unconvinced that AMD is better (stupid managers, I know, I retired from a company that had data center managers like that).
This post is only referencing servers...
Is that benchmark AVX-512 accelerated or something? Cypress Cove isn't better than Zen 3 IPC wise.
Also, Alderlake barely moves the needle vs Rocketlake in that benchmark, not an uplift worth mentioning.
I personally find it amusing when people are not impressed by Alder Lake. I especially don't understand it when I see this from the TPU i9-12900K review:
And I've seen some people say that Alder Lake only has a 7% IPC advantage over Zen 3?!
The bench in question can't even saturate all the 5800x's cores so there's no real core advantage; regardless, the uncapped 12400 is only 10% faster than the 5800x while the capped 12400 is only 6.25% faster. Without knowing more about clocks/power draw it's impossible to draw many conclusions about IPC from this bench alone. My guess is the 12400 is sitting at around 4.2-4.4 GHz while the 5800x . . . God only knows where that thing's running.
But when you compare it to the 5600x, which also has 6 cores and a slightly higher boost clock than the 12400 still loses by 26% for minimums and 13% for the average.
Okay but the 5600X also loses to the 5800X, which at least as of 2020 was kind of abnormal in a lot of game benchmarks (in some game benches at Vermeer's release, the 5600x was the fastest of the lot). Look at the minimums, something's going on there when moving to the 5800X. And we still don't know what clockspeed the 5600X is running either so . . . what conclusions can we draw?
I wonder what's their future strategy, supporting AVX512 so sparsely these past years... are they killing it entirely, reserved for servers only? Or maybe for newer chips (Arrow etc) so they can claim again that "feature"?
Intel is fusing off AVX-512 for real this time.
I wonder what's their future strategy, supporting AVX512 so sparsely these past years... are they killing it entirely, reserved for servers only? Or maybe for newer chips (Arrow etc) so they can claim again that "feature"?
Surely it's a weird step back after making it mainstream on rocket lake and some mobile parts.
Or they'll just deprecate AVX512 outside of server chips for the foreseeable future. I think that's quite plausible.Either the Atom cores will get AVX-512 support eventually or some transparent feature to kick processes back to the Big Cores if AVX-512 instructions are hit.
Or they'll just deprecate AVX512 outside of server chips for the foreseeable future. I think that's quite plausible.
AMD doesn't have the marketshare to drive ISA adoption by itself though. I don't think it's a good thing, but Intel killing AVX512 on client would de facto do the same for AMD.I doubt that given that Zen 4 supports AVX-512.
If they are really serious, they should microcode nuke 11th gen's AVX-512 support too.AMD doesn't have the marketshare to drive ISA adoption by itself though. I don't think it's a good thing, but Intel killing AVX512 on client would de facto do the same for AMD.
AMD doesn't have the marketshare to drive ISA adoption by itself though. I don't think it's a good thing, but Intel killing AVX512 on client would de facto do the same for AMD.
They advertised it, so too late. But I struggle to find another justification for their efforts to disable it other than really not wanting it to be used.If they are really serious, they should microcode nuke 11th gen's AVX-512 support too.