- Oct 9, 1999
- 4,946
- 3,375
- 136
With the release of Alder Lake less than a week away and the "Lakes" thread having turned into a nightmare to navigate I thought it might be a good time to start a discussion thread solely for Alder Lake.
This will explain it all:What is your problem?
And people wonder why we hate Intel most of the time !
The really bad power scaling of Celeron G6900 is evidence that something other than the GC cores is gulping up gobs of power. Is it possible that they fused off the other two GC cores in the die in such a way that they are not able to function but still using power during active usage of the functional cores?
What is YOUR problem is more the question. I was commenting on what idiot would pay $200 more for the same CPU that would use even more power than the 12900k.What is your problem?
It’s not like we’ve never seen such products before. AMD’s entire FX 9000 series being the most egregious example, but to a lesser extent also chips like the Coffee Lake 8086k. And it’s very common on the GPU side. Don’t see what all the fuss is about, especially since it can at least claim to be the absolute best CPU.What is YOUR problem is more the question. I was commenting on what idiot would pay $200 more for the same CPU that would use even more power than the 12900k.
The best Intel makes..... Otherwise its debatable.It’s not like we’ve never seen such products before. AMD’s entire FX 9000 series being the most egregious example, but to a lesser extent also chips like the Coffee Lake 8086k. And it’s very common on the GPU side. Don’t see what all the fuss is about, especially since it can at least claim to be the absolute best CPU.
I thought you where insulting me.. ok coolWhat is YOUR problem is more the question. I was commenting on what idiot would pay $200 more for the same CPU that would use even more power than the 12900k.
It’s going to be the best chip in a mainstream socket in both single and multithreaded performance.The best Intel makes..... Otherwise its debatable.
It’s going to be the best chip in a mainstream socket in both single and multithreaded performance.
In MT it wont, unless they increase all 16 cores frequencies by more than 9%, and of course we wont discuss power drain...
Intel Core i9-12900K, i7-12700K & i5-12600K im Test: Benchmarks in Anwendungen
Intel Alder Lake im Test: Benchmarks in Anwendungen / Ein rekordverdächtiger Testumfang / Windows 10 vs. Windows 11www.computerbase.de
I totally agree, except one thing everyone keeps forgetting. It does that at almost twice the power draw. So that another point that makes it arguable that the 12900k or ks is the best. Its the best at sucking power and generating heat, thats not arguable.It trades blows with the 5950X, you can get different aggregates / geomeans depending on the benchmarks and apps used to come up with those rankings.
For example: THG https://cdn.mos.cms.futurecdn.net/VucudWo8bsQAWkjhMjNy2f-970-80.png.webp
and TPU: https://www.techpowerup.com/review/intel-core-i9-12900k-alder-lake-12th-gen/23.html
I think the 12900K (and by extension, the 12900KS) is competitive enough with the 5950X to basically call it a wash in MT, but it edges it in gaming and moderately threaded workloads, so overall it is technically 'the best' CPU in terms of performance even if the 5950X might edge it in some MT workloads. Call it a split decision, if I can use a boxing analogy. 2/3... it ain't a KO, but its enough to get the win.
Now, Is it worth the premium for a 12900KS? Probably not if you're after price/performance. But neither is an RTX 3090 vs a 3080 Ti right? Its all relative. $200 means nothing to someone spending $5K on an insane gaming PC where only the 'best of the best' will do.
Yeah, its definitely not an efficient CPU due to Intel pushing the voltage/frequency curve to (IMO) crazy levels to well... 'get the win'. Its a win at all costs strategy, probably dictated by the marketing team more than the engineering team.I totally agree, except one thing everyone keeps forgetting. It does that at almost twice the power draw. So that another point that makes it arguable that the 12900k or ks is the best. Its the best at sucking power and generating heat, thats not arguable.
My 3080TI is rated at 400 watts from the factory ! EVGA GeForce RTX 3080 Ti FTW3 ULTRA GAMING. I downed the wattage to 250, and for what I do (F@H) its almost the same performance. And again, I agree. And those that think I hate Intel, check the build thread where I recommended a 12700k for the person for a gaming box.Yeah, its definitely not an efficient CPU due to Intel pushing the voltage/frequency curve to (IMO) crazy levels to well... 'get the win'. Its a win at all costs strategy, probably dictated by the marketing team more than the engineering team.
Personally I would undervolt a 12900K to maintain close to stock clocks but at more sane power levels. Intel even does this on their own chips, the all core clocks on a 12900 non K is only 100MHz lower than the 12900K (4.9GHz vs 5.0GHz) but has a PL1 of approx 200W instead of 241W.
Though to be fair, I doubt those building those multi thousand dollar gaming rigs care too much about power efficiency. After all, an RTX 3090 is 350W+ and the rumoured TDP of a 3090 Ti is 450W! So an extra 150W on the CPU is hardly going to make them blink an eye. Again, its all relative. At the bleeding edge its a totally different market where efficiency and value for the dollar aren't high on the list of priorities.
My 3080TI is rated at 400 watts from the factory ! EVGA GeForce RTX 3080 Ti FTW3 ULTRA GAMING. I downed the wattage to 250, and for what I do (F@H) its almost the same performance. And again, I agree. And those that think I hate Intel, check the build thread where I recommended a 12700k for the person for a gaming box.
It trades blows with the 5950X, you can get different aggregates / geomeans depending on the benchmarks and apps used to come up with those rankings.
For example: THG https://cdn.mos.cms.futurecdn.net/VucudWo8bsQAWkjhMjNy2f-970-80.png.webp
and TPU: https://www.techpowerup.com/review/intel-core-i9-12900k-alder-lake-12th-gen/23.html
I think the 12900K (and by extension, the 12900KS) is competitive enough with the 5950X to basically call it a wash in MT, but it edges it in gaming and moderately threaded workloads, so overall it is technically 'the best' CPU in terms of performance even if the 5950X might edge it in some MT workloads. Call it a split decision, if I can use a boxing analogy. 2/3... it ain't a KO, but its enough to get the win.
Now, Is it worth the premium for a 12900KS? Probably not if you're after price/performance. But neither is an RTX 3090 vs a 3080 Ti right? Its all relative. $200 means nothing to someone spending $5K on an insane gaming PC where only the 'best of the best' will do.
For MT it lose more often than it win, otherwise there wouldnt be thoses 9% difference at Computerbase, TPU use lowly threaded benches in their avarage and that s why they dislkay only 3% difference.
Back to system power TPU state 179W for the 5950X and 297W for the 12900K and that s in Cinebench, i wouldnt call such a CPU competitive, or else the FX9590 was more than competitive against a i7 3770K.
Again, if you are like Mark and value efficiency then there is no argument to be had that the 5950X wins that category. The 12900K and especially the 12900KS uses a brute force approach to get competitive MT performance...
I totally get your point, but surely you can acknowledge that there is a place for the 12900K/S for performance / watt aren't high on the agenda.
Personally, I think the Alter lake cores (P cores) are good for gaming and LIGHT multitasking, but for anything serious, the E cores make them a joke. For true multi core use, all cores should be equal IMO.Eventually for games since only the P cores will be put to contribution.
In MT that s a different story, even the compiling stuff where it is deemed as very good (when 8 cores are enough) doesnt hold a candle, what if i have several projects to compile and that i launch several instances.?.
It’s not like we’ve never seen such products before.
what if i have several projects to compile and that i launch several instances.?.
The really bad power scaling of Celeron G6900 is evidence that something other than the GC cores is gulping up gobs of power. Is it possible that they fused off the other two GC cores in the die in such a way that they are not able to function but still using power during active usage of the functional cores?
Heck, now I am thinking... even the Pentium Gold might suffer against a potential Octa core Pentium Silver chip.
I have to agree. That's because I see the Atom brand returning. So expecting this lineup...Probably just terrible sample and/or binning.
I don't think they'll call the 8 core Pentium Silver anymore. Also the TDP range is definitely getting extended beyond 10W if that's the case.
Also the 5950X scales pretty well with move power in MT workloads