- Oct 9, 1999
- 4,943
- 3,373
- 136
With the release of Alder Lake less than a week away and the "Lakes" thread having turned into a nightmare to navigate I thought it might be a good time to start a discussion thread solely for Alder Lake.
On paper, its looks good, and in gaming it wins. Not in all things. For example, (DC work) in primgrid, 2 different projects. One it gets creamed in my even a 3950x, and another, it beats the 5950x (well 8 cores of a 5950x). So it really depends on the exact workload. An power is all over the place.275 to 161 (at the wall, no video usage).I actually agree with you that a cross platform upgrade doesn't make the most sense since there are upgrade options available on the AM4 side, but in what world is a 5800X to 12700K a 'side grade'?
You get better IPC + higher clocks with the 12700K, plus more cores/threads. The 12700K wins in every metric except for worse power consumption, and even then its not in 12900K territory because Intel didn't squeeze it to an inch of its life. You should know this since you own a 12700K yourself.
The 12700K is a 5900X competitor, the 5800X is in the 12600K class.
FYI: You should seek real reviews and performance charts. There are certain people here who are going to recommend said product, no matter what. You aren't going to get an unbiased opinion, period. Just look at the benchmarks and the people who can actually test the products and know what they are talking about, then make your decision. Ultimately, most "users" don't really have the resources or testing ability to form a valid opinion. Look to the people who actually know what they are doing and conduct real test and not in just one specific area. Look for overall performance.I actually agree with you that a cross platform upgrade doesn't make the most sense since there are upgrade options available on the AM4 side, but in what world is a 5800X to 12700K a 'side grade'?
You get better IPC + higher clocks with the 12700K, plus more cores/threads. The 12700K wins in every metric except for worse power consumption, and even then its not in 12900K territory because Intel didn't squeeze it to an inch of its life. You should know this since you own a 12700K yourself.
The 12700K is a 5900X competitor, the 5800X is in the 12600K class.
And don't forget power efficiency. Overall performance really doesn't matter when its almost impossible to cool, or your electric bill goes through the roof.FYI: You should seek real reviews and performance charts. There are certain people here who are going to recommend said product, no matter what. You aren't going to get an unbiased opinion, period. Just look at the benchmarks and the people who can actually test the products and know what they are talking about, then make your decision. Ultimately, most "users" don't really have the resources or testing ability to form a valid opinion. Look to the people who actually know what they are doing and conduct real test and not in just one specific area. Look for overall performance.
Overall performance would probably favor the 12700k *more* than simply gaming performance though. Both are 8 cores, but the 12700k also has 4 e-cores that would help in productivity. Gaming probably not so much, unless one is running background tasks which are shifted to the e cores. I could see a rationale for selecting the 12700k over the 5800x for a build from scratch, but for an upgrade, I dont see it making any sense unless there is some specific game/task that runs unusually well on Intel.FYI: You should seek real reviews and performance charts. There are certain people here who are going to recommend said product, no matter what. You aren't going to get an unbiased opinion, period. Just look at the benchmarks and the people who can actually test the products and know what they are talking about, then make your decision. Ultimately, most "users" don't really have the resources or testing ability to form a valid opinion. Look to the people who actually know what they are doing and conduct real test and not in just one specific area. Look for overall performance.
I get that is your go to quote. But this is coming from someone who runs their PCs 24/7 at max performance for DC. Which is a VERY VERY limited segment,.And don't forget power efficiency. Overall performance really doesn't matter when its almost impossible to cool, or your electric bill goes through the roof.
First, digging on someone trying to help cure cancer will not buy you any good will. Even though I greatly dislike some people, after having had cancer (and may still) I would not wish it on anyone.I get that is your go to quote. But this is coming from someone who runs their PCs 24/7 at max performance for DC. Which is a VERY VERY limited segment,.
Let's quote your own signature.
All Doing Rosetta or WCG + F@H- -2 x Ryzen 3900x \ Ryzen 3950x \ 5 x Ryzen 5950x \ 12700F
EPYC 7401 \ Dual EPYC 7601 (64c/128t for both) \ EPYC 7742 (64c/128t) \ 7551 EPYC 32c/64t
2 x (Threadripper 1950X) EPYC 7452 \ Xeon E5-2683v3 \ EPYC 7742 (64c/128t)
TR 2970wx \ TR 2990WX \ TR 2990WX \ EPYC 7B12 @2.6 ghz
3070, 3080TI, 3070TI, 6 x (2080TI) 3 x (2060) 2 x (1080TI) , 1070TI and 2060 super, 1060
So do you REALLY think that a single CPU used for X amount of hours a day is going to push their energy bill? I mean, really? It's just an excuse. Funny enough, I never see you mention the power consumption used by people doing DC and how they run their computers so hard and full time? (Are you exempt to your own preaching?) Is that better or worse than someone running a single CPU in their rig for enjoyment or work? Do you think you have a lower energy use footprint than someone asking for a PC recommendation?
If we want to be REAL about it.....
There is a minimal impact to power usage, cost and such. I was running a 9900K before my 5950x. I noticed no real difference in my power bill or heating/cooling of the house. I mean, it's a PC sitting in my office. I, of course, have far more PC and servers, but you always want to represent the worst case, which isn't what most users are asking about. If you want to provide unbiased absolute recommendations, look past the whole DC and personal usage. The power/heat difference looks great on paper, but what real effect does it have in real world?
Power consumption is broadly comparable under gaming workloads, which do not max out the processor.And don't forget power efficiency. Overall performance really doesn't matter when its almost impossible to cool, or your electric bill goes through the roof.
First, I didn't dig on anyone trying to cure cancer, but who says DC is the right answer? I mean SETI went on for how long, yet it found nothing? I've worked with a huge academic organization and children's hospital on this very subject. Years and years ago, I even conducted a PC donation initiative on AGN/PC Abuses for members to donate PC parts so I could build PCs for Children living in on-site rooms/wings (of the children's hospital and off-site) so they had access to the internet and ability to do things outside of the constant medical care. I provided tax exemption forms for those individuals to write-off those parts off and did this on my own free will.First, digging on someone trying to help cure cancer will not buy you any good will. Even though I greatly dislike some people, after having had cancer (and may still) I would not wish it on anyone.
Second, notice I said cooling and power. My 12700F will hit 300 watts sometimes, then quickly throttle down to 161 when AVX-512 kicks in. If I had a 12900k, I imagine at the power curve it would be much harder to cool, and thats been said by all sorts of reviewers. I think its well known that Intel put the power curve and wattage limits so high on the 12900k, just to win a few extra benchmarks. The 12700F is much better, but still at times can be trying.
Granted that... But I am talking general usage. Until Intel disables AVX-512, and other software like rendering and encoding and such, it does use more power. I even recommended the 12700k to a few gamers, before 5800X3D seemed to have a recent release date.Power consumption is broadly comparable under gaming workloads, which do not max out the processor.
Honest question. Have you or any other review done an actual study on "general usage" with end users to see if they can tell the difference between similar Products from company A and B? Benchmarks are terrific for people like us, but if we are talking GENERAL usage, what is the percentage of people that notice those extra 10 frames at 200FPS, or saving a word document milliseconds quicker, running a powerpoint is faster? I've preached this from YEARS ago. Especially when people were saying "Intel is dead, AMD is dominating, I was like "What are you people smoking? No normal person sees those insignificant differences". Where is the use ability study or "one cup has Pepsi, one has Coke, which do you prefer?"Granted that... But I am talking general usage. Until Intel disables AVX-512, and other software like rendering and encoding and such, it does use more power. I even recommended the 12700k to a few gamers, before 5800X3D seemed to have a recent release date.
Since when is your niche compute workload anything close to "general usage"? Gaming is quite likely the most intense use case their system will see, and in that context your claims are outright false.Granted that... But I am talking general usage. Until Intel disables AVX-512, and other software like rendering and encoding and such, it does use more power. I even recommended the 12700k to a few gamers, before 5800X3D seemed to have a recent release date.
Per Intel - AVX512 is used for "scientific simulations, financial analytics, artificial intelligence (AI)/deep learning, 3D modeling and analysis, image and audio/video processing, cryptography and data compression" Plus, not sure why we're talking about it because no current AMD CPU supports AVX512? So how is it a positive or negative if Intel does or doesn't, lol?Since when is your niche compute workload anything close to "general usage"? Gaming is quite likely the most intense use case their system will see, and in that context your claims are outright false.
Also, Alder Lake doesn't support AVX512 out of the box...
When I disabled the 4 e-cores on my 12700F, and ran primegrid, several times the power jumped from 162 to 300 on my kill-a-watt. StefanR5R found (in log files that in fact it was using AVX-512 at least for a few seconds, we do not know how long. So, yes, out of the box mine certainly does.Since when is your niche compute workload anything close to "general usage"? Gaming is quite likely the most intense use case their system will see, and in that context your claims are outright false.
Also, Alder Lake doesn't support AVX512 out of the box...
Curiosity, please feel free to post screen shots, how long did the computer jump from 162 to 300watts? Would be nice to see the data and or videos of the jump for an extended period of time.When I disabled the 4 e-cores on my 12700F, and ran primegrid, several times the power jumped from 162 to 300 on my kill-a-watt. StefanR5R found (in log files that in fact it was using AVX-512 at least for a few seconds, we do not know how long. So, yes, out of the box mine certainly does.
1) Set your PL2 value lower or set the tau to a short period of time. Then your CPU won't spike so high. Or at least won't spike for any amount of time that matters.When I disabled the 4 e-cores on my 12700F, and ran primegrid, several times the power jumped from 162 to 300 on my kill-a-watt. StefanR5R found (in log files that in fact it was using AVX-512 at least for a few seconds, we do not know how long. So, yes, out of the box mine certainly does.
With the 4 e-cores it was taking 275 watts. Right now, its doing 162 with them disabled. I know the 4 e-cores should not take 100 watts, I am just telling you what I see, I watch these all the time, they are pretty real-time.Curiosity, please feel free to post screen shots, how long did the computer jump from 162 to 300watts? Would be nice to see the data and or videos of the jump for an extended period of time.
These run On Xeons as well, in linux. Maybe even a lot of them, I really don't know.The performance difference has got to be substantial for them to go to the trouble of writing AVX-512 optimized code.
These run On Xeons as well, in linux. Maybe even a lot of them, I really don't know.
Rosetta@home was used to create a drug that KILLED/CURED covid-19 (in the lab only) in 5 doses. Not sure when it will come to fruition, if ever, but maybe it has steered research to help create the vaccines.
Yes, very true. If you want a decent 690 Alderlake board, the cheapest I could find was $220, but AM4, I have a $150 motherboard on my last 5950x. The memory for DDR4 is the same, but if you go DDR5 on Alderlake, that just skyrockets the price, with not that much more performance. Thats why I went DDR4. And 3200 CL14 is the fastest DDR4 you can use, due to gear down. There are cheap Alderlake boards, but I wanted something midstream, so I went 690.One thing I think often is forgotten when talking value, is the total cost of motherboard, memory and CPU. It's very seldom you buy a CPU without buying the rest.