Discussion Future ARM Cortex + Neoverse µArchs Discussion

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Doug S

Platinum Member
Feb 8, 2020
2,536
4,171
136
I'd love to see platforms report Geekbench's peak power draw. Phoronix did once with the 7840U.

Geekbench 6.1 | AMD 7840U | TSMC N4
1T / single-core: ~21W
nT / multi-core: ~30W

Unfortunately, that is the max power consumption, because Geekbench has intervals of zero CPU load between sub-tests.


While I agree that would be nice, the problem is there is no standard way to measure peak power draw so GB6 couldn't include it in the app's results. Even if you manually measure it like some (Phoronix, Anandtech) have the API that various platforms provide isn't measuring the same thing so it isn't necessarily useful for comparing Apple vs Qualcomm or Intel vs AMD.
 
Reactions: ikjadoon

FlameTail

Diamond Member
Dec 15, 2021
3,264
1,880
106


Well now we know why the rumours of a Snapdragon 8 gen 3 with X4 @ 3.7 GHz didn't pan out.

The power consumption is high at 3.3 GHz already.
 
Reactions: Tlh97 and Saylick

soresu

Diamond Member
Dec 19, 2014
3,024
2,265
136
View attachment 88942

Well now we know why the rumours of a Snapdragon 8 gen 3 with X4 @ 3.7 GHz didn't pan out.

The power consumption is high at 3.3 GHz already.
The big power increase vs SD8G2 suggests it was already straddling the optimal position on the frequency curve with X3, while X4 just asks too much of the µArch for minimal gain, especially in on the FP side which seems like a laughably pitiful perf increase for such a large power draw increase.

Enough to make me wonder if QC didn't set up SD8G3 to fail on purpose so that Oryon based SD8G4 will look better by comparison.

Does the same source have Dimensity 9300 power draw figures to compare to?
 
Reactions: Tlh97 and Saylick

Doug S

Platinum Member
Feb 8, 2020
2,536
4,171
136
Enough to make me wonder if QC didn't set up SD8G3 to fail on purpose so that Oryon based SD8G4 will look better by comparison.

That's a ridiculous idea. Qualcomm's high end will be the G3 for an entire year. If it was "set up to fail on purpose" they'd be handing design wins to Mediatek who is showing they are now able to compete at the high end. Why should Qualcomm want to reduce its own revenue and increase Mediatek's, just to make the jump to G4 look bigger?

By that theory you could suggest Apple has been sandbagging its SoCs the last few years, planning to make huge gains with A18. At least Apple would be able to get away with that without losing revenue, since they remain the top performer and have a mostly captive market. But it still wouldn't make any sense.

Now maybe the G3's design was finalized by the B team, while the A team was working on Oryon. That's believable and could explain why it is not able to match up to "rumors".
 

soresu

Diamond Member
Dec 19, 2014
3,024
2,265
136
they'd be handing design wins to Mediatek
QC keeps its dominance mostly through its modems rather than anything else.

Mediatek also have pretty poor software standards for their device drivers relative to QC and Samsung.

Final point is that a single generation gone bad or b0rked will not prompt a revolt among OEMs.

It didn't happen with SD810, and it won't happen here either.
 
Reactions: Tlh97 and Saylick

soresu

Diamond Member
Dec 19, 2014
3,024
2,265
136
By that theory you could suggest Apple has been sandbagging its SoCs the last few years, planning to make huge gains with A18
Some would say that's exactly what they did prior to the M1 Mac with Intel hw updates, which is where I got the idea from in the first place.
 

Doug S

Platinum Member
Feb 8, 2020
2,536
4,171
136
QC keeps its dominance mostly through its modems rather than anything else.

Time has run out on that advantage, since Qualcomm's proprietary 3G CDMA is no longer a factor. Even if Qualcomm's modems are better in terms of pulling in signals on the fringes or whatever that's not going to influence people's purchase decisions. People didn't avoid buying iPhones when they were using Intel's modems (and while I don't know I'm gonna assume Mediatek's are a lot closer to Qualcomm's in quality than Intel's were) nor do they avoid buying Samsung phones using their own modem - some people avoided Samsungs with Exynos but it wasn't due to the modem it was because until recently they used their crappy in house CPU cores that were worse than ARM designed cores.
 

FlameTail

Diamond Member
Dec 15, 2021
3,264
1,880
106
Time has run out on that advantage, since Qualcomm's proprietary 3G CDMA is no longer a factor. Even if Qualcomm's modems are better in terms of pulling in signals on the fringes or whatever that's not going to influence people's purchase decisions. People didn't avoid buying iPhones when they were using Intel's modems (and while I don't know I'm gonna assume Mediatek's are a lot closer to Qualcomm's in quality than Intel's were) nor do they avoid buying Samsung phones using their own modem - some people avoided Samsungs with Exynos but it wasn't due to the modem it was because until recently they used their crappy in house CPU cores that were worse than ARM designed cores.
Samsung's integrated modems in their Exynos processors seem to be fine.

But what's not fine is their discrete modems. If you go to r/GooglePixel for instance, you can a gazillion posts about signal/modem issues with recent Pixel phones- all of which used discrete Samsung modems.
 

ikjadoon

Member
Sep 4, 2006
154
260
146
As these results show, Arm still use ~ 25% more power on a similar process node to lose to an an A16 or A15 by 8-14%.
The X4 is a generic Firestorm core. Matches an M1/A15 roughly on GB5 and close on Spec, albeit 26% more power draw (in the Spec example and probably similar for GB5 with the 8 Gen 3).
View attachment 88942

Well now we know why the rumours of a Snapdragon 8 gen 3 with X4 @ 3.7 GHz didn't pan out.

The power consumption is high at 3.3 GHz already.

What a significant power increase, especially for SPECfp2017: perf / W dropped by 15%. Massive degradation.

Has Geekerwan published a methodology of its power testing?

Translating their legend via Google, "主板功耗 (W)" = "Mainboard power consumption (W)", which if true is decidedly not just CPU power. Is there a chance something else (DRAM, VRM, medium / little cores on background tasks) has eaten up power? I don't believe these are idle-normalized numbers. Geekerwan, unfortunately, does not probe the issue.

Not discounting their results outright: Arm might very well have shipped a very poor floating point design in the X4.

I don't know how AnandTech got its numbers, but losing 1W+ somewhere (either the CPU or the mainboard) for +6% perf looks like a 'failure' in mobile CPU design.

While I agree that would be nice, the problem is there is no standard way to measure peak power draw so GB6 couldn't include it in the app's results. Even if you manually measure it like some (Phoronix, Anandtech) have the API that various platforms provide isn't measuring the same thing so it isn't necessarily useful for comparing Apple vs Qualcomm or Intel vs AMD.

Ah, apologies. I'd meant platforms as news organisations / websites, but your point stands: there's not a great way to compare between manufacturers, but even intra-brand comparisons would be helpful.

Though, right, would need agreement on one tool.
 

FlameTail

Diamond Member
Dec 15, 2021
3,264
1,880
106
Apparently the reason Mediatek hasn't made PC chips is their are no Mali drivers for Windows.
 

FlameTail

Diamond Member
Dec 15, 2021
3,264
1,880
106
Am I the only one who really wants Nvidia to enter the PC CPU space with ARM CPUs?

With their legacy as one of big 3 PC companies (the other being Intel and AMD), and their formidable GPU IP, Nvidia entering the PC CPU space is what will truly cement Windows on ARM.
 
Jul 27, 2020
18,275
12,000
116
Nvidia entering the PC CPU space is what will truly cement Windows on ARM.
Question is, will they enable their chips to be used in $300 to $700 laptops? That's what the average person wants to spend on a laptop (even when they really need a laptop. If they don't need a laptop, there has to be something very special like a killer app or game to make them want a laptop).
 

soresu

Diamond Member
Dec 19, 2014
3,024
2,265
136
Apparently the reason Mediatek hasn't made PC chips is their are no Mali drivers for Windows.
Samsung having gone with AMD does give them an advantage here.

That being said, as long as ARM can field a decent Mali Vulkan driver for Windows then DXVK and Zink can probably handle the rest in the interim at least.
 

FlameTail

Diamond Member
Dec 15, 2021
3,264
1,880
106
Question is, will they enable their chips to be used in $300 to $700 laptops? That's what the average person wants to spend on a laptop (even when they really need a laptop. If they don't need a laptop, there has to be something very special like a killer app or game to make them want a laptop).
I don't think it matters.

The reason why I want Nvidia to enter the PC space is that it will finally PROVE that WindowsOnARM is a viable alternative to WinOnX86.

Qualcomm is basically a nobody in the PC industry. That's the truth. The Snapdragon X processors will change that perception, but not overnight. A lot of people in the PC/hardware community don't even know the Snapdragon X Elite exists, and many of those who do, scoff at the concept of WindowsOnARM.

I believe it's Nvidia, as a legacy player of the PC industry, who can truly change this situation.
 

soresu

Diamond Member
Dec 19, 2014
3,024
2,265
136
I believe it's Nvidia, as a legacy player of the PC industry, who can truly change this situation.
I don't see why Samsung SoC's using AMD gfx would be any worse in that respect.

nVidia count no better than any other player here.

They haven't had any presence as a full fledged SoC maker in any PC or mobile market since TX1, which is well known to have had a significant security flaw, and it's not like they are overflowing with Tegra orders either given the current biggest player in the smart automotive market designs their own chips.

AMD on the other hand could just put ARM Ltd cores into their IP and be off to the races.

Of course the reality is a bit more complex than this, but certainly not so much that it would present AMD much trouble with their combined engineering experience and resources.
 
Reactions: Tlh97

Tup3x

Golden Member
Dec 31, 2016
1,016
1,009
136
I don't see why Samsung SoC's using AMD gfx would be any worse in that respect.

nVidia count no better than any other player here.

They haven't had any presence as a full fledged SoC maker in any PC or mobile market since TX1, which is well known to have had a significant security flaw, and it's not like they are overflowing with Tegra orders either given the current biggest player in the smart automotive market designs their own chips.

AMD on the other hand could just put ARM Ltd cores into their IP and be off to the races.

Of course the reality is a bit more complex than this, but certainly not so much that it would present AMD much trouble with their combined engineering experience and resources.
I don't follow your logic. What you posted hardly makes sense, especially relative to previous paragraph. Guess what? NVIDIA could do the same.

NVIDIA has been making ARM SoCs with GeForce for ages now. AMD hasn't. One could argue that Exynos 2200 would count but based on that the future doesn't look that great - the GPU part was definitely the weakest link.
 
Reactions: Nothingness

FlameTail

Diamond Member
Dec 15, 2021
3,264
1,880
106
I don't know if Samsung has the license to use RDNA in PC SoCs?

That is what is said about Mediatek and Nvidia. Mediatek licensed Nvidia's GPU IP but it's apparently only for automotive chips and mobile chips possibly. Come to think of it, if Mediatek makes PC chips in direct competition, why should Nvidia help them?
 

soresu

Diamond Member
Dec 19, 2014
3,024
2,265
136
the GPU part was definitely the weakest link
Rembrandt's power efficiency with the same RDNA2 generation shows that the fault is clearly in Samsung's court there - the implementation of it on their own process node didn't work out so well.

I guess it will depend on how well AMD work with them in the future as to whether their implementations - perhaps this is part of why they are diversifying fab sources to better familiarise themselves with Samsung's nodes for future semi custom work.
NVIDIA has been making ARM SoCs with GeForce for ages now
As I said - automotive sales aside they have had zero presence in any OS marketshare since TX1.

It's not like they are exactly killing it in the automotive department either considering many, if not most major automakers seem to be looking to go with their own homegrown SoC solutions for autonomous driving and ADAS.
 
Reactions: Tlh97
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |