Discussion Apple Silicon SoC thread

Page 240 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Eug

Lifer
Mar 11, 2000
23,871
1,438
126
M1
5 nm
Unified memory architecture - LP-DDR4
16 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 12 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache
(Apple claims the 4 high-effiency cores alone perform like a dual-core Intel MacBook Air)

8-core iGPU (but there is a 7-core variant, likely with one inactive core)
128 execution units
Up to 24576 concurrent threads
2.6 Teraflops
82 Gigatexels/s
41 gigapixels/s

16-core neural engine
Secure Enclave
USB 4

Products:
$999 ($899 edu) 13" MacBook Air (fanless) - 18 hour video playback battery life
$699 Mac mini (with fan)
$1299 ($1199 edu) 13" MacBook Pro (with fan) - 20 hour video playback battery life

Memory options 8 GB and 16 GB. No 32 GB option (unless you go Intel).

It should be noted that the M1 chip in these three Macs is the same (aside from GPU core number). Basically, Apple is taking the same approach which these chips as they do the iPhones and iPads. Just one SKU (excluding the X variants), which is the same across all iDevices (aside from maybe slight clock speed differences occasionally).

EDIT:



M1 Pro 8-core CPU (6+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 16-core GPU
M1 Max 10-core CPU (8+2), 24-core GPU
M1 Max 10-core CPU (8+2), 32-core GPU

M1 Pro and M1 Max discussion here:


M1 Ultra discussion here:


M2 discussion here:


Second Generation 5 nm
Unified memory architecture - LPDDR5, up to 24 GB and 100 GB/s
20 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 16 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache

10-core iGPU (but there is an 8-core variant)
3.6 Teraflops

16-core neural engine
Secure Enclave
USB 4

Hardware acceleration for 8K h.264, h.264, ProRes

M3 Family discussion here:


M4 Family discussion here:

 
Last edited:

Doug S

Platinum Member
Feb 8, 2020
2,836
4,820
136
My guess is they wanted to get to 8533 MT/s ram on M3 Pro/Max, but something went wrong. M2 Pro has 256-bit 6400 MT/s memory, so 204 GB/s theoretical memory speed. M3 Pro chip got smaller 192-bit. To keep the same max memory speed, they'd need ram that at least 8533 MT/s

Have you seen any benchmarks (of actual applications, not synthetic stuff like STREAM type benchmarks) showing a regression from M2 Pro to M3 Pro that can be pinned on the reduced memory bandwidth? Not sure what could go "wrong" switching to LPDDR5X, it isn't like that was a brand new technology when M3 came out and they couldn't get enough parts (even if that was an issue they could have used LPDDR5 in the base M3 and LPDDR5X only for Pro/Max)

I think they made that change for other reasons, and because it didn't matter too much to actual performance. Because of the relatively fixed memory configurations (only two LPDDR module sizes) it might have had to do with the switch from 8Gb to 12Gb DRAM chips and product segmentation (i.e. memory size, pricing, competitive position vs the base M3 and the M3 Max)
 

SpudLobby

Senior member
May 18, 2022
991
682
106
Lmao. The cost for Apple to do their own per unit, provided it’s good as you seem to imagine is the case currently, is going to be much less than Qualcomm’s pricing for a modem. That’s the whole reason they bought Intel’s division dude. They want to build something that’s almost as good or not noticeably different but lower cost. Qualcomm’s modems are unbelievably good from peak performance, idle + dynamic power, compatibility and standards perspectives and reception.

They didn’t buy Intel’s division because QC modems weren’t cutting it.
I honestly can’t believe I had to explain this.
 

SpudLobby

Senior member
May 18, 2022
991
682
106
I expect they will use LPDDR5X, it reduces power and increases performance. But Apple needs WAY more than Android OEMs who put it their flagship device, plus there is some degree of "our competition is doing it so we must do it too for when buyers compare specs" that Apple doesn't have to worry about.

When the availability and price work for them, they'll switch. When they do, no doubt people will whine "when is Apple going to support LPDDR5T!"
Yeah, nailed it. They’ve always been slightly behind on RAM adoption whereas Android flagships are usually ahead and dispersed over various different kinds (albeit generally newer) in one given year. Apple tends to wait for a bit more maturity both for volume and probably cost reasons. LPDDR5 6400 standard for the whole flagship lineup is fine, it’s not the end of the world.
 

GC2:CS

Member
Jul 6, 2018
32
19
81
Have you seen any benchmarks (of actual applications, not synthetic stuff like STREAM type benchmarks) showing a regression from M2 Pro to M3 Pro that can be pinned on the reduced memory bandwidth? Not sure what could go "wrong" switching to LPDDR5X, it isn't like that was a brand new technology when M3 came out and they couldn't get enough parts (even if that was an issue they could have used LPDDR5 in the base M3 and LPDDR5X only for Pro/Max)

I think they made that change for other reasons, and because it didn't matter too much to actual performance. Because of the relatively fixed memory configurations (only two LPDDR module sizes) it might have had to do with the switch from 8Gb to 12Gb DRAM chips and product segmentation (i.e. memory size, pricing, competitive position vs the base M3 and the M3 Max)
Back in the days apple was super agresive with scaling memory bandwidth. Like having 128 bit bus in 2012 or 51,2 GBps SoC in 2015. Got a new memory gen. the day it was avalable.
Then they got stuck on LPDDR4X from A11 to A15. Only moved to LPDDR5 on the M1 Pro and Max then M2 then A16.

One theory is that they leveled their hide and cache game so much that it is simply not adventageous to scale the bandwidth anymore. Also the performance growth has leveled off and it can cope with less bandwidth - like with losless and lossy memory compresion on A12 and A15.

That the negatives outweight the positives ?

Anybody invested in this who can give an educated opinion ?
 

eek2121

Diamond Member
Aug 2, 2005
3,108
4,410
136
The iPad was introduced in 2010, long before tethering was an option for most people because cellular companies discouraged or even actively prevented tethering by their customers.

Another reason you don't see it built into laptops is that you can get a little USB cellular dongle and have cellular capability in any laptop you want, rather than selecting from the small number that have it as a BTO option.
They are built into many laptops, just not macbooks.

If you use a laptop for an 8 hour workday, you are killing your phone’s battery if you are tethering the entire time.
Lmao. The cost for Apple to do their own per unit, provided it’s good as you seem to imagine is the case currently, is going to be much less than Qualcomm’s pricing for a modem. That’s the whole reason they bought Intel’s division dude. They want to build something that’s almost as good or not noticeably different but lower cost. Qualcomm’s modems are unbelievably good from peak performance, idle + dynamic power, compatibility and standards perspectives and reception.

They didn’t buy Intel’s division because QC modems weren’t cutting it.
I guess it went over your head so I will try again: Qualcomm owns patents on many different aspects of cellular connectivity. In the past, they have abused their patents to force everyone to use their modems. It doesn’t matter how much each unit costs.

Hypothetical Example:

Apple modem: $1.00/unit. Patent licensing: $100/unit
QC modem: $25/unit

If you were Apple, which would you choose?
 

Doug S

Platinum Member
Feb 8, 2020
2,836
4,820
136
I guess it went over your head so I will try again: Qualcomm owns patents on many different aspects of cellular connectivity. In the past, they have abused their patents to force everyone to use their modems. It doesn’t matter how much each unit costs.

Hypothetical Example:

Apple modem: $1.00/unit. Patent licensing: $100/unit
QC modem: $25/unit

If you were Apple, which would you choose?

Apple owns a fair number of cellular patents from their Intel purchase, as well as previous purchases. If they really wanted to prevent Qualcomm from troubling them again in this manner (after they had their own modem, whether internally developed or acquired elsewhere) they just need a critical mass of LTE/5G patents. If they have enough they can force Qualcomm into a patent cross licensing deal. The reason why those exist is because one side can play Qualcomm's patent licensing games, but if both sides have a ton of patents it is like mutually assured destruction. Basically the reason Qualcomm agrees to a cross license deal because the implied threat is "OK, if you want to play patent games we have plenty of patents too, let's play and see who gets hurt the worst!"

This is why I suggested what Apple should do is make a deal with Mediatek. Buy their modem design (so basically the two companies fork their development efforts at that point, each further developing from the modem at the time of the deal) and get rights to their IP so Qualcomm can no longer pressure them. Maybe spin that IP off into a jointly owned patent holding organization that licenses all those patents to both Mediatek and Apple forever, with licensing split - so Apple can leverage those patents against Qualcomm and force them into a cross licensing deal which would also benefit Mediatek (actually for all I know they already have a cross licensing deal with Qualcomm)
 

SpudLobby

Senior member
May 18, 2022
991
682
106
They are built into many laptops, just not macbooks.

If you use a laptop for an 8 hour workday, you are killing your phone’s battery if you are tethering the entire time.

I guess it went over your head so I will try again: Qualcomm owns patents on many different aspects of cellular connectivity. In the past, they have abused their patents to force everyone to use their modems. It doesn’t matter how much each unit costs.

Hypothetical Example:

Apple modem: $1.00/unit. Patent licensing: $100/unit
QC modem: $25/unit

If you were Apple, which would you choose?
Are you under the impression Apple bought Intel’s modem unit for fun? Or that they aren’t explicitly aware of Qualcomm’s IP charges? No, the economics still won’t favor buying a Qualcomm modem if the Apple modem is even decent. They dropped all litigation a few years ago and have a cross-licensing deal that covers exactly this, as Doug explains and as I believe is understood by most on this forum. Does it cover all? Maybe not, but last I checked they’ve dropped litigation, it’s over and again they bought Intel’s unit. Even Qualcomm were telling investors Apple were going to switch by 2025, what shifted was Apple’s internal progress on the hardware.
 

Doug S

Platinum Member
Feb 8, 2020
2,836
4,820
136
Are you under the impression Apple bought Intel’s modem unit for fun? Or that they aren’t explicitly aware of Qualcomm’s IP charges? No, the economics still won’t favor buying a Qualcomm modem if the Apple modem is even decent. They dropped all litigation a few years ago and have a cross-licensing deal that covers exactly this, as Doug explains and as I believe is understood by most on this forum. Does it cover all? Maybe not, but last I checked they’ve dropped litigation, it’s over and again they bought Intel’s unit. Even Qualcomm were telling investors Apple were going to switch by 2025, what shifted was Apple’s internal progress on the hardware.

Apple does not have a cross licensing deal with Qualcomm, they have a licensing deal. That gives them certainty of price, but that price is a lot higher than it would be if they had a cross licensing deal - which typically means neither side pays the other anything for patent licensing. Now that Qualcomm is designing their own CPUs they may have a greater incentive to pursue a wide cross licensing deal with Apple - something covering cellular as well as CPU, where Apple would have the stronger patent portfolio. Otherwise Apple might have some pressure points of their own to play against Qualcomm once they are no longer buying Qualcomm modems.

The deal Apple made with Qualcomm included a two year option, which Apple exercised last year. All the estimates (including Qualcomm's) for when Apple would stop using Qualcomm modems assumed they would not need that two year option. I've seen rumors indicating that the baseband software effort is in complete disarray, depending on how bad that is two years may not be enough.
 

SpudLobby

Senior member
May 18, 2022
991
682
106
Apple does not have a cross licensing deal with Qualcomm, they have a licensing deal. That gives them certainty of price, but that price is a lot higher than it would be if they had a cross licensing deal - which typically means neither side pays the other anything for patent licensing. Now that Qualcomm is designing their own CPUs they may have a greater incentive to pursue a wide cross licensing deal with Apple - something covering cellular as well as CPU, where Apple would have the stronger patent portfolio. Otherwise Apple might have some pressure points of their own to play against Qualcomm once they are no longer buying Qualcomm modems.

The deal Apple made with Qualcomm included a two year option, which Apple exercised last year. All the estimates (including Qualcomm's) for when Apple would stop using Qualcomm modems assumed they would not need that two year option. I've seen rumors indicating that the baseband software effort is in complete disarray, depending on how bad that is two years may not be enough.
I stand corrected. Still, they simply wouldn’t buy Intel and continuee development if the cost of rolling their own, even with patents, were as high as Eek believes, which makes no sense in light of the evidence we have.

Now, I’ve seen the same rumors RE: the modem, but that’s peripheral to Eek’s claim (and I’ve said this — their current problem remains their own hardware just isn’t good enough yet). So yes they extended their contract with a Qualcomm’s modems. Again, I know.
 

SpudLobby

Senior member
May 18, 2022
991
682
106
And yeah Doug, if Apple is smart they’ll try to wield their patent portfolio elsewhere to negotiate a cross-licensing deal that’s more favorable. I’m not sure it looks likely in the near term though, they seem to have quite a bit on their plate already and the modem unit itself has to develop something worthwhile — though as discussed, it doesn’t have to be as good as QC’s stuff. Just good enough.
 

Mopetar

Diamond Member
Jan 31, 2011
8,141
6,838
136
All of the mobile patents that are part of any of the standards have to be licensed on FRAND terms. Apple can't use any of them to force a cross-licensing deal.

Apple has enough volume that it would make sense for them to try to develop their own modem, but their efforts haven't panned out they way they had hoped.

Anyone who really needs to tether all day but doesn't want to use their phone could just get one of the stand-alone devices built for that purpose.
 

mikegg

Golden Member
Jan 30, 2010
1,847
471
136
View attachment 93008
First leak of Apple A18 Pro!

ST performance looks solid. >15% improvement. MT performance might lag behind next gen Android SoCs (said to exceed 10 000).
Who is this person? Seems to be just a guess.

I'd be surprised if it's 3,500. That's nearly 17% increase. I'm going to guess 10%. Unless Apple really went conservative with its N3B design and will put everything into the N3E design.
 

mikegg

Golden Member
Jan 30, 2010
1,847
471
136
Judging by the Apple Vision Pro reviews, future Mx chios definitely need more GPU grunt and display engines.
Why display engines?

You mean to allow more than one external monitor when connected to Mac? That's actually hardware requirements on the Mac side.

I think the reason they can only do one external monitor via Vision Pro is due to the wireless technology between the Mac and VP not being good enough for more.
 

Doug S

Platinum Member
Feb 8, 2020
2,836
4,820
136
All of the mobile patents that are part of any of the standards have to be licensed on FRAND terms. Apple can't use any of them to force a cross-licensing deal.

Apple has enough volume that it would make sense for them to try to develop their own modem, but their efforts haven't panned out they way they had hoped.

Anyone who really needs to tether all day but doesn't want to use their phone could just get one of the stand-alone devices built for that purpose.


FRAND doesn't mean FRAND when it comes to Qualcomm.

They have interpreted (and courts have unfortunately upheld) that it does not violate FRAND to charge for standards essential patents as a percentage of the price of the device it goes into. So they will charge 4x more if you use their patent in a device that costs 4x as much.

And yes you most definitely can use standards essential patents to force cross licensing deals. That's how they typically happen in fact. Apple holds some of the standards essential patents for LTE and 5G, but only a few percent of them, far less than Qualcomm, so they can't force cross licensing. But with Qualcomm designing their own high end CPU cores, and following the path Apple first trod, things may get interesting. But "interesting" only if Apple has its own modem (whether internally developed or acquired) If Apple needs to come crawling to Qualcomm asking for a new deal for buying their modems, they can't simultaneously put pressure on them patent wise. In fact I'm willing to bet almost anything that Qualcomm's contract allows them to end the deal if Apple sues them.
 
Reactions: SpudLobby

KaiCor

Junior Member
Feb 8, 2024
6
4
36
FRAND doesn't mean FRAND when it comes to Qualcomm.

They have interpreted (and courts have unfortunately upheld) that it does not violate FRAND to charge for standards essential patents as a percentage of the price of the device it goes into. So they will charge 4x more if you use their patent in a device that costs 4x as much.

And yes you most definitely can use standards essential patents to force cross licensing deals. That's how they typically happen in fact. Apple holds some of the standards essential patents for LTE and 5G, but only a few percent of them, far less than Qualcomm, so they can't force cross licensing. But with Qualcomm designing their own high end CPU cores, and following the path Apple first trod, things may get interesting. But "interesting" only if Apple has its own modem (whether internally developed or acquired) If Apple needs to come crawling to Qualcomm asking for a new deal for buying their modems, they can't simultaneously put pressure on them patent wise. In fact I'm willing to bet almost anything that Qualcomm's contract allows them to end the deal if Apple sues them.


Apple puts out very good products no matter what. I especially like Apple Pay, I use it to play casino games. Although I'm from Australia myself, the site is available worldwide. You can also check out the list of online casinos that accept Apple Pay - just check it out, you'll like it I think.
I have a hard time envisioning a situation where Apple sues Qualcomm. Most likely, they will decide to resolve it in a different way.
 
Last edited:

repoman27

Senior member
Dec 17, 2018
381
536
136
Hmmm... not seeing any UltraFusion D2D on this shot of the M3 Max. Unless that image is cropped, it looks like there won't be an M3 Ultra based on that die. I don't recall ever seeing a Palma 2C codename anywhere either.


Source: @techanalye1 on X:
Also surprised nobody seems to be talking about the R1 / APL1W08, which certainly looks like a 2.5/3D tiled package design.



Source: iFixit: https://www.ifixit.com/Guide/Apple+Vision+Pro+Chip+ID/169813
 
Last edited:

FlameTail

Diamond Member
Dec 15, 2021
4,095
2,465
106
Does Apple bin the caches in their SoCs?

If 30-core GPU version of M3 Max still has same size SLC as the 40-core, it's not dat bad is it? Despite he cut down memory bus.
 

Doug S

Platinum Member
Feb 8, 2020
2,836
4,820
136
Interestingly I saw a rumor yesterday claiming that A18 would have "a lot more" NPU cores. Whether that has to do with helping out other units like the GPU as above, or something to do with Apple's patents for running an on device LLM that relies on NAND instead of DRAM (since DRAM on a phone/typical PC is way too small to run a general LLM) who knows. I think they use dedicated ISP hardware for photography, but they might be able to cut that down in size and use the NPU for better area efficiency.

It is a very tiny portion of the overall die, so it has been suggested Apple could go 2-4x larger with little size penalty on the overall SoC if they had a use for that extra NPU power. Maybe that will become one of the A18 vs A18 Pro differentiators, since both will be made on N3E this time around.
 

gdansk

Diamond Member
Feb 8, 2011
3,109
4,825
136
I dread the idea of using NAND flash as new age swap in a device with serialized, non-replaceable storage. Do that in a server where you can hot swap storage as it fails but want to train a model that won't fit in your GPUs limited memory space.

Plus the paper was with a 24GB Nvidia GPU anyway? So hopefully not related to iPhones.
 
Last edited:
Mar 11, 2004
23,320
5,756
146
I dread the idea of using NAND flash as new age swap in a device with serialized, non-replaceable storage. Do that in a server where you can hot swap storage as it fails but want to train a model that won't fit in your GPUs limited memory space.

Plus the paper was with a 24GB Nvidia GPU anyway? So hopefully not related to iPhones.

I don't think they're rewriting large swaths of the data that often so it shouldn't really wear out the NAND, it just needs a substantial amount of data to inform its decision, so its basically reading from it. Some is adjusted I'm sure (and the level that they're at in Enterprise probably writes huge chunks but its also pulling from TB+ worth of data so the amount being written is likely much smaller, but I wouldn't think it'd be near constant writing or anything.

Apple for sure will need to address their small NAND though if that's their plan. My guess is they'll add some that won't be available to the end user. But I could see them touting that they upped storage or something. But then I wish they'd just add some SSD slots so the end user can treat the installed amount as system/app, and the separate SSD as storage (which would then also make it easier to carry over between systems). Then again, that's probably moot with TB5/USB 4 Gen 2 enabling very fast external drives, although it'd be nice if they could offer that support on say the Pro models of the iPhone (instead of the ok USB 3 Gen 2, and woesome slower than that on non Pro iPhones), so that you could backup from your phone easily. Which, I have seen one pro photographer that says iPhone 15 Pro enables them to write to an external drive not sure if it was SSD or more like thumb drive, which they use the iPhone as a supplemental camera (think they do like weddings and stuff, and so they'd setup the iPhone to record from certain spots, while they are moving about shooting with their pro camera). They acted like Android phones don't offer the same speeds (not sure, or if they do maybe could argue not as good of camera).
 

gdansk

Diamond Member
Feb 8, 2011
3,109
4,825
136
I don't think they're rewriting large swaths of the data that often so it shouldn't really wear out the NAND, it just needs a substantial amount of data to inform its decision, so its basically reading from it.
How do you read data from disk without it being written? And these models aren't small even when compressed and used only as needed as described in the paper. It really seems like a poor fit unless you have a more reliable form of storage for consumer devices.
 

Doug S

Platinum Member
Feb 8, 2020
2,836
4,820
136
I dread the idea of using NAND flash as new age swap in a device with serialized, non-replaceable storage. Do that in a server where you can hot swap storage as it fails but want to train a model that won't fit in your GPUs limited memory space.

Plus the paper was with a 24GB Nvidia GPU anyway? So hopefully not related to iPhones.

Apple's patents in that area did not involve using NAND flash as swap. That's so obvious that even in a world where far too many "obvious" patents are granted that it would never be granted.
 
Reactions: lightmanek
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |