Discussion Apple Silicon SoC thread

Page 375 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Eug

Lifer
Mar 11, 2000
23,953
1,567
126
M1
5 nm
Unified memory architecture - LP-DDR4
16 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 12 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache
(Apple claims the 4 high-effiency cores alone perform like a dual-core Intel MacBook Air)

8-core iGPU (but there is a 7-core variant, likely with one inactive core)
128 execution units
Up to 24576 concurrent threads
2.6 Teraflops
82 Gigatexels/s
41 gigapixels/s

16-core neural engine
Secure Enclave
USB 4

Products:
$999 ($899 edu) 13" MacBook Air (fanless) - 18 hour video playback battery life
$699 Mac mini (with fan)
$1299 ($1199 edu) 13" MacBook Pro (with fan) - 20 hour video playback battery life

Memory options 8 GB and 16 GB. No 32 GB option (unless you go Intel).

It should be noted that the M1 chip in these three Macs is the same (aside from GPU core number). Basically, Apple is taking the same approach which these chips as they do the iPhones and iPads. Just one SKU (excluding the X variants), which is the same across all iDevices (aside from maybe slight clock speed differences occasionally).

EDIT:



M1 Pro 8-core CPU (6+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 16-core GPU
M1 Max 10-core CPU (8+2), 24-core GPU
M1 Max 10-core CPU (8+2), 32-core GPU

M1 Pro and M1 Max discussion here:


M1 Ultra discussion here:


M2 discussion here:


Second Generation 5 nm
Unified memory architecture - LPDDR5, up to 24 GB and 100 GB/s
20 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 16 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache

10-core iGPU (but there is an 8-core variant)
3.6 Teraflops

16-core neural engine
Secure Enclave
USB 4

Hardware acceleration for 8K h.264, h.264, ProRes

M3 Family discussion here:


M4 Family discussion here:

 
Last edited:

The Hardcard

Senior member
Oct 19, 2021
300
386
106
By the time Apple could get into the AI market en masse, the gold rush will be over. Few people seriously believe that the market for AI picks and shovels that Nvidia is selling is going to keep growing without end. The LLM strategy for achieving "intelligence" is already showing its limits with the problems OpenAI is having with ChatGPT 5 and even entrants trying to be extra careful to limit their AIs to minimize issues are getting egg on their face (see Siri's hallucinated "summary" of a BBC story that claimed Mangione had shot himself)

Once demand stops growing so supply can catch up AI computation will become a commodity, with players in that market unable to extract the massive margins Nvidia is currently enjoying. It makes perfect sense for Apple to produce its own capability internally, since they already have most of the pieces and can tailor it exactly to their needs. Selling to third parties is a totally different animal, and if Apple tries it will just be a lot of wasted money and effort. They can afford to waste an awful lot of money without doing any real harm to the overall business, but that effort would be better redirected at their forte which is the premium mass consumer market.
I don’t know about problems with ChatGPT 5, but OpenAI’s o1 and o3 models are huge breakthroughs in AI reasoning capabilities. What LLMs are capable of has just increased possibly by orders of magnitude. Reinforcement learning, chain of thought - RL CoT LLMs will be the story of 2025.

For everyone anxiously waiting for the AI bubble to burst, your wait just got extended, maybe indefinitely. It will soon be time to accept that not only is AI not going away, but soon CPUs and GPUs are going to become second class silicon.

Just going off of new rumors of of old rumors all with tiny pieces of information, this seems to be a shift by Apple. At first, I thought Baltra was going to follow Hidra, but now I think it will come instead of Hidra.

Apple is almost certainly not selling servers to third parties. They are building a cloud of small batch inference servers, like Google’s TPU cloud which also consists of hardware not sold to others. No coincidence. The guy leading Private Cloud Compute is the same guy who headed design of Google’s TPU cloud infrastructure.
 
Last edited:
Jul 27, 2020
22,298
15,555
146
What LLMs are capable of has just increased possibly by orders of magnitude. Reinforcement learning, chain of thought - RL CoT LLMs will be the story of 2025.
That's not really a good thing because we all know how good humans are at doing damage with a toy. Any toy. It will be abused far more than put to good use to make us more "civilized".
 

The Hardcard

Senior member
Oct 19, 2021
300
386
106
That's not really a good thing because we all know how good humans are at doing damage with a toy. Any toy. It will be abused far more than put to good use to make us more "civilized".

I don’t know why people think AI in general or OpenAI in particular is stalled on AI limitations. This just happened:

Breakthrough

It is a good thing. It will give good actors comparable power to bad actors.
 

johnsonwax

Member
Jun 27, 2024
118
195
76
I don’t know about problems with ChatGPT 5, but OpenAI’s o1 and o3 models are huge breakthroughs in AI reasoning capabilities. What LLMs are capable of has just increased possibly by orders of magnitude. Reinforcement learning, chain of thought - RL CoT LLMs will be the story of 2025.

For everyone anxiously waiting for the AI bubble to burst, your wait just got extended, maybe indefinitely. It will soon be time to accept that not only is AI not going away, but soon CPUs and GPUs are going to become second class silicon.

Just going off of new rumors of of old rumors all with tiny pieces of information, this seems to be a shift by Apple. At first, I thought Baltra was going to follow Hidra, but now I think it will come instead of Hidra.

Apple is almost certainly not selling servers to third parties. They are building a cloud of small batch inference servers, like Google’s TPU cloud which also consists of hardware not sold to others. No coincidence. The guy leading Private Cloud Compute is the same guy who headed design of Google’s TPU cloud infrastructure.
You're overindexing on tech and not on markets. o3 may be a tech advancement, but it's also $20 per query due to the execution/training costs. Sure, that'll drop, but if you can't articulate a business model for these that looks realistic, then it's a bubble. The housing bubble wasn't because homes weren't sufficiently advanced, it was because supply and demand had gotten f'd due to market manipulation by investors.

The operative questions here are:
1) Who is going to own the information/reasoning infrastructure that your business is reliant on?
2) Where is the data going to come from for the ongoing utility of that infrastructure?
3) If the business doesn't own it, how much rent seeking can take place on it?
4) Will businesses be allowed to own it?

Note, the market demand for PhDs is a lot more limited than you might realize, and there are a lot of disciplines where the marginal value of a PhD is pretty close to zero. The places where it has value is where rents can be extracted through IP ownership (go ask an engineering PhD adjunct professor what they earn). Who will own the IP from an AI development? The owner of the infrastructure or the renter of that infrastructure? The answer to that will determine both whether that infrastructure even bothers to be rented, and how broad the winners of the AI space will be.

Appreciate that once we hit AGI, the owner of that IP will effectively own everything as all subsequent IP will be fairly quickly consumed by that IP owner. What does that economy even look like, and how much does increasingly effective AI approach that state before AGI? And this is why the IP on which these systems are trained will become an issue - because enforcing that IP is really the only brake against that outcome.

The technology is the easy part.
 

poke01

Diamond Member
Mar 8, 2022
3,036
4,010
106
Apple does not OWN CUPS.

"CUPS is licensed under the Apache License Version 2.0."

Copyright © 2007-2022 Apple Inc. CUPS 2.2 and earlier are provided under the terms of the GNU GPL2 and LGPL2 with exceptions while CUPS 2.3 and later are provided under the terms of the Apache License, Version 2.0. CUPS, the CUPS logo, and macOS are trademarks of Apple Inc. All other trademarks are the property of their respective owners. Apple Privacy Policy
Apple owns the CUPS trademarks
 
Reactions: oak8292
Jul 27, 2020
22,298
15,555
146

okoroezenwa

Member
Dec 22, 2020
120
125
116
Eh, I doubt that.
 

Doug S

Diamond Member
Feb 8, 2020
3,005
5,167
136
Eh, I doubt that.

I also doubt it. Just because the CPU and GPU are separate chips doesn't mean they can't still use unified memory. They managed it with the Ultra, and it sounds like this new TSMC tech will let them accomplish something similar to that less expensively.

I suspect there is too much reliance on the unified memory within iOS/macOS especially where the GPU drivers/Metal are concerned for it to even be an option for Apple to ditch it.
 

The Hardcard

Senior member
Oct 19, 2021
300
386
106
This guy is one of many who don’t understand what unified memory is. Unified memory is the CPU and GPU memory management units being coherent. It is neither a description of nor related to physical proximity.
 

johnsonwax

Member
Jun 27, 2024
118
195
76
I also doubt it. Just because the CPU and GPU are separate chips doesn't mean they can't still use unified memory. They managed it with the Ultra, and it sounds like this new TSMC tech will let them accomplish something similar to that less expensively.

I suspect there is too much reliance on the unified memory within iOS/macOS especially where the GPU drivers/Metal are concerned for it to even be an option for Apple to ditch it.
Yeah, they aren't dropping it. Apple committed to it when they saw the roadmap would eventually allow for memory to be off package, and simply dealt with the limitations until that came about. Apple has a long horizon and they're counting on these architectural advantages paying off over time.
 

Eug

Lifer
Mar 11, 2000
23,953
1,567
126
Eh, I doubt that.
This guy is one of many who don’t understand what unified memory is. Unified memory is the CPU and GPU memory management units being coherent. It is neither a description of nor related to physical proximity.
Ming-Chi Kuo did NOT say Apple would drop unified memory.

That was added by Notebookcheck but does not reflect the original posts by MCK.


 

Meteor Late

Senior member
Dec 15, 2023
266
291
96
I'm not well versed on this, but couldn't Apple sell you M5 Pro with 12+4 CPU cores but with the weakest iGPU and 128 bit bus thanks to this new advanced packaging?
 

Doug S

Diamond Member
Feb 8, 2020
3,005
5,167
136
I'm not well versed on this, but couldn't Apple sell you M5 Pro with 12+4 CPU cores but with the weakest iGPU and 128 bit bus thanks to this new advanced packaging?

They could but they won't. Apple has always wanted to limit SKUs to make choice easier for consumers, and make inventory management easier for them - remember Tim Cook was Steve Jobs' personal choice to take over Apple's operations when he returned as CEO. What you describe has a market, but not enough of a market Apple is likely to offer that in a product.

Where it could make sense is on the higher end - easier to build the fabled "Extreme" if you don't have to link four complete chips and endure the wasted silicon for stuff you don't need four times as much of) and it may make things easier for Apple to build internal chips. They may not think there's a market for a CPU that has no GPU but that's exactly what they want for cloud servers. I think AI cloud/training stuff will be fully custom but they could use this tech to link a bunch of smaller chips together as one and not pay the yield penalties Nvidia does for its reticle sized dies.
 

fkoehler

Senior member
Feb 29, 2008
214
175
116
"By the time Apple could get into the AI market en masse, the gold rush will be over."
I basically agree with the entire picks and shovels gold rush metaphor.

Assuming that works out as it likely will though, getting Apple into an Apple Server Industry position seems a natural possiblity to keep the mounds of cash flowing in.
Its worth a small money shot considering all the effort they're putting into their own AI/Cloud..

Doubtful phones are going to maintain previous rates of sales no matter what, and it seems to be happening already. Computer sales are not anywhere near going to satisfy that revenue need.

Can't remember where I read it recently, but an excellent piece on how much money has been blown so far on AI/Hype, and how still no one knows how they are ever going to get ROI outside of some fantasy next big thing.
Currently there are a few big players like MS keeping the big AI names funded on the hopes that something will come of it and they're not Gelsingered.
AI seems to be just another hype train that people desperately want to believe can happen if they just keep throwing money and resources at.

These weekly/monthly 'incredible advances' mostly serve to keep the investors hooked and convinced not to move towards the exits.
 

joshua95

Junior Member
Jan 6, 2025
3
0
6
Happy new year all. Long time lurker since 2008...

I'm toying with the idea of getting a MacBook Air to replace my gaming PC (changing habits, and all).

That said, I'm conscious the current version is m3. How much of a step up is the m4 in a way that would improve the Air compared to the m3? I'm happy to wait if the m4 will benefit the Air (it seems to be more power efficient?).
 

johnsonwax

Member
Jun 27, 2024
118
195
76
"By the time Apple could get into the AI market en masse, the gold rush will be over."
I basically agree with the entire picks and shovels gold rush metaphor.

Assuming that works out as it likely will though, getting Apple into an Apple Server Industry position seems a natural possiblity to keep the mounds of cash flowing in.
Its worth a small money shot considering all the effort they're putting into their own AI/Cloud..
Apple almost always wins the gold rush, even though they're always tagged as being late to it.

The reason for this is that in most cases, the gold rush is not a rush of new dollars, rather a shifting of them within a market. It's a low-end or high-end disruption, not a new market. Right now I think there's really only one clear new market for AI tools and that's in expert systems where it's substituting or augmenting domain expertise. Almost everywhere else it's merely value-add. Consumers aren't going to pay for AI based writing tools - they'll simply come to expect them as part of their word processor, with at most a small uptick in cost. They aren't going
to pay for object removal in their photo app, that'll just be an expected feature. Within these markets there can be massive movements of money from the incumbent that doesn't have these features to the competitor that does, so there's money to be made that way, but there's unlikely to be an addition of money, or a shift of money from out of tech into tech at the consumer level. I don't see a value proposition for consumers, but I do see a threat that if AI becomes a strong feature of Android that iOS could lose share, or the reverse. But I don't see an increase in dollars flowing into smartphones as a result.

Microsoft sees potential in enterprise productivity tools - after all, that's where their money originally came from - eliminating millions of clerks and secretaries and having their payroll turned into Exchange and O365 licenses, and there's a possibility that can continue here, but so far there doesn't seem to be much value in general knowledge systems - they are too unreliable. Apple has no real market here, so all of its AI work is purely defensive, to ensure they are adding value against competitors. There isn't much opportunity to monetize outside of a handful of markets around products like FCP and Logic, and even there I think the opportunities are very small. The only potential for Apple is if consumers see Android/Windows cloud based AI as being too insecure and migrate based on that, and I think that has limited opportunity.

I can see a case where Nvidia certainly comes out ahead based on selling silicon to the operators of those expert systems. I can see a bunch of intermediate players doing well designing those systems on behalf of clients, or selling specialized tools for animators, etc that are priced in the 6 figures, which will represent a gold rush for them. But I don't yet see a business model for the public around ChatGPT that makes any sense. That just seems to eventually become an on-device value-add or a disruption to Googles ad revenue stream which is at most just a moving of existing dollars.
 

poke01

Diamond Member
Mar 8, 2022
3,036
4,010
106
Happy new year all. Long time lurker since 2008...

I'm toying with the idea of getting a MacBook Air to replace my gaming PC (changing habits, and all).

That said, I'm conscious the current version is m3. How much of a step up is the m4 in a way that would improve the Air compared to the m3? I'm happy to wait if the m4 will benefit the Air (it seems to be more power efficient?).
I would wait for M4
 
Reactions: jdubs03

Meteor Late

Senior member
Dec 15, 2023
266
291
96
Yeah, M4 introduces two extra E cores that were previously not there, and is the biggest uplift since M1 in both ST and MT. There is a very high chance it's the best upgrade in a long time, all base models upgraded to 16GB, etc.
The only other big upgrade, when it happens, will not be from the chip, but in the display, when OLED comes into the Air.
 

joshua95

Junior Member
Jan 6, 2025
3
0
6
Out of interest, is anyone feeling any limitations from 16GB RAM in any of their M series devices? The Apple tax on upgrading, and with non user-replacable hardware, is a lot more restrictive than my desktop.

You'll ask what I would be doing:
  • Light gaming, such as CIV 5/6
  • Productivity apps such as excel, SQL tools and light python
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |