Discussion Apple Silicon SoC thread

Page 317 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Eug

Lifer
Mar 11, 2000
23,775
1,349
126
M1
5 nm
Unified memory architecture - LP-DDR4
16 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 12 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache
(Apple claims the 4 high-effiency cores alone perform like a dual-core Intel MacBook Air)

8-core iGPU (but there is a 7-core variant, likely with one inactive core)
128 execution units
Up to 24576 concurrent threads
2.6 Teraflops
82 Gigatexels/s
41 gigapixels/s

16-core neural engine
Secure Enclave
USB 4

Products:
$999 ($899 edu) 13" MacBook Air (fanless) - 18 hour video playback battery life
$699 Mac mini (with fan)
$1299 ($1199 edu) 13" MacBook Pro (with fan) - 20 hour video playback battery life

Memory options 8 GB and 16 GB. No 32 GB option (unless you go Intel).

It should be noted that the M1 chip in these three Macs is the same (aside from GPU core number). Basically, Apple is taking the same approach which these chips as they do the iPhones and iPads. Just one SKU (excluding the X variants), which is the same across all iDevices (aside from maybe slight clock speed differences occasionally).

EDIT:



M1 Pro 8-core CPU (6+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 16-core GPU
M1 Max 10-core CPU (8+2), 24-core GPU
M1 Max 10-core CPU (8+2), 32-core GPU

M1 Pro and M1 Max discussion here:


M1 Ultra discussion here:


M2 discussion here:


Second Generation 5 nm
Unified memory architecture - LPDDR5, up to 24 GB and 100 GB/s
20 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 16 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache

10-core iGPU (but there is an 8-core variant)
3.6 Teraflops

16-core neural engine
Secure Enclave
USB 4

Hardware acceleration for 8K h.264, h.264, ProRes

M3 Family discussion here:


M4 Family discussion here:

 
Last edited:

The Hardcard

Member
Oct 19, 2021
182
273
106
I disagree, the PC will still be here. You don’t stop making trucks just because cars are cheaper and more efficient.

The analogy is inaccurate as it assumes that there are capacities and capabilities that PC can provide over and above post PC devices. everything that the general population would want to do with PCs will be possible and preferable with the post PC devices. They will have all of the compute power and memory and storage capacity to do all the tasks and give all the results the people computers to do.

Not that there won’t be enthusiasts who continue to use them because they’re into the technology. but PCs will not be trucks. They will be horses. Horses used to be the backbone of transportation but now only people who enjoy being around horses ride them. PC enthusiasts will have the same relationship. I will probably be one who keeps one or more PCs around.

These are silly arguments. An iPad Pro can't be the death of the PC because it IS a PC! You might think it makes a pretty crappy PC, and I wouldn't argue that, but being a crappy PC doesn't make it not a PC. And no matter how good it eventually might become at being a PC won't mean it killed PCs just became the preferred form factor - similar to how laptops have become the preferred form factor over desktops.

Personally I don't view using a tablet as being as convenient as a PC. Holding the device sucks versus having it sit on a surface. Typing on a touch screen sucks versus a good keyboard. Touching a touch screen sucks versus a mouse when you're doing it repeatedly. I mean, try doing heavy duty spreadsheet work on a tablet you're holding in your hands, I bet you'll throw it against the wall before an hour is up!

Now some might say "yes but you don't have to hold the tablet, you can set it on a surface with a stand so it is like a laptop." They'll point out you don't have to use the touchscreen to type, you can get a keyboard - either built into the stand like the iPad Pro's, or a separate bluetooth keyboard. Ditto for a mouse. But if you are equipping your iPad Pro with all the trappings of a PC, how can you argue you are not using a PC?

The only way the "death of the PC" happens is if there is a different input paradigm. Something that doesn't use a "keyboard", whether a separate physical device or a virtual one on the display. Maybe the display is smart glasses/contacts, and you talk to it or "think to it" (i.e. where Apple probably sees Vision Pro at some future date) If we see that arrive in actual products that are affordable etc. and it is generally accepted to be a more convenient and efficient modality to do the sort of stuff we've been doing on a PC then yes the PC will be dead. But using a tablet or a smartphone as your computer does not mean the PC is dead. They are the same modality of input, just with different tradeoffs - the phone is smaller and always with you but less powerful and not something you'd want to write a term paper with. The tablet is kind of in between (in my mind combining the worst attributes of smartphones and PCs, but judging by Apple's sales figures obviously not everyone agrees with me)

An iPad Pro provides the same old "look at display, type on keyboard, point at stuff" technology we've been using since the first GUIs appeared. Unless you define a PC the way the Supreme Court once defined pornography, as "I know it when I see it", it is impossible to argue that an iPad Pro with a keyboard case and optional bluetooth mouse is not a PC unless you place "I know it when I see it" types of conditions on what YOU THINK a PC is.

iPadOS 18. yet another year of Apple, having a powerful fully capable processor in a device that people who want to use it as a PC, creating frustration about the operating system’s capability to perform like other PCs they have access to.

It is on purpose. Apple knows there are people who want full PC capability, and they are more than capable of providing it. They could have put a smooth, fully functional traditional PC environment on the iPad years ago, even before they split iPadOS off from iOS.

The reason is because while iPad can do the “ look at display, type on keyboard, point at stuff” that is the foundation of the PC paradigm, that is not its future which is why iPads will always be an inferior experience using it that way. A maximal “PC experience” is not why the M4 is in the new iPad, nor any M chip for that matter.

Another key aspect of what a PC is, is how people get the results they want from it. In addition to “ look at display, type on keyboard, point at stuff” Is what the software is on the PC. Currently you have to find software that does functions that you can manipulate data on to get the results you want.

MS Office, Photoshop, Blender, Abelton, Da Vinci Resolve, Quickbooks, Visual Studio, etc. these are all examples of the PC paradigm. Developers define a set of tasks that would allow their users to get useful results, then code those functions in a systematic way. Then the user has to learn how to get to, and use those functions to achieve the specific results that they want with their own data. most of those users do not want to learn the software, it is necessary for them to get the results they want.

While technically all computers, owned by persons, our personal computers, I am talking about the death of the Personal Computer, established by IBM and Microsoft in 1981, given graphical user interfaces, but Apple and Microsoft in the mid-1980s and with many refinements essentially what we are still using.

This is what is going to die for the general market and the change will affect what hardware people use to get their results. The post PC era, the shoots of which are now in 2024/2025 just barely piercing up above the soil is the machine learning era.

Instead of providing functions that users have to learn to get their results, deep learning algorithms will be given the data that allows them to acquire the capabilities of providing the results the user wants specific to said users datasets.

From years ago to any given moment now, Apple could put on the iPad a most fantastic way of organizing, keeping track of, and finding your files. They could, in fact, have also made improvements to the filesystem management on Macs. The reason they haven’t is because they’re not interested in iPads replacing PCs by being as good or better than PCs at what those PCs do.

For nearly every user files and filesystems the best current means to achieve their goals, which is to get the results from their data that they want. The “What is a PC?” plan for iPads is just that the user get results from their data upon request with no thought of or relationship to a filesystem, if there even is one. already large language models can extract data that you want from a single giant multi GB blob of model weights.

Machine learning researchers are now working hard to figure out how to extract all types of data from these blobs of model weights. Text, image and video (including particular subjects in the image and video data), numbers and patterns in those numbers of any type financial, scientific, engineering and more.

There is still a lot of work needed for it to be effective, accurate, and comprehensive. but once users are able to retrieve an interact with their data upon request very few will care how smooth their devices filesystem is or whether it even has one.

That is what I mean by the death of the PC. The system of the users organizing and tracking data via a set a files, then, using applications that have sets of functions to be learned by the user so that they can then pull data from files to interact with it with those application functions.

And it absolutely will happen.
 
Last edited:

dr1337

Senior member
May 25, 2020
400
661
136
The “Death of PCs” inevitable.
Nah it will never happen. Just like how PCs haven't replaced television, and TV didn't replace radio. We live in a world where, generally, once a form factor becomes mainstream it stays around basically forever. New vinyl records are pressed here in 2024, people still make cassettes, CDs, ect.

Frankly the only distinction from a mobile device and a "PC" really is nothing more than the ability to program that device, from the device itself.

Consider the iPad Pro with the M4, its a mobile device with one of the fastest chips ever made in it, yet if you want to write code and use it as a development environment, you are SOL. Web environments do exist yes but even something simple like programming a kids arduino is not possible to do from an iPad. So either Apple makes devices like the iPad and iPhone into PCs that can run any code I want, or what's really gonna happen is that the PC will never die and mobile devices are always going to be their own thing.

Someone that only checks email could replace their PC with an iPad, but those individuals probably didn't need a personal computer in the first place.
 

The Hardcard

Member
Oct 19, 2021
182
273
106
To append that long comment, Nvidia and Apple are the only major tech companies who have fully grasped this for several years now, and have been working toward it. Nvidia has already separated itself from the pack in the datacenter, and Apple is going to open up a big gap in the personal device space going into 2026.

Nvidia built hardware seven years ago is competitors are only catching onto now and having fully developed yet. More importantly, at the same time they began the new software paradigm, which requires a huge amount of fundamental work that is still going to take years for others to recreate.

Apple’s gap will be defined in hardware by a fundamental change that everyone else is going to have to do. Unified memory. It’s going to soon be painfully obvious that unified memory is Apple Silicon’s most important feature.
 

Doug S

Platinum Member
Feb 8, 2020
2,661
4,496
136
That is what I mean by the death of the PC. The system of the users organizing and tracking data via a set a files, then, using applications that have sets of functions to be learned by the user so that they can then pull data from files to interact with it with those application functions.

OK so it sounds like instead of defining a PC as "I can run software I wrote" you're defining it as "organizes information via files". Still an "I know it when I see it" definition. A Windows PC would still be a PC if Microsoft replaced WinFS with a relational database.
 
Jul 27, 2020
19,223
13,185
146
A Windows PC would still be a PC if Microsoft replaced WinFS with a relational database.
You mean NTFS. WinFS was a database type of filesystem that Microsoft tried to implement in Windows Longhorn (the Vista beta) but it was too slow for its time coz it predated the ubiquity of SSDs by quite a few years.

Anyway, I agree that PC is defined by its "openness" more than anything else. You can tweak it, tinker with it, do what your heart desires with it. Even with locked down office PCs from Dell/HP etc., there's usually a way to install any supported OS on them. And once you have root on a PC, you can even destroy it (by overclocking the bejeezus out of it). But most people just content themselves by running the CPU pretty close to its limits with some extra cooling. That's the essence of what a PC is. To do with as you please. Otherwise, it simply doesn't sell all that well. I know zero people among my hundreds of acquaintances with a Surface device. Even the Steam Deck is open because Valve understands what users want.
 

johnsonwax

Member
Jun 27, 2024
70
136
66
So what you're telling me is exactly what I figured people would say and why I wrote the last paragraph. You personally believe an iPad's not a PC because in your mind that requires running whatever code you want to write (which technically it can, you can run whatever code you want on your iPad or iPhone without Apple stopping you, the key is you can only run it on YOUR iPad or iPhone, and there are a couple gotchas in that you need a Mac to transfer what you've written to your device and pay the $99 for a developer license for code signing)
No, I can't script the OS using Python, even with a dev license. Among other things, there's limited POSIX support and a dev license doesn't enable you to address that.

So there is no 'in my mind' aspect to this - iPad BY DESIGN cannot do a range of things that the Mac is designed to do. This is not an oversight by Apple, this is deliberate for security reasons. And that's a decision I agree very strongly with. I'm not saying that the iPad is a bad PC, I'm saying that arbitrary code execution is a characteristic that many users legitimately require in the device they call a 'PC'. I could not have done my data science work on an iPad, even with a dev license.

So in the PC space nobody has ever asked users to have to choose the feature set of a PC in this way, so we have no language to differentiate between them. But Apple is asking users to do that, and the classification between Mac and iPad is routinely made along the wrong lines - around access to a mouse, or touch or what silicon they run on. But those are all negotiable by Apple. They can change all of that stuff. The thing they can't change is the arbitrary code execution - that's the whole security argument for iOS, set on day one. That's inviolable. And that becomes, in the end, the point of decision. If you don't need to write and execute code, an iPad is potentially functionally equivalent to a Mac and can be viewed as a substitute. If you do need write an execute code, it can never be.

If you understand this, you can have reasonsable arguments about the potential reach of iPad without digging yourself into a hole. This is not an 'I know it when I see it' kind of thing. This is a very clear expression of functionality, one that the iPad, again BY DESIGN, cannot ever meet. My argument is that with this clear delineation of functionality, we should have a delineation of terminology. I similarly argue that the Mac and Windows PCs, and things like Raspberry Pis are 'general computing' devices because they can boostrap themselves. You can effectively throw all software that it contains in a hole and rewrite it from scratch to do whatever you want it to do. And that property makes them all of a sort - they should all be able to be labeled similarly. That's impossible to do on an iPad, iPhone, Playstation, and so on. I would argue that by virtue of the general computing category being effectively permissionless platforms they should be evaluated differently in terms of things like platform lock in (see the EU requiring Apple to blow a hole in their security stack by allowing 3rd party JITs) because the contract with the user is very different for these platforms where the user doesn't not have unlimited access to the hardware. To Apple's credit, they have expanded the iPad VERY close to the Mac in terms of utility, but it cannot ever cross that barrier because that barrier is a fundamental principle of computer science. It's not a 'feels like' thing, it's 'can I change the permission of this page of memory from writable to executable' or not. And those 'general compute' platforms that I list, you can - easily, and the iPad you cannot - like, at all.

My wife may not understand the difference between those two things. But anyone with an understanding of computer science can make definitive statements about what that security feature will limit you from doing and it segregates these into very clear categories. There's a reason why an M2 Mac can run iPad apps but an M2 iPad cannot run Mac apps. The latter isn't possible.

So is an iPad a PC? My wife would say yes, and a software developer will say no because it's impossible to do their job on one. I don't think it benefits the argument about whether the iPad is a PC or not to simply pretend that security feature doesn't exist and has no impact on the utility of the device. Now, could we define PCs as devices that cannot execute code arbitrarily? Sure. But I don't know anyone who does. Could we call those 'general computing' platforms 'Pro PCs' or 'development PCs'? Sure. But they need to be delineated in some way, because the iPad doesn't fail to execute arbitrary code because nobody has gotten around to it, it fails to do so because it is designed to - and the community of people that write the occasional VBA script, or execute something on the command line isn't that small. Given that the PC market originated with users that were required to write their own code (I'm of that era), it seems a bit unreasonable to just write them out of the market completely. So I'm not arguing that the iPad isn't a PC, or a suitable PC substitute for the vast majority of users. I think it is. But I think stopping there and not recognizing where it isn't makes that argument weaker. If you cannot acknowledge what it cannot, by design do, it weakens your argument.
 

Eug

Lifer
Mar 11, 2000
23,775
1,349
126
Whether or not an iPad is a PC is just a discussion of semantics. FWIW, some market data analysts have included tablets in their PC sales data summaries in the past, even though these days most separate them out. IOW, it really just depends on how you want to define the term PC. Hell, in the past, some people even refused to include Macs in the term "PC" and Apple even made a whole set of commercials referencing that, including one of my favourite commercials of all time.


So, I think there's an important distinction here that benefits people to focus on. The sole difference between a Mac and an iPad, fundamentally, is that the former is open by design, and can run any code you put on it, and the latter is closed by design and will not run arbitrary code. In a lot of cases that's not a big deal - running Office 365 on a Mac and on an iPad can be functionally equivalent experiences for the majority of users.
Funny you should mention that. Earlier I mentioned my issue trying to run PowerPoint 2016 via Rosetta 2 on my M1 Mac mini.


I instead decided to try to give that 2.5 hour PowerPoint presentation (about 90 slides and 90 MB) from my iPad Pro M4 over Zoom, since I have Office 365 installed on that. The way I usually present is to pipe the presentation to an external screen (eg. projector on-site or secondary screen shared by Zoom), while showing my notes and next slide on my local screen (not visible to the audience).

This simply did not work on iPadOS 17. For Zoom, I couldn't share the external screen to the Zoom audience. It only would allow me to share the primary screen. Furthermore, with PowerPoint I could not put a trackpad cursor on the external screen either, so I couldn't point things out to the audience. The pointer only worked on the primary screen. The only way to make this work for the audience is to use the use simple presentation mode where there are no presenter notes or next slide available.

So, instead, to make all this work I pulled out my old 2017 MacBook Core m3, and presented from Office on native Intel.

So, while a tablet still may be considered a PC to many, especially in my case where I have a Magic Keyboard with trackpad, iPadOS can still be a major PITA when when appropriate and so-called mature software is available.
 

The Hardcard

Member
Oct 19, 2021
182
273
106
OK so it sounds like instead of defining a PC as "I can run software I wrote" you're defining it as "organizes information via files". Still an "I know it when I see it" definition. A Windows PC would still be a PC if Microsoft replaced WinFS with a relational database.
You left out the key aspect of the definition, the filesystem is a secondary concept.

Generative models that can operate on data via self deep learning neural networks as opposed to having coders define software functions and procedures that the user then has to learn how to set parameters and then apply them to their particular data and desires.

Woul a Windows PC still be a PC if you don’t have compiled applications or lightweight apps?

I say compute devices built around neural networks are post PC, and that after several more breakthroughs along the line of transformers, neural network models will be able to perform every task people buy PCs for. I don’t see any common general market PC application that neural networks won’t be able to do inside of 15 years.

They will be preferred because users won’t need to learn how to use them. In fact, they often be able to give people the results they want even when they aren’t themselves clear on what they want.

I used the filesystem as an example of Apple looking past the PC concept. It is one of the most infamous pain points of using an iPad as a PC. Another example is the multitasking and windowing system that elicits complaints from everyone using it as a traditional PC.

Apple hears these complaints and chooses to not satisfy. Is it because they can’t? Is it because they enjoy taking people’s money and then being mean to them? Or are they purposely looking at a new direction and are not worried about perfecting the past? Some other reason?
 

johnsonwax

Member
Jun 27, 2024
70
136
66
The only reason iPad doesn't have an OSX mode is the Google money.
This is 100% wrong. This was set up years before the Google money even existed. It was set up when Jobs said he he didn't want 3rd party apps on iPhone and the iPhone team worked out a set of security limitations that would protect the device which are the defining characteristics of iOS/iPadOS. Everything derives from that.
 

johnsonwax

Member
Jun 27, 2024
70
136
66
He came pretty close: https://worthdoingbadly.com/macappsios/

View attachment 103413


I love people who are too honest for their own good
And I don't see any reason why Apple wouldn't be willing to take the iPad hardware and put MacOS on it and call it a Mac Tablet. They'd have to address the touch interface on MacOS, I would think, and that's not a trivial thing.

But I think that sort of begs the question of why not just be content with a MBA given that both require KB+trackpad, and the touch screen doesn't do anything anyway. There was an 11" Air once upon a time that maybe should make a comeback if people just want a smaller form factor. But apart from that, I've never gotten an adequate explanation why the iPad Pro form factor is better than the Air without a lot of handwaving about a Mac touch interface magically appearing.

But fantasies of a MacOS mode on an iPadOS device are never happening, nor are getting a prompt on iPad, etc.
 
Reactions: Viknet

The Hardcard

Member
Oct 19, 2021
182
273
106
He came pretty close: https://worthdoingbadly.com/macappsios/

View attachment 103413


I love people who are too honest for their own good
The ultimate honesty will be facing why iPads will neither get macOS nor will the desired features be satisfactorily added to iPadOS. I don’t rule out more tweaking of iPadOS as a traditional PC, but it’s time to get really honest. Apple is in pursuit of a new direction and iPads will never be a nice, pain free desktop UI clone by intent.
 

johnsonwax

Member
Jun 27, 2024
70
136
66
Whether or not an iPad is a PC is just a discussion of semantics. FWIW, some market data analysts have included tablets in their PC sales data summaries in the past, even though these days most separate them out. IOW, it really just depends on how you want to define the term PC. Hell, in the past, some people even refused to include Macs in the term "PC" and Apple even made a whole set of commercials referencing that, including one of my favourite commercials of all time.



Funny you should mention that. Earlier I mentioned my issue trying to run PowerPoint 2016 via Rosetta 2 on my M1 Mac mini.


I instead decided to try to give that 2.5 hour PowerPoint presentation (about 90 slides and 90 MB) from my iPad Pro M4 over Zoom, since I have Office 365 installed on that. The way I usually present is to pipe the presentation to an external screen (eg. projector on-site or secondary screen shared by Zoom), while showing my notes and next slide on my local screen (not visible to the audience).

This simply did not work on iPadOS 17. For Zoom, I couldn't share the external screen to the Zoom audience. It only would allow me to share the primary screen. Furthermore, with PowerPoint I could not put a trackpad cursor on the external screen either, so I couldn't point things out to the audience. The pointer only worked on the primary screen. The only way to make this work for the audience is to use the use simple presentation mode where there are no presenter notes or next slide available.

So, instead, to make all this work I pulled out my old 2017 MacBook Core m3, and presented from Office on native Intel.

So, while a tablet still may be considered a PC to many, especially in my case where I have a Magic Keyboard with trackpad, iPadOS can still be a major PITA when when appropriate and so-called mature software is available.
I would argue that all of those are pretty easily solvable problems by the developer. They aren't inherent problems with the platform. My experience was always that Zoom on the Mac was a garbage [learning to hold my language] app but was bulletproof on iPad, not because one OS was better than the other, but because having a garbage iOS Zoom app would kill Zoom as a platform but it could live with a garage MacOS app, so they did a terrible job porting it.

But multiscreen touch apps are a lot harder to develop than multiscreen kb+mouse ones, simply due to the nature of the UX. I'm not surprised that the iPad offerings are limited in those ways.

I was a solid MacBook Pro + iPad Pro user at work, and they had VERY different roles. Things like Zoom were infinitely better on the iPad than the Mac. Being a platform with a rear camera, there were a lot of tools for snapping a photo of a printed table of data and having that very quickly converted to a spreadsheet, really accurately. I got rid of my scanner entirely for the iPad rear camera. iOS/iPad OS apps, in my experience get a lot more development attention and tended to be more reliable/faster. But muscle memory around KB+mouse even for web based tools like Google Sheets was better on the Mac, despite a lot of trying to make it work on iPad. There are some very subtle differences between the platforms that add up to big differences in experience. I'm not convinced those are particularly easy to flatten - certainly not as easy as the people demanding hybrid devices seem to think.
 

johnsonwax

Member
Jun 27, 2024
70
136
66
The ultimate honesty will be facing why iPads will neither get macOS nor will the desired features be satisfactorily added to iPadOS. I don’t rule out more tweaking of iPadOS as a traditional PC, but it’s time to get really honest. Apple is in pursuit of a new direction and iPads will never be a nice, pain free desktop UI clone by intent.
I think that's been pretty obvious for a long time. And the fact that there are more iPad users than Mac users reinforces Apple's view here.
 

name99

Senior member
Sep 11, 2010
481
361
136
Apple’s hardware is not limited by NPU capacity. The unified memory design allows the GPU to provide compute capacity that exceeds all current NPUs. So far for Apple, the NPU is just a quick and low power way to do certain AI tasks, and is particularly beneficial for things like image processing from stacking a burst of camera shots into a more appealing photo.

However, for the features that companies are trying to get consumers interested in in terms of generative AI, currently Apple’s GPU crushes its own NPU, and all other NPUs in capability.

Apple was not caught off guard by the AI push. What causes Apple’s challenges to engage in AI features is desire to provide features in a deterministic way. A big reason for a lot of the frustration about the Apple way - often frustrates me as well is there desire to control how they consumers experience new features.
I agree with your general thrust. The one technical point I would disagree on is that I think Apple *has* been somewhat surprised by the speed at which Transformers/LLMs took off.

Remember the design cycle for hardware is 3..4 years if you're running fast and are able to build on the past. That's been fine for CNNs and vision, and the ANE has done a fine job of adding functionality to make vision ever more functional at ever lower power. But the ANE, designed for CNNs and vision, is not a great match for transformers and the math of LLMs. It's not awful, but much of what needs to be done is a hack, and much of the functionality present in the ANE can't really be exploited.

Presumably Apple HAS been considering optimal hardware since Transformers hit the bigtime. But it's not clear how this will play out.
- make such changes to ANE? Possible, but then you lose some of what makes ANE so lightweight for vision.
- keep pushing the GPU? That's the easy solution, and may actually be the best "business" solution, given the costs of designing new hardware from scratch, and the risk that Transformers may be overthrown next month by a rather different type of architecture. Downside is you pay a power cost on the GPU bcs of flexibility it provides that you don't require. Not as bad as the CPU, but there's a reason ANE exists...
- design new hardware? In my ANE volume 7 I mention a recent patent for what looks like a 128-wide vector DSP, by some of the team who designed the original ANE. Could this be for Transformer work? Flexible enough to have at least some protection against changes away from Transformers, but more specialized than a GPU? Maybe...
Or maybe it's unrelated to ANE and is in fact for Apple wireless (primarily cellular) work?

Maybe this will be more clear with the A18?
 

name99

Senior member
Sep 11, 2010
481
361
136
Nah it will never happen. Just like how PCs haven't replaced television, and TV didn't replace radio. We live in a world where, generally, once a form factor becomes mainstream it stays around basically forever. New vinyl records are pressed here in 2024, people still make cassettes, CDs, ect.

Frankly the only distinction from a mobile device and a "PC" really is nothing more than the ability to program that device, from the device itself.

Consider the iPad Pro with the M4, its a mobile device with one of the fastest chips ever made in it, yet if you want to write code and use it as a development environment, you are SOL. Web environments do exist yes but even something simple like programming a kids arduino is not possible to do from an iPad. So either Apple makes devices like the iPad and iPhone into PCs that can run any code I want, or what's really gonna happen is that the PC will never die and mobile devices are always going to be their own thing.

Someone that only checks email could replace their PC with an iPad, but those individuals probably didn't need a personal computer in the first place.
While I agree with your larger point (persistence of technology), the specific details of what you suggest might change.
For example suppose that lightweight hypervisors make it practical to launch untrusted/JIT'ed code in a virtual machine, so that there's no longer a security reason to limit such code from iOS... (And this may be a direction Apple finds itself forced down faster than it would like if they want to continue to sell in the EU...)

Essentially this is a bet that the attack surface of a hypervisor can be made small enough that it's feasible to believe hypervisor isolation is good enough. (Along with some other details like assuming OS read-only/execute pages can be shared across hypervisors, to prevent memory use explosion.)
Is this a realistic bet? I honestly don't know.
But it strikes me as the sort of unexpected attack from left field that Apple is very good at -- EU tries to create this tower of nonsense to supposedly punish them, and Apple fights back by not only giving the EU everything they claim they want, but also by doing so in such a technically sophisticated package that it becomes EVEN HARDER for any other company to compete!
 

The Hardcard

Member
Oct 19, 2021
182
273
106
I agree with your general thrust. The one technical point I would disagree on is that I think Apple *has* been somewhat surprised by the speed at which Transformers/LLMs took off.

Remember the design cycle for hardware is 3..4 years if you're running fast and are able to build on the past. That's been fine for CNNs and vision, and the ANE has done a fine job of adding functionality to make vision ever more functional at ever lower power. But the ANE, designed for CNNs and vision, is not a great match for transformers and the math of LLMs. It's not awful, but much of what needs to be done is a hack, and much of the functionality present in the ANE can't really be exploited.

Presumably Apple HAS been considering optimal hardware since Transformers hit the bigtime. But it's not clear how this will play out.
- make such changes to ANE? Possible, but then you lose some of what makes ANE so lightweight for vision.
- keep pushing the GPU? That's the easy solution, and may actually be the best "business" solution, given the costs of designing new hardware from scratch, and the risk that Transformers may be overthrown next month by a rather different type of architecture. Downside is you pay a power cost on the GPU bcs of flexibility it provides that you don't require. Not as bad as the CPU, but there's a reason ANE exists...
- design new hardware? In my ANE volume 7 I mention a recent patent for what looks like a 128-wide vector DSP, by some of the team who designed the original ANE. Could this be for Transformer work? Flexible enough to have at least some protection against changes away from Transformers, but more specialized than a GPU? Maybe...
Or maybe it's unrelated to ANE and is in fact for Apple wireless (primarily cellular) work?

Maybe this will be more clear with the A18?
I still have concerns about where the ANE is located on Apple Silicon dies. They appear to be behind the CPU L2 cache. I wonder how much chip bandwidth they can use. CPU clusters of the A and base M chips appear to be able to use the full SOC bandwidth, but in the Pro and larger M chips CPU clusters don’t seem to be able to use the full memory bandwidth.

It seems to me that optimizing the chips for maximum machine learning potential involves boosting matrix math compute with access to the full package memory bandwidth. can that be done with the NPU?
 

jpiniero

Lifer
Oct 1, 2010
15,043
5,609
136
This is 100% wrong. This was set up years before the Google money even existed. It was set up when Jobs said he he didn't want 3rd party apps on iPhone and the iPhone team worked out a set of security limitations that would protect the device which are the defining characteristics of iOS/iPadOS. Everything derives from that.

I'm talking about today and not when the iPad was first developed.
 

johnsonwax

Member
Jun 27, 2024
70
136
66
I'm talking about today and not when the iPad was first developed.
The iOS security layer that prohibits what you describe dates to the when the iPhone was first developed. It's foundational to the difference between iOS and MacOS. It predates the Google deal by several years. Apple is not about to redesign two of their platforms, with 10x the user share of MacOS to appease a handful of MacOS users.
 

Doug S

Platinum Member
Feb 8, 2020
2,661
4,496
136
I agree with your general thrust. The one technical point I would disagree on is that I think Apple *has* been somewhat surprised by the speed at which Transformers/LLMs took off.

Remember the design cycle for hardware is 3..4 years if you're running fast and are able to build on the past. That's been fine for CNNs and vision, and the ANE has done a fine job of adding functionality to make vision ever more functional at ever lower power. But the ANE, designed for CNNs and vision, is not a great match for transformers and the math of LLMs. It's not awful, but much of what needs to be done is a hack, and much of the functionality present in the ANE can't really be exploited.

Presumably Apple HAS been considering optimal hardware since Transformers hit the bigtime. But it's not clear how this will play out.
- make such changes to ANE? Possible, but then you lose some of what makes ANE so lightweight for vision.
- keep pushing the GPU? That's the easy solution, and may actually be the best "business" solution, given the costs of designing new hardware from scratch, and the risk that Transformers may be overthrown next month by a rather different type of architecture. Downside is you pay a power cost on the GPU bcs of flexibility it provides that you don't require. Not as bad as the CPU, but there's a reason ANE exists...
- design new hardware? In my ANE volume 7 I mention a recent patent for what looks like a 128-wide vector DSP, by some of the team who designed the original ANE. Could this be for Transformer work? Flexible enough to have at least some protection against changes away from Transformers, but more specialized than a GPU? Maybe...
Or maybe it's unrelated to ANE and is in fact for Apple wireless (primarily cellular) work?

Maybe this will be more clear with the A18?

I still maintain that since there's a ton of overlap between what GPUs, NPUs, and now AMX/SME does that building something that can fill all those roles makes a lot of sense. Now obviously there's a role for smaller batch lower latency results and large batch higher throughput results so you can't get rid of the SME unit but theoretically you could supplement it for really big tasks. At any rate something flexible that could be partitioned so that when you're pushing the GPU with a CAD program or game and your AI demand is low it is most doing GPU tasks and only a small number of its cores are doing AI, and the reverse is true when you're just sitting at the GUI doing whatever task can use all the NPU power it can get.

Obviously there isn't perfect overlap between the implemention of the various functions, but there's enough that I think someone will eventually solve this. Apple is best positioned to do so since it controls the whole platform and doesn't have to support third party GPUs, but Nvidia is inarguably closer to that goal as far as hardware design. The software piece is probably the harder nut to crack though.
 

Eug

Lifer
Mar 11, 2000
23,775
1,349
126
But I think that sort of begs the question of why not just be content with a MBA given that both require KB+trackpad, and the touch screen doesn't do anything anyway. There was an 11" Air once upon a time that maybe should make a comeback if people just want a smaller form factor. But apart from that, I've never gotten an adequate explanation why the iPad Pro form factor is better than the Air without a lot of handwaving about a Mac touch interface magically appearing.
I really disliked the 11" MacBook Air, mainly (but not only) because of its horrible screen, which was sub-par even for its era. Its only real advantage was that it was cheap. If it makes a comeback, I hope it's nothing like that original Air, except for the price.

Actually, I've been saying for years that they could easily make a say 11.8" replacement to the 12" Retina MacBook with (binned) Apple Silicon, improved keyboard, improved trackpad, and two USB4 / Thunderbolt 4 ports, but they don't want to because full-on laptops at that size just don't seem to sell well at the price points Apple prefers. Instead, they released an 11.1" iPad Pro with (binned) Apple Silicon, improved keyboard, improved trackpad, and 1 Thunderbolt 3 port and 1 USB-C charging port (via the Magic Keyboard). With the iPad Pro 11", Apple basically solved all the form factor complaints I had with the 12" MacBook, but with the tradeoff of incorporating two new issues, which are a reduced key spacing for the iPad Pro keyboard and increased weight.

Anyhoo, on another note... My workplace and my wife's workplace are both down today. As you might have guessed, both are Windows based. I guess I have some free time today.
 
Reactions: Ghostsonplanets

name99

Senior member
Sep 11, 2010
481
361
136
I still have concerns about where the ANE is located on Apple Silicon dies. They appear to be behind the CPU L2 cache. I wonder how much chip bandwidth they can use. CPU clusters of the A and base M chips appear to be able to use the full SOC bandwidth, but in the Pro and larger M chips CPU clusters don’t seem to be able to use the full memory bandwidth.

It seems to me that optimizing the chips for maximum machine learning potential involves boosting matrix math compute with access to the full package memory bandwidth. can that be done with the NPU?
You are speaking word salad.

The whole point of matrix multiply is that it's ridiculously NON-memory-bandwidth intensive. It's a running joke in HPC that if you're boasting about your great matrix multiply performance what you're actually saying is our memory sucks.
There *are* neural ops that are bandwidth intensive but matrix multiply is not one of them. The concern with quantizing (and otherwise shrinking) large LLMs is primarily to reduce memory footprint; reduced execution bandwidth is just a nice side benefit that reduces power a little.

Second I have no idea why you imagine the ANE is (sharing?) the CPU L2. Why would it? Or what does this even mean in the context of say an M Pro with two P clusters.

The ANE has its own local storage (manually managed, so not a "cache" but plays the same sort of role as an L2 cache) along with DMA into that storage.
I don't know that anyone has tested the bandwidth into the ANE, but I expect it's "appropriate" given that Apple seems to get this correct for every other unit on the SoC.

The fact that the entire internet is telling you that the bandwidth of an NPU is what matters (when they aren't telling you that the TOPs of an NPU is what matters...) doesn't make it true.
I expect there are also papers by other people, but there are DEFINITELY papers by Apple that discuss this issue, comparing the performance [quality and speed] of multiple networks relevant to Apple's interests to both required bandwidth and required TOPs, and the conclusion is that there's very little correlation right now...
 
Reactions: Viknet
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |