Discussion Apple Silicon SoC thread

Page 316 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Eug

Lifer
Mar 11, 2000
23,992
1,610
126
M1
5 nm
Unified memory architecture - LP-DDR4
16 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 12 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache
(Apple claims the 4 high-effiency cores alone perform like a dual-core Intel MacBook Air)

8-core iGPU (but there is a 7-core variant, likely with one inactive core)
128 execution units
Up to 24576 concurrent threads
2.6 Teraflops
82 Gigatexels/s
41 gigapixels/s

16-core neural engine
Secure Enclave
USB 4

Products:
$999 ($899 edu) 13" MacBook Air (fanless) - 18 hour video playback battery life
$699 Mac mini (with fan)
$1299 ($1199 edu) 13" MacBook Pro (with fan) - 20 hour video playback battery life

Memory options 8 GB and 16 GB. No 32 GB option (unless you go Intel).

It should be noted that the M1 chip in these three Macs is the same (aside from GPU core number). Basically, Apple is taking the same approach which these chips as they do the iPhones and iPads. Just one SKU (excluding the X variants), which is the same across all iDevices (aside from maybe slight clock speed differences occasionally).

EDIT:



M1 Pro 8-core CPU (6+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 16-core GPU
M1 Max 10-core CPU (8+2), 24-core GPU
M1 Max 10-core CPU (8+2), 32-core GPU

M1 Pro and M1 Max discussion here:


M1 Ultra discussion here:


M2 discussion here:


Second Generation 5 nm
Unified memory architecture - LPDDR5, up to 24 GB and 100 GB/s
20 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 16 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache

10-core iGPU (but there is an 8-core variant)
3.6 Teraflops

16-core neural engine
Secure Enclave
USB 4

Hardware acceleration for 8K h.264, h.264, ProRes

M3 Family discussion here:


M4 Family discussion here:

 
Last edited:

johnsonwax

Member
Jun 27, 2024
130
219
76
This makes me wonder if the money in that market isn't in only two places: tight integration in existing software stacks (which would make the existing SW product more valuable) and efficient NPU designs (needed to run and update the models on the go). These two extremities would be the only ones to survive the current craze.
That's my read. That doesn't mean new software stacks won't tip up based on the enabling capabilities of AI - products that it enables that traditional software couldn't provide. But the market for generative features will almost certainly be captured by the tools that producers have historically used. If you want to generate art, that'll probably happen in the tools that you have historically used to make art - photoshop, illustrator, etc. rather than an entirely new product.

If AI was an IP moat (and the people throwing so much money into it believe that its the ultimate moat - that someone like OpenAI is going to unlock the universal algorithm that brings us Fully Automated Luxury Gay Space Communism which captures all surplus dollars) then you could argue that moat could be leveraged into capturing these value stacks, but there is no IP moat - most of the people working on these tools are in academia around open-source stacks, and they aren't so far off of the closed source systems - certainly not so far that Adobe would be better off licensing that IP vs developing their own. And Apple's argument is that AI is going to have the most utility when it can be trained on your personal data which begs the question how do you safely and securely do that in a market where shipping your data off-device is pretty universally seen as unsafe.

In the enterprise space there will be a product argument, and you will see some need for Nvidia's big products, but so far nobody has shown any real revenue growth off of that approach. It's early, but the billions of Nvidia hardware sales have to turn into 10s of billions of new revenue for that to be sustainable, and well, we're still waiting to see that. Apple doesn't need to achieve that - they can justify their investments as sustaining efforts to retain their customer base. After all, the customers are the ones paying for the hardware.
 

johnsonwax

Member
Jun 27, 2024
130
219
76
Gotta disagree. Apple's been caught off guard by Microsoft's AI push and higher AI TOPS requirement. They "might" rectify that with M4 Macbooks but if they use the exact same SoC as the iPad Pro, they will be limited to 38 TOPS while players in the x86 space will have anywhere from 50 to 55 TOPS (HP Omnibook Ultra). Developers may be more excited to experiment on x86 laptops than Apple ones. Apple is faced with a conundrum right now. Do they follow suit or do they pave their own way? The deals with OpenAI seem to suggest the latter as Apple clearly didn't anticipate Microsoft's strong AI push so they will try to give their users more AI TOPS through the OpenAI cloud without getting into the bind of being stuck with lower rated NPUs in their SoCs. It also suggests that they never gave much thought to genAI features before. I'm not saying that x86's higher local AI performance will revolutionize computing but it could certainly bring forth new possibilities. If I were a developer, I wouldn't touch Apple's overpriced hardware to experiment with local AI possibilities, especially since Apple does not show any interest in aggressively increasing their marketshare. Apple could still move developers in their direction through clever marketing and promises of reaching more users through their store and iPhones. It will be interesting watching this space on how things unfold.
Not sure how Apple, the company that has shipped more NPU compute than anyone in the world by a wide margin and has had it a standard feature in their mobile devices for 6 years now is the party caught off guard. Microsoft has published these NPU requirements and has yet to show what that NPU will be used for apart from Visual Studio.

Apple's deal with OpenAI is a hedge. It's odd to say Apple didn't give much thought to genAI features before, when Apple has been shipping genAI features for what, about 5 years now? I think you've bought in too hard to the 'Apple is behind' tech press bullshit, when what we have is a disagreement about what the market will want. Apple believes that the market will want narrow, contextual AI including generative AI which works in the camera app, in apps like GarageBand, etc., while the tech press believe that the market will want fully open-ended contextless generative AI like midjourney. The OpenAI deal is a hedge against that. Clearly there are users that want it, so Apple provides a hook for those users to use it, but Apple is not investing in that being a core market. I mean, OpenAI is the one paying Apple, after all.

I've never heard a developer complain that iPhones market share was insufficient. That's a new one.
 

johnsonwax

Member
Jun 27, 2024
130
219
76
Will the current Mx series be strong enough for all those AI features? I know that Apple said they will support their Intelligence, but the question is whether they will be limited, for example, maybe they won’t support all features but only partially. What does those 38 TOPS in the M4 mean compared to 18 in the M3?
So, this depends on the models that Apple builds and deploys. The big AI players have demonstrated that you can always make the model bigger, and therefore there is an infinite demand for compute. But they have also demonstrated that there are diminishing returns on that compute. These players are all chasing some notion of AGI which is not Apple's goal. If you look at what they showed off, the generative AI features are very narrow and limited. Most of the focus is on boosting Siri, with some modest summary/rewrite text AI, some generative fill-style AI for photos, similar features for audio in GarageBand, and so on. There is no provision for a prompt to write an email for you. The generative image AI is also very narrow - constrained to a handful of pre-defined styles, and without an open-ended prompt.

That suggests to me that Apple isn't trying to do 10 different things in one model, but is deploying a bunch of different narrowly tailored models, each of which will have more modest compute needs. And Apple can size these models to the ranges of compute that they know they have available - which is one of the benefits of that vertical control. Adobe can't control what GPU you have and so can't size their features beforehand, but Apple can.

My guess is that everything is going to be sized to about 40TOPS, which is what Microsoft is also signaling. (A17 Pro is 35 TOPS, so in that ballpark.) Their models will be sufficiently performant on that hardware for most things. If you're trying to rewrite a huge document, maybe that goes to PCC. If you are on an 18TOPS M3, that will go to PCC more often, but my guess is that if everyone has 35-40TOPS in their hand/on their lap, that very few things will go to PCC. The AI features in Xcode might need more, but it's not unreasonable for Apple to assume that developers will seek out an M4 Pro or higher if they really rely on those features. Any AI features added to Logic and FCP would similarly expect better hardware. It'll be interesting to see how Apple chooses to scale the NPU on the M4 Pro/Max/etc. or if they're going to rely on the GPU for the additional compute.

You can get some insight into the plan here if you read the PCC white paper. Apple lays out the specific models they are planning on deploying. What's interesting is that PCC isn't just running the on-device models faster, they're running much bigger models. So this isn't just a case of performance scaling, the offload to PCC will to some degree involve the device saying 'this model isn't capable of doing that' and handing you off to a more capable model. It's not clear if those PCC models will be available on Macs running the same SoC, or if MacOS will utilize the same ~40TOPS models across the board. But one of the mistakes I see people making around this is the thought that Apple will deploy models that benefit from continuous scaling, and that's just not how this works. You're going to get models scaled to ~40TOPS. Having more than that will make them a bit snappier, but not more capable. This is why I think PCC is a stopgap - Apple isn't going to roll out models that its consumer hardware can't handle. They're going to roll out more 40TOPS models to do a wider range of things while users replace their hardware with more capable stuff reducing the load on PCC. I don't see Apple letting the software models get ahead of the silicon they're shipping. They've never done that before. They didn't roll out shitty TouchID until they got the silicon in line, they held the feature until the silicon shipped and tied it to the silicon. That's just how they roll. They're being forced to break with that just due to the mismatch between AI interest and the timetable for designing silicon, but I think they're going to work to get back to their old pattern.
 

johnsonwax

Member
Jun 27, 2024
130
219
76
I was referring to their laptops. Apple is not interested in competing there, as many of your kind have tried to hammer that into my head, in this very thread.
Misunderstood your last sentence. Sorry about that.

But I think the PC software market state is a bit more complicated than you are making it to be. The main market that Apple can't reach is enterprise, and that's not going to change. Apple is a consumer facing company, and they aren't going to tip up an enterprise support contract structure to appease CIOs who don't actually give a shit if their PCs are any good, just as long as they are cheap. On the consumer side, the entire consumer PC software space is dead as dicks apart from gaming, and there are real structural issues around gaming that are difficult for Apple to address. They are doing a few things to try and fix that, but there are some things they realistically just can't do.

To developers that means that there's very little viable money to be made in consumer software outside of games. The opportunity is in web-based services, and mobile apps. Marketshare doesn't change anything to help developers unless it's enterprise share.
 
Reactions: dr1337

Doug S

Diamond Member
Feb 8, 2020
3,088
5,327
136
they will be limited to 38 TOPS while players in the x86 space will have anywhere from 50 to 55 TOPS

So what? Apple has NPUs in every ARM Mac, and every iPhone made since 2017. Intel and AMD are the Johnny come latelys here, not Apple. Whatever the marketers behind the "AI PC" push might think in their wet dreams, the PC market is not going to vastly accelerate its replacement cycle to get "AI". Thus it is going to remain a niche a small and slowly growing percentage of the PC userbase as PCs are slowly replaced. Even by 2030 I'd guess half of all PCs won't have an NPU - because probably a quarter to a third won't have been upgraded, and there are still an awful lot non AI PCs being sold today and will continue to be sold for the next couple years thanks to all the old Zen 4, RPL and so forth that predates the PC industry's sudden embrace of something that Apple has been doing for seven years longer than the PC industry.
 
Reactions: Tlh97 and Viknet

The Hardcard

Senior member
Oct 19, 2021
314
397
106
I was referring to their laptops. Apple is not interested in competing there, as many of your kind have tried to hammer that into my head, in this very thread.
They are interested in competing in laptops, just differently than the way you want. They are absolutely serving the buyers of their laptops. I want and can get 128 GBs of RAM accessible to GPU accelerated compute. Impossible on any other laptop and extremely complex to even achieve on desktop outside of Macs.
 
Last edited:
Reactions: Viknet

The Hardcard

Senior member
Oct 19, 2021
314
397
106
Gotta disagree. Apple's been caught off guard by Microsoft's AI push and higher AI TOPS requirement. They "might" rectify that with M4 Macbooks but if they use the exact same SoC as the iPad Pro, they will be limited to 38 TOPS while players in the x86 space will have anywhere from 50 to 55 TOPS (HP Omnibook Ultra). Developers may be more excited to experiment on x86 laptops than Apple ones. Apple is faced with a conundrum right now. Do they follow suit or do they pave their own way? The deals with OpenAI seem to suggest the latter as Apple clearly didn't anticipate Microsoft's strong AI push so they will try to give their users more AI TOPS through the OpenAI cloud without getting into the bind of being stuck with lower rated NPUs in their SoCs. It also suggests that they never gave much thought to genAI features before. I'm not saying that x86's higher local AI performance will revolutionize computing but it could certainly bring forth new possibilities. If I were a developer, I wouldn't touch Apple's overpriced hardware to experiment with local AI possibilities, especially since Apple does not show any interest in aggressively increasing their marketshare. Apple could still move developers in their direction through clever marketing and promises of reaching more users through their store and iPhones. It will be interesting watching this space on how things unfold.
Apple’s hardware is not limited by NPU capacity. The unified memory design allows the GPU to provide compute capacity that exceeds all current NPUs. So far for Apple, the NPU is just a quick and low power way to do certain AI tasks, and is particularly beneficial for things like image processing from stacking a burst of camera shots into a more appealing photo.

However, for the features that companies are trying to get consumers interested in in terms of generative AI, currently Apple’s GPU crushes its own NPU, and all other NPUs in capability.

Apple was not caught off guard by the AI push. What causes Apple’s challenges to engage in AI features is desire to provide features in a deterministic way. A big reason for a lot of the frustration about the Apple way - often frustrates me as well is there desire to control how they consumers experience new features.

It is why they don’t just let all their hardware try to do every feature they introduce. often, if they doubt the reliability of a device or a processor to provide the experience in the way, they see fit, they simply refuse to let it run whereas a lot of other players in the industry will provide such features to everybody no matter how, potentially bumpy or buggy the experience is to certain consumers with hardware that doesn’t quite meet the mark.

The AI push offers is only unique challenges to Apple because currently, I developers have figured out innovative ways of getting neural networks to learn patterns from human language, even though they don’t understand how the network is learning. This is a big conundrum in the whole industry every major AI developer has teams of people to put up artificial safety guardrails to minimize socially unacceptable and politically incorrect interactions with these models.

This is particularly challenging to Apple, because even with the guard rails, AI interactions are not deterministic. They can’t be sure how their AI is going to interact with a prompt or request from a consumer. That is what slowing Apple down not any missing of the boat.

The reason Siri fell so far behind other AI assistants is precisely because they want to know what is going to happen when a consumer uses Siri. part of the purpose of giving Siri access to device features through App Intents is to try to smooth the experience. But it also is done this way so that Apple can have confidence that they know how Siri is going to respond to requests.

Apple’s caution and attempt to curate the experience is turning this into a classic tortoise and hare race scenario. The big fantastic feature Microsoft intended to drive Copilot PC sales is now delayed and damaged needed. Trust for all their AI feature introductions. Apple’s slower decision to develop Private Cloud Compute Is probably going to put Apple of Microsoft when it comes to consumers on AI features.
 

johnsonwax

Member
Jun 27, 2024
130
219
76
So what? Apple has NPUs in every ARM Mac, and every iPhone made since 2017. Intel and AMD are the Johnny come latelys here, not Apple. Whatever the marketers behind the "AI PC" push might think in their wet dreams, the PC market is not going to vastly accelerate its replacement cycle to get "AI". Thus it is going to remain a niche a small and slowly growing percentage of the PC userbase as PCs are slowly replaced. Even by 2030 I'd guess half of all PCs won't have an NPU - because probably a quarter to a third won't have been upgraded, and there are still an awful lot non AI PCs being sold today and will continue to be sold for the next couple years thanks to all the old Zen 4, RPL and so forth that predates the PC industry's sudden embrace of something that Apple has been doing for seven years longer than the PC industry.
Yeah, Apple has shipped ~150M devices that have a reliable 35TOPS. They're likely to ship another 300M this year. And we still really don't know what Copilot is going to use that NPU for apart from VS code assist. And given that most of those PCs are going to be enterprise, I'm not sure how many of those will even permit the feature to be enabled. We had a pretty strong 'no LLM use' policy where I worked apart from narrow applications - no shoving institutional data into OpenAI. And we did AI research.
 

johnsonwax

Member
Jun 27, 2024
130
219
76
They are interested in competing in laptops, just differently than the way you want. They are absolutely serving the buyers of their laptops. I want and can get 128 GBs of RAM accessible to GPU accelerated compute. Impossible on any other laptop and extremely complex to even achieve on desktop outside of Macs.
It's also pretty clear that Apple envisions much of the consumer PC market shifting to iPad (witnessed by its continual elevation). There are some real problems they need to address in iPadOS for that to realistically happen, but that's their vision.
 

DavidC1

Golden Member
Dec 29, 2023
1,442
2,342
96
It's also pretty clear that Apple envisions much of the consumer PC market shifting to iPad (witnessed by its continual elevation). There are some real problems they need to address in iPadOS for that to realistically happen, but that's their vision.
We've been talking about the "Death of PCs" for years, and nothing changed substantially. All it has resulted is now more people having some form of a computing device than ever before. It's just marketing fluff to make us feel excited.
 
Reactions: dr1337

jdubs03

Golden Member
Oct 1, 2013
1,226
870
136
I don’t think MacOS is going anywhere anytime soon. It still fits its niche very well. It would be cool if they made a hybrid, but they have previously said they won’t, and it would cannibalize sales for their iPads or MacBooks. Plus not everyone wants a touchscreen laptop.
 

johnsonwax

Member
Jun 27, 2024
130
219
76
Apple’s caution and attempt to curate the experience is turning this into a classic tortoise and hare race scenario. The big fantastic feature Microsoft intended to drive Copilot PC sales is now delayed and damaged needed.
Apple's biggest strength is their ability to figure out how consumers will interact with a given bit of technology. This is why they are often late to a market but usually win that market. OpenAI is first to the market, but they don't know how to turn it into a product that can generate the revenue needed to sustain the product - especially after everyone has realized that their datasets are the thing of value for AI training, and the free data scraping is drying up. Microsoft recognizes the opportunity, hence the investment, but themselves haven't really figured out the go-to-market either - though they are signaling it's a value-add on Windows/Office rather than a fee service.

Apple always solves the go-to-market problem first and foremost. That's what's defined the company since Jobs' return. It's why I'm concerned by AVP because they didn't really do that in that case, and that's out of character for them. But I'm sure the Apple Car was killed because they couldn't solve that problem because the market opportunity in cars is at the low end, but regulation in the US doesn't allow for it, and Apple may not have been able to get their vision out at where the opportunity price was.
 

johnsonwax

Member
Jun 27, 2024
130
219
76
We've been talking about the "Death of PCs" for years, and nothing changed substantially. All it has resulted is now more people having some form of a computing device than ever before. It's just marketing fluff to make us feel excited.
I would argue that a lot has changed. As I noted above the consumer software space for PCs is -redacted- dead, apart from games. All of that got usurped by mobile. That means that PCs are pretty seriously underserving most consumers (as compared to 15 years ago), but consumers aren't fully ready to give them up, for a variety of good reasons. The category hasn't died, but the importance of them is much weaker than it used to be. Apple's problem is that the iPad doesn't fully address the variety of good reasons why consumers hold onto PCs.

But regardless of whether you think there's an opportunity there or not, you can't deny that Apple has that vision.

Profanity in the tech forums is not allowed.

Daveybrat
AT Moderator
 
Last edited by a moderator:

Doug S

Diamond Member
Feb 8, 2020
3,088
5,327
136
It's also pretty clear that Apple envisions much of the consumer PC market shifting to iPad (witnessed by its continual elevation). There are some real problems they need to address in iPadOS for that to realistically happen, but that's their vision.

I don't buy that at all. iPad is a complementary, different way of doing things. There is absolutely nothing Apple is doing that suggests what you say. I know that's a popular meme, but it has no basis in reality.
 
Reactions: name99 and scannall

Mopetar

Diamond Member
Jan 31, 2011
8,306
7,319
136
What does Microsoft even do with their AI that Apple is now so far behind that you feel it's imperative that they must catch up to this magic 50 TOPS number that's being thrown around. Does the 38 TOPS that Apple supposedly has mean there's something their products are incapable of as opposed to taking 30% longer to crunch through some workload?
 

Eug

Lifer
Mar 11, 2000
23,992
1,610
126
They are interested in competing in laptops, just differently than the way you want. They are absolutely serving the buyers of their laptops. I want and can get 128 GBs of RAM accessible to GPU accelerated compute. Impossible on any other laptop and extremely complex to even achieve on desktop outside of Macs.
I'm curious, what do you do that uses that much RAM with the GPU? How small is that market though?
 
Reactions: Mopetar

The Hardcard

Senior member
Oct 19, 2021
314
397
106
We've been talking about the "Death of PCs" for years, and nothing changed substantially. All it has resulted is now more people having some form of a computing device than ever before. It's just marketing fluff to make us feel excited.
The “Death of PCs” inevitable. The possibility that it may take more than 20 years after the claim, rather than less than 10 years after the claim doesn’t make the statement itself any less true.

It is because of why people use PCs just like any technology. They want the results that PC give them, not the PCs themselves. The members of these Anandtech forums and other like discussion groups are part of a small minority that is fascinated with the technology in and of itself. The majority of the market, though just wants the results that the technology can give them and they could not care less about the technology itself.

They are less interested in the tradeoffs. People put up with compute devices thatvoccupy space, require physical effort and comfort compromises to carry around to get the results not because they want to do any of those things. It’s the same way with the software. Office applications, photo and video, editing, coding, financial and accounting, drawing, or music software - the list goes on and on. Very few people take the time to learn them for the joy of being engaged in the intricacies of how they work. They just want the results that becoming proficient in them provide.

The AI craze is exactly about the coming change in peoples relationships with compute devices. Algorithms that learn enough of about all the patterns of human functioning, so that they can simply provide you with the results you want without you having to be involved with the complexities of how those results are achieved. Currently those results are very rudimentary and inconsistent in accuracy and usefulness.

But the writing is on the wall. Algorithms will be coming that make those tasks many orders of magnitude more complex as well as providing above human levels of accuracy. The excitement around AI isn’t that about the little things that current algorithms can do. It is instead the realization that eventually AI algorithms be able to provide people with the results that they get from any professional software just by asking for what they want.

It's also pretty clear that Apple envisions much of the consumer PC market shifting to iPad (witnessed by its continual elevation). There are some real problems they need to address in iPadOS for that to realistically happen, but that's their vision.

But regardless of whether you think there's an opportunity there or not, you can't deny that Apple has that vision.

I will argue the Apple’s progression in general, and with the iPad in particular is not to be a new way to use computers, but instead be a new way to get the results that people now rely on computers to give them. That’s why I think people are bewildered by Apple’s year after year refusal to “fix iPadOS.”

I think the iPad is a bridge device. It will help people cross over from the Windows/macOS/Linux desktop way of of people getting what they want from their processor to the way that doesn’t involve learning how to use applications, rather involves algorithms that learn how to give you the results you want.
 

Mopetar

Diamond Member
Jan 31, 2011
8,306
7,319
136
The rumors of the PCs death are as over exaggerated now as they have been the past two decades.

It's certainly true that there are more alternatives now than ever before and there are several people who don't need a PC, but there will always be users who need the computer equivalent of heavy machinery.
 

The Hardcard

Senior member
Oct 19, 2021
314
397
106
I'm curious, what do you do that uses that much RAM with the GPU? How small is that market though?
Unfortunately, I don’t have the time to do it now however, I intend to make time quickly. If I am fortunate, in time to use the next set of Max or Ultra chips.

Neural networks regardless of algorithm so far function better with higher parameters. For large language models for instance Llama 3 is one of the top current open source LLMs. Meta has released the 7 billion parameter and 70 billion parameter models and soon will be releasing a 400 billion parameter models with each size Increase being more capable and accurate.

Currently for AI models, full precision is typically 16 bits or 2 bytes. Plus you need a chunk extra of RAM for them to run. So for Llama 3, that would be about 18 GBs of RAM for the 7B, 150 GBs for 70B and 850 GBs of RAM for the 400B model.

an important aspect of neural networks, however, is high precision is rarely necessary, and has been mathematically established that you would lose very little accuracy even going down to 2 bit parameters. Practically for current techniques however, 4 bits is generally the lowest you wanna go. 4 bits can give you more than 95% of the accuracy of full precision, thus very useful.

However, a 4-bit quantized version of Llama 3 400B would still need about 250 GBs of RAM to run. GPU acceleration is needed for all that RAM to have even slightly practical speed. This is true for all generative AI models not just language models. To be on the cutting edge of capability you’re going to need hundreds of billions of parameters.

The Ultras coming summer or fall 2025 will be the most economical way to GPU accelerate these models. Nvidia cards would be significantly faster than any Ultra chip but, getting together 250 GBs of Nvidia VRAM we need far more money, space, and power then one $6000 Mac Studio with 256 GB RAM.
 
Reactions: Eug

poke01

Diamond Member
Mar 8, 2022
3,387
4,627
106
The “Death of PCs” inevitable. The possibility that it may take more than 20 years after the claim, rather than less than 10 years after the claim doesn’t make the statement itself any less true.

It is because of why people use PCs just like any technology. They want the results that PC give them, not the PCs themselves. The members of these Anandtech forums and other like discussion groups are part of a small minority that is fascinated with the technology in and of itself. The majority of the market, though just wants the results that the technology can give them and they could not care less about the technology itself.

They are less interested in the tradeoffs. People put up with compute devices thatvoccupy space, require physical effort and comfort compromises to carry around to get the results not because they want to do any of those things. It’s the same way with the software. Office applications, photo and video, editing, coding, financial and accounting, drawing, or music software - the list goes on and on. Very few people take the time to learn them for the joy of being engaged in the intricacies of how they work. They just want the results that becoming proficient in them provide.

The AI craze is exactly about the coming change in peoples relationships with compute devices. Algorithms that learn enough of about all the patterns of human functioning, so that they can simply provide you with the results you want without you having to be involved with the complexities of how those results are achieved. Currently those results are very rudimentary and inconsistent in accuracy and usefulness.

But the writing is on the wall. Algorithms will be coming that make those tasks many orders of magnitude more complex as well as providing above human levels of accuracy. The excitement around AI isn’t that about the little things that current algorithms can do. It is instead the realization that eventually AI algorithms be able to provide people with the results that they get from any professional software just by asking for what they want.





I will argue the Apple’s progression in general, and with the iPad in particular is not to be a new way to use computers, but instead be a new way to get the results that people now rely on computers to give them. That’s why I think people are bewildered by Apple’s year after year refusal to “fix iPadOS.”

I think the iPad is a bridge device. It will help people cross over from the Windows/macOS/Linux desktop way of of people getting what they want from their processor to the way that doesn’t involve learning how to use applications, rather involves algorithms that learn how to give you the results you want.
I disagree, the PC will still be here. You don’t stop making trucks just because cars are cheaper and more efficient.
 

Doug S

Diamond Member
Feb 8, 2020
3,088
5,327
136
These are silly arguments. An iPad Pro can't be the death of the PC because it IS a PC! You might think it makes a pretty crappy PC, and I wouldn't argue that, but being a crappy PC doesn't make it not a PC. And no matter how good it eventually might become at being a PC won't mean it killed PCs just became the preferred form factor - similar to how laptops have become the preferred form factor over desktops.

Personally I don't view using a tablet as being as convenient as a PC. Holding the device sucks versus having it sit on a surface. Typing on a touch screen sucks versus a good keyboard. Touching a touch screen sucks versus a mouse when you're doing it repeatedly. I mean, try doing heavy duty spreadsheet work on a tablet you're holding in your hands, I bet you'll throw it against the wall before an hour is up!

Now some might say "yes but you don't have to hold the tablet, you can set it on a surface with a stand so it is like a laptop." They'll point out you don't have to use the touchscreen to type, you can get a keyboard - either built into the stand like the iPad Pro's, or a separate bluetooth keyboard. Ditto for a mouse. But if you are equipping your iPad Pro with all the trappings of a PC, how can you argue you are not using a PC?

The only way the "death of the PC" happens is if there is a different input paradigm. Something that doesn't use a "keyboard", whether a separate physical device or a virtual one on the display. Maybe the display is smart glasses/contacts, and you talk to it or "think to it" (i.e. where Apple probably sees Vision Pro at some future date) If we see that arrive in actual products that are affordable etc. and it is generally accepted to be a more convenient and efficient modality to do the sort of stuff we've been doing on a PC then yes the PC will be dead. But using a tablet or a smartphone as your computer does not mean the PC is dead. They are the same modality of input, just with different tradeoffs - the phone is smaller and always with you but less powerful and not something you'd want to write a term paper with. The tablet is kind of in between (in my mind combining the worst attributes of smartphones and PCs, but judging by Apple's sales figures obviously not everyone agrees with me)

An iPad Pro provides the same old "look at display, type on keyboard, point at stuff" technology we've been using since the first GUIs appeared. Unless you define a PC the way the Supreme Court once defined pornography, as "I know it when I see it", it is impossible to argue that an iPad Pro with a keyboard case and optional bluetooth mouse is not a PC unless you place "I know it when I see it" types of conditions on what YOU THINK a PC is.
 

johnsonwax

Member
Jun 27, 2024
130
219
76
These are silly arguments. An iPad Pro can't be the death of the PC because it IS a PC! You might think it makes a pretty crappy PC, and I wouldn't argue that, but being a crappy PC doesn't make it not a PC.
So, I think there's an important distinction here that benefits people to focus on. The sole difference between a Mac and an iPad, fundamentally, is that the former is open by design, and can run any code you put on it, and the latter is closed by design and will not run arbitrary code. In a lot of cases that's not a big deal - running Office 365 on a Mac and on an iPad can be functionally equivalent experiences for the majority of users. But if you need to write a python script to process a file before loading it into Excel, that's not going to be available on the iPad - not without some fundamental changes to how iPad works (which the EU is trying to punch holes through, against Apple's wishes).

For a lot of us, what constitutes a PC - arbitrary code execution, because we're the kinds of people who write code - the iPad can never sit in the role of a 'PC'. There's nothing stopping Apple from taking the iPad Pro, shoving MacOS on it, and calling it a Mac tablet, either leaving the touchscreen inoperable for MacOS and only enabling it if you run an iPadOS app, or adding touch support to MacOS. But most users never execute arbitrary code, and for them the Mac represents a pretty hard tradeoff in terms of IT support and security that probably isn't worth it. My wife is a Mac user because she's been a Mac user for 30 years, but there's nothing there she really couldn't do just as well on iPad after re-establishing some new muscle memory. If I wasn't here to provide support, she'd have faced some real challenges that she could have avoided being iPad first. But me being a code-writer, being primary iPad is a non-starter. Just not possible.

'PC' has always identified a platform that could work for all users. It might have been overkill for most, but it didn't exclude to the point that if you didn't like Windows, you could build your own linux and run that. You could always go back to writing your own bootloader and building up from there. iPad doesn't fit that vision of a PC. iPad can't replace the Mac. The Mac has to exist to sustain all of Apple's platforms. The question is who is best served by each, and that's not a question that exists in the Wintel world. But I think to Apple the fundamental problems around securing and supporting a Mac are unsolvable. The category of users that require W^X violations be easy to enable cannot be resolved with the security needs of users that never need to do that, so the best thing to do is segregate these users to different platforms. Ultimately, that's the question that exists in the Mac/iOS world.
 
Reactions: Viknet and name99

Doug S

Diamond Member
Feb 8, 2020
3,088
5,327
136
So, I think there's an important distinction here that benefits people to focus on. The sole difference between a Mac and an iPad, fundamentally, is that the former is open by design, and can run any code you put on it, and the latter is closed by design and will not run arbitrary code. In a lot of cases that's not a big deal - running Office 365 on a Mac and on an iPad can be functionally equivalent experiences for the majority of users. But if you need to write a python script to process a file before loading it into Excel, that's not going to be available on the iPad - not without some fundamental changes to how iPad works (which the EU is trying to punch holes through, against Apple's wishes).

For a lot of us, what constitutes a PC - arbitrary code execution, because we're the kinds of people who write code - the iPad can never sit in the role of a 'PC'. There's nothing stopping Apple from taking the iPad Pro, shoving MacOS on it, and calling it a Mac tablet, either leaving the touchscreen inoperable for MacOS and only enabling it if you run an iPadOS app, or adding touch support to MacOS. But most users never execute arbitrary code, and for them the Mac represents a pretty hard tradeoff in terms of IT support and security that probably isn't worth it. My wife is a Mac user because she's been a Mac user for 30 years, but there's nothing there she really couldn't do just as well on iPad after re-establishing some new muscle memory. If I wasn't here to provide support, she'd have faced some real challenges that she could have avoided being iPad first. But me being a code-writer, being primary iPad is a non-starter. Just not possible.

'PC' has always identified a platform that could work for all users. It might have been overkill for most, but it didn't exclude to the point that if you didn't like Windows, you could build your own linux and run that. You could always go back to writing your own bootloader and building up from there. iPad doesn't fit that vision of a PC. iPad can't replace the Mac. The Mac has to exist to sustain all of Apple's platforms. The question is who is best served by each, and that's not a question that exists in the Wintel world. But I think to Apple the fundamental problems around securing and supporting a Mac are unsolvable. The category of users that require W^X violations be easy to enable cannot be resolved with the security needs of users that never need to do that, so the best thing to do is segregate these users to different platforms. Ultimately, that's the question that exists in the Mac/iOS world.

So what you're telling me is exactly what I figured people would say and why I wrote the last paragraph. You personally believe an iPad's not a PC because in your mind that requires running whatever code you want to write (which technically it can, you can run whatever code you want on your iPad or iPhone without Apple stopping you, the key is you can only run it on YOUR iPad or iPhone, and there are a couple gotchas in that you need a Mac to transfer what you've written to your device and pay the $99 for a developer license for code signing)

But that's so much of an "I know it when I see it" argument I have to think you stopped reading my post before you got to the end, or somehow have such tunnel vision you believe that your views are shared by the world at large so what I wrote didn't apply. Something like 99% of PC owners will never write their own code, so that is irrelevant to them as defining what a "PC" is. To the world at large, an iPad is a PC even by your standards, because they care about writing their own code on a PC exactly as much as they care about changing their own oil in their car or replacing their own electrical panel in their house. Would you consider a car that had a sealed engine bay not a car because you can't change your own oil? Because most drivers in the US wouldn't even KNOW if their hood was sealed because they've never tried to open it.
 
Reactions: scannall
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |