Solved! ARM Apple High-End CPU - Intel replacement

Page 56 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Richie Rich

Senior member
Jul 28, 2019
470
229
76
There is a first rumor about Intel replacement in Apple products:
  • ARM based high-end CPU
  • 8 cores, no SMT
  • IPC +30% over Cortex A77
  • desktop performance (Core i7/Ryzen R7) with much lower power consumption
  • introduction with new gen MacBook Air in mid 2020 (considering also MacBook PRO and iMac)
  • massive AI accelerator

Source Coreteks:
 
Reactions: vspalanki
Solution
What an understatement And it looks like it doesn't want to die. Yet.


Yes, A13 is competitive against Intel chips but the emulation tax is about 2x. So given that A13 ~= Intel, for emulated x86 programs you'd get half the speed of an equivalent x86 machine. This is one of the reasons they haven't yet switched.

Another reason is that it would prevent the use of Windows on their machines, something some say is very important.

The level of ignorance in this thread would be shocking if it weren't depressing.
Let's state some basics:

(a) History. Apple has never let backward compatibility limit what they do. They are not Intel, they are not Windows. They don't sell perpetual compatibility as a feature. Christ, the big...

peanutbridges

Junior Member
Jul 5, 2020
3
5
36
Also I do not expect Apple to understand at all since they're a lost cause when they don't spend a dime on their software department. If Microsoft can be accused of taking a non-committed attitude towards Windows on ARM then the same can be said Apple as well since they're putting in a similar amount of effort too which is practically amounts to nearly nothing. Apple are mostly expecting macs to basically subsist off of the scraps from iOS instead of Windows/Intel but it's going to end up about as compelling as an iPad is ...

I wonder if Mac users feel more happy about having an inferior version of iOS over being an inferior alternative to Windows/Intel ?

Apple's problem is stubbornly adhering to their annual cadence of releasing a new version of software whereas the previous version was just mostly fixed up with duct tape that summer (as in WWDC announcement summer time). Catalina is lauded as a pretty bad release and now it's ARM prepping time so pack it up and move forward. For iOS they probably have a solid audience of testers to sort through; for Macs, I doubt developers even get a chance to catch all the bugs before the next release triggering a new wave of conflicts. It's all fine and dandy for a simple app but a pain in the butt for complicated programs.

I think Apple spread themselves too thin, don't have quite the actual talent they think they have, and work on a deadline where no one wants to be fired or blamed so everything is done as much as it can regardless of function or feasibility until it's cancelled or postponed. I don't know why they feel they need to release so often when Macs are generally neglected hardware-wise. I really think 18 months or even 24 months with a service pack in-between for crucial bugs/security would be so much better for the desktop platform. Microsoft seems to be going the same route with seasonal updates too.
 

soresu

Platinum Member
Dec 19, 2014
2,956
2,175
136
You are SO short sighted with your comments , do you truly believe we should encourage or accept that Intel/AMD and Nvidia/AMD should be the only game in town ?
As much as I hate to say it, nVidia is already the de facto leader/monopoly in many areas that require pro GPU for computation - that ship had sailed and even Intel is unlikely to unseat them from it without some pretty incredible academic and industry support behind them, likely in concert with AMD if it is to have any success.
I applaud Intel for trying to disrupt the GPU market and Apple for the CPU market
Apple is not disrupting the market in reality, only their own supply integration and the sanity of their third party software partners.

If they were willing to sell Axx SoC's beyond use in Apple products (like Samsung and occasional Exynos sales) then that could indeed be market disruption, but alas not.

Even the disruption to Intel's revenue is not so big, only 4% apparently - which could even go some way to alleviating their apparent chip shortages, albeit a pyrrhic solution to that particular problem.
You can see MS trying their best to be a player in the ARM space
I'm not sure which show you have been watching, from my perspective they are barely trying at all.

W10 ARM came out some time ago now and did not support x64 app emulation or any non DX gfx API more recent than OpenGL 1.0 - this might as well be a declaration of their lack of interest, to the point where I do wonder why they even bothered to order a custom Qualcomm chip at all with SG1.

To be clear, x64 support is still missing, post OGL 1 support is still missing, and Vulkan support is still missing. Qualcomm has at least a Vulkan driver on Android, so the latter is even more puzzling.

There is actually support for x64 Windows apps in the Android/Linux Hangover-Wine layer (albeit limited by QEMU's horrible binary translation), which makes the lack of that support in the real MS WARM completely ridiculous

I have a theory that MS were essentially bribed (for lack of a better word) by Qualcomm to make their minimal WARM effort, which explains the custom Surface SG1 chip and 850/8cx chips.

The problem is that in the world of bribes and contra revenue, no one beats Intel - so I believe that WARM will go nowhere as long as Intel can maintain such a stream of cash to MS for it's lackluster efforts.

Similarly the Chromebook market has been strangely devoid of new ARM SoC models despite the A72 based ones performing very well vs Intel equivalents when originally introduced.
 

DrMrLordX

Lifer
Apr 27, 2000
21,797
11,144
136
If they were willing to sell Axx SoC's beyond use in Apple products (like Samsung and occasional Exynos sales) then that could indeed be market disruption, but alas not.

That's more due to the type of company Apple is than anything else. Doesn't say much about how good (or bad) their cores are, but the reality is that we won't see their present or future A-series cores going head to head with other technology since Apple wants to keep it locked up inside their walled garden.

If you buy Apple, you get their hardware. If you don't buy Apple, you don't get their hardware. Pretty simple. If anyone (myself included) had fantasies about them licensing their cores elsewhere to shake up the computer scene . . . they can drop those fantasies in the dustbin. Not gonna happen. Not anytime soon, anyway.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
If Apple has to worry about selling to others, they'll lose the advantage they have.

Many things merchant CPU vendors have to worry about are not a problem for Apple. The CPU and OS vendors need to work very closely to get TTM as low as possible, and to get things working. If you have everything under one roof you don't have to deal with many conflicts that comes from having multiple different leaders.

The reason is why acquisitions happen. But acquisitions often end up being a failure, because the two leaders and management teams do not agree with each other.

The spirit and energy between two companies(and what works for them) are very different between two different companies. It's like they are ran by people with different goals and aspirations... oh wait its exactly that!
 
Last edited:
Reactions: lobz

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
Don't be so sure you understand what is going on here; clearly you don't.
https://www.parallels.com/blogs/apple-silicon-wwdc/
Note in particular "...makes it simple for businesses and individuals to use the applications and files from *any operating system* they need on their Macs"

Now how are they doing that? I have my hypotheses, but I'm sick of typing them up repeatedly, and having them ignored by people still living in the computing stone age. So I'll just leave you with that statement from Parallels.

They showed a virtualization demo using an ARM build of Debian, nothing to see here. I maintain the case that Rosetta 2 cannot be used for x86-64 virtualization and even Apple themselves state this limitation in their developer documentation so it's impossible for them to support x86 Windows VMs or x86 Linux VMs but on top of it all no Boot Camp as well ...

Rosetta 2's emulation model is exactly a consistent description of that of a static recompiler which is probably why the technical demos they showed had somewhat acceptable performance since they do a static analysis pass on x86 binaries ahead of runtime to generate the translated code.

Rosetta 2 not being able to handle x86 virtualization has massive implications that it might not work with executable binaries that have indirect branching. A dynamic analysis pass would be needed to handle this case but it'll come at a fairly severe performance penalty ...
 

Richie Rich

Senior member
Jul 28, 2019
470
229
76
And when Apple moves to ARM here comes also Chinese ARM HiSilicon Kungpeng 920 into desktop:
  • - ATX board
  • - 8c/8t CPU Kungpeng 920 @ 2.6 GHz (ARMv8 - 64-bit, internal development, not using licensed Cortex)
  • - 4x DDR4 64 GB max
  • - PCIe 3.0 1x16 lanes, 1x4 lanes, 1x1 lane
  • - 6x SATA, USB 3.0
  • - LAN GBit & optic

Performance in Blender BMW render: 11 min 45s
This suggest about 50% IPC of Zen2, somewhere near Cortex A72/A73, not really great.

I
nterestingly they mentioned Huawei has no access to ARMv9.
But Kungpeng 930 was planned with SVE vectors.

 

Eug

Lifer
Mar 11, 2000
23,752
1,284
126
The more I think about it, the more I think that the thermals on Arm will get even more priority than raw performance.

This will bring back the fanless MacBook and/or introduce a fanless MacBook Air.
This will reduce the "TDP" of the MacBook Pro.
This will reduce the "TDP" of the iMac.
I think the Mac mini will retain the fan (and I assume the A12Z dev kit has a fan although I'm not sure), but this will reduce the "TDP" of the Mac mini.

I think the only machines where the thermals will not be as much of a concern are the iMac Pro (if they release it again) and the Mac Pro.

Cuz right now, the MacBook Air is too loud, the high end MacBook Pro is too loud, and the high end iMac is too loud. Ironically the iMac Pro under similar usage as an iMac is usually not too loud according to reports, but that's because it has a completely different cooling system than the regular iMac.

It seems to me the current iMac should be running 65 Watt TDP Intel chips as they are in the lower end models and not the 95 Watt chips they have in the higher end models. The 95 Watt chips should really be paired with higher end cooling solutions like in the iMac Pro, but I don't see Apple doing that due to cost.
 

teejee

Senior member
Jul 4, 2013
361
199
116
The more I think about it, the more I think that the thermals on Arm will get even more priority than raw performance.

This will bring back the fanless MacBook and/or introduce a fanless MacBook Air.
This will reduce the "TDP" of the MacBook Pro.
This will reduce the "TDP" of the iMac.
I think the Mac mini will retain the fan (and I assume the A12Z dev kit has a fan although I'm not sure), but this will reduce the "TDP" of the Mac mini.

I think the only machines where the thermals will not be as much of a concern are the iMac Pro (if they release it again) and the Mac Pro.

Cuz right now, the MacBook Air is too loud, the high end MacBook Pro is too loud, and the high end iMac is too loud. Ironically the iMac Pro under similar usage as an iMac is usually not too loud according to reports, but that's because it has a completely different cooling system than the regular iMac.

It seems to me the current iMac should be running 65 Watt TDP Intel chips as they are in the lower end models and not the 95 Watt chips they have in the higher end models. The 95 Watt chips should really be paired with higher end cooling solutions like in the iMac Pro, but I don't see Apple doing that due to cost.

I agree 100% about fanless Macbook. This is a major premium attribute in a laptop. And very easy to accomplish with same SoC as a coming Ipad pro (”A14x”).
I haven’t thought about getting a Macbook before, but now I’m thinking about it
 

DrMrLordX

Lifer
Apr 27, 2000
21,797
11,144
136
Performance in Blender BMW render: 11 min 45s
This suggest about 50% IPC of Zen2, somewhere near Cortex A72/A73, not really great.

Why are we comparing it to Zen2? It isn't even in the same market. A board like that is mostly made to be cheap and to be a dev platform for current/future Huawei server hardware. It's more on the level of Zhaoxin's hardware.

My 3900x @ 4.35 GHz scores 1 minute 47s on the BMW scene. If I downclocked it to 2.6 GHz and restricted it to 8 cores, it would score ~4 minutes 29 seconds. That's not 50% . . . not even close.
 
Reactions: Tlh97 and Markfw

RasCas99

Member
May 18, 2020
34
85
51
As much as I hate to say it, nVidia is already the de facto leader/monopoly in many areas that require pro GPU for computation - that ship had sailed and even Intel is unlikely to unseat them from it without some pretty incredible academic and industry support behind them, likely in concert with AMD if it is to have any success.

Apple is not disrupting the market in reality, only their own supply integration and the sanity of their third party software partners.

If they were willing to sell Axx SoC's beyond use in Apple products (like Samsung and occasional Exynos sales) then that could indeed be market disruption, but alas not.

Even the disruption to Intel's revenue is not so big, only 4% apparently - which could even go some way to alleviating their apparent chip shortages, albeit a pyrrhic solution to that particular problem.

I'm not sure which show you have been watching, from my perspective they are barely trying at all.

W10 ARM came out some time ago now and did not support x64 app emulation or any non DX gfx API more recent than OpenGL 1.0 - this might as well be a declaration of their lack of interest, to the point where I do wonder why they even bothered to order a custom Qualcomm chip at all with SG1.

To be clear, x64 support is still missing, post OGL 1 support is still missing, and Vulkan support is still missing. Qualcomm has at least a Vulkan driver on Android, so the latter is even more puzzling.

There is actually support for x64 Windows apps in the Android/Linux Hangover-Wine layer (albeit limited by QEMU's horrible binary translation), which makes the lack of that support in the real MS WARM completely ridiculous

I have a theory that MS were essentially bribed (for lack of a better word) by Qualcomm to make their minimal WARM effort, which explains the custom Surface SG1 chip and 850/8cx chips.

The problem is that in the world of bribes and contra revenue, no one beats Intel - so I believe that WARM will go nowhere as long as Intel can maintain such a stream of cash to MS for it's lackluster efforts.

Similarly the Chromebook market has been strangely devoid of new ARM SoC models despite the A72 based ones performing very well vs Intel equivalents when originally introduced.

"Apple is not disrupting the market in reality, only their own supply integration and the sanity of their third party software partners."

Again , this is your view of things , which is fine , but like the iPhone that came into a mature 1 billion devices sold a year and entrenched leaders and we know how that ended.
For Apple to be disruptive , they will need to build excellent machines that are best in class in every category and then some , if they managed to do so and increase their market share , then its a big win for them , if all they did was keep doing more of the same with Arm CPU instead of an Intel, then i would say you are right and they failed to disrupt anything.

"I have a theory that MS were essentially bribed (for lack of a better word) by Qualcomm to make their minimal WARM effort, which explains the custom Surface SG1 chip and 850/8cx chips."
you think MS was bribed by QC to build their product ? this is NOT how the biggest company in the world operates (MS are , with Apple/Amazon) ? how much do you think they got for this effort , this is NOT how MS allocates resources to projects , and more over on the OS side not to mention their brand status , As i said repeatedly , they will not be left behind this transition (once this is now evident that its starting both on consumer and server side) , if anything THEY "bribed" QC to start working on some CPU`s for them , they will need to be there with a working OS when Arm gains traction , they will want to be there if Apple are successful and Arm start licensing new competitive CPU`s (which is the plan) , dont forget QC no longer work on the CPU side , so without Arm providing new designs its hard for QC to deliver.
They are hedging their bets , If Apple makes it big (probably already to be honest), you would be seeing much more development going into this from the Major players , MS , QC , Arm and my outside personal bet - Google.

All in all , I think the thread went to the SW/Business side much more then intended in the CPU section of the forum , I will be eager to see the results once the internet takes the new CPU`s through the gauntlet and see what comes out on the other side , competition is good , I think we will start see AMD and Intel give us some more bang for our buck and reduce their margins , be it lower prices or more die size for the same price.
 

name99

Senior member
Sep 11, 2010
445
333
136
Apple's problem is stubbornly adhering to their annual cadence of releasing a new version of software whereas the previous version was just mostly fixed up with duct tape that summer (as in WWDC announcement summer time). Catalina is lauded as a pretty bad release and now it's ARM prepping time so pack it up and move forward. For iOS they probably have a solid audience of testers to sort through; for Macs, I doubt developers even get a chance to catch all the bugs before the next release triggering a new wave of conflicts. It's all fine and dandy for a simple app but a pain in the butt for complicated programs.

I think Apple spread themselves too thin, don't have quite the actual talent they think they have, and work on a deadline where no one wants to be fired or blamed so everything is done as much as it can regardless of function or feasibility until it's cancelled or postponed. I don't know why they feel they need to release so often when Macs are generally neglected hardware-wise. I really think 18 months or even 24 months with a service pack in-between for crucial bugs/security would be so much better for the desktop platform. Microsoft seems to be going the same route with seasonal updates too.

I don't think that ("annual cadence of software") plays quite the role you think it does.
The tying to a specific timetable (September iPhones) is the real problem, IMHO. Allowing the release to shift three months or so, to really fix the last few serious issues, should be part of the release mentatlity.

The annual cadence, including releasing even though things feel unformed, is "necessary" if you want to move forward. It turns change into a constant low-level pain for everyone (devs, users, Apple) rather than turning it into something so large and terrifying that no-one wants to move forward in the slightest (mostly the MS and Linux experiences).

Someone who does not follow Apple closely (ie most people) has no idea how much Apple has been changing the OS guts of Darwin (including macOS) for the past four years or so. Obviously there's been a huge amount of security stuff, but look at the consequences of that
-- huge changes to permissions on macOS, changes to volume layout, volume sealing, notarization, moving drivers into user space, splitting more and more monolithic process (things like in-process video decode) into separate processes, creating a hypervisor, ...

macOS is diverging more and more from traditional UNIX to something that has both a very different security architecture, with all that implies, and a very different structure that's more appropriate for a many-core world. All of this is not easy! But it's the sort of thing that needs to be done if you want to keep improving, if you have a vision of a better future.

Now there are plenty of people who don't have such a vision; they believe that Windows or Linux as it exists today is basically close to perfect, enough so that all that's required is some tinkering at the margin, and that the disruptions caused by more extensive changes are not worth the hassle. (cf the US as the one country in the world that's not willing to go through a few years of pain for the eventual benefit of being metric like the rest of the world.)

They're welcome to believe that; they're the users of those systems. But Apple (and Apple's ecosystem) do not see the world that way; and comparing Apple (why this constant OS churn, why keep making big changes?) to other OS's where the changes are smaller misses the point that they are doing different things.

As for why it's more disruptive on Mac than on iOS, I don't think that's incompetence or even less concern and less testing. The Mac problem is fundamentally harder. Apple wants to get the Mac to essentially the same point as iOS devices with respect to RAS, but (in spite of the never-ending claims of the more paranoid elements, ie 80% or so..., of the internet) they don't want to give up Mac capabilities to get there.
So iOS/iPadOS can simply have a very limited set of hardware connectivity, with no concern for backward compatible hardware or software, and few expectations. While macOS has to graft RAS onto a system that has expected (and continues to expect) that they can plug in a huge variety of printers, hard drives, HID devices, screens, etc; along with a variety of third party drivers, and that will all work (and continue to work going forward to ARM). This means large disruptive changes like creating all the code on the Apple side to allow drivers to run in user space, then the changes on the developer side to move their drivers over.

Compare say Linux. As far as I can tell Linux has been talking about user space drivers for years and years. And they have a few toy drivers (things like case LEDs) that demonstrate the concept. But the work of doing this seriously, moving many, performance critical drivers, into user space as far as I know remains undone, still just talk.

The same is as true for security. Apple wants iOS levels of security for the mac. But the mac expectation has been that random third party code can do what it likes to the entire disk, not just its sandbox. Hence the phenomenon of ransomware...
Ideally what you'd like is that third party code can do what it likes to some small part of the file system that it's been given access to AND can modify the rest of user files WITH THE USER's EXPLICIT PERMISSION, but not otherwise. But that's a hard place to get to technically. Not just changes to the OS, but changes to developers ("be prepared that you ONLY write in your directory, and you use these APIs otherwise") and to users ("be prepared that it's OK when an app asks for this type of file permission in this context, but it's not OK in this other context"). Not easy. Messy.
But the end goal is that you'll still be able to run an app like, I don't know, maybe XCode, that has good reasons to be able to look at files anywhere in your home directory, but in any malware gets onto your mac it will NOT be able to encrypt every file in your home directory and demand ransomware.

Every time you complain about this Apple churn, think of that as the motivating example -- what's allowing ransomware to encrypt your entire home directory, and how would you stop it?
Saying "well don't allow ransomware onto my machine" is a childish answer. Duh, of course that's a different prong of security, also important, also being worked on; but we want defense in depth.
Are standard UNIX permissions going to protect you? Clearly not. How about Microsoft's richer set of permissions and extended attributes? I know a lot less about them, but I don't think so.
 
Reactions: defferoo

name99

Senior member
Sep 11, 2010
445
333
136
They showed a virtualization demo using an ARM build of Debian, nothing to see here. I maintain the case that Rosetta 2 cannot be used for x86-64 virtualization and even Apple themselves state this limitation in their developer documentation so it's impossible for them to support x86 Windows VMs or x86 Linux VMs but on top of it all no Boot Camp as well ...

Rosetta 2's emulation model is exactly a consistent description of that of a static recompiler which is probably why the technical demos they showed had somewhat acceptable performance since they do a static analysis pass on x86 binaries ahead of runtime to generate the translated code.

Rosetta 2 not being able to handle x86 virtualization has massive implications that it might not work with executable binaries that have indirect branching. A dynamic analysis pass would be needed to handle this case but it'll come at a fairly severe performance penalty ...

Uh, duh! Which part of
"
Now how are they doing that? I have my hypotheses, but I'm sick of typing them up repeatedly, and having them ignored by people still living in the computing stone age. So I'll just leave you with that statement from Parallels.
"
and its implications did you not understand?

If you can't read between the lines, that's on you, not on me. But perhaps start by assuming that I know what I'm talking about here and that you can learn something by parsing what I said very carefully...
 

name99

Senior member
Sep 11, 2010
445
333
136
The more I think about it, the more I think that the thermals on Arm will get even more priority than raw performance.

This will bring back the fanless MacBook and/or introduce a fanless MacBook Air.
This will reduce the "TDP" of the MacBook Pro.
This will reduce the "TDP" of the iMac.
I think the Mac mini will retain the fan (and I assume the A12Z dev kit has a fan although I'm not sure), but this will reduce the "TDP" of the Mac mini.

I think the only machines where the thermals will not be as much of a concern are the iMac Pro (if they release it again) and the Mac Pro.

Cuz right now, the MacBook Air is too loud, the high end MacBook Pro is too loud, and the high end iMac is too loud. Ironically the iMac Pro under similar usage as an iMac is usually not too loud according to reports, but that's because it has a completely different cooling system than the regular iMac.

It seems to me the current iMac should be running 65 Watt TDP Intel chips as they are in the lower end models and not the 95 Watt chips they have in the higher end models. The 95 Watt chips should really be paired with higher end cooling solutions like in the iMac Pro, but I don't see Apple doing that due to cost.

(1) aTV 4K has a fan...

(2) Part of the concerns with mac devices (certainly mini, even laptops) is that you want to be able to deliver reasonable amounts of power to some number of USB ports (otherwise you get bitter complaints about people who want to charge 4 different devices from their mac mini/macbook pro at the same time. This doesn't absolutely necessitate a fan; but it does necessitate a certain degree of larger volume and larger components everywhere than is required as you move down through aTV to iPad to iPhone.

So I'm not so much contradicting you as suggesting that the cutbacks (in size, in cooling) cannot be as extreme as some people are suggesting, not unless you're willing to cutback on USB power delivery. And while Apple may do that for the MacBook Vacuum (or whatever they want to call the thinnest, one USB-port model they'll doubtless ship again at some point), there's a substantial user base for whom lotsa ports is part of the point.

And once you have specced all that cooling (active or passive) for power delivery, you might as well juice the SoC enough to be able to utilize it under conditions when no power delivery is happening...

I agree that the current laptops are (way) too loud (iMac I don't know); and that that will be fixed. But I would not go so far as to suggest no fan, rather that a quieter fan could be used along with better communication between fan, SoC, and OS. Has anyone ever complained about the fan in the aTV?
 
Last edited:

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
If you can't read between the lines, that's on you, not on me. But perhaps start by assuming that I know what I'm talking about here and that you can learn something by parsing what I said very carefully...

Then how about you start demonstrating this knowledge without being so condescending ? Feeling a little defensive are we now after being corrected ?
 

Doug S

Platinum Member
Feb 8, 2020
2,483
4,041
136
Apple also didn't design most of their critical hardware components either for the original iPhone. Their CPU/memory were supplied from Samsung and their graphics processor was provided from Imagination Technologies ...


Well of course not, do you think it would make sense for them to wait years to release it while they design their own CPU, GPU, etc. and take the risk that someone beats them to it (though since Android was developing a Blackberry clone until Apple showed the iPhone and they did a 180 maybe Apple wouldn't have had to worry too much about this) as well as take the risk that all that investment is for nothing if the product doesn't succeed?

It takes years to ramp up to where you are able to do these things, even if you buy a company that's ready to do it - Apple bought PA Semi in 2008 but it took four years before they shipped the first iPhones with a custom ARM core. If Apple decided today "we're going to start designing our own cameras so we can get some value add there" it would be years before we saw the first Apple designed camera in an iPhone.

You build these vertical chains over long periods of time in a product as complex as a smartphone, you can't make everything needed on day one, or afford to wait until you can.


As far as having no software at the start that's not true because there was once a time where iOS provided lot's of value to customers but as Apple has started to obsess over hardware their software has degraded in quality over the years. Now iOS is in a constantly broken state and what's more is that after every update tons of programs from the App Store keeps breaking!


Maybe this is news to you, but as software grows bigger and more complex, the number of bugs in it increase. Look at the terrible quality of each Windows 10 update, Microsoft can only dream about reaching an iOS level of quality on release. Even Linux is not immune despite the "many eyes" of open source. Each new kernel version gets released and then there are multiple point releases to fix bugs - the LTS kernels are still fixing bugs years later. Linux distributions will add their own cycle of testing/fixing, which can be pretty lengthy for enterprise products like RHEL, and they still are fixing bugs years later.

Please point to somewhere, ANYWHERE, that is doing complex software on the scale of a modern iOS, Windows, or Linux that produces solid, stable, bug free code. I'll wait.
 

Doug S

Platinum Member
Feb 8, 2020
2,483
4,041
136
They showed a virtualization demo using an ARM build of Debian, nothing to see here. I maintain the case that Rosetta 2 cannot be used for x86-64 virtualization and even Apple themselves state this limitation in their developer documentation so it's impossible for them to support x86 Windows VMs or x86 Linux VMs but on top of it all no Boot Camp as well ...


Rosetta 2 is supported for user mode code only, so of course it can't run an x86 hypervisor. But why should it need to? If you want to run x86 Windows code you will run an ARM64 hypervisor running Windows/ARM - which supports running x86 Windows applications.

I don't know why you think there's this huge army of Mac owners who buy them primarily to run Windows. And also believe that not only is that the case, but you are the only one who knows. Even Apple is somehow in the dark.
 

soresu

Platinum Member
Dec 19, 2014
2,956
2,175
136
Again , this is your view of things , which is fine , but like the iPhone that came into a mature 1 billion devices sold a year and entrenched leaders and we know how that ended.
iPhone as a single product does occupy the top sales position for smartphones yes - but when you account for the platform it runs on Android overtakes iOS device sales significantly.

Like by about 3x as much.

Not so bad considering that Android devices run woefully inferior CPU cores eh?

Almost like the market hit a saturation point where performance was simply good enough and price started to become the problem for further market expansion?

(Hint hint, there is a reason that Apple have a cheaper iPhone model)
you think MS was bribed by QC to build their product ? this is NOT how the biggest company in the world operates
It is when their single largest hardware partner (ie Intel) is in direct opposition to the very concept of WARM.

I've been using Windows since 3.1 and DOS before that, trust me they operate in whatever way makes them money as fast as they can make it - if they think that they can play Qualcomm and Intel off each other to get the best bribe/contra revenue then they will.

The very fact that they have finally 'embraced' Linux, and to an extent open source implies that they are fighting to remain relevant in a world dominated by mobile OS's they found themselves unable to compete in - in the absence of surety of dominance they will play the market and profit margins any way they can to keep share holders happy and content.
if anything THEY "bribed" QC to start working on some CPU`s for them
Considering the lackluster effort of WARM's implementation this seems somewhat unlikely.

What seems more likely, that they bribed Qualcomm and then barely bothered to make an effort on WARM which only discourages people from buying into the platform.

OR - that Qualcomm bribed them to open up a new market to sell their SoC's as seemingly they are getting nowhere fast with ARM based Chromebooks using their SD SoC's, despite previously demonstrated parity of A72 SoC's with contemporary Intel chips in CB performance.

Qualcomm has more to gain from WARM being a success - but MS has more to gain by using the mere existence of a barebones WARM as a huge bargaining chip to haggle a contra revenue deal from Intel, something we have already witnessed Intel doing in the failed x86 smartphone market, and very likely the Chromebook market too considering ARM seems to have died a horrible death there.
and my outside personal bet - Google.
That's not a bet, it's guaranteed with their 'Whitechapel' chip designed at Samsung, which more than likely uses RDNA IP as Samsung's future Exynos does.
if all they did was keep doing more of the same with Arm CPU instead of an Intel, then i would say you are right and they failed to disrupt anything.
I feel like I must drive a point home, because your mind seems addled with heroic idealism where Apple is concerned.

They are not trying to disrupt or do anything amazing, they are just trying to keep people interested in their products, and increasing their profits from vertical integration of home grown/semi home grown hardware IP vs contracted chips - this also gives them a greater degree of control over their own product stack which is already pretty tight as it is in mobile, they are just bringing their pro, desktop and laptop divisions into parity with mobile.

These are all things a company would do to increase profits and diminish reliance on outside factors from contractors implementation foibles (ie Intel 10nm).
 
Last edited:
Reactions: Tlh97

Eug

Lifer
Mar 11, 2000
23,752
1,284
126
(1) aTV 4K has a fan...

(2) Part of the concerns with mac devices (certainly mini, even laptops) is that you want to be able to deliver reasonable amounts of power to some number of USB ports (otherwise you get bitter complaints about people who want to charge 4 different devices from their mac mini/macbook pro at the same time. This doesn't absolutely necessitate a fan; but it does necessitate a certain degree of larger volume and larger components everywhere than is required as you move down through aTV to iPad to iPhone.

So I'm not so much contradicting you as suggesting that the cutbacks (in size, in cooling) cannot be as extreme as some people are suggesting, not unless you're willing to cutback on USB power delivery. And while Apple may do that for the MacBook Vacuum (or whatever they want to call the thinnest, one USB-port model they'll doubtless ship again at some point), there's a substantial user base for whom lotsa ports is part of the point.

And once you have specced all that cooling (active or passive) for power delivery, you might as well juice the SoC enough to be able to utilize it under conditions when no power delivery is happening...

I agree that the current laptops are (way) too loud (iMac I don't know); and that that will be fixed. But I would not go so far as to suggest no fan, rather that a quieter fan could be used along with better communication between fan, SoC, and OS. Has anyone ever complained about the fan in the aTV?
I agree with you actually.

I wasn't suggesting fanless MacBook Pros. I was suggesting fanless MacBooks and MacBook Airs, but quieter MacBook Pros and iMacs. Performance-wise, the fanless machines could theoretically just run A14, or if that is not palatable from a marketing perspective, perhaps it could be some sort of A14C chip that is more than A14 but less than A14X. The MacBook Pros would run A14X and more powerful chips, but not at the power levels used by chips such as Intel's Core i9-9980HK.

With regard to the iMacs, besides taking my word for it, people can check some online reviews which have compared the fan noise behaviour of the iMac vs the iMac Pro. With 4K editing and export in Final Cut the i7 and above 95 Watt iMacs often will ramp up to full vacuum cleaner mode in short order, but that doesn't happen with the iMac Pro, despite the iMac Pro having a 140 W TDP CPU. This doesn't happen with the i5 65 Watt iMacs either, and it should be noted that the 65 Watt iMacs have the exact same cooling systems as the 95 Watt iMacs.

To test this out in a more objective manner, a few of us ran a bunch of simple tests transcoding the same video in Handbrake using the same software video encode settings, on different iMacs of that era (2017). The Core i7-7700K (91 W) would transcode the video in 10 minutes, but would ramp the fan to maximum at the 30 second mark. The Core i5-7600 (non-K, 65 W) would take 12.5 minutes for the same job but would be silent to quiet, gradually increasing fan noise until about the 9.5 minute mark, and then would be full blast after that. This makes all the difference in the world IMO. Consumers can export that short kid's birthday video on the i5 silently in 2.5 minutes, while those with the i7 would have the fan screaming blue murder for 1.5 minutes out of the 2 minutes it would take to do the same job.
 
Last edited:
Reactions: IntelCeleron

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
Well of course not, do you think it would make sense for them to wait years to release it while they design their own CPU, GPU, etc. and take the risk that someone beats them to it (though since Android was developing a Blackberry clone until Apple showed the iPhone and they did a 180 maybe Apple wouldn't have had to worry too much about this) as well as take the risk that all that investment is for nothing if the product doesn't succeed?

It takes years to ramp up to where you are able to do these things, even if you buy a company that's ready to do it - Apple bought PA Semi in 2008 but it took four years before they shipped the first iPhones with a custom ARM core. If Apple decided today "we're going to start designing our own cameras so we can get some value add there" it would be years before we saw the first Apple designed camera in an iPhone.

You build these vertical chains over long periods of time in a product as complex as a smartphone, you can't make everything needed on day one, or afford to wait until you can.

For a software company, it wouldn't but that just highlights why Apple changed to being a hardware company ...

[Maybe this is news to you, but as software grows bigger and more complex, the number of bugs in it increase. Look at the terrible quality of each Windows 10 update, Microsoft can only dream about reaching an iOS level of quality on release. Even Linux is not immune despite the "many eyes" of open source. Each new kernel version gets released and then there are multiple point releases to fix bugs - the LTS kernels are still fixing bugs years later. Linux distributions will add their own cycle of testing/fixing, which can be pretty lengthy for enterprise products like RHEL, and they still are fixing bugs years later.

Please point to somewhere, ANYWHERE, that is doing complex software on the scale of a modern iOS, Windows, or Linux that produces solid, stable, bug free code. I'll wait.

Windows 10 might have issues but at least it doesn't keep breaking so many apps unlike iOS or macOS does ...

Windows 10 would be a gold standard in software quality compared to what Apple has been pumping out lately ...
 

Doug S

Platinum Member
Feb 8, 2020
2,483
4,041
136
For a software company, it wouldn't but that just highlights why Apple changed to being a hardware company ...



Windows 10 might have issues but at least it doesn't keep breaking so many apps unlike iOS or macOS does ...

Windows 10 would be a gold standard in software quality compared to what Apple has been pumping out lately ...


You live in an alternate reality if you think Windows 10 is higher quality than iOS. iOS quality is not great and its quality has gone downhill from back when Steve Jobs was around but it is also far more complex now. But Windows 10 has been an utter disaster, especially the last couple releases. For a time it seemed like Microsoft had things figured out in the Windows 7 days, but ever since they started messing with the UI it has become more and more fragile. It is almost as if they are nostalgic for Windows ME.

Apple has always been a hardware company. When do you think they were a 'software company'? During that period in the 90s when they were licensing Mac OS to third parties? Everyone agrees those were the worst days Apple has ever seen, if things had got any worse for them they wouldn't have survived to make it to 2000, let alone become what they are today.
 

Doug S

Platinum Member
Feb 8, 2020
2,483
4,041
136
iPhone as a single product does occupy the top sales position for smartphones yes - but when you account for the platform it runs on Android overtakes iOS device sales significantly.

Like by about 3x as much.


And this matters why? Apple may have a much lower market share than Android but they grow their installed base by around 100 million phones a year, meaning their share of the installed base keeps growing despite the market share remaining pretty stagnant for the past five years - iPhones have a longer useful life because they get updated for ~5 years or more.

Which is why developers continue to make more money from iOS apps than Android apps despite Android's commanding lead in market share.
 

defferoo

Member
Sep 28, 2015
52
51
91
For a software company, it wouldn't but that just highlights why Apple changed to being a hardware company ...
Since you obviously don’t know anything about Apple, I have some news for you. they have been a hardware company since 1976. they may not have designed their own chips, but they’re through and through a hardware company. I don’t have insight into the inner workings of Apple, but my guess is the majority of their engineers actually write software and don’t design hardware. This mythical change you talk about doesn’t exist, stop trying to make it a thing.

Windows 10 might have issues but at least it doesn't keep breaking so many apps unlike iOS or macOS does ...

Windows 10 would be a gold standard in software quality compared to what Apple has been pumping out lately ...
What bugs are you referring to exactly? I use iOS and macOS for 10+ hours a day and haven’t noticed anything obvious. You spew all this BS and don’t back it up with concrete examples. I can do the same thing.

macOS would be a gold standard in software quality compared to what Microsoft has been pumping out lately ...

what now?
 
Reactions: scannall

peanutbridges

Junior Member
Jul 5, 2020
3
5
36
To test this out in a more objective manner, a few of us ran a bunch of simple tests transcoding the same video in Handbrake using the same software video encode settings, on different iMacs of that era (2017). The Core i7-7700K (91 W) would transcode the video in 10 minutes, but would ramp the fan to maximum at the 30 second mark. The Core i5-7600 (non-K, 65 W) would take 12.5 minutes for the same job but would be silent to quiet, gradually increasing fan noise until about the 9.5 minute mark, and then would be full blast after that. This makes all the difference in the world IMO. Consumers can export that short kid's birthday video on the i5 silently in 2.5 minutes, while those with the i7 would have the fan screaming blue murder for 1.5 minutes out of the 2 minutes it would take to do the same job.

The thing I see is that a lot of users tend to "max out" their Macs as if it were their big-time splurge purchase much like a new car. To me, I wouldn't mind a i5 over say i7 but the price premium Macs command and the relative inability to upgrade later on warrants having to consider the next step up. If I bought a machine used or refurbished with a good overall setup and good price in my mind, sure; brand new retail stock, I'm a lot more weary.

I suppose with Apple Silicon they'll curb this issue because I doubt they'll have a bunch of core variants, probably just the same SoC for iPads/Macbooks/13-inch Pro/small iMacs and a more performant one for bigger iMacs/15-inch Pros. Lack of options makes it a lot easier to decide but soldering-on of components is the same issue of requiring more upfront cost which may or may not be needed. Maybe if they just go 16GB of RAM across the board and call it a done deal, this too would be of only importance to 32GB users instead of fence riders.
 

Doug S

Platinum Member
Feb 8, 2020
2,483
4,041
136
The thing I see is that a lot of users tend to "max out" their Macs as if it were their big-time splurge purchase much like a new car. To me, I wouldn't mind a i5 over say i7 but the price premium Macs command and the relative inability to upgrade later on warrants having to consider the next step up. If I bought a machine used or refurbished with a good overall setup and good price in my mind, sure; brand new retail stock, I'm a lot more weary.

I suppose with Apple Silicon they'll curb this issue because I doubt they'll have a bunch of core variants, probably just the same SoC for iPads/Macbooks/13-inch Pro/small iMacs and a more performant one for bigger iMacs/15-inch Pros. Lack of options makes it a lot easier to decide but soldering-on of components is the same issue of requiring more upfront cost which may or may not be needed. Maybe if they just go 16GB of RAM across the board and call it a done deal, this too would be of only importance to 32GB users instead of fence riders.


I still expect they will sell different performance levels, even if they offer (for example) an 8 core Macbook Pro they might also offer it in 6 and 4 core variants (which would probably all have 8 cores in them they'd just disable some on the lower end models) They will offer different RAM configs too, there won't be one size fits all, but not being able to upgrade later would be annoying to me. I usually give my laptops a "mid life kicker" after a couple years when the bigger RAM modules become more reasonably priced.

Measured as a percentage of overall PC sales hardly anyone upgrades CPUs after purchase so the lack of upgradeability isn't really an issue. The Anandtech forum readership isn't average consumers. Hardly any laptops allow CPU upgrades but no one seems to have an issue with that.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |