Discussion Apple Silicon SoC thread

Page 225 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Eug

Lifer
Mar 11, 2000
23,751
1,283
126
M1
5 nm
Unified memory architecture - LP-DDR4
16 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 12 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache
(Apple claims the 4 high-effiency cores alone perform like a dual-core Intel MacBook Air)

8-core iGPU (but there is a 7-core variant, likely with one inactive core)
128 execution units
Up to 24576 concurrent threads
2.6 Teraflops
82 Gigatexels/s
41 gigapixels/s

16-core neural engine
Secure Enclave
USB 4

Products:
$999 ($899 edu) 13" MacBook Air (fanless) - 18 hour video playback battery life
$699 Mac mini (with fan)
$1299 ($1199 edu) 13" MacBook Pro (with fan) - 20 hour video playback battery life

Memory options 8 GB and 16 GB. No 32 GB option (unless you go Intel).

It should be noted that the M1 chip in these three Macs is the same (aside from GPU core number). Basically, Apple is taking the same approach which these chips as they do the iPhones and iPads. Just one SKU (excluding the X variants), which is the same across all iDevices (aside from maybe slight clock speed differences occasionally).

EDIT:



M1 Pro 8-core CPU (6+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 16-core GPU
M1 Max 10-core CPU (8+2), 24-core GPU
M1 Max 10-core CPU (8+2), 32-core GPU

M1 Pro and M1 Max discussion here:


M1 Ultra discussion here:


M2 discussion here:


Second Generation 5 nm
Unified memory architecture - LPDDR5, up to 24 GB and 100 GB/s
20 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 16 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache

10-core iGPU (but there is an 8-core variant)
3.6 Teraflops

16-core neural engine
Secure Enclave
USB 4

Hardware acceleration for 8K h.264, h.264, ProRes

M3 Family discussion here:


M4 Family discussion here:

 
Last edited:

FlameTail

Diamond Member
Dec 15, 2021
3,148
1,796
106
On that note, next year for the A18/M4, substantial IPC gains are necessary.

If the A18/M4 uses the N3E node (which it most likely will), they are gonna run into a wall because N3B -> N3E the performance gain is only 2-5%. They can't get significant performance uplifts anymore by increasing the frequency.

 

GC2:CS

Member
Jul 6, 2018
27
19
81
Standard 60 Hz display should have a response time of 16.67 ms G2G.

Typical response time of, lets say Gigabyte G24F monitor, with 165 Hz had 4.1 ms, which is actually good enough to deliver 240 Hz, but the panel itself has 165 Hz refresh rate, and is considered one of the best budget gaming monitors.

So, what do you think is the G2G response time of Super Hiper Retina XDR Mobile god of displays, to deliver 120 Hz?

21 ms on 14 inch, 22 ms on 16 inch MacBook Pro. Thats good enough response time for 45 Hz display, but the display has 120 Hz refresh rate.

Its absolutely atrocious. Its not good enough for 60 Hz refresh rate, which results in absolutely terrible smearing everywhere, even on the OS level.


From 3:20.

Apple LCDs always had quite long response times. For example older iPads had 160ms. But this might be partly due to the way the display transitions in states or measuring methods.

With Pro motion they went much lower with response times

If you want good gaming Oled displays are coming everywhere in a year or two

I do have very sensitive eyes and never thought that the ghosting was terible. In fact I prefer the LCD iPad Pro motion to the iPhone oled one. It looks smoother to me. W

.
On that note, next year for the A18/M4, substantial IPC gains are necessary.

If the A18/M4 uses the N3E node (which it most likely will), they are gonna run into a wall because N3B -> N3E the performance gain is only 2-5%. They can't get significant performance uplifts anymore by increasing the frequency.

View attachment 88514
Why they could not increase the clock ?
I think we already know what is comming up

M1 3,23 Ghz -> A15 3,24 Ghz
M2 3,47 Ghz -> A16 3,46 Ghz
M2 Max 3,69 Ghz -> A17 3,68 Ghz
M3 4,05 Ghz -> A18 4,05 Ghz

This proves Apple tests their next gen clock setting in past gen M chips with higher power budget. Also I would look for a higher clocked M3 variant. If there is one say at 4,3 Ghz then it means two things A19 will run at 4,3 and that M5 will be based on A19 eg we will see a generation skip like with M2 to M3.

So now the bets are if A18 goes to 4Ghz or not. Would put my sticker “4Ghz flame flame speed” on such a phone like in the old days.
 
Sep 18, 2023
26
13
41
Why they could not increase the clock ?
I think we already know what is comming up

M1 3,23 Ghz -> A15 3,24 Ghz
M2 3,47 Ghz -> A16 3,46 Ghz
M2 Max 3,69 Ghz -> A17 3,68 Ghz
M3 4,05 Ghz -> A18 4,05 Ghz

So now the bets are if A18 goes to 4Ghz or not. Would put my sticker “4Ghz flame flame speed” on such a phone like in the old days.

It is concerning that the newer process nodes don't seem to be scaling efficiently. Relying on faster clocks leads to higher temperatures and higher power draw. The M2 fans were noisier than M1, and the M3 is also getting quite warm. If this trend continues, will it result in chips with excessive power consumption and temperatures, similar to Intel's? In addition to process node improvements, we need to focus on improving IPC (instructions per cycle) to make this strategy sustainable.
 
Reactions: Tlh97 and FlameTail
Mar 11, 2004
23,173
5,639
146
Hmm, he didn't make any subjective comments about it being blurry or smearing. Have you used one in person?

I'm wondering if its not a function of some animation smoothing or other behavior. Maybe dynamic refresh rate handling (what's the range for the Macs? I think iPhones can do 1-120Hz?).

At least phones are being used outside under bright sunlight. Even there we are going overboard. I had no issues using Galaxy Note 4 back in 2014 in bright sun light. I am not certain we need 3000 nits like what we are seeing today. At this point these races are only for marketing reasons. Samsung goes for Megapixel race, companies goes for most number of cameras or some metric to market it. But from usability purpose there is hardly anything dramatic.

Laptops these days even the cheapo windows/chromebook ones are much better than what we used to get 10 years ago. At this point we get only incremental value spending 1000s of $ more. I can understand users of specific application/gamers might want certain features that make them ok to pay more. Otherwise its all a status symbol

Yeah, its very dependent on usage. For a laptop I think 500 nits should be the target. Trying to offer 2000nits and stuff like that is just getting silly/stupid and is turning into yet another marketing spec race.

Hmm... This is probably futile, but I'll provide some contextual information anyway.

Typically content creation oriented professional monitors do not have super high refresh rates.

For example, this is the BenQ SW321C 31.5" 4K monitor that goes for ~US$2000. It's 60 Hz.


Creative Bloq gives it a 4.5 out of 5 stars in their review, commenting that its $2000 price is comparatively low for this category of monitor.


The thing is those are aimed at like photo editing, not video. They can be used for video but the focus there would be the color reproduction not say motion handling. Plus I'm not sure that the difference in motion handling between different technologies is even considered at all whatsoever in any video production and/or mastering right now. I think it used to somewhat when CRTs were the dominant display tech (and even then I think it was more say how the scanlines affected motion), but I'm not aware of any that consider it at this point. OLEDs do better motion handling even at lower refresh rates than LCD. And projectors use a variety (DLP, LCD, then light sources). And these VR/AR headsets use a variety, and motion handling is even more important there. But then very little content is produced targeting such yet.

There is some improvement in perf/watt but not much. Keep in mind M2 Max is on N5P, so once again the perf/watt of N3B looks roughly equivalent of N4P.

I'm curiouis how sustained performance compares. Will the M3 chips use enough that we might end up seeing more throttling than before? At some point Apple is going to have to deal with that and develop better cooling. I feel like that makes more sense for segmentation and should be a metric that is easier to compare (maybe we need a digital measurement that's like horsepower where its measuring work performed over time; I'm pretty sure there already is one that would be applicable for the energy usage like perf/W, but not sure it could comprehensively show the mix of heat output or thermal limit as well; I'm probably overthinking it though). and they should simplify their lineup while adding a separate system for say storage, have 256GB base integrated fast NAND for OS, but then have a separate slot for adding more. Each tier up doubles the standard NAND (so Pro starts at 512, Max 1TB, Ultra 2TB; RAM can probably be don

Higher nits equal better highlights for a more impactful HDR experience. Check rtings reviews of OLEDs. They rarely go above 600 nits for full screen brightness.

Yes, its about highlights (although frankly I personally don't want these blown out HDR highlights, if I want super realistic brightness I wouldn't be watching TV; HDR blowouts is gonna be so obnoxious when its your whole vision in like an AR/VR). That's a function of power though, and you would not want to be pumping out 500+ nits whole screen on an OLED, both because it'll be more likely to cause issues with the pixels but also it would be using so much power and putting out a ton of heat (that most of the TVs are not setup to handle, increasong potential failures).

I think it's a fools errand to live on someone else's promises or fantasies.

Best to wait for testable silicon to arrive and proceed from that point.

I am not too doubtful of Qualcomm's claims, but we'll see how it does in actual devices. Apple has shown its possible to offer such performance, even sustained, in platforms that don't have much in the way of cooling design. With modern PC designs (not even taking into account future ones like those piezo-electric "fans" and the like), they should be able to manage the load these chips offer without too much hassle. But the thing to keep in mind is this is more about elevating the base (or in the PC space more like the mid-range start as the base is super cheap stuff that is either years old like how AMD still has chips with Zen 2 cores, or much more limited 2 cores with weak GPU and little if any co-processors). Which is great and will improve things for most people.

I'm not "defending this". This screen would be better with a lower response time.
I'm responding to the comment that this screen is "atrocious" and that Apple should never charge this amount of money for because of this screen. That's different.

If you just said that this screen had poor response time, I wouldn't have replied. But your comment implied that it was a shitty screen overall and that users were dumb to use it. While the reality is that this screen is best in class in several other metrics and that motion blur is not a problem for most people.
I personally have zero issue with blur while scrolling. I don't stare at an image during scrolling, I don't read text during scrolling and even if I did, the blur is minimal and doesn't impact readability.
If motion blur was such an issue for everyone, the so called "soap opera effect" wouldn't be an problem and every filmmaker would be using high shutter speed (like in the first scene of Saving Private Ryan), or they would all be shooting at 48 fps, and fullscreen motion blur would not be a setting in video games.
The truth is, motion blur is not an issue for most people.

Again, not saying that this screen wouldn't be better with a lower response time, just that most people don't care and value other metrics like color fidelity, contrast, resolution and brightness.

Yeah different people prioritize different things. Reminds me of someone saying some display in a Lenovo laptop was total garbage because of PWM or something, including them linking a review that they acted like supported their claim, except the review basically said the display was exceptional, and just remarked that it had a fairly low PWM strobing or something which some people are sensitive to but that while it was somewhat low, it still shouldn't affect too many people much. And the person was going on a tirade acting like the display was literally unusable like it would turn you blind or something.

This is the first time I've seen a complaint about the response time on Apple's stuff. Some of that is probably due to Apple talking about gaming more so maybe more people are looking to game or people used to gaming focused performance are trying Apple stuff and noticing the discrepency? But I have a hunch there's some animation smoothing or other going on that is impacting things (although perhaps that's already been ruled out?).

I have seen several reviews that commented on watching video on Apple's stuff and they didn't seem to remark about that aspect, which I know people have different thresholds, and many tech reviewers are lacking in hardcore expertise on some of that stuff. And its less pronounced than gaming is.

It's easy to forget that Apple isn't a CPU manufacturer/vendor and doesn't really care about spec wars.

The Apple Silicon team produces silicon for the Apple product roadmap and to satisfy the needs of future Apple products.

The fact that Apple has occasionally pulled far enough ahead to spank Intel and AMD - at their reduced energy envelope - is more side-effect than goal.

When you combine the CPU speed with their video toolkit you get video editing/rendering/transcoding which makes their machines pretty much ideal for that task, especially when you consider they can do it on a center table in Starbucks or at 30,000 feet.

I think Apple simply made a chip that targeted a common denominator (I wouldn't say lowest, but more like median, where most people could use a decent amount of performance but many don't need something excessively powerful or all the features that PC platforms offer, and many would definitely prioritize efficiencty aka battery life), so there was a lot of pretty easy low-hanging fruit. They could adapt their mobile processor design since it was strong enough (the biggest issue was simply overcoming that software situation, but they having already done something similar in the PowerPC to x86 change, and having the benefit of the development for ARM via iOS). And Qualcomm is basically doing the same thing.

Intel and AMD could've been offering x86 designs that would've made both less interesting, but neither had even been attempting to really do that in that space (closest was probably Intel with that weird VEGA chip, or I think there was one that mixed Core with Atom), for 2 simple reasons, their designs were targeting a wide application use (desketop, enthusias/pro use, and server/enterprise got more focus) and their method of segmenting was basically core counts, and because of lack of competition they let natural progression manage things for people that say wanted better efficiency and/or GPU performance.

I'd been saying for years that I wanted the baseline processors to be better balanced. Basically designing a chip that was balanced across the board. Which compared to the APU designs AMD and Intel were offering would've meant simplifying things with either adopting a BIG.little setup or cores tuned for the application more (seemingly what the ZenXc cores are doing), stronger GPUs with more memory bandwidth to support them, and then some fast NAND. My fix for it would've been 16GB of HBM serving as system memory (where it would offer like 256GB/s bandwidth, with NAND basically RAM channels although I think that'd be unnecessary with the PCIe spec these days, but offering several GB/s NAND bandwidth). Apple pretty much did that, and it didn't take that much work, it just took someone simply doing that.

Yeah Apple will leverage OLEDs to fix the garbage panel issue later on.

I'm surprised we haven't seen more attempte a simple intermediary OLED layer. Just a simple on/off layer of white OLEDs pixels. Could even be like 1/4 the resolution of the display (and it'd still offer more "zones" than any mini-LED does). They wouldn't need to be super bright, just basically activate to do some low level light for dark but not totally dark, but then have even an edge lit or other fairly simple LED backlight which offers the higher brightness, with turning off or maybe have some gradient of on/off to allow light blocking and better handling).

It could also potentially do something to offer better output in say outside/sunlight conditions, by offering a transparent display or something where it can pass sunlight through the back or maybe a reflective aspect (kinda like e-ink). So that if you're in a bright area, the light actually boosts the display (without needing that from the display itself) but then in dark situations you get the benefits of OLED.

I know there was a TV that did that, and there's been other prototypes that used dual OLED layers as a means of offering higher brightness and/or compensating for color (maybe even an attempt at dealing with the RGBW pixel setup).

Just an indicator that Apple has been stockpiling these M3 MacBook Pros for a while: Some customers have been receiving their new M3 MacBook Pros with an unreleased version of macOS Ventura, despite the fact that the newer macOS Sonoma came out way back in September.

Normally that would be a good thing since some workflows would be better served by the more mature Ventura, which we now know can run on these new machines, but that doesn't really apply here because that older macOS version is not actually available for download.


Yup. I said the same thing about the CPU core count and the CPU performance of M1 Pro and M2 Pro vs the corresponding Maxes. It was weird from a marketecture point of view.

Possible. It seems like Apple has been going through some strife. The Vision Pro delay, possibly M3 delay or something weird going on there. But I'm not sure I'd read the software issue being that or more that there was some QC hiccup where maybe someone accidentally sent an OS image to the manufacturing/assembly by accident.

Yeah, the Max kind of needed this CPU boost. Shame they felt the need to bring down the Pro to uplift the Max though (can't really see how the Max still wouldn't have been notable with 12 cores vs 8 cores of a proper Pro).

I'm sure it'll depend on use, but I think the biggest issue there is the memory bandwidth. It probably won't end up being a big deal outside of a few scenarios (and those already likely aren't optimized as is, namely thinking about games, such that the extra bandwidth the Pro has over the base will actually still offer benefits, but where it having ~150GB/s vs 200GB/s won't make the difference between unplayable or not). Its possible that for some uses (people mentioned iPad Pro) it might even prove beneficial by having a bit higher efficiency or something that helps either sustained performance and/or battery life to make it preferrable. But that's because the performance is already good that going to very good doesn't offer as much tangible as the extra sustained or battery life would.

Don't know if I'd call it a garbage panel - it still achieves greater brightness, better color accuracy, and doesn't color shift over time (as OLED's blue pixels start to burn out or dim).

It's really a beautiful looking panel and a good compromise until micro-LED shows up.

I'm curious how OLED has affected Apple's display replacement. I know that was a big concern, with burn-in and natural degredation being cited, but it doesn't seem like that ended up being a big deal on the iPhones. Some of that might be due to iPhone users being more likely to move to newer devices regularly, but even on Android I hadn't seen that be a major issue. Which seeems to align with those issues being considered a bit overhyped in reality.

Its nice, but I'm not surprised its not suiting everyone's needs. Most users probably won't care, but frankly they also weren't likely to care about it having above 500 nits brightness and other features to begin with.

I'm not really sure they brought down the Pro - we'll have to see benchmarks for that.

At improved M3 levels the 6+6 of the M3 might be faster than the 8+4 configuration with greater efficiency - after all the M3 Max's 12+4 configuration comes just a hair short of the M2 Ultra.

Apple's been making great strides on the e-core's performance and efficiency, and having 6 e-cores might give you better performance and might mean that most of the time you'd be running on nothing but e-cores contributing to longer battery life.

Yeah the e-core uplift will likely make it not matter too much if at all. The biggest issue to me looks likely to be the memory bandwidth. But its probably not an issue in most tasks and the ones it will matter for (best guess is gaming, maybe some media processing or other), the limitation will be more software optimazation in general, such that if it had the full 200GB/s memory bandwidth it wouldn't change the outcome much.

From the benches Apple was listing the M3 Pro is at least as fast as M2 Pro, but not hugely faster. I think that is fine. In contrast, M3 Max gets a big boost.



That's not how it works.

For example, Sony generally masters its movies to 1000 nits. You can have a scene that is 80 nits overall, but which may have an 800 nits highlight representing less than 2% of the screen. Other companies master to 4000 nits.

Take a look at this image for example. It's a night scene with some lights. The scene is quite dark overall and definitely wouldn't blow out your eyes. Yet there are some bright spots in it. Just don't stare unblinking at the bright spots for extended periods and you'll be fine.

View attachment 88506

What does this mean with an OLED that maxes out around 700 nits? There are a couple of ways to approach this. One is like what Sony did with their OLED TVs. Everything above 700 nits or whatever is just cut-off, so 700 nits and 1000 nits are displayed exactly the same with no detail. What LG did is compressed and mapped it so that 1000 nits would be 700 nits, and 700 nits would be say 575 nits or whatever. This maintains bright contrast detail, but makes everything darker. Sony's solution would be to have solid bright white clouds with no detail. LG's solution would be to have clouds that show the detail but which are less bright.

So, if you're a video editor, ideally you should have a screen that can extend past 1000 nits, and the higher the better, as long as it can go low too with the blacks. The holy grail would be an OLED panel that can go to 0 nit black and exceed 1000 nits bright, with great colour accuracy and no off-axis colour or brightness shift, and without burn-in.

P.S. I have high dynamic range on for my home stereo system, and my ears don't get blown out with properly mastered material. However, I am using Audyssey sound measurements on my receiver to calibrate the sound. This didn't work well when I had a projector because the projector's fan noise would be loud enough to obscure quiet dialogue, but with a fanless OLED TV, this is no longer an issue.

I definitely prefer the LG method.

Micro-LED is likely when you'll start getting that. I really hope they offer some feature like the hearing safety setting for audio where you can truncate those maximums. Seeing light blooms gives me a headache. But something I've been noticing is that a lot of modern content also is bad at handling dark scenes. That's in the mastering, but there's a lot of newer content where dark scenes you need to crank the brightness up to see any details at all besides like super highly contrasted things. I don't know if this was because it was mastered on brighter displays, or what the cause of it is, but its irritating when most of the rest is fine but then you have to pause and up the brightness (or in my case I switch to a different preset that I already configured because this was occurring so often). Which, sure refrence standards are nice, but no one can seem to agree on what the reference should be. I feel like it shouldn't be that hard to have a spectrum, and then get equipment calibrated so that it then can switch between, and then when content is played it can take into account your base preference and then can adjust to try and suit that but still offer the closest to the intended.

I already don't use max brightness on the LG OLED I have, and its also several years old at this point. I have it on quite low brightness settings and it already is much brighter than the LCD monitor I'm also using (and its not a garbage LCD either).

To each their own. I personally cannot stand how modern movies are mastered. The discrepancy in volume, between loud action sounds and dialogue is grating. Even in properly calibrated situations (its in fact one of the reasons I don't go see movies in theater much, I have to cover my ears watching some of the Marvel movies because the action scenes were ridiculously loud). Much like I do not actually want realistic brightness in displays, I do not want realistic loudness for gunshots and similar (they do realize you wear hearing protection around such, right?). I like full dynamic range being an option but I would love if they'd have a separate mix that lowers the max loudness and better normalizes the whole spectrum, to make dialogue more listenable but providing an overall more coherent mix. This is going to become even more important as they move to these spatial audio mixes. The other aspect that makes that really irritating is them integrating music. Some of that is because of the stupid Loudness War mixing/mastering that the music industry went to, but that exacerbates the issues.

One of the major things that I do when I rip and encode movies is have the audio processed to adjust that I'd personally like even better control over that, and maybe there is a program. But now with the change to object based audio formats, I'm hoping we can get better down/re-mixing of these.
 

poke01

Golden Member
Mar 8, 2022
1,381
1,595
106
It is concerning that the newer process nodes don't seem to be scaling efficiently. Relying on faster clocks leads to higher temperatures and higher power draw. The M2 fans were noisier than M1, and the M3 is also getting quite warm. If this trend continues, will it result in chips with excessive power consumption and temperatures, similar to Intel's? In addition to process node improvements, we need to focus on improving IPC (instructions per cycle) to make this strategy sustainable.
The M3 pulls the same amount of power as M2 that is up to 21-22 watts in Cinebench MT while being 20% faster.

But the max 108C temp is because of N3B being more dense. The surface temps are below <40C ie the keyboard.

See: https://www.notebookcheck.net/Apple...del-now-comes-without-a-Pro-SoC.765661.0.html
 

Eug

Lifer
Mar 11, 2000
23,751
1,283
126
That was a long post!

OLEDs do better motion handling even at lower refresh rates than LCD.
Motion handling on traditional OLEDs is TERRIBLE. Some newer models tackle this with black frame insertion, but that can affect brightness and increase input lag.

But something I've been noticing is that a lot of modern content also is bad at handling dark scenes. That's in the mastering, but there's a lot of newer content where dark scenes you need to crank the brightness up to see any details at all besides like super highly contrasted things. I don't know if this was because it was mastered on brighter displays, or what the cause of it is, but its irritating when most of the rest is fine but then you have to pause and up the brightness (or in my case I switch to a different preset that I already configured because this was occurring so often). Which, sure refrence standards are nice, but no one can seem to agree on what the reference should be. I feel like it shouldn't be that hard to have a spectrum, and then get equipment calibrated so that it then can switch between, and then when content is played it can take into account your base preference and then can adjust to try and suit that but still offer the closest to the intended.
It's not that the mastering displays are brighter, it's that they display dark details better. Many TVs including OLEDs suffer from black crush. The mastering displays do not. However, a high end mastering display might cost something like $25000 for a 30" display, so you're not going to get the same performance out of a $1500 OLED.

The reference standard would be that the monitor can display everything correctly. Mastering monitors will more-or-less do this. The problem isn't that they are mastering to non-reference standards. The problem is consumer TVs often can't display blacks properly. So, if some high end video editor correctly masters a night scene with tons of detail, it may end up looking like a sea of black with no detail because the TV just isn't good enough to show that detail. And it's even worse if it's on streaming services or cable TV because it's been further compressed. Depending how it was compressed, that can remove even more dark detail. A case in point was The Longest Night in Game of Thrones. That was horrible. Basically unwatchable.
 
Last edited:

pj-

Senior member
May 5, 2015
481
249
116
Since it's such a hot topic, I did a very unscientific motion blur test with my new 14" MBP w/ M3Pro vs. my two desktop monitors.
Left is a vertically oriented low end 60hz 2560x1440 Dell. Top is an Alienware qdoled 175hz 3440x1440. Bottom is the new MBP. Photo was with my phone at 1/2000 shutter speed (one photo, not stitched).

Test is the response time thing here: https://www.eizo.be/monitor-test/ (background to black, left rectangle to black, speed to 2k pixels/sec). Each screen is running a separate session of the test at its native res, monitors are driven by my PC. The tall white block scrolls horizontally across the screen left to right, everything else on the screens is static.

To me, the oled looks great in the photo and in motion. The dell and mbp look pretty similar and are both pretty bad but to my eye the dell actually looks less blurry in motion. It has a longer tail in the picture, but maybe it helps that it's more defined, idk.

I also have a 14" MBP M1Pro from 2021 and while the screen never really bothered me, it definitely doesn't feel like 120hz in general use. The screen on the M3Pro seems exactly the same. The visual artifacting from the dimming zones is also very apparent in dark content, particularly around the mouse cursor. I don't think it's especially great or terrible, but it mostly stays out of the way which I think is what most people care about. It would definitely be awful for gaming but only a crazy person would buy a macbook for that.

Playing around with some synthetic ML workloads, the M3Pro is appreciably faster in both CPU and GPU training than my M1Pro. The fans did kick up pretty aggressively with one test that was fully saturating the CPU for a few minutes, louder than I've heard on the M1Pro.
 

Attachments

  • 20231108_180804.jpg
    404.3 KB · Views: 21
Last edited:

Eug

Lifer
Mar 11, 2000
23,751
1,283
126
It will be interesting to see how well M3 Ultra scales over M3 Max at about 21750, in Geekbench 6 multi-core:

M2 Ultra at about 21600 is only about 1.4X M2 Max.
M1 Ultra at about 18700 is in the same ballpark, about 1.45X M1 Max.

If scaling for M3 Ultra is about the same, then that would be around 30000 or so.

Meanwhile, my M1 Mac mini is about 8750.
An M3 Pro Mac mini at 15750 would a huge upgrade, with more ports too.
 

FlameTail

Diamond Member
Dec 15, 2021
3,148
1,796
106
It will be interesting to see how well M3 Ultra scales over M3 Max at about 21750, in Geekbench 6 multi-core:

M2 Ultra at about 21600 is only about 1.4X M2 Max.
M1 Ultra at about 18700 is in the same ballpark, about 1.45X M1 Max.

If scaling for M3 Ultra is about the same, then that would be around 30000 or so.

Meanwhile, my M1 Mac mini is about 8750.
An M3 Pro Mac mini at 15750 would a huge upgrade, with more ports too.
Haven't we established that GB6 MT is a terrible benchmark for CPUs with a large number of cores, as it does not scale well?
 

Nothingness

Platinum Member
Jul 3, 2013
2,732
1,362
136
Haven't we established that GB6 MT is a terrible benchmark for CPUs with a large number of cores, as it does not scale well?
Some are convinced of that. I'm not.

Some of the individual MT subtests scale well (the render test scales as Cinebench for instance). Other subtests don't and that matches real life applications that don't scale well or at all. So I agree the aggregate MT score is not that interesting, but that doesn't mean GB6 MT is a "terrible benchmark". It should be used with care (as all benchmarks anyway...).
 
Reactions: Eug
Jul 27, 2020
17,816
11,609
106

Apple has defended its decision to continue releasing new MacBook Pro models with only 8GB of RAM. Apple's VP of Worldwide Product Marketing Bob Borcher asserts that Apple's efficiency gains make 8GB in its products equivalent to the performance of 16GB in other systems.

Feel free to jump to this douche's defense...


Only a marketing guy would speak without using his brain.
 

poke01

Golden Member
Mar 8, 2022
1,381
1,595
106



Feel free to jump to this douche's defense...


Only a marketing guy would speak without using his brain.
I'd ignore anything that comes from marketing from any company.

Here's some non-marketing stuff about the new GPU architecture:
 

manly

Lifer
Jan 25, 2000
11,318
2,344
136



Feel free to jump to this douche's defense...


Only a marketing guy would speak without using his brain.
A lot of people actually believe this; that there's something magical about Apple unified memory that makes it so efficient.

Strangely enough, I just noticed that the M3 Pro systems start off at 18GB of RAM.

Will the base models get 12GB in late 2024, or more likely 2025? 🤷‍♂️
 

Ajay

Lifer
Jan 8, 2001
16,094
8,104
136
A lot of people actually believe this; that there's something magical about Apple unified memory that makes it so efficient.

Strangely enough, I just noticed that the M3 Pro systems start off at 18GB of RAM.

Will the base models get 12GB in late 2024, or more likely 2025? 🤷‍♂️
MacOS is more efficient memory wise than Windows, as is Linux. Windows NT introduced a terrible memory management system. Cutler's team was aiming for a truly cross platform (CPU) OS. MIPS had no ability to tag memory based on usage, so they couldn't implement a LRU or NRU algorithm for optimal memory paging. They used a FIFO instead (which, aside from being literally 'dumb' also suffers from problems like Bélády's anomaly). So the NT team added the 'working set' algorithm on top of the FIFO to mitigate the various problems. It was kind of a hack, but once implemented MS won't change it for fear of breaking something important.

There are other advantages that MacOS and Linux have, but I can no longer recall them off hand.
 

manly

Lifer
Jan 25, 2000
11,318
2,344
136
MacOS is more efficient memory wise than Windows, as is Linux. Windows NT introduced a terrible memory management system. Cutler's team was aiming for a truly cross platform (CPU) OS. MIPS had no ability to tag memory based on usage, so they couldn't implement a LRU or NRU algorithm for optimal memory paging. They used a FIFO instead (which, aside from being literally 'dumb' also suffers from problems like Bélády's anomaly). So the NT team added the 'working set' algorithm on top of the FIFO to mitigate the various problems. It was kind of a hack, but once implemented MS won't change it for fear of breaking something important.

There are other advantages that MacOS and Linux have, but I can no longer recall them off hand.
It's been a while since I studied operating system internals, but I'm not sure this is even relevant. Cutler's NT was developed during a time when it could run on something like 32 MB of RAM? (The official requirement was half that, not saying that it ran well.) How is that ancient decision germane to today's configurations? I'll concede Windows uses more RAM, but that isn't really the debate IMO. Obviously today's OSs boot up using a lot more memory than NT 3.1 and NeXTSTEP did 30 years ago.*

Apple isn't necessary wrong that most consumers will be "fine" running 8GB today; that isn't really the issue people have. The issue is that if you're dropping $1500 on a brand new laptop in 2023, one that has NO upgradeable components, Apple is doing a disservice to its customers. It's doing this because it can get away with it. To make matters worse, the upcharge for more RAM is incredibly steep. With on-package memory, sure the throughput is great but you're still splitting 8GB between the OS, your apps, and for VRAM. There is no silver bullet here; just because the UX degrades gracefully doesn't mean macOS apps are twice as memory efficient as on other platforms. (Mobile is somewhat different, iPhoneOS has indeed always needed less RAM than Android.)

Finally, what works well today isn't always going to be the case. For a company that claims to be environmentally friendly, obviously they think it's better business if these systems become obsolete sooner rather than later. That strategy has worked extremely well for AAPL shareholders, but also results in a lot more e-waste than necessary.

* I recently installed macOS Monterey on an old unsupported dual-core MacBook Pro. While it was acceptable, the UX was kind of rough. Unfortunately it doesn't really make sense to install a 7 year old unsupported version of macOS, so it was basically Monterey or a Linux distro. This isn't really a RAM issue, but nevertheless today's macOS is a lot fatter than that of a decade ago when macOS was my "daily driver" and I had only 3GB of RAM.

MacWorld nails it:
 
Last edited:

Doug S

Platinum Member
Feb 8, 2020
2,479
4,036
136
MacOS is more efficient memory wise than Windows, as is Linux.

That's probably true, and combined with compression of less active pages then MAYBE you could claim it is equivalent to a 50% larger Windows memory like 12 GB. Making a claim that it is equivalent to 16 GB is stretching such claims beyond credulity.

They'd have been better off saying that the people who buy these entry level systems typically have modest needs that are served by 8 GB. Which is surely true, including Windows users. Now whether 8 GB is still OK for those people in 2030 who knows. Operating systems haven't really increased their resource usage, but browsers seem to keep using more RAM as pages get more complex (though that's more of a problem for people who keep a lot of tabs open or leave their browser running for weeks)
 
Reactions: TESKATLIPOKA

Glo.

Diamond Member
Apr 25, 2015
5,761
4,666
136
That's probably true, and combined with compression of less active pages then MAYBE you could claim it is equivalent to a 50% larger Windows memory like 12 GB. Making a claim that it is equivalent to 16 GB is stretching such claims beyond credulity.

They'd have been better off saying that the people who buy these entry level systems typically have modest needs that are served by 8 GB. Which is surely true, including Windows users. Now whether 8 GB is still OK for those people in 2030 who knows. Operating systems haven't really increased their resource usage, but browsers seem to keep using more RAM as pages get more complex (though that's more of a problem for people who keep a lot of tabs open or leave their browser running for weeks)
People who buy entry level systems DO NOT pay for them 1599$. Thats how much the M3 MacBook Pro costs.

They pay at best 1000$.
 

Ajay

Lifer
Jan 8, 2001
16,094
8,104
136
Apple isn't necessary wrong that most consumers will be "fine" running 8GB today; that isn't really the issue people have. The issue is that if you're dropping $1500 on a brand new laptop in 2023, one that has NO upgradeable components, Apple is doing a disservice to its customers. It's doing this because it can get away with it. To make matters worse, the upcharge for more RAM is incredibly steep. With on-package memory, sure the throughput is great but you're still splitting 8GB between the OS, your apps, and for VRAM. There is no silver bullet here; just because the UX degrades gracefully doesn't mean macOS apps are twice as memory efficient as on other platforms. (Mobile is somewhat different, iPhoneOS has indeed always needed less RAM than Android.)
I wasn't arguing in favor of Apple's stupid choice to put on 8GB of DRAM on a $1600 laptop. Clearly, that's a brain dead idea - it's not a Chromebook, even if a fair number of ppl with use it in a similar way. Also, yes, MacOS has gotten fat. We are all using OSes that are 20-30 years old. Lots of new features and system services. Linux does better on that front, but is not immune if you want a fully featured UI and Apps.

The point for Apple, I imagine, is the upsell for an extra 8BG of DRAM for $200 that probably costs them $20 to purchase and install.
 

Ajay

Lifer
Jan 8, 2001
16,094
8,104
136
That's probably true, and combined with compression of less active pages then MAYBE you could claim it is equivalent to a 50% larger Windows memory like 12 GB. Making a claim that it is equivalent to 16 GB is stretching such claims beyond credulity.

They'd have been better off saying that the people who buy these entry level systems typically have modest needs that are served by 8 GB. Which is surely true, including Windows users. Now whether 8 GB is still OK for those people in 2030 who knows. Operating systems haven't really increased their resource usage, but browsers seem to keep using more RAM as pages get more complex (though that's more of a problem for people who keep a lot of tabs open or leave their browser running for weeks)
Geez, really got myself in trouble by not being more clear. Anyway, good point about OSes having finally slowed their roll when it comes to eating up ever more resources. Browsers are terrible, especially when pages won't suspend. That, and the myriad of services installed by apps and utilities has become kind of ridiculous, at least on the windows side.
 

manly

Lifer
Jan 25, 2000
11,318
2,344
136
I wasn't arguing in favor of Apple's stupid choice to put on 8GB of DRAM on a $1600 laptop. Clearly, that's a brain dead idea - it's not a Chromebook, even if a fair number of ppl with use it in a similar way. Also, yes, MacOS has gotten fat. We are all using OSes that are 20-30 years old. Lots of new features and system services. Linux does better on that front, but is not immune if you want a fully featured UI and Apps.

The point for Apple, I imagine, is the upsell for an extra 8BG of DRAM for $200 that probably costs them $20 to purchase and install.
Apple wins either way. Although the margins are lower, they sell tons of "base" config devices to consumers who don't know better. Not that long ago, iPhones had 2GB of RAM and these did not have great longevity. 8GB Macs simply are not a good "investment" in late 2023, no matter how fast the memory and NAND is.

The savvy consumer will have to spend a lot more for 16GB and Apple captures a ton of pure profit that way.

It's been an excellent business model for the shareholders.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |