Hmm, he didn't make any subjective comments about it being blurry or smearing. Have you used one in person?
I'm wondering if its not a function of some animation smoothing or other behavior. Maybe dynamic refresh rate handling (what's the range for the Macs? I think iPhones can do 1-120Hz?).
At least phones are being used outside under bright sunlight. Even there we are going overboard. I had no issues using Galaxy Note 4 back in 2014 in bright sun light. I am not certain we need 3000 nits like what we are seeing today. At this point these races are only for marketing reasons. Samsung goes for Megapixel race, companies goes for most number of cameras or some metric to market it. But from usability purpose there is hardly anything dramatic.
Laptops these days even the cheapo windows/chromebook ones are much better than what we used to get 10 years ago. At this point we get only incremental value spending 1000s of $ more. I can understand users of specific application/gamers might want certain features that make them ok to pay more. Otherwise its all a status symbol
Yeah, its very dependent on usage. For a laptop I think 500 nits should be the target. Trying to offer 2000nits and stuff like that is just getting silly/stupid and is turning into yet another marketing spec race.
Hmm... This is probably futile, but I'll provide some contextual information anyway.
Typically content creation oriented professional monitors do not have super high refresh rates.
For example, this is the BenQ SW321C 31.5" 4K monitor that goes for ~US$2000. It's 60 Hz.
BenQ SW321C Photographer Monitor supports IPS technology for photo editing and excellent 99% Adobe RGB color space.
www.benq.com
Creative Bloq gives it a 4.5 out of 5 stars in their review, commenting that its $2000 price is comparatively low for this category of monitor.
The BenQ SW321C is a pro-level monitor without the pro-level price.
www.creativebloq.com
The thing is those are aimed at like photo editing, not video. They can be used for video but the focus there would be the color reproduction not say motion handling. Plus I'm not sure that the difference in motion handling between different technologies is even considered at all whatsoever in any video production and/or mastering right now. I think it used to somewhat when CRTs were the dominant display tech (and even then I think it was more say how the scanlines affected motion), but I'm not aware of any that consider it at this point. OLEDs do better motion handling even at lower refresh rates than LCD. And projectors use a variety (DLP, LCD, then light sources). And these VR/AR headsets use a variety, and motion handling is even more important there. But then very little content is produced targeting such yet.
There is some improvement in perf/watt but not much. Keep in mind M2 Max is on N5P, so once again the perf/watt of N3B looks roughly equivalent of N4P.
I'm curiouis how sustained performance compares. Will the M3 chips use enough that we might end up seeing more throttling than before? At some point Apple is going to have to deal with that and develop better cooling. I feel like that makes more sense for segmentation and should be a metric that is easier to compare (maybe we need a digital measurement that's like horsepower where its measuring work performed over time; I'm pretty sure there already is one that would be applicable for the energy usage like perf/W, but not sure it could comprehensively show the mix of heat output or thermal limit as well; I'm probably overthinking it though). and they should simplify their lineup while adding a separate system for say storage, have 256GB base integrated fast NAND for OS, but then have a separate slot for adding more. Each tier up doubles the standard NAND (so Pro starts at 512, Max 1TB, Ultra 2TB; RAM can probably be don
Higher nits equal better highlights for a more impactful HDR experience. Check rtings reviews of OLEDs. They rarely go above 600 nits for full screen brightness.
Yes, its about highlights (although frankly I personally don't want these blown out HDR highlights, if I want super realistic brightness I wouldn't be watching TV; HDR blowouts is gonna be so obnoxious when its your whole vision in like an AR/VR). That's a function of power though, and you would not want to be pumping out 500+ nits whole screen on an OLED, both because it'll be more likely to cause issues with the pixels but also it would be using so much power and putting out a ton of heat (that most of the TVs are not setup to handle, increasong potential failures).
I think it's a fools errand to live on someone else's promises or fantasies.
Best to wait for testable silicon to arrive and proceed from that point.
I am not too doubtful of Qualcomm's claims, but we'll see how it does in actual devices. Apple has shown its possible to offer such performance, even sustained, in platforms that don't have much in the way of cooling design. With modern PC designs (not even taking into account future ones like those piezo-electric "fans" and the like), they should be able to manage the load these chips offer without too much hassle. But the thing to keep in mind is this is more about elevating the base (or in the PC space more like the mid-range start as the base is super cheap stuff that is either years old like how AMD still has chips with Zen 2 cores, or much more limited 2 cores with weak GPU and little if any co-processors). Which is great and will improve things for most people.
I'm not "defending this". This screen would be better with a lower response time.
I'm responding to the comment that this screen is "atrocious" and that Apple should never charge this amount of money for because of this screen. That's different.
If you just said that this screen had poor response time, I wouldn't have replied. But your comment implied that it was a shitty screen overall and that users were dumb to use it. While the reality is that this screen is best in class in several other metrics and that motion blur is not a problem for most people.
I personally have zero issue with blur while scrolling. I don't stare at an image during scrolling, I don't read text during scrolling and even if I did, the blur is minimal and doesn't impact readability.
If motion blur was such an issue for everyone, the so called "soap opera effect" wouldn't be an problem and every filmmaker would be using high shutter speed (like in the first scene of Saving Private Ryan), or they would all be shooting at 48 fps, and fullscreen motion blur would not be a setting in video games.
The truth is, motion blur is not an issue for most people.
Again, not saying that this screen wouldn't be better with a lower response time, just that most people don't care and value other metrics like color fidelity, contrast, resolution and brightness.
Yeah different people prioritize different things. Reminds me of someone saying some display in a Lenovo laptop was total garbage because of PWM or something, including them linking a review that they acted like supported their claim, except the review basically said the display was exceptional, and just remarked that it had a fairly low PWM strobing or something which some people are sensitive to but that while it was somewhat low, it still shouldn't affect too many people much. And the person was going on a tirade acting like the display was literally unusable like it would turn you blind or something.
This is the first time I've seen a complaint about the response time on Apple's stuff. Some of that is probably due to Apple talking about gaming more so maybe more people are looking to game or people used to gaming focused performance are trying Apple stuff and noticing the discrepency? But I have a hunch there's some animation smoothing or other going on that is impacting things (although perhaps that's already been ruled out?).
I have seen several reviews that commented on watching video on Apple's stuff and they didn't seem to remark about that aspect, which I know people have different thresholds, and many tech reviewers are lacking in hardcore expertise on some of that stuff. And its less pronounced than gaming is.
It's easy to forget that Apple isn't a CPU manufacturer/vendor and doesn't really care about spec wars.
The Apple Silicon team produces silicon for the Apple product roadmap and to satisfy the needs of future Apple products.
The fact that Apple has occasionally pulled far enough ahead to spank Intel and AMD - at their reduced energy envelope - is more side-effect than goal.
When you combine the CPU speed with their video toolkit you get video editing/rendering/transcoding which makes their machines pretty much ideal for that task, especially when you consider they can do it on a center table in Starbucks or at 30,000 feet.
I think Apple simply made a chip that targeted a common denominator (I wouldn't say lowest, but more like median, where most people could use a decent amount of performance but many don't need something excessively powerful or all the features that PC platforms offer, and many would definitely prioritize efficiencty aka battery life), so there was a lot of pretty easy low-hanging fruit. They could adapt their mobile processor design since it was strong enough (the biggest issue was simply overcoming that software situation, but they having already done something similar in the PowerPC to x86 change, and having the benefit of the development for ARM via iOS). And Qualcomm is basically doing the same thing.
Intel and AMD could've been offering x86 designs that would've made both less interesting, but neither had even been attempting to really do that in that space (closest was probably Intel with that weird VEGA chip, or I think there was one that mixed Core with Atom), for 2 simple reasons, their designs were targeting a wide application use (desketop, enthusias/pro use, and server/enterprise got more focus) and their method of segmenting was basically core counts, and because of lack of competition they let natural progression manage things for people that say wanted better efficiency and/or GPU performance.
I'd been saying for years that I wanted the baseline processors to be better balanced. Basically designing a chip that was balanced across the board. Which compared to the APU designs AMD and Intel were offering would've meant simplifying things with either adopting a BIG.little setup or cores tuned for the application more (seemingly what the ZenXc cores are doing), stronger GPUs with more memory bandwidth to support them, and then some fast NAND. My fix for it would've been 16GB of HBM serving as system memory (where it would offer like 256GB/s bandwidth, with NAND basically RAM channels although I think that'd be unnecessary with the PCIe spec these days, but offering several GB/s NAND bandwidth). Apple pretty much did that, and it didn't take that much work, it just took someone simply doing that.
Yeah Apple will leverage OLEDs to fix the garbage panel issue later on.
I'm surprised we haven't seen more attempte a simple intermediary OLED layer. Just a simple on/off layer of white OLEDs pixels. Could even be like 1/4 the resolution of the display (and it'd still offer more "zones" than any mini-LED does). They wouldn't need to be super bright, just basically activate to do some low level light for dark but not totally dark, but then have even an edge lit or other fairly simple LED backlight which offers the higher brightness, with turning off or maybe have some gradient of on/off to allow light blocking and better handling).
It could also potentially do something to offer better output in say outside/sunlight conditions, by offering a transparent display or something where it can pass sunlight through the back or maybe a reflective aspect (kinda like e-ink). So that if you're in a bright area, the light actually boosts the display (without needing that from the display itself) but then in dark situations you get the benefits of OLED.
I know there was a TV that did that, and there's been other prototypes that used dual OLED layers as a means of offering higher brightness and/or compensating for color (maybe even an attempt at dealing with the RGBW pixel setup).
Just an indicator that Apple has been stockpiling these M3 MacBook Pros for a while: Some customers have been receiving their new M3 MacBook Pros with an unreleased version of macOS Ventura, despite the fact that the newer macOS Sonoma came out way back in September.
Normally that would be a good thing since some workflows would be better served by the more mature Ventura, which we now know can run on these new machines, but that doesn't really apply here because that older macOS version is not actually available for download.
Yup. I said the same thing about the CPU core count and the CPU performance of M1 Pro and M2 Pro vs the corresponding Maxes. It was weird from a marketecture point of view.
Possible. It seems like Apple has been going through some strife. The Vision Pro delay, possibly M3 delay or something weird going on there. But I'm not sure I'd read the software issue being that or more that there was some QC hiccup where maybe someone accidentally sent an OS image to the manufacturing/assembly by accident.
Yeah, the Max kind of needed this CPU boost. Shame they felt the need to bring down the Pro to uplift the Max though (can't really see how the Max still wouldn't have been notable with 12 cores vs 8 cores of a proper Pro).
I'm sure it'll depend on use, but I think the biggest issue there is the memory bandwidth. It probably won't end up being a big deal outside of a few scenarios (and those already likely aren't optimized as is, namely thinking about games, such that the extra bandwidth the Pro has over the base will actually still offer benefits, but where it having ~150GB/s vs 200GB/s won't make the difference between unplayable or not). Its possible that for some uses (people mentioned iPad Pro) it might even prove beneficial by having a bit higher efficiency or something that helps either sustained performance and/or battery life to make it preferrable. But that's because the performance is already good that going to very good doesn't offer as much tangible as the extra sustained or battery life would.
Don't know if I'd call it a garbage panel - it still achieves greater brightness, better color accuracy, and doesn't color shift over time (as OLED's blue pixels start to burn out or dim).
It's really a beautiful looking panel and a good compromise until micro-LED shows up.
I'm curious how OLED has affected Apple's display replacement. I know that was a big concern, with burn-in and natural degredation being cited, but it doesn't seem like that ended up being a big deal on the iPhones. Some of that might be due to iPhone users being more likely to move to newer devices regularly, but even on Android I hadn't seen that be a major issue. Which seeems to align with those issues being considered a bit overhyped in reality.
Its nice, but I'm not surprised its not suiting everyone's needs. Most users probably won't care, but frankly they also weren't likely to care about it having above 500 nits brightness and other features to begin with.
I'm not really sure they brought down the Pro - we'll have to see benchmarks for that.
At improved M3 levels the 6+6 of the M3 might be faster than the 8+4 configuration with greater efficiency - after all the M3 Max's 12+4 configuration comes just a hair short of the M2 Ultra.
Apple's been making great strides on the e-core's performance and efficiency, and having 6 e-cores might give you better performance and might mean that most of the time you'd be running on nothing but e-cores contributing to longer battery life.
Yeah the e-core uplift will likely make it not matter too much if at all. The biggest issue to me looks likely to be the memory bandwidth. But its probably not an issue in most tasks and the ones it will matter for (best guess is gaming, maybe some media processing or other), the limitation will be more software optimazation in general, such that if it had the full 200GB/s memory bandwidth it wouldn't change the outcome much.
From the benches Apple was listing the M3 Pro is at least as fast as M2 Pro, but not hugely faster. I think that is fine. In contrast, M3 Max gets a big boost.
That's not how it works.
For example, Sony generally masters its movies to 1000 nits. You can have a scene that is 80 nits overall, but which may have an 800 nits highlight representing less than 2% of the screen. Other companies master to 4000 nits.
Take a look at this image for example. It's a night scene with some lights. The scene is quite dark overall and definitely wouldn't blow out your eyes. Yet there are some bright spots in it. Just don't stare unblinking at the bright spots for extended periods and you'll be fine.
View attachment 88506
What does this mean with an OLED that maxes out around 700 nits? There are a couple of ways to approach this. One is like what Sony did with their OLED TVs. Everything above 700 nits or whatever is just cut-off, so 700 nits and 1000 nits are displayed exactly the same with no detail. What LG did is compressed and mapped it so that 1000 nits would be 700 nits, and 700 nits would be say 575 nits or whatever. This maintains bright contrast detail, but makes everything darker. Sony's solution would be to have solid bright white clouds with no detail. LG's solution would be to have clouds that show the detail but which are less bright.
So, if you're a video editor, ideally you should have a screen that can extend past 1000 nits, and the higher the better, as long as it can go low too with the blacks. The holy grail would be an OLED panel that can go to 0 nit black and exceed 1000 nits bright, with great colour accuracy and no off-axis colour or brightness shift, and without burn-in.
P.S. I have high dynamic range on for my home stereo system, and my ears don't get blown out with properly mastered material. However, I am using Audyssey sound measurements on my receiver to calibrate the sound. This didn't work well when I had a projector because the projector's fan noise would be loud enough to obscure quiet dialogue, but with a fanless OLED TV, this is no longer an issue.
I definitely prefer the LG method.
Micro-LED is likely when you'll start getting that. I really hope they offer some feature like the hearing safety setting for audio where you can truncate those maximums. Seeing light blooms gives me a headache. But something I've been noticing is that a lot of modern content also is bad at handling dark scenes. That's in the mastering, but there's a lot of newer content where dark scenes you need to crank the brightness up to see any details at all besides like super highly contrasted things. I don't know if this was because it was mastered on brighter displays, or what the cause of it is, but its irritating when most of the rest is fine but then you have to pause and up the brightness (or in my case I switch to a different preset that I already configured because this was occurring so often). Which, sure refrence standards are nice, but no one can seem to agree on what the reference should be. I feel like it shouldn't be that hard to have a spectrum, and then get equipment calibrated so that it then can switch between, and then when content is played it can take into account your base preference and then can adjust to try and suit that but still offer the closest to the intended.
I already don't use max brightness on the LG OLED I have, and its also several years old at this point. I have it on quite low brightness settings and it already is much brighter than the LCD monitor I'm also using (and its not a garbage LCD either).
To each their own. I personally cannot stand how modern movies are mastered. The discrepancy in volume, between loud action sounds and dialogue is grating. Even in properly calibrated situations (its in fact one of the reasons I don't go see movies in theater much, I have to cover my ears watching some of the Marvel movies because the action scenes were ridiculously loud). Much like I do not actually want realistic brightness in displays, I do not want realistic loudness for gunshots and similar (they do realize you wear hearing protection around such, right?). I like full dynamic range being an option but I would love if they'd have a separate mix that lowers the max loudness and better normalizes the whole spectrum, to make dialogue more listenable but providing an overall more coherent mix. This is going to become even more important as they move to these spatial audio mixes. The other aspect that makes that really irritating is them integrating music. Some of that is because of the stupid Loudness War mixing/mastering that the music industry went to, but that exacerbates the issues.
One of the major things that I do when I rip and encode movies is have the audio processed to adjust that I'd personally like even better control over that, and maybe there is a program. But now with the change to object based audio formats, I'm hoping we can get better down/re-mixing of these.