adroc_thurston
Diamond Member
- Jul 2, 2023
- 5,403
- 7,573
- 96
Some day.When is next SPEC version coming?
Well yeah they ain't breaking what works.There is more than a decade between SPEC2006 and SPEC2017
Some day.When is next SPEC version coming?
Well yeah they ain't breaking what works.There is more than a decade between SPEC2006 and SPEC2017
Do you run with the screen extra bright? That could a huge battery drainer.Awesome, thank you! This confirms my previous suspicions that the Omnibook X was efficiency-oriented.
That being said, I have determined that I must be a very heavy battery user. I'm getting between 6h~10hr w/ this Omnibook X, which is better than anything I've ever used before, but I was really hoping to hit ~ 15hr, 16hr AT LEAST. I mean, they claim up to 26 hours - idk if we will ever actually get there for anything more than video playback...
That bit is very interesting. What kind of modifications are we talking?The X Elite and X Plus chips, used for Windows on ARM (WOA), will reach about 2 million unit shipments in 2024, with expected year-on-year growth of at least 100–200% in 2025. The X Elite and X Plus will have modified versions in 2025, with a reduction in end product prices. Additionally, Qualcomm plans to launch a low-cost WOA processor codenamed Canim for mainstream models (priced between $599–799) in 4Q25. This low-cost chip, manufactured on TSMC’s N4 node, will retain the same AI processing power as the X Elite and X Plus (40 TOPS).
Cellular is too expensive anyway. No one wants to worry about their bill going up while browsing the internet, hence the preference for WiFi.
Wow, Qualcomm somehow managed to lose money by being forced to subsidize OEMs buying Qualcomm's proprietary PMICs? How stupid is that? Let's hope they end that policy as quickly as possible.An interesting analysis of the Dell leak (which happened a few months ago). I don't think it was posted here then, so now I am posting it. There's some juicy details, including stuff about the PMIC-fiasco (It is real!)..
Dell Mega Leak Analysis
Incredibly bullish for $QCOM. Terrifying for $INTC.irrationalanalysis.substack.com
PICNICS!At least in the US, unlimited tethering is pretty much standard now. So what would be the benefit of having cellular in my laptop, even if it didn't cost anything?
New Jerry Seinfeld project: Picnics with PMICs.PICNICS!
Exactly, I'm not sure why Apple would do what they did on purpose. Typically you want products to be on sale long enough to pay back the investment to create and produce them in the first place. MTL is the same story, imho.It was not in 7 months. It was developed for several years as always.
What clearly happened is that M3 was pushed back a lot due to process delay, almost to the point where M4 was ready. There was originally going to be much shorter gap between the M1 and M3 architecture and thus larger gap between what is now M3 and M4, that's all.
That's pretty clear given how Apple went three years without a new performance core before that. There's roughly 4 years between M1 and M4, so it was clearly intended to be closer to ~2 years spacing between new architecture originally. And both of those new uarchs (M3, M4) show single-percent IPC uplift.
Those 18-24months gaps are pretty much what AMD does, and the IPC, well. Apple owes the lion's share of their success after M1 to clock speeds chasing, which may be that effect of starting from low level, but also having the tailwind of being the richest kid on the block with most reckless customers so they can afford the node upgrade goodies 1-2 years before the rest.
Guilty as charged. Screen on full brightness (although in my defense, the Omnibook only has a 300 nit screen so it isn't like I'm blasting my eyes with bright light) and lots of multitasking. I will say last night I was just doing some light browsing and I hit 21 hrs estimated battery life, and that was still with full brightness!Do you run with the screen extra bright? That could a huge battery drainer.
Regarding trackpads: Obviously totally unfashionable and pretty impossible in ever thinner laptops, but IMO track points - a la IBM ThinkPad "sticks" - are the only good laptop mouse controllers.
When typing the last thing I want is to move my hands to use the mouse to click on some minor thing. When not typing and doing mouse things, is rather use a proper mouse.
Nothing, AFAICT. The number of times I have needed network connectivity and haven't had my cell with me is ... maybe zero? I have had issues where I had no service, but I don't think a modem in the laptop itself would have helped in that case.At least in the US, unlimited tethering is pretty much standard now. So what would be the benefit of having cellular in my laptop, even if it didn't cost anything?
Crazy what we consider 'low-end' these days. I remember when the first quad-core laptops came out and I was like 😲. Of course, I also remember when I upgraded to an absolutely decadent 768mb of RAM as well and remember thinking I'd never need any more.
Extremely cheap consumer devices are basically using Ryzen 5500U or Intel Core i3 1215U currently. Both are extremely capable SoCs for the price you can buy devices with them.Crazy what we consider 'low-end' these days. I remember when the first quad-core laptops came out and I was like 😲. Of course, I also remember when I upgraded to an absolutely decadent 768mb of RAM as well and remember thinking I'd never need any more.
I remember the early days of Celeron. You could not do anything with it. Then I used some last-gen Celeron two years ago, and it was impressive how well it held on many tasks.Extremely cheap consumer devices are basically using Ryzen 5500U or Intel Core i3 1215U currently. Both are extremely capable SoCs for the price you can buy devices with them.
It's indeed a wild leap over what we used to have 5 to 10 years back. In the following years, we'll see even more low-end love with QCOM Canim, AMD Sonoma Valley and Kraken 2, Intel PTL U. Absolutely good stuff for cheap price.
He was not joking when he said it was "easy" taking their money just to make a video with the computers. Poor Qualcomm fools!Linus Tech Tips agrees that the naming scheme for Snapdragon X SKUs is not good:
And why I fear for their self developed GPU efforts.Sidenote, in hindsight Samsung's decision to partner with AMD to put RDNA in Exynos looks a 200 IQ move.
Move AMD to tier1 and Apple to tier2 and it's ok.So in terms of GPU architectures, I guess we can rank them like this:
Tier 1 : Nvidia
Tier 2 : AMD, Intel
Tier 3 : Apple
Tier 4 : Qualcomm, ARM