Discussion Qualcomm Snapdragon Thread

Page 103 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ikjadoon

Member
Sep 4, 2006
145
242
126
I can't read German so I didnt read the article, just quickly glanced theough it. But those numbers don't make sense at all. Are the AMD/Intel parts using CPU decoding? What codec is being used? VP9/AV1? CPU or GPU decoding? Browser used? Chrome/FF/Edge/Safari? The non x86 numbers align and are where they should be; really hardware decoding of 4k video should really only use around ~1WHr of power with the latest processors, however those x86 numbers don't look right at all. Makes it seems like the CPU is doing all the decoding at that point. Anyone who can read the article can offer more insight?

I also can't read German, haha. Luckily, the official English translation has now been posted.

But, without additional data, I'm also surprised at the Redmi & Schenker power draw. I don't have an x86 laptop left in the house these days to run a sanity check. Could something have gone awry with the iGPU → external monitor waking up too many parts of the SoC that would otherwise be asleep or, as you posit, was it forcing CPU decode somehow?

Are these laptops really burning 16-18W on YouTube videos? You've asked great questions and NBC did not provide much methodology for this new test they've introduced.

The only other questions I'd ask: is this over Wi-Fi? These don't use the same Wi-Fi chips. They also use different SSDs (and, IIRC, some browsers needlessly cache video playback, causing excessive writes).

These are more laptop comparisons than SoC comparisons, then, in an article mostly about an SoC, but perhaps they thought it was interesting to share.

I hope they keep their promise and add more laptops, however. I'd love to see a more traditional OEM's laptop shown on AMD / Intel, e.g., HP / Lenovo / Dell, as I'm less familiar with Schenker's and Redmi's designs.

I've tested with this video. On a 2019 Intel 16" i7 MBP. What NBC got is weird. I get a 12watts total core on a 14nm i7. Those x86 figures should be much lower.

View attachment 101503

Thank you for running this sanity check. I believe NBC is showing total system power draw with an external display. Would you know if you're using hw decode or sw decode here? Chrome has a way to check in DevTools once the video playback starts:

DevTools → Media → select the playing video and it'll show up on the right side. I can't remember which VP9 profiles are accelerated on Macs, as I remember Apple had that long spat:



NBC uses a flawed methodology. While understandable, measuring AC power means most laptops do not have all the power management turned on, thus you often see a difference in figure between actual battery life and power in W.

You'd see "Idle minimum" at 7W, but the laptop is getting 10 hours on a 70WHr battery or the same 7W under WiFi browsing, which isn't "Idle minimum".

That is interesting, yes. NBC's power figures do not seem reliable sometimes. This is where a clear methodology is quite important so at least it's reproducible but they also don't have that.

HWInfo64, I believe, has the battery drain power draw and I'd think reducing it to 150 nits or even 50 nits would've been a more interesting comparison than just one power figure with an unknown methodology.

Can't even tell if that is a peak and for how long they tested.

Those idle numbers show that there is certainly overlap between Intel & AMD designs and Qualcomm designs. The Redmi laptop (with MTL) has a lower idle than the ASUS laptop (Qualcomm 78).

Notebookcheck tables for ST and MT look great actually. So what is the catch?

The 1T performance is quite good & the power looks nice compared to Intel & AMD. To get the 84 1T perf, AMD & Intel virtually demand a gaming laptop. But, again, very very few laptops ship with the 84. At least one major laptop only ships with the 78 (HP OmniBook X).

However, the picture is a little more murky depending on what device and may only matter depending on your expectations: the ASUS Vivobook cuts 1T by over half automatically on battery by default; some laptops only come with the slower 78 SKU (e.g., HP Omnibook X); virtually no laptops come with the 00 tier that Qualcomm used in its launch presentation; zero laptops nor tablets (!) are fanless; compatibility is not as seamless as first claimed (actually in the reddit thread linked earlier); lots of little bugs still in shipping systems (e.g., Just Josh livestream showed one laptop suddenly falling asleep → the webcam couldn't wake up properly); the GPU drivers are rough and, as of this week, can't even be re-installed as Qualcomm's installer errors out.

As @coercitiv mentioned, Qualcomm's CEO & technical marketing went overboard, showing off results that are not generally applicable: 3.2K 1T GB6 scores that won't ever ship in a Windows laptop; "fastest laptop! Period!" when Cristiano well knew his devices won't ship for 7+ months; multiple scores revised down as the launch drew closer. But while I don't like this side of it, it really only affects us, the people that bother to listen to CEOs and technical marketing, lol.

Actual consumers just want to see how devices perform & how simple / easy it is to move to WoA device. Yet even that is still up in the air as reviewers only just got devices.
 
Last edited:

eek2121

Diamond Member
Aug 2, 2005
3,045
4,266
136
Actual consumers just want to see how devices perform & how simple / easy it is to move to WoA device. Yet even that is still up in the air as reviewers only just got devices.

Spot on. Most users aren’t technical. If they have issues, they will simply return it and get something else.

That is why the claim of ARM reaching 50% marketshare within a few years are dubious.

Look at how long it has taken AMD to get anywhere with Intel since the original Ryzen launch.

Qualcomm and Microsoft have one chance to even make a 1% dent in Marketshare, and both are screwing it up.

The driver situation should have been sorted out before production silicon was ever allowed to launch.

Why is x86 software, even non-gaming software, still failing to launch despite an emulation layer being present? I haven’t had time to play with a recent WoA build, but I had hoped things would have improved.

They need to fix things, and fast.
 

podspi

Golden Member
Jan 11, 2011
1,982
102
106
The marketing. They said it would be faster, smarter and have better battery life. They hyped it to the moon. Everything the competition could do, the X Elite Copilot+PC could do better. In reality it's just good hardware with a software deficiency.
Honestly, yeah. Without the Recall features it is just another laptop.
How's your Elitebook so far?
I actually bought the Omnibook X, though they are more or less identical.
Here are my first impressions (after owning it for 2 days, and not really getting to use it above and beyond setup). For background, my previous laptop was an HP Spectre, and the other laptop I was considering was the 2024 Spectre w/ MTL.

- Beautiful laptop with good build quality.

- I am not 100% on board with the direction HP is going with their keyboards, but I am notoriously picky about keyboards and will probably need a few weeks to really decide. The laptop is also quite thin, which affects the keyboard.

-Thermals are excellent, the best I've ever experienced on an actively cooled laptop. The display is great for general productivity tasks, which is why I bought it. I don't think I'd be happy if I bought it to play games or watch movies, though, although there are no obvious deficiencies other than the relatively low maximum brightness for a laptop in 2024.

Now that I've gotten all of the things I wish someone else would've said on the internet before, on to the stuff everyone really cares about:

- Performance is excellent. This thing feels quick, even when it is unplugged (where it seems to max out at 2.4ghz, forget turbo higher than 3.4ghz 🤣) Webapps and electron apps run great, as does Office and Windows in general.

-Compatibility is fine. It isn't perfect, and I am not gaming, but the only real issues I've had are virtualization (still haven't actually gotten a VM running yet) and VPN, and the VPN is an easy work-around (either using the built-in functionality or OpenVPN, which has an ARM64 version).

- I can't comment on gaming. Although I do occasionally play games, it won't be on this. I did have the GFX driver crash out on the desktop though, so demerits for that.

- Battery life is mixed - I am still setting up and configuring everything, so this is difficult to gauge. It definitely is sipping less power than my old Spectre w/ a 1065G7 in it, but with an advertised battery life of up to 26 hours, I am hoping once everything is set I am just web browsing I at least see 12+ hours in mixed usage.


Overall, I would say that they did overpromise a bit, and the launch was unacceptably rushed for something that has been delayed for so long. OOBE was pretty rough, I had to power-cycle it a few times before it would boot up. It also rebooted randomly /crashed out at the desktop a few times, as noted above. After all of the updates have been downloaded I haven't had any issues.

At this point, my overall satisfaction pretty much hinges on the usefulness of the NPU (along with) the battery life. Unlike a lot of people on these forums, I'm quite bullish on AI and see the usefulness of having an IP block capable of performing these tasks at low power, though it is really disappointing these launched without any serious software support (for the NPU).
 

SpudLobby

Senior member
May 18, 2022
961
655
106

If it’s true 8 Gen 4 is running at around 4GHz or more (and again even just 4GHz is high in this case given Oryon right now) doing so under 6W for GB and SpecInt, and is architecturally mostly similar enough to Oryon & on N3E —

Then we know something was f*cked up with X Elite’s power delivery or physical design and fabric.

Moving to N3E will give them a bit of headroom like + 5-10% more frequency iso-power from currently but that’s not enough to get them into shape with what they’re currently doing lol. Absolutely not enough.

Anyway if they can pull off A17 Pro ST at similar power without using more (P core anyway) L2 [16MB] & SLC [24MB] than Apple, that’s amazing and means something really was sloppy about X Elite. But right now I don’t have those expectations if I had to gamble, just not impossible that they improve some lowish hanging fruit they blew up.
 

FlameTail

Diamond Member
Dec 15, 2021
3,151
1,800
106
Rumours are that next flagship phones with 8G4 are packing bigger batteries (5500-6000 mAh).

Why do you think that is?

...
 

Tup3x

Golden Member
Dec 31, 2016
1,008
996
136
Battery tech advances and it doesn't really have anything to do with next Snapdragon.
 

adroc_thurston

Diamond Member
Jul 2, 2023
3,319
4,788
96
Nvidia ARM SoC, Mediatek ARM SoC etc...
That's the same thing.
Then we know something was f*cked up with X Elite’s power delivery or physical design and fabric.
RDNA3bros...
Rumours are that next flagship phones with 8G4 are packing bigger batteries (5500-6000 mAh).
Super unrelated, most flagships have been sitting at ~5k mAh and it's time for a bump.
Chinese OEMs went to 5.5k flagships last year anyway, iirc.
 

ikjadoon

Member
Sep 4, 2006
145
242
126
I actually bought the Omnibook X, though they are more or less identical.
Here are my first impressions (after owning it for 2 days, and not really getting to use it above and beyond setup). For background, my previous laptop was an HP Spectre, and the other laptop I was considering was the 2024 Spectre w/ MTL.

Thank you for sharing all this. I'm quite glad to hear about the active cooling / thermals. 2.4 GHz is all right, especially as that should perform like a laptop released in the past 2-3 years (ASUS, looking at you).

I wish OEMs were more up front about that, but they never were transparent with Intel / AMD designs, e.g.,. "So why start now with Qualcomm?" I'd expected Oryon designs would be closer to Apple's designs and Qualcomm's pre-Oryon designs; unplugged vs plugged, virtually no difference in 1T perf.

Funnily enough, I also had a Spectre x360 with the i7-1065G7 and that "Intel 10nm" was not very ice lake like, like most Intel & AMD laptops it seems. Unplugged, it'd start to few stutter switching apps and the quiet (silent?) mode was genuinely slow even in web browsing with a few background apps (Spotify, todo list, PowerPoint, etc.). And then forget summers with bare legs.

//

Spot on. Most users aren’t technical. If they have issues, they will simply return it and get something else.

That is why the claim of ARM reaching 50% marketshare within a few years are dubious.

Look at how long it has taken AMD to get anywhere with Intel since the original Ryzen launch.

Qualcomm and Microsoft have one chance to even make a 1% dent in Marketshare, and both are screwing it up.

The driver situation should have been sorted out before production silicon was ever allowed to launch.

Why is x86 software, even non-gaming software, still failing to launch despite an emulation layer being present? I haven’t had time to play with a recent WoA build, but I had hoped things would have improved.

They need to fix things, and fast.

That's a fair point. It may depend on how much people can overlook and what devices people are coming from. Most people--an assumption here--buying an Oryon laptop are probably coming from a 2-5+ year old laptop. Will Oryon be faster and more efficient (even if simply throttled) than those laptops? I'd say yes and that'd be great.

Throttled but "still faster than my old laptop" is a win. (as an aside, from Qualcomm's 1T perf / W charts, I assumed a 5-10W burst budget would've been very workable → fanless, pls, but nada, nothing).

But, then, if this Oryon laptop has too many bugs / much worse incompatibility, then people may reject them. "I don't care if it's hotter or slower; I need this to work today and it doesn't."

Re: compatibility: it looks a little rough now, yes. I don't know what the "long tail" will look like here: is it 3 months? 6 months? Years?
  • Some apps failing to launch even with Prism emulation: I suspect these software developers tested Prism, still hit a really ugly roadblock (e.g., hundreds of bugs, crashes, files being deleted, etc.), and disallowed emulation to prevent user complaints / reputational damage. While it is bad and people get unwelcome surprises, sadly the alternative is even worse: "My Qualcomm laptop deleted half my Google Drive!"
  • Some apps may eventually be native, but they may not have all the same features. Adobe did this a few times with their native versions: it is native, but it's not a complete port yet.
  • Some apps may go mostly native, but it'll take a longer time to weed out all the x86-dependent code (e.g., older libraries, dependency A has dependency B). This is more ancillary and shouldn't prevent usage, but it may contribute to still-not-parity.
Ideally, it's not years, but, yes: 50% seems like a large jump. Qualcomm will get the brunt of incompatibility problems vs later Arm vendors because, well, QC wanted to be the exclusive WoA provider & that will cut both ways.
 
Reactions: Tlh97 and Joe NYC

SpudLobby

Senior member
May 18, 2022
961
655
106
Super unrelated, most flagships have been sitting at ~5k mAh and it's time for a bump.
Chinese OEMs went to 5.5k flagships last year anyway, iirc.
Yep. “5.5K mAh holy shiz!” From leakers losing their mind kills me. Like have you guys seen the current Xiaomi and Huawei stuff? Lmao.
 

KompuKare

Golden Member
Jul 28, 2009
1,069
1,108
136
NBC uses a flawed methodology. While understandable, measuring AC power means most laptops do not have all the power management turned on, thus you often see a difference in figure between actual battery life and power in W.

You'd see "Idle minimum" at 7W, but the laptop is getting 10 hours on a 70WHr battery or the same 7W under WiFi browsing, which isn't "Idle minimum".
I guess there are at least 4 ways to measure:
  1. Use the chips self-reporting
  2. Measure at the mains
  3. Take the device apart and measure the actual battery draw
  4. Fully charged the device. Measure without the mains. Then accurately measure how much it takes to charge back to 100%.
None are perfect.
#1 might be okay for multiple devices with the same vendor's chips, but probably even then you'd want same vendor and the same generation in case that changes.

#2 has the flaws you pointed out.

#3 is probably the most accurate but some devices are pretty tricky to open.

#4 could be a compromise where you don't have to open anything but does require a good estimate of how efficient the charger is and so on.

For #4 an accurate tally of power used to get to 90% might be better as charging usually tapers off at the end of the charging cycle - but that does require constantly watching the percentages or cameras.
 

eek2121

Diamond Member
Aug 2, 2005
3,045
4,266
136
I guess there are at least 4 ways to measure:
  1. Use the chips self-reporting
  2. Measure at the mains
  3. Take the device apart and measure the actual battery draw
  4. Fully charged the device. Measure without the mains. Then accurately measure how much it takes to charge back to 100%.
None are perfect.
#1 might be okay for multiple devices with the same vendor's chips, but probably even then you'd want same vendor and the same generation in case that changes.

#2 has the flaws you pointed out.

#3 is probably the most accurate but some devices are pretty tricky to open.

#4 could be a compromise where you don't have to open anything but does require a good estimate of how efficient the charger is and so on.

For #4 an accurate tally of power used to get to 90% might be better as charging usually tapers off at the end of the charging cycle - but that does require constantly watching the percentages or cameras.

There are other ways to get accurate measurements. I have never tried to do accurate power measurements in a laptop, but possibly probes to the VRMs or something else. You actually have me curious, now…
 
Reactions: KompuKare

KompuKare

Golden Member
Jul 28, 2009
1,069
1,108
136
There are other ways to get accurate measurements. I have never tried to do accurate power measurements in a laptop, but possibly probes to the VRMs or something else. You actually have me curious, now…
Some VRMs should be readable in software too.

Problems are: expecting a new platform to accurately report on the VRM on the motherboard is not a given...

Having to rely on someone to accurately place probes requires a lot of hardware knowledge...

Measuring art the mains is the easiest. My original suggestion #4 at least should work without needing supporting software nor hardware knowledge. Just an good wall meter to measure the power taken to change it back up, and accurate note taking.
 
Reactions: Nothingness

FlameTail

Diamond Member
Dec 15, 2021
3,151
1,800
106
Lenovo Yoga Slim 7x Snapdragon review


I called it many weeks ago: the Yoga Slim 7x is one of the best laptops with the new X Elite chip.

• Great OLED screen
• Great build quality
• Good battery life
• Great value for money
 
Reactions: Tlh97 and Gideon

trivik12

Senior member
Jan 26, 2006
320
288
136
Asus is not the best laptop brand to a new product line. Lenovo does it much better especially in their premium lines. At this point X Elite are not great for gaming but as a productivity device they are really good matching with current x86 without breaking a sweat. I expect it to take marketshare in Corporate refreshes as they mostly use only Microsoft products like office/sharepoint/teams etc.
 

Joe NYC

Platinum Member
Jun 26, 2021
2,324
2,929
106
I don’t really care about Zen 5 anymore, all aboard the Zen 6 hype train.

If you haven’t deduced it’s just an another average generation jump. I don’t need to see third party benchmarks. Certainly, not from those YouTubers.

Skipping Zen 5 x3d at +26%
 

Hitman928

Diamond Member
Apr 15, 2012
5,593
8,770
136
A couple more reviews came up in my feed. Neither are very good, imo, especially the second one (way too much, I'll say, fluff). However, there are a few interesting things to take from them.

The first one has the Samsung laptop with the top SKU with 4.2 GHz boost. Unfortunately, it seems to not be a very good device overall. It has an audible and (according to the reviewer) annoyingly whiny fan. Additionally, it drops clocks significantly on battery power, even on the highest performance profile, to the extent that it barely equals or falls behind the Surface laptop (also on battery) in MT tests, even though the Surface has the 2nd tier SKU. On the plus side, the GPU performs better but we all know how useful that is right now. Unfortunately he only tests GB when plugged in, everything else is done on battery, so with the Samsung dropping clocks on battery, we don't really get to know much about the faster SKUs thermals/power. Lastly, the Surface model stays pretty quiet but can get toasty at 47 C surface temperature.

The second one I found interesting because he monitors the clock speeds while CB24 is running. The cores start at ~3.4 GHz but slowly drop to ~2.2 GHz by the end of the 10 minute stress test, so it seems that QC is using some sort of boosting algorithm and 3.4 GHz is not really the base clock but more the all core boost. Granted, this was done on battery so maybe that's why they dropped, but the profile was set to high performance (edit: I believe that CB24 reports the highest score during the 10 minute run, so my prior comment isn't accurate and I've deleted it). It also wasn't thermal throttling, at least for sure once it got below 3 GHz. It could also be that HWmonitor is not reporting frequencies correctly but it's an interesting observation and something I hope someone looks further into as I don't think this type of behavior is seen on AMD/Intel laptops when on battery power, but someone can correct me if I'm wrong.

 
Last edited:
Reactions: Nothingness

FlameTail

Diamond Member
Dec 15, 2021
3,151
1,800
106
Okay, so here's how the X Eite performs in 3DMark Steel Nomad Light.

Snapdragon X Elite* : 2000 points
Apple M2 : 2600 points
Apple M4 : 3600 points
Radeon 780M : 3000 points
Core Ultra 155H : 3600 points
RTX 4050 : 9000 points

Steel Nomad is relevant because it's new, and more relevant to modern games. Steel Nomad stresses the GPU's compute aspects more than something like Wildlife Extreme. Here's a set of tweets from David Huang explaining it;


As you can see, the X Elite does not perform well at Steel Nomad Light. The lacking compute performance of Adreno 7xx is largely to blame, as well as the poor drivers.
 

adroc_thurston

Diamond Member
Jul 2, 2023
3,319
4,788
96
The lacking compute performance of Adreno 7xx is largely to blame, as well as the poor drivers.
They're actually loaded on raw compute.
I assume it's the usual Adreno pitfalls of "low cache b/w" and "bad VALU/GPR ratio".
Oh and it, well, clocks low.
 

poke01

Golden Member
Mar 8, 2022
1,388
1,601
106
Okay, so here's how the X Eite performs in 3DMark Steel Nomad Light.

Snapdragon X Elite* : 2000 points
Apple M2 : 2600 points
Apple M4 : 3600 points
Radeon 780M : 3000 points
Core Ultra 155H : 3600 points
RTX 4050 : 9000 points

Steel Nomad is relevant because it's new, and more relevant to modern games. Steel Nomad stresses the GPU's compute aspects more than something like Wildlife Extreme. Here's a set of tweets from David Huang explaining it;


As you can see, the X Elite does not perform well at Steel Nomad Light. The lacking compute performance of Adreno 7xx is largely to blame, as well as the poor drivers.
oh, pat wasn’t kidding when he said he wasn’t worried about X Elite.

Lunar is going to wreck this GPU and Strix is going steam roll it. Looks like Lunar Like will have a better iGPU than M4. I can’t wait for a 12”/13” notebook Lunar machine, couple that great battery life and performance for its class. It’s what the 2015 12” MacBook should had a SoC.
 

FlameTail

Diamond Member
Dec 15, 2021
3,151
1,800
106
SNLWLESNL/WLE
Apple M2260069000.376
Apple M3360090000.400
780M300060000.500
155H330065000.507
X Elite200063000.317
8G3170051700.328
4050M9000170000.529
3050M5800110000.527
SNL = 3DMark Steel Nomad Light.
WLE = 3DMark Wild Life Extreme

As you can see, Adreno has the lowest SNL/WLE ratio, while Nvidia has the highest.
 

Ghostsonplanets

Senior member
Mar 1, 2024
545
945
96
The Switch 2 will have a better GPU than this SoC….
Is this surprising? Tegra X1 already showed what is the difference between a Desktop Class GPU arch vs a Mobile oriented GPU arch. It was able to hold its own for a long time against the newest Android peers.

Switch 2 will be using the proven Ampere uArch, which is a DX12U/12.2 Desktop Class uArch designed for current and future gaming R&D such as Compute Shaders, Ray Tracing, Virtualized Geometry, etc. Not only that, but it's also a big GPU for a portable device (1536 Shaders/ALUs).

Adreno 7xx is decent enough. But, as it showed with Snapdragon X, it's not suited for Desktop Gaming. Let's see if Adreno 8 will bring changes to properly align more with PC oriented GPU design rather than Mobile.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |