Discussion Intel Meteor, Arrow, Lunar & Panther Lakes Discussion Threads

Page 449 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Tigerick

Senior member
Apr 1, 2022
694
600
106






As Hot Chips 34 starting this week, Intel will unveil technical information of upcoming Meteor Lake (MTL) and Arrow Lake (ARL), new generation platform after Raptor Lake. Both MTL and ARL represent new direction which Intel will move to multiple chiplets and combine as one SoC platform.

MTL also represents new compute tile that based on Intel 4 process which is based on EUV lithography, a first from Intel. Intel expects to ship MTL mobile SoC in 2023.

ARL will come after MTL so Intel should be shipping it in 2024, that is what Intel roadmap is telling us. ARL compute tile will be manufactured by Intel 20A process, a first from Intel to use GAA transistors called RibbonFET.



Comparison of upcoming Intel's U-series CPU: Core Ultra 100U, Lunar Lake and Panther Lake

ModelCode-NameDateTDPNodeTilesMain TileCPULP E-CoreLLCGPUXe-cores
Core Ultra 100UMeteor LakeQ4 202315 - 57 WIntel 4 + N5 + N64tCPU2P + 8E212 MBIntel Graphics4
?Lunar LakeQ4 202417 - 30 WN3B + N62CPU + GPU & IMC4P + 4E012 MBArc8
?Panther LakeQ1 2026 ??Intel 18A + N3E3CPU + MC4P + 8E4?Arc12



Comparison of die size of Each Tile of Meteor Lake, Arrow Lake, Lunar Lake and Panther Lake

Meteor LakeArrow Lake (20A)Arrow Lake (N3B)Lunar LakePanther Lake
PlatformMobile H/U OnlyDesktop OnlyDesktop & Mobile H&HXMobile U OnlyMobile H
Process NodeIntel 4Intel 20ATSMC N3BTSMC N3BIntel 18A
DateQ4 2023Q1 2025 ?Desktop-Q4-2024
H&HX-Q1-2025
Q4 2024Q1 2026 ?
Full Die6P + 8P6P + 8E ?8P + 16E4P + 4E4P + 8E
LLC24 MB24 MB ?36 MB ?12 MB?
tCPU66.48
tGPU44.45
SoC96.77
IOE44.45
Total252.15



Intel Core Ultra 100 - Meteor Lake



As mentioned by Tomshardware, TSMC will manufacture the I/O, SoC, and GPU tiles. That means Intel will manufacture only the CPU and Foveros tiles. (Notably, Intel calls the I/O tile an 'I/O Expander,' hence the IOE moniker.)



 

Attachments

  • PantherLake.png
    283.5 KB · Views: 24,000
  • LNL.png
    881.8 KB · Views: 25,479
Last edited:

Magio

Member
May 13, 2024
61
54
51
This isn’t competitive at all with Strix Halo or M4 Max…
I feel like the fact it was P branded suggests it's more likely to be targeting Strix Point than Strix Halo (and M4 Pro rather than Max). Shame it's Alchemist based and not Battlemage though.

I don't think they'll even be targeting Strix Halo at all, for what it's worth. It's kind of unclear whether Strix Halo will be an interesting proposition vs laptops with Strix Point/ARL + an Nvidia GPU.
 

inquiss

Member
Oct 13, 2010
176
260
136
I feel like the fact it was P branded suggests it's more likely to be targeting Strix Point than Strix Halo (and M4 Pro rather than Max). Shame it's Alchemist based and not Battlemage though.

I don't think they'll even be targeting Strix Halo at all, for what it's worth. It's kind of unclear whether Strix Halo will be an interesting proposition vs laptops with Strix Point/ARL + an Nvidia GPU.
It's going to sing on anything that loves RAM. See AI/machine learning nonsense.
 

Philste

Senior member
Oct 13, 2023
248
442
96
This isn’t competitive at all with Strix Halo or M4 Max…
I feel like the fact it was P branded suggests it's more likely to be targeting Strix Point than Strix Halo
What did I miss? Strix with 16CU is faster than 8 Xe Core MTL, but it doesn't completely destroy it. Both Arrow Lake Halo and Strix Halo have 2.5× the ALUs (40CU Strix and 20Xe Arrow). So it would be decently competitive GPU-Wise. It surely misses a few Cores/threads CPU wise tho.
 

SiliconFly

Golden Member
Mar 10, 2023
1,452
821
96
What did I miss? Strix with 16CU is faster than 8 Xe Core MTL, but it doesn't completely destroy it. Both Arrow Lake Halo and Strix Halo have 2.5× the ALUs (40CU Strix and 20Xe Arrow). So it would be decently competitive GPU-Wise. It surely misses a few Cores/threads CPU wise tho.
Well, in theory, a ARL 6+8 CPU with 320 Xe-LPG EU GPU plus a large L4 ADM cache would have been a solid performer.

But I kinda find these niche products a bit hideous. They're neither the best in compute, nor the best in graphics processing and nor the best in efficiency. There's nothing *halo* about these parts. They should just be called *hollow*.
 

Abwx

Lifer
Apr 2, 2011
11,515
4,301
136
Well, in theory, a ARL 6+8 CPU with 320 Xe-LPG EU GPU plus a large L4 ADM cache would have been a solid performer.

But I kinda find these niche products a bit hideous. They're neither the best in compute, nor the best in graphics processing and nor the best in efficiency. There's nothing *halo* about these parts. They should just be called *hollow*.
That may apply for Intel as they re penny-pinching on the CPU side of things, but Strix Halo should have CPU perfs comparable to a 14900K, or not far, on the other hand Intel is just proposing some MTL 185H, wich is unbalanced for such a big GPU.
 

SiliconFly

Golden Member
Mar 10, 2023
1,452
821
96
That may apply for Intel as they re penny-pinching on the CPU side of things, but Strix Halo should have CPU perfs comparable to a 14900K, or not far, on the other hand Intel is just proposing some MTL 185H, wich is unbalanced for such a big GPU.
Thats what you said about Zen 5 too. But the reality turned out to be very different. Not only have Twitter/Reddit users have named is as Zen 5%, there are other things many people say about competition that no self respecting human being would share in this forum. Even HXL posted something nasty about Zen 5 today and I had to defend (yes, me) saying they're far better than what he says. It's that bad out there.

And yes, please enjoy your magical halo part as much as you want. But I ain't touching no hollow stuff. I'd rather go for a good cpu with a decent gpu like always.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,478
2,955
136
And yes, please enjoy your magical halo part as much as you want. But I ain't touching no hollow stuff. I'd rather go for a good cpu with a decent gpu like always.
Yeah, but you will be limited to 8GB Vram unless you are willing to pay >2000 euro for something stronger.
Strix Halo at least will offer top level CPU, but I think It will also cost >2000 euro. ARL CPU looks behind in comparison.
IGP is not that strong in comparison. 48CU or 24Xe would have been more interesting, but BW would have been a bigger problem.
 

SiliconFly

Golden Member
Mar 10, 2023
1,452
821
96
ARL CPU looks behind in comparison.
Don't think so. The tGPU part in ARL is poor. But not the CPU part in ARL. Every review/leak out there has put Lion Cove and Skymont well ahead of competition's equivalent one way or another. And ARL LNC final IPC hasn't even been revealed yet. The last leak stands at a massive 18% uplift for ARL's LNC (a few pages back). Zen 5 doesn't stand much of a chance this time around (including halo).
 

DavidC1

Senior member
Dec 29, 2023
776
1,236
96
But I kinda find these niche products a bit hideous. They're neither the best in compute, nor the best in graphics processing and nor the best in efficiency. There's nothing *halo* about these parts. They should just be called *hollow*.
I agree about this part. I don't agree with Arrowlake beating Zen 5 as a done deal. It hasn't until it does. Even well after release people are finding new things about Zen 5, the relatively massive gaming improvement with the Windows update that'll make Raptorlake a Raptor-was.

These Halo parts have traditionally always been about getting more premium systems. That's the issue. It doesn't offer better battery life compared to dGPUs, it often costs more than comparable dGPUs, so the only thing you get is the potentially thin form factor which doesn't make much sense either as now the heat is concentrated even more.

While dGPUs have a disadvantage of needing a separate PCB, the cost difference is much reduced by the fact that it can go in any system configuration, and massively produced methodologies, while the fancy new packaging is much lower volume and harder to assemble too. The "magic" of iGPUs which is basically zero heat and zero cost is nullified in the halo versions, because face it, it would be stupid for them to offer it cheap.
 

branch_suggestion

Senior member
Aug 4, 2023
370
826
96
It doesn't offer better battery life compared to dGPUs
That is not true, no need for PCIe paging and the tax of running two different types of RAM.
It often costs more than comparable dGPUs
Yes, but the BoM is lower, but being niche parts with advantages are charged more by OEMs.
So the only thing you get is the potentially thin form factor which doesn't make much sense either as now the heat is concentrated even more.
A single SoC is easier to cool in a limited space than 2 separate packages.
While dGPUs have a disadvantage of needing a separate PCB, the cost difference is much reduced by the fact that it can go in any system configuration, and massively produced methodologies, while the fancy new packaging is much lower volume and harder to assemble too.
It is momentum, standard CPU+dGPU designs are very common and the tooling et al reflects this, new designs require new investments.
But Apple has shown that there is a market for big iGPU configs, so now the death of lower end dGPUs is finally near.
 

DavidC1

Senior member
Dec 29, 2023
776
1,236
96
That is not true, no need for PCIe paging and the tax of running two different types of RAM.

Yes, but the BoM is lower, but being niche parts with advantages are charged more by OEMs.

A single SoC is easier to cool in a limited space than 2 separate packages.

It is momentum, standard CPU+dGPU designs are very common and the tooling et al reflects this, new designs require new investments.
But Apple has shown that there is a market for big iGPU configs, so now the death of lower end dGPUs is finally near.
Only true in theory. In reality it often fails. None of Intel's attempts at making such had success. Not with their own iGPU, nor with using AMD's GPU. Perhaps that's little more indicative of Intel's lack of execution but examples that you can't guarantee it.

The advantages even if it exists are mostly nullified by charging extra. What kind of volumes are they going to push with a $2000 device?

The fascination of big iGPUs are driven by unconscious knowledge that iGPUs are very low cost and very low power, which isn't the case here, and the real product often pops the bubble.

What makes the high end dGPU make less sense from an average user's perspective is not these honking big iGPUs but the inevitable fact of Moore's Law and computing advances favoring the low power/low cost ladder and diminishing returns on top.

Going from 360 to 720P made a lot of sense. 720 to 1080P was ok too. But what about to 4K? 4K to 8K? Huge advances in graphical fidelity made in the beginning. What about now? Even Ray Tracing with big compute requirements doesn't seem as huge as earlier advances.
 
Reactions: MangoX

branch_suggestion

Senior member
Aug 4, 2023
370
826
96
Only true in theory. In reality it often fails. None of Intel's attempts at making such had success. Not with their own iGPU, nor with using AMD's GPU. Perhaps that's little more indicative of Intel's lack of execution but examples that you can't guarantee it.
Sure, you need good enough IP and execute the right overall package.
Kaby Lake-G was very much a leap into the unknown competing against superior NV IP. Dismissing the concept based on early attempts with lots of room to improve is not wise.
The advantages even if it exists are mostly nullified by charging extra. What kind of volumes are they going to push with a $2000 device?
They can charge lower prices for the same margins with some MDF to offset OEM development costs.
But yes, it will be niche initially, the future depends on what it offers over a conventional design in the eyes of consumers.
The fascination of big iGPUs are driven by unconscious knowledge that iGPUs are very low cost and very low power, which isn't the case here, and the real product often pops the bubble.
Apple has done it for 3 generations now, just need a x86 SoC with an iGPU that actually does well in games to show it is a good concept.
What makes the high end dGPU make less sense from an average user's perspective is not these honking big iGPUs but the inevitable fact of Moore's Law and computing advances favoring the low power/low cost ladder and diminishing returns on top.
Luggables are a meme, yes. These devices are not luggables but with luggable performance.
Going from 360 to 720P made a lot of sense. 720 to 1080P was ok too. But what about to 4K? 4K to 8K? Huge advances in graphical fidelity made in the beginning. What about now? Even Ray Tracing with big compute requirements doesn't seem as huge as earlier advances.
Uncompromising 4K on a laptop that isn't a brick is way off. 1440p is the current comfy spot and still has room to get more comfortable
 

jdubs03

Senior member
Oct 1, 2013
665
286
136
Based on the core counts there, they must be prioritizing the GPU, because on the CPU side, it’s not even close to Strix Halo. It’ll have to significantly outperform on the graphics side for it to stand a chance in the market.
—————————————————————-
Some salt may be needed, but is interesting nonetheless.
 

511

Senior member
Jul 12, 2024
244
179
76
Based on the core counts there, they must be prioritizing the GPU, because on the CPU side, it’s not even close to Strix Halo. It’ll have to significantly outperform on the graphics side for it to stand a chance in the market.
—————————————————————-
Some salt may be needed, but is interesting nonetheless.
View attachment 106471
I think for halo StrixPoint level CPU Performance is fine with a beefier gpu basically 4060M-4070M performance the only advantage would be Windows not messing GPU like it dows in hybrid system
 

Magio

Member
May 13, 2024
61
54
51
I still can't help but feel like this whole Royal Core narrative is overblown. I'm pretty sure a company Intel's size always has at least a few advanced core design projects in progress and the enshrining of one of them (if it even exists) as "the arch to save the company" never made much sense to me.

I'm sure if Royal Core was legit and actually that promising, they wouldn't have killed it. And if it wasn't promising as a whole but had promising aspects, they'd keep the promising aspects. And I'm sure that has happened without a peep getting to the public to half a dozen advanced core designs before.
 

Hulk

Diamond Member
Oct 9, 1999
4,454
2,372
136
Only true in theory. In reality it often fails. None of Intel's attempts at making such had success. Not with their own iGPU, nor with using AMD's GPU. Perhaps that's little more indicative of Intel's lack of execution but examples that you can't guarantee it.

The advantages even if it exists are mostly nullified by charging extra. What kind of volumes are they going to push with a $2000 device?

The fascination of big iGPUs are driven by unconscious knowledge that iGPUs are very low cost and very low power, which isn't the case here, and the real product often pops the bubble.

What makes the high end dGPU make less sense from an average user's perspective is not these honking big iGPUs but the inevitable fact of Moore's Law and computing advances favoring the low power/low cost ladder and diminishing returns on top.

Going from 360 to 720P made a lot of sense. 720 to 1080P was ok too. But what about to 4K? 4K to 8K? Huge advances in graphical fidelity made in the beginning. What about now? Even Ray Tracing with big compute requirements doesn't seem as huge as earlier advances.
Eventually human limitations put an end to the endless upgrade in resolution cycle. 480i to 1080p, huge increase in perceived video quality. 1080p to 4k, significant increase in quality. 4k to 8k or higher, not much increase unless you are sitting 2 ft away from a 85" display, which of course is ridiculous.
 
Reactions: Tlh97 and misuspita

jdubs03

Senior member
Oct 1, 2013
665
286
136
I still can't help but feel like this whole Royal Core narrative is overblown. I'm pretty sure a company Intel's size always has at least a few advanced core design projects in progress and the enshrining of one of them (if it even exists) as "the arch to save the company" never made much sense to me.

I'm sure if Royal Core was legit and actually that promising, they wouldn't have killed it. And if it wasn't promising as a whole but had promising aspects, they'd keep the promising aspects. And I'm sure that has happened without a peep getting to the public to half a dozen advanced core designs before.
Yeah I mean I get that point of view. On the Reddit forums, the theory (I suppose based on 2nd hand knowledge) is that Intel wanted to direct more of its resources to the GPU and NPU teams, which came at the expense of the Royal Core redesign. When that happened, four of the top architects in that team quit and formed their own company, AheadComputing. Part of this theory is that both the p and e core teams are integrating the concepts of Royal into their future products, probably starting with Nova Lake. So maybe they get 2-3 releases with improvements derived from Royal.

It would be managerial malpractice to just dump the project if the benefit down the road was to be very significant. But Intels’ management has been making solid decisions for the past decade or so. And that’s the source of the worries.
 

AcrosTinus

Member
Jun 23, 2024
72
44
51
I still can't help but feel like this whole Royal Core narrative is overblown. I'm pretty sure a company Intel's size always has at least a few advanced core design projects in progress and the enshrining of one of them (if it even exists) as "the arch to save the company" never made much sense to me.

I'm sure if Royal Core was legit and actually that promising, they wouldn't have killed it. And if it wasn't promising as a whole but had promising aspects, they'd keep the promising aspects. And I'm sure that has happened without a peep getting to the public to half a dozen advanced core designs before.
I couldn't have said it better !!
This is just the natural flow of R&D but people love to attach a narrative to it, makes it more exciting
 

LightningZ71

Golden Member
Mar 10, 2017
1,777
2,133
136
Hitting 3050 level of performance with an integrated GPU itself is not a small task. Forget 4060/4070.
We get that 3050 mobile performance is no joke, and that it's still a notch above what the 890M is giving in Strix Point in most cases. What Phoenix and now Strix Point have done is essentially eliminate the need for anything below 3050 mobile. None of Nvidia or AMDs products below that level are clearly better than those iGPUs, nor will they be above Lunar Lake's iGPU in performance either. While there will still be some stragglers in the low cost trailing edge design category, it's not something that's going to get a lot of development love going forward.

Essentially, now that we have things like DLSS/XeSS/FSR, as long as an iGPU is putting out decent performance at native 1080p, laptops with 1080p to 1440p screens are going to be adequately served by them unless someone REALLY wants top tier fidelity. Look at the Strix Point high tdp reviews on NBC's site. The 890m is producing playable FPS at low/mid quality at native 1080p on the majority of the games tested. It doesn't make much sense to spend the extra money on a laptop with a 3050 mobile over a Strix Point laptop with a decent TDP and decent ram capacity when the quality of the game experience won't be worth the extra money spent.
 

dullard

Elite Member
May 21, 2001
25,476
3,974
126
4k to 8k or higher, not much increase unless you are sitting 2 ft away from a 85" display, which of course is ridiculous.
You are thinking video, which is valid. But I think of still images like photography, CAD/Solidworks, science/medical image analysis, productivity software (such as large Excel sheets or many lines of code). With just about any flagship phone (or DSLR camera), an 8K monitor is insufficient to see their images from the main camera without artifacts caused by downscaling. And those image artifacts are independent of the monitor distance from the viewer.

Yes, there are human limitations, but for still images 8K is far from that limitation due to downscaling or anti-aliasing needed. I personally get frustrated in Solidworks trying to select an exact line in the middle of multiple anti-aliased lines. A ~16K monitor (if I had one), even 3 feet from my eyes, would really help that task.

Plus, GPUs are used for many other things than high resolution gaming graphics. Crypto, blockchain, AI, rendering, etc.

Edited: added productivity software as suggested by igor_kavinski
 
Last edited:
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |