Question Zen 6 Speculation Thread

Page 76 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

moinmoin

Diamond Member
Jun 1, 2017
5,193
8,328
136
Personally I think AMD's biggest problem is not each gen being a big enough improvement but rather their cadence slowing down. For years we were talking about them sticking to a cadence of around 16 months on average. Now it's becoming more and more clear that it's significantly longer than that, and that the change wasn't due to some slip ups. This increases the pressure on single per gen improvements.

With Zen 5 the talk was that the full core was intended for 3nm and the eventual 4nm version only got a compromised adaption. As far as I can see we never got any further info, like indications how Zen 5c on 3nm fares compared to Zen 5 4nm etc.

Ideally with one gen falling back the next gen, still executing as planned, would more than make up for it. Zen 6 going 2nm ("double shrink") may help.

But I feel we haven't really seen that play out like that with other AMD iterations, like with lackluster RDNA3 and the correction RDNA4. Is the latter really more than what RDNA3 originally promised plus some significant improvements on top?

And from the patent trail from last quarter you'd imagine AMD is a packaging specialist than CPU designer. Not good. Typically patent trail is good indicator of active research in novel ideas but not since last two quarters unfortunately. At least not much applications in core design.
With RTG always having been big on packaging, packaging being a crucial part of AMD CPUs since Zen 1 Epyc/Threadripper and Zen 2 Ryzen, and Xilinx (being the other major specialist in packaging in the industry) merging with AMD since, I would say yes, AMD is definitely no doubt a packaging specialist.

Though I'd like to ask you, did you keep track of who is credited in those patents, so which group or department within AMD these patents can be credited to?
 
Reactions: Joe NYC

yuri69

Senior member
Jul 16, 2013
623
1,080
136
In recent videos, MLID says he expects Zen 6 in H1 2026.

TSMC leak showed Zen 6 to be one of the early products for risk production of N2. First of those wafers start in late 2025. Which does not guarantee that Zen 6 will be on very first ones, but certainly makes mid 2026 feasible.
Yea, that N2 thing... AMD invested in splitting the core design between N4 and N3 just because the N3 was pricey. Now with Zen 6 they don't mind and go full throttle into a risk-production?
Personally I think AMD's biggest problem is not each gen being a big enough improvement but rather their cadence slowing down. For years we were talking about them sticking to a cadence of around 16 months on average. Now it's becoming more and more clear that it's significantly longer than that, and that the change wasn't due to some slip ups. This increases the pressure on single per gen improvements.

With Zen 5 the talk was that the full core was intended for 3nm and the eventual 4nm version only got a compromised adaption. As far as I can see we never got any further info, like indications how Zen 5c on 3nm fares compared to Zen 5 4nm etc.

Ideally with one gen falling back the next gen, still executing as planned, would more than make up for it. Zen 6 going 2nm ("double shrink") may help.
Yup, the slowdown is very bad. For Zen 4 there was the "COVID did it" excuse. With Zen 5 there was the "redesign to N4 did it" excuse.

In 2023 Zen 6 was still labeled as a 10-15+% IPC 3nm/2nm derived design. Did they really redesign it *all* to 2nm? I mean all the 8-12c/16c/32c CCXes?
 

Joe NYC

Platinum Member
Jun 26, 2021
2,906
4,270
106
Yea, that N2 thing... AMD invested in splitting the core design between N4 and N3 just because the N3 was pricey. Now with Zen 6 they don't mind and go full throttle into a risk-production?

N3B had problems AMD was fully aware of (high cost, yield, capacity).

It is a very strange logic to assume that Zen 6 will also be mired in same contingencies and delays of backporting as Zen 5, when N2 does not have the same problems as N3B.
 

OneEng2

Senior member
Sep 19, 2022
451
669
106
What, where? In what natively compiled real world application x86 dominates?

I write software myself. And while I haven't written much in C lately. Be it Rust, Zig, Go, Java, or Node.js - every time I've compared Mac to x86, the real world performance of M processors tends to mirror the canned benches or be even slightly better (in part probably due to more memory-BW available).
... as do I.

Writing "Apps" for phones is one thing. They are simple things, and many a high school student can do it today. The VAST majority of phone apps written in these "programming languages" (I tend to believe that if it has ARC and a garbage collector, it isn't a real language. If it doesn't use objects, it DEFINATELY isn't a real language) are single threaded and very simple in architecture.

For business applications, PC and x86 DOMINATE.

For DC, x86 dominates.

For workstation, x86 dominates.

If I am incorrect here, please provide some information to enlighten me.
So in your world most of the development is done on Windows and also the runtime is MS Windows. Okay, there surely are niches like that but hey. The wast majority of development has been centered around web technologies which are not really friendly for Windows.
In my world, most computers sold (not phones or tablets) are x86 and most software sold is x86.
Yea, that N2 thing... AMD invested in splitting the core design between N4 and N3 just because the N3 was pricey. Now with Zen 6 they don't mind and go full throttle into a risk-production?
Doesn't seem at all likely to me. I am fairly sure the DC "D" parts will be N2. Other than that, I just don't see it.
I still have serious doubts about that. At this point I assume it's yet another thing the so-called-Leakers get wrong, again.
Bingo!
It's what it is.
N2 is cheap and good and it's the thing everyone will be using.
N2 is the most expensive process node on the planet (outside of some specialized military grade stuff). How is it "cheap"?
D CCD was always an N2 stack.
What changed is they moved everything else onto N2, besides the poverty parts.
Which will be N3p. Or N3c.
Do you have a link for this? It is very difficult to believe.

AMD has been very careful to utilize a cost effective node for their high volume products. Putting all their eggs in the N2 basket would be risky AND expensive.

As of today, AMD's N4P Zen 5 competes very well against Intel's Arrow Lake on N3B. If I were at AMD, I would be designing my next high volume part on N3P which would be more than enough to further expand AMD's lead over Intel's Arrow Lake and likely match Intel's Nova Lake (or near enough).

[begin speculation] I believe that desktop is about to become more about affordable computing than absolute performance. Laptop is about to become more about battery life than absolute performance. DC and Workstation are going to continue to be all about the most cores per socket and the bandwidth needed to feed the beasts.

DC is the fastest growing market with the highest margins. AMD is much more likely to design a new architectural platform for DC and then trickle down what fits to Desktop/Laptop/Consumer.

Only DC will be using the most advanced node since DC will be all about "all out performance per socket".[/End Speculation]
 

Doug S

Diamond Member
Feb 8, 2020
3,059
5,290
136
Apple needs to be careful. Yes, they have a golden goose that lays golden eggs, the iPhone, but these days a smartphone is pretty much a smartphone. Like their computer ecosystem, Apple has "East Berlin'ed" but even that could start to fail as people realize you can get a competing Andriod phone for half the price or less. It's not like it was 10 years ago when iPhones were demonstrably better than Andriod. All it took then was a quick stutterless fast scroll of busy webpage to show iPhone superiority. It's all the same today. Even my kids who have top of the line iPhones notice my Pixel 7 is "just like" thier iPhone when they need to use it. Also, I think people are weary of the "must upgrade" every year thing.

What, you really think Apple has to be worried about Androids half the price that are pretty competitive? That's been the case for over a decade now. Why is that something they should start worrying about in 2025 when it wasn't hitting them in 2015 or 2020?

And no one upgrades every year except content creators. Apple would sell way more than a quarter billion phones a year if even a quarter of their userbase was doing annual upgrades!

If you want to look for risks to Apple you need to look at geopolitical/legal stuff. Trade wars getting worse could deliver a big blow to Apple's overseas sales as people angry at the United States choose to avoid the one major American smartphone brand. DOJ actions could take away the billions they get paid by Google to set a default that is easy to change. And that's not all of it. Of course every company everywhere, both US and domestic, is potentially caught up as history has shown trade/tariff wars will often go worldwide, because tariff barriers erected by country A make country B seek out new export markets which cause country C to protect their domestic producers.

A week ago I took my stock accounts to 85% cash, I didn't just make huge trims in my holdings of Apple, but every major stock position I hold other than Berkshire. Had nothing to do with the capabilities of midrange Androids, or my belief in Apple as a company. We're in for a rough ride with what I saw one person describe as "history's first shit-posting caused recession". A worldwide depression isn't out of the realm of possibilities if the tariff cancer spreads around the globe like it did in the early 1930s. Even if you think I'm being hyperbolic, the stock market was until its recent decrease at near historic multiples, and even with the recent declines is still very elevated. So a major correction is/was overdue even if Trump can be somehow talked off the ledge before jumping and pulling us all down with him. So going cash isn't a bad call right now even for those who think talk of recession is TDS, because there are plenty of reasons to believe you'll have opportunities to buy back in at much lower multiples in the future.
 

adroc_thurston

Diamond Member
Jul 2, 2023
5,225
7,286
96
N2 is the most expensive process node on the planet (outside of some specialized military grade stuff). How is it "cheap"?
It's cheap since the cost bump is low single digit %% and you get 10-15% more freebie speed.
AMD has been very careful to utilize a cost effective node for their high volume products. Putting all their eggs in the N2 basket would be risky AND expensive.
they just kept client inna ghetto, but that's not gonna last forever.
As of today, AMD's N4P Zen 5 competes very well against Intel's Arrow Lake on N3B. If I were at AMD, I would be designing my next high volume part on N3P which would be more than enough to further expand AMD's lead over Intel's Arrow Lake and likely match Intel's Nova Lake (or near enough).
there are more competitors than just Intel.
that CCD will go a loooooooooooong way.
I believe that desktop is about to become more about affordable computing than absolute performance.
GPU ASPs are crawling up and up. Wtf are you on.
 
Reactions: madtronik

GTracing

Senior member
Aug 6, 2021
417
960
106
(I tend to believe that if it has ARC and a garbage collector, it isn't a real language. If it doesn't use objects, it DEFINATELY isn't a real language)
That's certainly a take. Doesn't that cross off basically every popular language besides c++?

I guess objective c also fits that criterea.
 

Joe NYC

Platinum Member
Jun 26, 2021
2,906
4,270
106
N2 is the most expensive process node on the planet (outside of some specialized military grade stuff). How is it "cheap"?

If you divide the segments between Premium:
- Medusa Halo
- Medusa Point
- Medusa Ridge

Mainstream:
- Kraken successor

Low end
- whatever that will be

Then you will see the beauty of AMD strategy to dominate the premium segments, using appropriate node for it, and address lower segments with less expensive nodes, and monolithic designs.
 
Reactions: OneEng2

Nothingness

Diamond Member
Jul 3, 2013
3,239
2,293
136
Writing "Apps" for phones is one thing. They are simple things, and many a high school student can do it today. The VAST majority of phone apps written in these "programming languages" (I tend to believe that if it has ARC and a garbage collector, it isn't a real language. If it doesn't use objects, it DEFINATELY isn't a real language) are single threaded and very simple in architecture.

For business applications, PC and x86 DOMINATE.

For DC, x86 dominates.

For workstation, x86 dominates.

If I am incorrect here, please provide some information to enlighten me.
A single point is no proof, but here you go: many EDA software are now available on AArch64 Linux and large clusters with such machines exist and are used to design successful processors.

As far as development goes, many people are happily using Mac machines everyday for workstation like jobs. I did the switch some months for CPU simulation and I won't look back. My x86 workstation is left almost unused.

That doesn't mean x86 is going away, it certainly isn't. But thinking Mac or Arm machines are being used only to develop phone apps is utterly wrong.
 

yuri69

Senior member
Jul 16, 2013
623
1,080
136
That doesn't mean x86 is going away, it certainly isn't. But thinking Mac or Arm machines are being used only to develop phone apps is utterly wrong.
Apple computer can do only stuff for Apple phones. Brilliant

You can't just open a notepad and save your text file as index.html. Nor you can't install just any language runtime, such as Python or JVM. Also Rust on Mac is, apparently, impossible to develop there.

Btw the statement "business applications - x86 dominates" is hilarious. Majority of business apps are cloud-based. The largest cloud vendor - AWS - has been deploying 50:50 ARM to x86 for past two years. Is this the domination?
 

itsmydamnation

Diamond Member
Feb 6, 2011
3,016
3,768
136
Btw the statement "business applications - x86 dominates" is hilarious. Majority of business apps are cloud-based. The largest cloud vendor - AWS - has been deploying 50:50 ARM to x86 for past two years. Is this the domination?
Your conflating two very separate things and trying to pass them off as the same.

And even if we accept your false notion, first aws is like ~35% cloud market share and cloud is about 45% vs 55% on prem. so thats 0.45 * 0.35 * 0.5 = 8%. Sure there are other arm products in other clouds its no massively changing the magnitudes here.

Cloud is kinda a funny place right now , lots of growth , but at the same time lots of businesses pulling workloads from cloud , cloud is very expensive. My hot take is cloud is more about bad capital management, finance systems and planning then it is about IT infrastructure, companies that are good at those three things generally (outside of super elastic workloads ) cloud has limited value.
 

yuri69

Senior member
Jul 16, 2013
623
1,080
136
Your conflating two very separate things and trying to pass them off as the same.

And even if we accept your false notion, first aws is like ~35% cloud market share and cloud is about 45% vs 55% on prem. so thats 0.45 * 0.35 * 0.5 = 8%. Sure there are other arm products in other clouds its no massively changing the magnitudes here.

Cloud is kinda a funny place right now , lots of growth , but at the same time lots of businesses pulling workloads from cloud , cloud is very expensive. My hot take is cloud is more about bad capital management, finance systems and planning then it is about IT infrastructure, companies that are good at those three things generally (outside of super elastic workloads ) cloud has limited value.
I like those 8%.

Cloud has always been expensive. Running a workload on geographically separate places has always been expensive. Achieving those 9s in availability is also expensive. Having all those compliance checkboxes ticked is also expensive. Having a separate team to keep a clustered service up is also expensive.

If your business doesn't require those expensive solutions then you are good. If it does, then you need to do the math and decide.
 
Reactions: OneEng2

OneEng2

Senior member
Sep 19, 2022
451
669
106
That's certainly a take. Doesn't that cross off basically every popular language besides c++?

I guess objective c also fits that criterea.
Fair assessment .

I am still agitated from the FIRST time (about 15 years ago) when I was forced (by a customer) to create a Java based runtime engine for something mission critical because (customer's quote, not mine) "You won't have to worry about memory management". Right. Ended up handling major portions of the kernel in a C++ library to get around all the problems in Java.

Still, if I take your point, there are plenty of good places to utilize these languages.... perhaps even the lions share of places where the utility and ease of use would outweigh the efficiency and control allotted by C++.
A single point is no proof, but here you go: many EDA software are now available on AArch64 Linux and large clusters with such machines exist and are used to design successful processors.

As far as development goes, many people are happily using Mac machines everyday for workstation like jobs. I did the switch some months for CPU simulation and I won't look back. My x86 workstation is left almost unused.

That doesn't mean x86 is going away, it certainly isn't. But thinking Mac or Arm machines are being used only to develop phone apps is utterly wrong.
Last time I checked the stats, x86 was ~85% of the business market. Not saying Mac doesn't exist, just that it is the minority.

Last good Phoronix review that included ARM chips showed they are still far behind x86. Sure, they appear to be gaining share, but then again, starting from 0 you increase rapidly in %.

Yes, I do agree that they are popular in some REALLY big companies since they can make a business case for an internally developed proprietary solution. The rest of the world .... not so much.
 
Reactions: GTracing

MS_AT

Senior member
Jul 15, 2024
525
1,107
96
You can't just open a notepad and save your text file as index.html. Nor you can't install just any language runtime, such as Python or JVM. Also Rust on Mac is, apparently, impossible to develop there.
But thanks to Microsoft and Qualcomm that developed Windows on Arm, I should be able to run WoA in a VM on Mac and be able to do all these things? I mean I am honestly asking if this is viable?
 

poke01

Diamond Member
Mar 8, 2022
3,301
4,546
106
But thanks to Microsoft and Qualcomm that developed Windows on Arm, I should be able to run WoA in a VM on Mac and be able to do all these things? I mean I am honestly asking if this is viable?
I think that was sarcasm. You can absolutely do those those things on a Mac.
 
Reactions: MS_AT
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |