Question Zen 6 Speculation Thread

Page 94 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

fastandfurious6

Senior member
Jun 1, 2024
480
627
96
computing requirements will quickly reduce over time. better AI will train better AI and algorithms and techniques and so on, it's a self-feeding loop. local LLM will be GPT-4 equivalent but at the same time online i.e. GPT-5 will be far more powerful and so on.

AI companies will still run with deficit on surplus funding etc because capitalist accelerationism
 

HurleyBird

Platinum Member
Apr 22, 2003
2,791
1,512
136
Compute requirements will go down, perhaps very quickly, but only when controlling for output (amount and robustness) requirements. Output requirements on the other hand are skyrocketing, and the ceiling for them probably doesn't exist.
 
Reactions: coercitiv

fastandfurious6

Senior member
Jun 1, 2024
480
627
96
They've literally only ever went up.

no, deepseek has GPT-4 perf at way less requirements
and with quants u can run 7B on most machines even with just mobile CPU i.e. Phoenix/HWK up to 10tiktoks/sec, macbook M3+ being a lot better

of course GPT-5 will have way more requirements

two ways this goes,
1) existing models become more efficient/less hungry while hardware becomes faster
and 2) new models of course will be hungrier
 
Last edited:

Hulk

Diamond Member
Oct 9, 1999
5,074
3,564
136
They've literally only ever went up
When more compute is available developers use it. Either to add features or to not optimize the code and just let the processor do the heavy lifting.

Now and then some outlier applications do become more computationally efficiency. PureRaw is one that comes to mind but I'm having a hard time thinking of another!

If compute "topped out" for one reason or another then we'd see developers really get to work figuring out ways to increase both features and speed while decreasing or keeping computational overhead constant. A great example of that woud be the old Atari 2600 game console. The unit was so ubiquitous that even when it was far outclassed by newer hardware developers were racking their brains to create games that were somewhat competitive with the newer hardware just because the user base and therefore possible sales base was so large. It was worth the effort to work out all sorts of clever tricks to max out the hardware.

Due to the limit of the human eye I think we are seeing a similar effect with TV's and displays. The move from SD (interlaced no less!) to 720p or 1080p was a massive increase in visual fidelity. While the move from 1080p to 4k is technically just as large an increase in resolution (4x) the visual results are much less pronounced for basically two reasons. First, depending on the distance from the screen and size of it we are reaching limits to human vision, especially in the real world where many people aren't even 20/20 corrected. Second, it takes really good cameras, lighting, lenses, operators, and post production to actually make each one of those 4k pixels count.

I have my doubts that 8k will even ever become a mainstream consumer format. Professional format? Absolutely, working at high resolution provides greater latitude and ability for cropping while preserving quality when going to 4k in post.

I think where we are going to see the big push for higher resolution is with monitors and gaming. What happens if desktop resolution stalls at 4k? Eventually GPUs capable of gaming at 4k "slide down the stack" possibily to the point where the lower end cards are capable of decent 4k. Sure, developers will find ways to waste compute cycles on fx that are not or barely visible at the request of GPU manufacturers but consumers are pretty smart and will see the lack of difference with their eyes and just turn them off and happily game away.

Anyway, the push for more compute is complicated and driven by market forces as much or more than consumer demand is my point.
 
Reactions: Tlh97

Win2012R2

Senior member
Dec 5, 2024
822
811
96
While the move from 1080p to 4k is technically just as large an increase in resolution (4x) the visual results are much less pronounced for basically two reasons
It's a huge improvements on monitors as people sit much closer to them than TVs, but even on TVs it's a very nice improvement too

Now that we have very good upscalers 8k might finally become being viable too, in a few years at least.

yeah 8K is a pipe dream with little benefits under current architectures. wayyy too hungry
Render in 4k and upscale, done.
 

itsmydamnation

Diamond Member
Feb 6, 2011
3,023
3,785
136
everyones ignoring the copyright issues. At some point thats going to have to get answered, with the USA doing USA things that might be a very different answer jurisdiction by jurisdiction.

So far only shovel makers are coming out ahead in this arms race. What happens to LLM spend in a trump induced recession? Are you even in a recession if trump firers everyone who reports consumption?
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,060
29,167
146
everyones ignoring the copyright issues. At some point thats going to have to get answered, with the USA doing USA things that might be a very different answer jurisdiction by jurisdiction.

So far only shovel makers are coming out ahead in this arms race. What happens to LLM spend in a trump induced recession? Are you even in a recession if trump firers everyone who reports consumption?
Getting reports about this post. I agree, it is too political for the tech forums. Can't avoid talking about tariffs, but naming politicians and discussing their policies should be done in the designated forum for such topics. Any posts that cross the line moving forward will be subject to deletion.

Mod DAPUNISHER
 

Hulk

Diamond Member
Oct 9, 1999
5,074
3,564
136
It's a huge improvements on monitors as people sit much closer to them than TVs, but even on TVs it's a very nice improvement too

Now that we have very good upscalers 8k might finally become being viable too, in a few years at least.


Render in 4k and upscale, done.
We're getting into subjective territory with words like "huge" so I won't comment on your experience but for someone with perfect vision you'd need to be less than 25" from a 32" 4k screen to even have a chance of discerning individual pixels. This is best case, no anti-aliasing tech in effect, and someone that really has perfect vision. I have a 27" 4k display and am swinging my head around more than I like to so with a 32" display I'd probably be back farther to minimize head movement and then make things larger, kind of negating the larger monitor anyway. I *did* notice a significant increase in quality in moving from a 1920x1200 24" display to a 27" 4k. I would say "huge" actually. Don't know if I'd use the same adjective in moving to 8k but I haven't done it and it seems you have so you might be right.

So, in good faith, I will admit that your point is well taken. At typical TV viewing distances greater than 4k is a tough sell, jeez, even a lot of movie theaters are 1080p due to a lack of theaters wanting to invest in new display tech, but no one seems to care and honestly the movies look great because you are sitting so far back. Contrast is probably more of an issue than resolution anyway.

Computer displays are a different beast and you can get right up close and perhaps fully exploit 4k so I will give you that! While the math might show "no difference" really, really high dpi seems to make the display look more like actual paper.
 
Reactions: Tlh97 and Win2012R2

Win2012R2

Senior member
Dec 5, 2024
822
811
96
At typical TV viewing distances greater than 4k is a tough sell, jeez, even a lot of movie theaters are 1080p due to a lack of theaters wanting to invest in new display tech
Problem with TVs is lack of good 4k source - even UHDs have pretty poor bitstream (100 mbits), it's better than blu-rays, but still lacking, even console gaming on TV typically requires upscaling using crappy upscalers, but for console GUI it's much nicer in my view.

Once you try proper 4k you never look back, main problem was very expensive GPUs on PC, but now things like 9070 XT are very impressive even at 4k.
 

Hulk

Diamond Member
Oct 9, 1999
5,074
3,564
136
Problem with TVs is lack of good 4k source - even UHDs have pretty poor bitstream (100 mbits), it's better than blu-rays, but still lacking, even console gaming on TV typically requires upscaling using crappy upscalers, but for console GUI it's much nicer in my view.

Once you try proper 4k you never look back, main problem was very expensive GPUs on PC, but now things like 9070 XT are very impressive even at 4k.
I would put forth that the actual quality of the original uncompressed stream is as or more the issue than the compression. As I wrote above you need really solid equipment, lighting, and good camera people/directors to extract full quality from 4k recording. The compression really is only an issue with high motion, water waves, and other difficult to encode temporally "active" scenes.

Remember the sci-fi series Farscape? It was SD but they were able to make every single pixel count and even in SD it looks really good. You could easily mistake it for 720p. It is a good example of how important the original source is.

I was pretty heavily into digital video at the down of HD about 20 or so years ago and did some writing. Anyway as I was interviewing various manufacturers the issue wasn't the capture hardware but more the price of the lenses that made it tough to put even real 1080p into the hands of consumers. Most 4k video cameras even today will struggle to capture that resolution without superior optics. Although mirrorless cameras and AI are putting better glass into our hands for less and less money, especially since with mirrorless optical distortion can be corrected digitally since you don't need to see the capture image in the eyepiece.

Regardless of where we are going, technology has pretty much (finally) caught up with human vision and high quality video can be created with smart phone if the user knows what they are doing!
 

Kronos1996

Member
Dec 28, 2022
61
99
61
I wonder how many IOd and CCd variants are planned then.
SF 4nm IOd seems likely. But for most parts?
CCD? Probably two.

- Zen 6
- Zen 6C

IOD? Probably four.

- Desktop
- Server
- Medusa (Laptop)
- Medusa Halo

There’s rumors of two desktop IOD’s. A fancy new 3nm one with the Neural Engine and maybe more CU’s. Plus a cheap 6/4nm one for low-end products that’s stripped down. Not sure, we’ll see.
 

OneEng2

Senior member
Sep 19, 2022
476
700
106
computing requirements will quickly reduce over time. better AI will train better AI and algorithms and techniques and so on, it's a self-feeding loop. local LLM will be GPT-4 equivalent but at the same time online i.e. GPT-5 will be far more powerful and so on.

AI companies will still run with deficit on surplus funding etc because capitalist accelerationism
Thank you for saving me the typing .
When more compute is available developers use it. Either to add features or to not optimize the code and just let the processor do the heavy lifting.
Again, thank you for saving me the typing

AI and AI processing ship has sailed IMO. Hardware will become more capable even though the efficiency of LLM will skyrocket because the demand for AI processing will exceed both the hardware advances AND the software efficiencies found.
 

Joe NYC

Platinum Member
Jun 26, 2021
2,967
4,340
106
CCD? Probably two.

- Zen 6
- Zen 6C

IOD? Probably four.

- Desktop
- Server
- Medusa (Laptop)
- Medusa Halo

There’s rumors of two desktop IOD’s. A fancy new 3nm one with the Neural Engine and maybe more CU’s. Plus a cheap 6/4nm one for low-end products that’s stripped down. Not sure, we’ll see.

MLID was showing (in his Venice Leak) an IOD die that itself is a "chiplet". It supports connection to 2 CCDs and supports 4 DDR5 memory channels and ~32 PCIe Gen 6 lanes.

(I think in a later revision he said that the 2 of these IODs could be merged into one bigger chiplet)

The big socket (Venice) will get 4 of these IODs (or 2 big ones), the Sienna successor would have 2 (or one big one).
 

adroc_thurston

Diamond Member
Jul 2, 2023
5,455
7,633
96
MLID was showing (in his Venice Leak) an IOD die that itself is a "chiplet". It supports connection to 2 CCDs and supports 4 DDR5 memory channels and ~32 PCIe Gen 6 lanes.

(I think in a later revision he said that the 2 of these IODs could be merged into one chiplet)

The big socket (Venice) will get 4 of these IODs (or 2 big ones), the Sienna successor would have 2 (or one big one).
all of that is wrong.
 

Doug S

Diamond Member
Feb 8, 2020
3,104
5,348
136
yeah 8K is a pipe dream with little benefits under current architectures. wayyy too hungry

2.5K/4K 120hz will become the golden standard just like 720p/1080p has been for a long time

I agree you do hit a wall, but that's only true when human processing limitations are in the loop. There's a reason why audio processing didn't go from 44 KHz in the 80s to 88 KHz and up to up to where today it is in the tens of millions of hertz. The same applies with video resolution, even though we possess the technical means to go from 4K to 8K to 16K and so forth that won't happen for the same reason that we didn't step up audio processing to ever higher levels.

But everywhere that humans are outside the processing loop and only consume a final product, we've only ever increased processing demands as processing resources have become cheaper. I could run the best optimizing compiler from the 90s on my watch and it would run faster than anything you buy in the 90s, but no one wants to run a 90s compiler even if they could get its output in 1/100th of the time. Instead, compilers keep demanding more resources as CPUs get faster and RAM gets bigger. Likewise we throw more and more resources into compression algorithms as the years go by. I could come up with a hundred examples, but you get the idea. Because we're only consuming a final product, doubling the resources may result in only a low single digit improvement in the final product, but we've deemed that to be worth it.

The reason we are willing to throw hundreds of billions towards building AI infrastructure is because there are enough people willing to bet money that it will be worth it in the end (or that their specific investment will win out over others that will fail) We aren't spending hundreds of billions because we need amount of processing 'x' and when we can buy amount of processing 'x' for half as much money that all our investments will be cut in half. We're spending hundreds of billions to get the most processing capability we can for the money that's spent, and as long as people are willing to shovel money into AI they'll keep spending whether that buys only 2% more processing capability than it did last year or it buys 200% more.
 

Joe NYC

Platinum Member
Jun 26, 2021
2,967
4,340
106
it wasn't right back then either.
We will see what emerges, on the server side. AMD has been on the same technology since Rome, since 2019, and in 2026, it will be 7 years.

7 years was a plenty of time to plan the new architecture, it will be interesting to see it unveiled.

If the original idea (of 4 IODs each forming a unit with 2 CCDs) was to be implemented, it would be possible to do it using the same technology as Strix Halo, the carrier wafer size would not be too much bigger than the one for Strix Halo.

And then it showed some bridges to connect the IODs, that would have to use a different technology, maybe hybrid bond silicon bridges.

(this is the picture from the old video)
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |