Question Speculation: RDNA2 + CDNA Architectures thread

Page 92 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,702
6,405
146
All die sizes are within 5mm^2. The poster here has been right on some things in the past afaik, and to his credit was the first to saying 505mm^2 for Navi21, which other people have backed up. Even still though, take the following with a pich of salt.

Navi21 - 505mm^2

Navi22 - 340mm^2

Navi23 - 240mm^2

Source is the following post: https://www.ptt.cc/bbs/PC_Shopping/M.1588075782.A.C1E.html
 

Guru

Senior member
May 5, 2017
830
361
106
We know from the PS5 specs that the new RDNA2 GPU's can pretty much clock at least 2.23GHz and we know from PS5 tech presentation that it can possibly reach even higher, but it starts having diminishing returns!

So a full blown 80cu's or whatever desktop product is likely going to be able to add at least 100mhz to that 2.23, so that would put it around 2.35Ghz. I think with overclocking you can probably push the GPU good 150MHz more before it starts crashing and being unstable.

So I do not expect a 2500MHz core clock for any of their GPU's out of the box, but I think the RX 6800 or RX 6700 is likely going to have 2300MHz boost clock. Again these GPU's will probably have main core clock at say 1.9GHz, "game" clock at 2.1GHz and boost clock up to 2.3GHz
 

ModEl4

Member
Oct 14, 2019
71
33
61
The +40% frequency, achieved with Pascal in relation with Maxwell 2 was also due to 16nm vs 28nm process. I think a reasonable limit for what AMD can achieve in the same process is +25% (what Nvidia achieved from Kepler to Maxwell 2) Then in N7 process it would theoretically achieve 2115MHz+25%= 2645MHz (highest on air overclocked achieved with a Sapphire RX 5700XT Special Edition+25%) if it is on N7+ EUV then add +10% and viola, we have 2910MHz🤪. On the other hand, if true, the XBOX series X 1825MHz would be too low I think in that tower design, unless TSMC in order to achieve the greater density that N7E enjoys they shed 7-10% performance in relation with N7. Of course, if XBOX series X is indeed 1825MHz (which is the official specs), at Hot Chips the Gtri/s & Gpix/s figures that MS had in the slides suggest at least 1906MHz, maybe they made a mistake. Anyway, we will see! The way I see it, the most optimistic scenario for AMD is 3090 performance at $899 competing with $999 3090 12GB version😋 but Nvidia offers more features (unless AMD surprise us, but for example DLSS which for me is a superb feature, AMD probably will have nothing equivalent, a PS4 Pro checkerboarding feature maybe given the 128RBEs and 256bit bus...)
 

Glo.

Diamond Member
Apr 25, 2015
5,762
4,667
136
Its funny.

I looked at this thread's previous pages, and just three days ago we speculated whether AMD is able to even touch 2.3 GHz in their designs, and at what power drawn. That PS5 is going to suck out the power lines within 15 miles to maintain its clock speeds.

Today we are discussing: "Where is the ceiling?"
 

Konan

Senior member
Jul 28, 2017
360
291
106
Its funny.

I looked at this thread's previous pages, and just three days ago we speculated whether AMD is able to even touch 2.3 GHz in their designs, and at what power drawn. That PS5 is going to suck out the power lines within 15 miles to maintain its clock speeds.

Today we are discussing: "Where is the ceiling?"
Hype train is going full pelt mate. I feel folks should temper a little.
I always had ( and got the majority impression over the last month) that AMD's dGPU RDNA2 cards can boost high like 2.3ghz. Nothing about permanent sustained speeds.
 

Glo.

Diamond Member
Apr 25, 2015
5,762
4,667
136
Hype train is going full pelt mate. I feel folks should temper a little.
I always had ( and got the majority impression over the last month) that AMD's dGPU RDNA2 cards can boost high like 2.3ghz. Nothing about permanent sustained speeds.
The thing is, is there even a hype train, since we know the clock speeds of those GPUs, or rather rough specification, that will be within 5% of the end products?

I think, the general consensus, based on those clock speeds leaked can be summed up by one sentence:

"General disbelief".
 

Konan

Senior member
Jul 28, 2017
360
291
106
The thing is, is there even a hype train, since we know the clock speeds of those GPUs, or rather rough specification, that will be within 5% of the end products?

I think, the general consensus, based on those clock speeds leaked can be summed up by one sentence:

"General disbelief".

The question is is it going to be within 5% of the low-end or the high-end of those clock ranges. Because between the low-end on the high-end in the table there’s a massive variance. Like 30%...

Excluding clock speeds still don’t really know full IPC gain or even how things will perform with RT turned on. I have some scepticism on the BVH approach.
 

Glo.

Diamond Member
Apr 25, 2015
5,762
4,667
136
The question is is it going to be within 5% of the low-end or the high-end of those clock ranges. Because between the low-end on the high-end in the table there’s a massive variance. Like 30%...

Excluding clock speeds still don’t really know full IPC gain or even how things will perform with RT turned on. I have some scepticism on the BVH approach.
For me there are only four thing that are interesting me.

Rasterization performance, price, power draw, and Linux, Open Source drivers quality, feature sets and stability.

If I will be able to use my GPU on day one with latest Kernel and Mesa drivers(of course, I do not need to buy the GPU on day 1, and simply wait for Mesa and Kernel to be updated...).
 

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
7,542
2,542
146
For me there are only four thing that are interesting me.

Rasterization performance, price, power draw, and Linux, Open Source drivers quality, feature sets and stability.

If I will be able to use my GPU on day one with latest Kernel and Mesa drivers(of course, I do not need to buy the GPU on day 1, and simply wait for Mesa and Kernel to be updated...).
You forgot availability
 

Shivansps

Diamond Member
Sep 11, 2013
3,873
1,527
136
I did say in my reply few users might look at the overall package. AMD need a competitive response to DLSS and raytracing and thats obvious. The media engine is important but I do not agree that everyone streams. Having said that each user looks at a certain set of parameters to make his decision but I agree that there are folks who require a good encoder and software support for OBS and other popular streaming platforms for their streaming needs.

You know what is the issue here? sometimes when buying AMD i feel like im buying a promise, when i got my RX480 at launch, that had the encoder, coming from a GTX750TI with a NVENC i felt like i was hit with a ton of bricks, it had quality issues, color issues, i had to use 3rd party software for game recording, it was not stable... it took AMD close to a year to get everything fully working with relive stable and no weird issues, but hey, they marketed it as having the encoder from day 1. And even so, quality was below 750TI for both streaming and recording. They had similar issues with the Navi encoder at launch, everything is working fine now.

At this point i feel like they are behind features and implementing them, and get them fully working, may take a long time., i hope they have something more than a promise to show in the RDNA2 presentation.
 
Reactions: ozzy702

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
You know what is the issue here? sometimes when buying AMD i feel like im buying a promise, when i got my RX480 at launch, that had the encoder, coming from a GTX750TI with a NVENC i felt like i was hit with a ton of bricks, it had quality issues, color issues, i had to use 3rd party software for game recording, it was not stable... it took AMD close to a year to get everything fully working with relive stable and no weird issues, but hey, they marketed it as having the encoder from day 1. And even so, quality was below 750TI for both streaming and recording. They had similar issues with the Navi encoder at launch, everything is working fine now.

At this point i feel like they are behind features and implementing them, and get them fully working, may take a long time., i hope they have something more than a promise to show in the RDNA2 presentation.

Yep.The 7970 was the clear winner from a hardware perspective and aged well, but it sure did take awhile to get there. AMD is typically forward looking, but doesn't deliver polished products right out of the gate. Hopefully RDNA2 gets decent release drivers and is a fantastic product. With NVIDIA's failed execution, maybe they'll have another Ryzen moment on their hands.
 
Reactions: Tlh97 and Elfear

Panino Manino

Senior member
Jan 28, 2017
847
1,061
136
DLSS is crap. I hope to god AMD just ignores DLSS, it's a redactedrunaway to rendering games at lower resolution at an alleged higher one. It's pure marketing, with fine words as AI and unique Tenor cores. >>All I hear is waste of real raster space. They do what they always have done, Nvidia. Cut corners with IQ to win in pure FPS. It's old as the street.

It's shame, but you're wrong.
RTG needs to at least announce upcoming support via update to DirectX ML to offer image upscale. Because DLSS don't need to be perfect, Nvidia just needs to convince the public (with some help of the press) that it's good enough and that they need to use for magical performance improvement. Mindshare is all that it's needed to not have image upscale become a "fatal" flaw of RDNA2 cards.
 

DJinPrime

Member
Sep 9, 2020
87
89
51
DLSS is crap. I hope to god AMD just ignores DLSS, it's a redactedrunaway to rendering games at lower resolution at an alleged higher one. It's pure marketing, with fine words as AI and unique Tenor cores. >>All I hear is waste of real raster space. They do what they always have done, Nvidia. Cut corners with IQ to win in pure FPS. It's old as the street.
Sometime when your view is the opposite of what just about everyone else think, you might want to reconsider your position or reevaluate it because everyone else isn't dumb. Just about everyone who have used DLSS 2 or just looked at the screenshot and video are impressed with it. If you hate it that much, just don't use it. They're completely upfront about it being an intelligent upscale, no one is hiding that fact. You even choose the base resolution when using it. Tensor cores are used and it requires the software to be written to use it, so it's not just a regular upscaler.
I don't know how any rational person would be against a technology that will give you 2 resolution step up in performance with minor visual artifact trade off. And in some case improved clarity. Do you rage against TN/VA panels? They aren't visually nice as IPS but offer higher refresh rates. What about having to drop down to YCbCr422 from 444 in order to get the ultra high refresh rate? It's good to have choices.
 

Tup3x

Golden Member
Dec 31, 2016
1,011
1,002
136
Do you rage against TN/VA panels? They aren't visually nice as IPS but offer higher refresh rates.
They don't though. But yeah, there's no real reason to complain about DLSS. It improves speed with minimal visual quality loss and in some cases it is an improvement.
 

kurosaki

Senior member
Feb 7, 2019
258
250
86
They don't though. But yeah, there's no real reason to complain about DLSS. It improves speed with minimal visual quality loss and in some cases it is an improvement.


*Link to previous discussion*

*Link to previous discussion*

*Link to previous discussion*

I think the words "minimal quality loss" are extremely subjective here, as are the words "when every one thinks it's a great feat, you have to re-evaluate"
Objectivity the IQ worsens. If some people don't care, fine, turn on DLSS. But it is still a total waste of die space. AMD has had upscaling for a long time now. Good, let those who want, run the games like that. Just feels like a waste of money. If people don't care if games looks like they ran on a potato, why not just buy a 50usd GPU and crank the Res down to 720p? They still don't seem to see the difference on their TN-panels...
 
Last edited:

jamescox

Senior member
Nov 11, 2009
642
1,104
136
In my opinion AMD can't lauch anything for more that $800-900. They must know that people will not buy a AMD gaming card for $1000. They are not Nvidia.
Almost no one buys ridiculously high priced Nvidia cards for gaming. They are “marketing cards” as far as gaming is concerned. Nvidia seems to have crippled the 3090 for non-gaming applications, so I don‘t really expect them to sell hardly any of them. The thing that convinces me that AMD has a great architecture this round is Nvidia‘s 8K gaming non-sense for the 3090. Makes me wonder if they are going to be competitive at 4K.

After Threadripper dominating the HEDT space for CPUs, you may have to get used to AMD being the higher performance and high priced option. We don’t know if they have replicated their success with Zen in the GPU market yet, but Nvidia‘s 30 series launch doesn’t seem to be going that well. It has been very interesting and seems to be getting more interesting every day.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
Almost no one buys ridiculously high priced Nvidia cards for gaming. They are “marketing cards” as far as gaming is concerned. Nvidia seems to have crippled the 3090 for non-gaming applications, so I don‘t really expect them to sell hardly any of them. The thing that convinces me that AMD has a great architecture this round is Nvidia‘s 8K gaming non-sense for the 3090. Makes me wonder if they are going to be competitive at 4K.

I know it makes very little actual sense but there's little arguing with demonstrated reality. There's a dedicated section of the market who buy these things. Don't ask me why.

The 8k is just some sort of silly marketing attempt to differentiate the 3090 from the 3080. The huge 3090 die will sell very well indeed for deep learning workstations etc, even if it has to be as a Pro card, so NV are happily covered.

After Threadripper dominating the HEDT space for CPUs, you may have to get used to AMD being the higher performance and high priced option. We don’t know if they have replicated their success with Zen in the GPU market yet, but Nvidia‘s 30 series launch doesn’t seem to be going that well. It has been very interesting and seems to be getting more interesting every day.

This is where people start getting a bit ahead of themselves. Zen was resulted from a huge, multi year, investment while Intel ran into endless process trouble. Here, they've had one years work since the 5700/XT, only mild process improvements, and they've had to jam ray tracing into the cards while they're at it - as Turing showed that isn't free.

Vs a half decent die shrink & new architecture, you would, a priori, have expected them to drop back from last years position.

Perhaps they've done well enough to hold the line, perhaps a bit better, perhaps slightly worse. Honestly that isn't critical here.

What's critical is that they do a fast, organised, refresh of their entire GPU stack. Ideally competitive mobile GPU's into the bargain. That'll show that they've got/are putting in the resources to take it fully seriously. Frankly its been far too long since they've been able to do this.

If they're doing that then, yes, we'll have a rather competitive market again.
 
Reactions: Cableman

Glo.

Diamond Member
Apr 25, 2015
5,762
4,667
136
About Navi 23 die.

If Navi 22 has 40CUs/192 bit bus, and it is going to be used for X700 SKUs, no way in hell, Navi 23 a 32 CU/128 bit bus GPU will be used for anything else than X600 SKUs.

So either Navi 24 is the die that will sit below it, or... AMD will refresh Navi 10 dies as 6500 SKUs.
 

Shivansps

Diamond Member
Sep 11, 2013
3,873
1,527
136
It's shame, but you're wrong.
RTG needs to at least announce upcoming support via update to DirectX ML to offer image upscale. Because DLSS don't need to be perfect, Nvidia just needs to convince the public (with some help of the press) that it's good enough and that they need to use for magical performance improvement. Mindshare is all that it's needed to not have image upscale become a "fatal" flaw of RDNA2 cards.

Its not about being magical or Nvidia convincing the public that it needs it, its about LOGIC, think about it, i have a 4K monitor the extra resolution helps me to work, but when im gaming... i cant play at 4K with a RX570... so in order to avoid changing resolution or using games at fullscreen, i had to settle for 1440p as desktop resolution and... yes im playing at 1440p with a RX570 and bordeless screen, for 1080P im forced to go fullscreen. In an ideal world all games would have diferent UI and render resolutions, but this is not an ideal world.

If the RX570 had DLSS i would be able to use 4K resolution and play at 1080P upscaled. This also helps people wanting to play with very high fps in monitors with higher resolution than 1080p.
It is still a issue because games have to support it, but its something.

Whats important here is: as AMD is in console wharever system they implement it probably ends up being the widely used option.
 
Last edited:
Reactions: Cableman

Thala

Golden Member
Nov 12, 2014
1,355
653
136
*Link to previous discussion*

*Link to previous discussion*

*Link to previous discussion*

I think the words "minimal quality loss" are extremely subjective here, as are the words "when every one thinks it's a great feat, you have to re-evaluate"
Objectivity the IQ worsens. If some people don't care, fine, turn on DLSS. But it is still a total waste of die space. AMD has had upscaling for a long time now. Good, let those who want, run the games like that.

You are wrong in two aspects. First DLSS can improve IQ - there are lots of examples published by reviewers. Second none of the upscaling solutions offered by AMD comes anywhere close to DLSS(2.0) as far is IQ is concerned.
I not even consider buying an AMD solution unless they have a comparable upscaling technology like DLSS - it is just that much of a gamechanger in my opinion.
 

PhoBoChai

Member
Oct 10, 2017
119
389
106
You are wrong in two aspects. First DLSS can improve IQ - there are lots of examples published by reviewers. Second none of the upscaling solutions offered by AMD comes anywhere close to DLSS(2.0) as far is IQ is concerned.
I not even consider buying an AMD solution unless they have a comparable upscaling technology like DLSS - it is just that much of a gamechanger in my opinion.

He's wrong, with evidence presented? Look at the comparison image.

DLSS 2 has artifacts, its not a simple case of superior IQ. It's a compromise, and it's up to individuals whether they want those compromises.

The fact that DLSS 2 relies on motion vectors to function should already inform you guys that it can be prone to errors for objects that lack motion vectors. You only have to compare more carefully in Death Stranding to see all the artifacts it generates.

This entire "better than native" is a terrible meme, due to native being blurred by TAA. Use TSSAA8x like in idTech engine and there's zero chance for DLSS 2 to even look as good as native.
 

Glo.

Diamond Member
Apr 25, 2015
5,762
4,667
136
You are wrong in two aspects. First DLSS can improve IQ - there are lots of examples published by reviewers. Second none of the upscaling solutions offered by AMD comes anywhere close to DLSS(2.0) as far is IQ is concerned.
I not even consider buying an AMD solution unless they have a comparable upscaling technology like DLSS - it is just that much of a gamechanger in my opinion.
Yeah.

Superior image quality by lowering image quality.

Guys. DLSS objectively LOWERS image quality. How can it give superior image quality than native rendering?

I didn't wanted to chime in, on the topic of DLSS, because I could not care less about it, but people claiming that by lowering Image Quality you achieve better Image Quality is a logical fallacy.

Only possible by Nvidia's marketing, I guess.
 

kurosaki

Senior member
Feb 7, 2019
258
250
86
Yeah.

Superior image quality by lowering image quality.

Guys. DLSS objectively LOWERS image quality. How can it give superior image quality than native rendering?

I didn't wanted to chime in, on the topic of DLSS, because I could not care less about it, but people claiming that by lowering Image Quality you achieve better Image Quality is a logical fallacy.

Only possible by Nvidia's marketing, I guess.

Bu, bu buut. But AI! And Tensor cores!!! Nvidia told us It's making the pictures looking even better than the original IQ! I've seen a youtuber post such things!
 

dzoni2k2

Member
Sep 30, 2009
153
198
116
He's wrong, with evidence presented? Look at the comparison image.

DLSS 2 has artifacts, its not a simple case of superior IQ. It's a compromise, and it's up to individuals whether they want those compromises.

The fact that DLSS 2 relies on motion vectors to function should already inform you guys that it can be prone to errors for objects that lack motion vectors. You only have to compare more carefully in Death Stranding to see all the artifacts it generates.

This entire "better than native" is a terrible meme, due to native being blurred by TAA. Use TSSAA8x like in idTech engine and there's zero chance for DLSS 2 to even look as good as native.

DLSS is better than native, Ampere is 4x faster in RT and perf/W went up 1.9x. Guy in a black leather jacket said so, so it's definitely true!
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |