Question Speculation: RDNA2 + CDNA Architectures thread

Page 151 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,702
6,404
146
All die sizes are within 5mm^2. The poster here has been right on some things in the past afaik, and to his credit was the first to saying 505mm^2 for Navi21, which other people have backed up. Even still though, take the following with a pich of salt.

Navi21 - 505mm^2

Navi22 - 340mm^2

Navi23 - 240mm^2

Source is the following post: https://www.ptt.cc/bbs/PC_Shopping/M.1588075782.A.C1E.html
 

moinmoin

Diamond Member
Jun 1, 2017
4,994
7,765
136
Is it just me, feeling economically assaulted, by the humongous rise in pricing?
Remember buying fairly high end around the 300 USD mark. That feels like fiction today, what happened. Until now, I have never paid much more than that either, feels kind of discusting that the prices have more than tripled in several cases. It's like, the same performance per dollar, every year, if you want something better, you have to pay up.
Plenty of the graphics performance increases of the last decade has been through brute force (be it significantly bigger chips and/or much higher power usage) while at the same time previously expensive "pro" cards (like e.g. Titan) have been moved down in the tier structure as the new normal for the consumer market without keeping the earlier pricing structure. The market essentially accepted such price increases without hesitation, so there we are today.
 

gorobei

Diamond Member
Jan 7, 2007
3,713
1,067
136
Is it just me, feeling economically assaulted, by the humongous rise in pricing?
Remember buying fairly high end around the 300 USD mark. That feels like fiction today, what happened. Until now, I have never paid much more than that either, feels kind of discusting that the prices have more than tripled in several cases. It's like, the same performance per dollar, every year, if you want something better, you have to pay up.
Uh, no. As we push ahead on nm sized features, the prices for finished wafers are going up at an accelerating rate.
i cant remember which techtuber i heard this from but they made the observation that the eth mining boom back in 2016(?) demonstrated that there was pricing elasticity in gpus. basically amd/nv had a product that miners were willing to pay a premium for over historical pricing and gamers were forced to discover/pay the upper limits of their comfort zone. nv just ran with it and thus turing and ampere at 1200 1500. amd lack of competitiveness allowed them to do so, but consumers could also have chosen to not pay those prices.

Jims video on "titan" performance just demonstrates how many people have fallen for nv marketing nonsense.

new nodes with multiple masks has driven costs up, but chiplets and euv may get us back down in 3 or 4 years.
 
Reactions: SamMaster

kurosaki

Senior member
Feb 7, 2019
258
250
86
i cant remember which techtuber i heard this from but they made the observation that the eth mining boom back in 2016(?) demonstrated that there was pricing elasticity in gpus. basically amd/nv had a product that miners were willing to pay a premium for over historical pricing and gamers were forced to discover/pay the upper limits of their comfort zone. nv just ran with it and thus turing and ampere at 1200 1500. amd lack of competitiveness allowed them to do so, but consumers could also have chosen to not pay those prices.

Jims video on "titan" performance just demonstrates how many people have fallen for nv marketing nonsense.

new nodes with multiple masks has driven costs up, but chiplets and euv may get us back down in 3 or 4 years.
The funny thing is, there has always been new masks and nodes. How many 98 micron chips did we get from a wafer 1998? By the looks of it, if every node shrink doubles in cost, GPUs would be in the million USD range by now.
 

gorobei

Diamond Member
Jan 7, 2007
3,713
1,067
136
The funny thing is, there has always been new masks and nodes. How many 98 micron chips did we get from a wafer 1998? By the looks of it, if every node shrink doubles in cost, GPUs would be in the million USD range by now.
with double and quad patterning needed for the 14nm/16nm generation, the number of masks and passes thru the machine/stage jumped over past nodes. additionally more time needed for each pass means a completed wafer takes more work hours to produce so less total wafers processed over a given time. less wafers per day/week/month = less money so tsmc has to raise costs. euv will reduce the number of those passes/masks.
 
Reactions: KompuKare

kurosaki

Senior member
Feb 7, 2019
258
250
86
with double and quad patterning needed for the 14nm/16nm generation, the number of masks and passes thru the machine/stage jumped over past nodes. additionally more time needed for each pass means a completed wafer takes more work hours to produce so less total wafers processed over a given time. less wafers per day/week/month = less money so tsmc has to raise costs. euv will reduce the number of those passes/masks.
But if we keep pay the prices we do, why should they lower the prices.. sometimes I believe the only thing stopping the oligopoly is starting a consumers union. Voting with the wallet works if we all agree on a price point, won't work so well if people has stopped caring. Prices won't go down until WE make them go down, in unison.
 

PhoBoChai

Member
Oct 10, 2017
119
389
106
Lots of talk on wafer prices, yeah they have gotten more expensive. But its not at the point where the wafer cost is a major limiter.

Do the calculations yourself, a doubling in wafer prices to 10K for 7N, each mid-range GPU still would not cost that much to produce. We're talking ~$70, and include binning, less. For margins and R&D, they sell it for $200 a chip, add on some GDDR6 and PCB, $400 GPU is still very profitable for AIBs.

It's all down to lack of supply. TSMC alone cannot provide for such a high growth market with so many players wanting access to their leading node. Its the mobile market that demands the efficiency from TSMC, so let them pay more and pay first for it. They'll mostly move to 5N next year, there's plenty of 7N left for everyone else.

RDNA2 launching now may not be the greatest thing in terms of competing for wafers against Zen 3, but by early next year, there's going to be a surplus of 7N wafers AMD can use. Then towards the end of 2021, AMD will start ramping Zen 4 on 5N. Lots of cheap 7N wafers to play with.
 

YBS1

Golden Member
May 14, 2000
1,945
129
106
If AMD plans to price their 3080 level GPU at the same price they better have RT, DLSS and encoder/decoder up to Nvidia levels ready for the upcoming presentation in 4 days.
Or alternatively they could just have cards that are actually available to buy, that's enough to "beat" nVidia at this point. I'm really hoping the top end card is at least a bit faster than the 3090. The 3090 was a massive disappointment versus the 3080 at the given price points, and my 2080Ti is really struggling to achieve the framerates I desire at 5120x1440. (First world problems)
 

Shivansps

Diamond Member
Sep 11, 2013
3,873
1,527
136
No big pockets here. Just an understanding of the business case. Mind you, if AMD's 6800XT matched the 3080 and cost $300, my wife would be buying me a nice Christmas present

I understand, and i also understand that both Nvidia and AMD increased the prices in the last gen, or i need to remember you that the 5700XT was to be called "RX690" and that they changed the name to avoid the backlash?

10 years ago the HD5850 and HD5870 were the top of the line for $300/$400... Remember when the GTX280 was $650 then AMD launched the HD5000 series and Nvidia was forced to price the GTX480 down to $500? good times, i feel old. This does not happens anymore and prices are going up, and only up. The top Nvidia gpu cost $1500 and the midrange starts at what used to be high end prices and AMD is likely to follow suit again, good for AMD, but not that good for the consumers.
 
Reactions: kurosaki

dzoni2k2

Member
Sep 30, 2009
153
198
116
I understand, and i also understand that both Nvidia and AMD increased the prices in the last gen, or i need to remember you that the 5700XT was to be called "RX690" and that they changed the name to avoid the backlash?

10 years ago the HD5850 and HD5870 were the top of the line for $300/$400... Remember when the GTX280 was $650 then AMD launched the HD5000 series and Nvidia was forced to price the GTX480 down to $500? good times, i feel old. This does not happens anymore and prices are going up, and only up. The top Nvidia gpu cost $1500 and the midrange starts at what used to be high end prices and AMD is likely to follow suit again, good for AMD, but not that good for the consumers.

People can always vote with their wallets and buy a console. Whole next gen system for 500 USD. Beat that with PC. 3070 alone is that much and is maybe 15% faster.
 

PhoBoChai

Member
Oct 10, 2017
119
389
106
If AMD plans to price their 3080 level GPU at the same price they better have RT, DLSS and encoder/decoder up to Nvidia levels ready for the upcoming presentation in 4 days.

5700XT similar price to 2060S and 2070S, same 8GB vram, had no RT, DLSS, and crap encoder and still sold well.

Ppl think NV has this crazy mindshare. But the above should remind everyone, AMD has their share of fans too.

You all will be in for a shock that perf, perf/$ and VRAM are major selling points. Not niche stuff like DLSS that only works well in a few games released in the entirety of the Turing generation.

For myself, I don't stream, so an encoder doesn't do jack for my use case. I game at 4K, the VRAM is important as I plan to keep a flagship GPU for at least 3 years. RDNA2 has RT. So we're only talking about DLSS vs extra VRAM & efficiency.
 

undertaker101

Banned
Apr 9, 2006
301
195
116
I think more than the cost, the most egregious issue with this launch is availabiliy. People keep saying prior launches were as bad but I don't recall that from experience...I got a GTX 980, GTX 970 and a GTX 1080 from Amazon on launch day without too much effort....this has been the most frustrating trying to buy experience of my life so far and that includes negotiating with 10 dealers for a 50k SUV to get the exact colors and features I wanted over the span of a few months....and getting powerblock dumbbell expansions during the height of the Covid lockdowns earlier this year.
 
Last edited:

CastleBravo

Member
Dec 6, 2019
119
271
96
I understand, and i also understand that both Nvidia and AMD increased the prices in the last gen, or i need to remember you that the 5700XT was to be called "RX690" and that they changed the name to avoid the backlash?

10 years ago the HD5850 and HD5870 were the top of the line for $300/$400... Remember when the GTX280 was $650 then AMD launched the HD5000 series and Nvidia was forced to price the GTX480 down to $500? good times, i feel old. This does not happens anymore and prices are going up, and only up. The top Nvidia gpu cost $1500 and the midrange starts at what used to be high end prices and AMD is likely to follow suit again, good for AMD, but not that good for the consumers.

10 years ago it was common to have two of those $500-650 cards if you wanted to max out settings at high res. I see today's $1200-1500 cards as the replacement for that.
 

Leeea

Diamond Member
Apr 3, 2020
3,689
5,424
136
I don't like the fact that for sub 300$ GPUs we will have to wait till Q1 of next year.

Be it from Nvidia, or from AMD. The only thing that we might get is price cuts on current gen hardware. Maybe...
I am sitting in the same boat. I paid exactly $300 for my Vega 56, and it has no reasonable upgrade path unless I spend 2x+ what I paid for it. My TV is 4k, but my video card is only good for 1440p.


RDNA2 launching now may not be the greatest thing in terms of competing for wafers against Zen 3, but by early next year, there's going to be a surplus of 7N wafers AMD can use. Then towards the end of 2021, AMD will start ramping Zen 4 on 5N. Lots of cheap 7N wafers to play with.
I am hoping for price drops next year. Right now I feel priced out of the market. Ryzen 5000, 3080, 2080 Ti, and RDNA2* just seem excessively expensive right now.

I'm in fact going to skip this GPU gen and getting a Series X :>
Is that really an option?

Series X costs you about $530ish to buy right now. You will need a Microsoft Xbox account for an additional $60 per year.

The Series X is subsidized, Micro$oft will make that money back from you on software sales. Your in their store, their rules, and no other options. Your also stuck with their voice chat, their social network, and so on. Cheating is far less prevalent (although it still happens), but does that make up for inability to mod single player games? Community patches almost never come to console. PC also gets a ton of indie exclusives.

The relative openness on PC gives it a community and energy that just does not seem to exist on consoles. PC is dominate on Twitch for a reason.
 
Reactions: psolord

Shivansps

Diamond Member
Sep 11, 2013
3,873
1,527
136
People can always vote with their wallets and buy a console. Whole next gen system for 500 USD. Beat that with PC. 3070 alone is that much and is maybe 15% faster.

That may actually happen a lot this gen. For the first time the consoles have a good hardware and the starting console cost as much as the PS2 back in the day, something that was lost with the PS3/PS4 and equivalent xboxs.
 

Shivansps

Diamond Member
Sep 11, 2013
3,873
1,527
136
Is that really an option?

Series X costs you about $530ish to buy right now. You will need a Microsoft Xbox account for an additional $60 per year.

The Series X is subsidized, Micro$oft will make that money back from you on software sales. Your in their store, their rules, and no other options. Your also stuck with their voice chat, their social network, and so on. Cheating is far less prevalent (although it still happens), but does that make up for inability to mod single player games? Community patches almost never come to console. PC also gets a ton of indie exclusives.

The relative openness on PC gives it a community and energy that just does not seem to exist on consoles. PC is dominate on Twitch for a reason.

Is not an option for a lot of people, but Intel, AMD and Nvidia are pushing things too far, when your mainstream CPU cost $300 (6C) and your mainstream gpu cost $300 (likely similar to S series, maybe a bit more) you already talking about 10 year worth of subcription were they give you free games. And you arent even half way trought building a pc.

For me, i could never adapt to consoles, but the fact remains this is getting way too expensive.
 
Last edited:

PhoBoChai

Member
Oct 10, 2017
119
389
106
In not an option for a lot of people, but Intel, AMD and Nvidia are pushing things too far, when your mainstream CPU cost $300 (6C) and your mainstream gpu cost $300 (likely similar to S series, maybe a bit more) you already talking about 10 year worth of subcription were they give you free games. And you arent even half way trought building a pc.

For me, i could never adapt to consoles, but the fact remains this is getting way too expensive.

You don't need latest tech. Used hardware offer amazing bang for buck. You can still game very well up to 1440p on cheap used hardware. 1080p is much easier.
 

Mopetar

Diamond Member
Jan 31, 2011
8,004
6,445
136
The 3080 is a x080 class, hence the name. The 2080 was just atrocious when compared to the previous generation.

The Ti card is the 3090 which is a price increase over previous gen.

The 3080 is on the 102 die which has historically been for the Ti card. It's a bit more cut down than the typical Ti but given how the 3090 stacks up the 3080 doesn't lose much.

Ampere is still a good architecture. It's just that NVidia pushed the clocks too far for too little gain and that the top end card has more resources than it can keep fed causing performance gains to fall off hard. Unless AMD has some insanely low price for their cards no one who has a 3080 will be disappointed

You are surely joking right? Without seeing actual supply for the Radeons, 2020 will be remembered as the year of components no one could buy. Just like it is a terrible, horrible, no good, very bad year for everything else.

No one will really remember the shortages in five years or it will be lumped in with toilet paper shortages as some kind of joke.

No one talks about the 980 and 980 Ti shortages anymore and scalping that lead to NVidia selling FE cards at higher prices with Pascal in order to counteract that. Maybe this is worse because the 3080 is such a better value than anything that Turing had to offer and at a fraction of the price that an immediate upgrade is worth the price of admission, but in five years we'll just remember what a great value it was, especially since supplies will improve in time.
 

DDH

Member
May 30, 2015
168
168
111



Another leak in the wall.




Hmmmmm. Hmmmmmmmm. I think I have seen something similar months ago, somewhere...
This lends even more credence to the possibility of the 6800xt being a 64cu card.

If we take the techpowerup review of 3080 trio, 80% faster than the 5700xt at1440p
3080=100, 5700xt=55.

55 x 10% IPC increase, x 80% more cus, x 21% higher clocks = 131. Theoretically 30% faster than a 3080 at 1440p? For it to only match the 3080 the scaling has to be abysmal

Compared to a 64cu

55 x 1.1, x 1.6, x 1.21 = 117%. Lose 10% scaling and we have a number at 1440p than would seem to match up with game bench leaks from redgamingtech.
 

Mopetar

Diamond Member
Jan 31, 2011
8,004
6,445
136
This lends even more credence to the possibility of the 6800xt being a 64cu card.

If we take the techpowerup review of 3080 trio, 80% faster than the 5700xt at1440p
3080=100, 5700xt=55.

55 x 10% IPC increase, x 80% more cus, x 21% higher clocks = 131. Theoretically 30% faster than a 3080 at 1440p? For it to only match the 3080 the scaling has to be abysmal

Compared to a 64cu

55 x 1.1, x 1.6, x 1.21 = 117%. Lose 10% scaling and we have a number at 1440p than would seem to match up with game bench leaks from redgamingtech.

Would be odd for the 6800XT to be a 64 CU part. What's the 6800 then? Seems unlikely they have a 56 CU bin with how good the yields are supplying be on 7nm now. Also the speculation is that the 64 CU part will have a disabled memory controller and it's already almost unbelievable that AMD can get these results on a 256 bus.

I suppose it's possible we get two 64 CU parts where the only difference is the bus width, but that feels a little odd. I think the 64 CU part is the 6800 and the 6800XT will be the 72 CU part. 6900/XT will both be full die cars with the difference being clock speed. We might not see the real top dog if AMD is holding out to get some GDDR6X to pair with it to make up for some of the bus size limitations that have to start hitting at some point.

Honestly if the leaks and rumors are close, AMD doesn't need to release a top card. It seems like they'll be close enough to a 3090 with what they have now that they can get away with a promise of something even better four months from now.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |