Question Speculation: RDNA3 + CDNA2 Architectures Thread

Page 201 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,703
6,405
146

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
If you are feeling poor and want entertainment then you'll put yourself into more debt if your tv breaks, but you won't buy a new tv when you already have a working one.

That is the crux for video cards - if your cash strapped you aren't going to buy a brand new video card when the one you have is fine (if you turn down a few settings), you'll only buy one if it breaks. If you are feeling rich then you might buy a new video card because you fancy it even though your current one still works.
 

Aapje

Golden Member
Mar 21, 2022
1,467
2,031
106
You're not understanding my point though, my point wasn't that people in the U.S don't spend much on luxuries. Having lived there for a couple years, first-hand experience is that Americans consume absurd amounts of luxuries relative to their incomes....When times are good/normal, which is in no small part due to the consumer debt culture there.

But to pretend that this debt-fueled consumer spending can continue during a recession, when people lose jobs or are worried about losing jobs, significantly strains credulity.
But it simply is untrue that luxury spending stops during recessions. Starbucks and restaurants don't suddenly have zero customers. The facts overwhelmingly show that people don't suddenly stop spending on luxuries. Even car sales are extremely high, so people are still making very big purchases: https://www.cbtnews.com/u-s-auto-sales-defy-economic-headwinds-in-first-quarter-of-2023/

You seem to have a model of reality that makes you believe that it 'significantly strains credulity' that people are making big purchases like cars now, but the facts simply show otherwise. And they don't buy the most bare bones cars either, but still buy a lot of very luxurious cars. So your beliefs about how people react during recessions seems to be wrong.

I believe that people get used to luxury and then feel very entitled. What people see as 'giving up luxury' in actuality often consists of a rather modest reduction, going from a high level of luxury to a little lower level of luxury. You almost never see a person with a decent income truly giving up nearly all luxury and choosing the very cheapest options.

Again, I think that people are way more critical of the value proposition during tough times, but the level of entitlement to luxury that people have still means that they are actually not that hard to persuade to buy things that are actually completely optional purchases, as long as they can rationalize the purchase to themselves and probably more importantly, to others.
 
Last edited:

Aapje

Golden Member
Mar 21, 2022
1,467
2,031
106
That is the crux for video cards - if your cash strapped you aren't going to buy a brand new video card when the one you have is fine (if you turn down a few settings), you'll only buy one if it breaks. If you are feeling rich then you might buy a new video card because you fancy it even though your current one still works.
Is it actually still 'fine' from their perspective, though?

If people feel entitled to play the latest games and still rock a fairly old and/or slow card, then the card may no longer work fine from their perspective. In reality, a lot of people feel entitled to at least medium settings and in many modern games, that excludes all kinds of cards that people are still rocking. Especially since many people bought rather mediocre cards during the mining boom or stuck with their already aging card.

Just before the mining boom, Jensen made a desperate plea to Pascal owners to upgrade their cards, telling them that they now actually got good value after Turing sucked. Then the prices went crazy and after that boom ended, many people chose to wait for Ada, or started to experience reduced purchasing power. So I do think that there are quite a few people who in fact don't feel that their card is fine and actually have a pretty big desire to upgrade, but only if they can justify it as a great or at least decent purchase.

But in reality, Ada is one of the least enticing generations ever, with historically low gains per generation and much increased prices per tier. It's considerably worse than Turing, which also sold very poorly.

I have a real hard time believing that when cards that provide poor value sell badly, that is somehow all due to recessions and the like, rather than people simply deciding that the upgrade is not worth it.

It actually worries me that Jensen may have these exact same delusions, because such beliefs hurt everyone if he decides to wait out the recession and keeps selling cards at poor value. It hurts consumers obviously, but it also hurts the company as they stick to bad prices that reduce their profits, just due to the false belief that as economic conditions improve, people will suddenly happily buy bad video cards.
 

jpiniero

Lifer
Oct 1, 2010
14,839
5,456
136
But in reality, Ada is one of the least enticing generations ever, with historically low gains per generation and much increased prices per tier. It's considerably worse than Turing, which also sold very poorly.

That's what happens when Moore's Law is Dead. I think you could see AMD slash dGPU R&D again if they give up on getting higher ASPs to cover the increased costs.
 

Aapje

Golden Member
Mar 21, 2022
1,467
2,031
106
That's what happens when Moore's Law is Dead. I think you could see AMD slash dGPU R&D again if they give up on getting higher ASPs to cover the increased costs.
Nvidia actually got huge gains per mm2 and seems to ask way more than the increased costs should justify, so I don't believe that the current pricing actually reflects that it is now at least more costly to get gains. The prices seem inflated far beyond that.

I have a suspicion that Nvidia is already pricing in the cost increases of future nodes, trying to make us pay now for costs that don't actually exist for Ada.
 

Aapje

Golden Member
Mar 21, 2022
1,467
2,031
106
Probably, most of the worlds definition of "fine" is a lot lower than this forum insists on.
I just took a little look on an indian subreddit and they are demanding path tracing and raytracing from video cards. So their budgets may be lower, but the expectations are not necessarily much lower. In any case, good value matters even more to places with less wealth.

Strix Point is probably going to be a godsend for 2nd and 3rd world gamers who have both lower incomes and much higher taxes than the US and EU.
 

KompuKare

Golden Member
Jul 28, 2009
1,075
1,120
136
Don't forget that while higher wafer costs are a fixed thing per unit (okay, with the downturn TSMC might allow some re-negotiation and going chiplets play into this too), the R&D costs are fixed cost per generation which will be baked into the price of each product that generation.

Ergo, volume is key. No point in spending hundreds of $millions on a new generation and then pricing it so high it barely sells. By all means don't make a loss but take lower margins at higher volumes to keep the same or better profits.
 
Reactions: Tlh97 and maddie
Jul 27, 2020
17,957
11,708
116
That is the crux for video cards - if your cash strapped you aren't going to buy a brand new video card when the one you have is fine (if you turn down a few settings), you'll only buy one if it breaks. If you are feeling rich then you might buy a new video card because you fancy it even though your current one still works.
Yeah. I bet tons of people who had really old cards that lasted them 4 or 5 years, immediately plopped down the cash/credit for 4090s or 7900 XTXs, causing them to sell out at launch.
 
Jul 27, 2020
17,957
11,708
116
I just took a little look on an indian subreddit and they are demanding path tracing and raytracing from video cards. So their budgets may be lower, but the expectations are not necessarily much lower. In any case, good value matters even more to places with less wealth.
Without seeing the comments on that subreddit, it's not possible to tell how wealthy they might be. Are they demanding good raytracing perf from cheap cards or expensive cards? Could be a bunch of wealthy dudes (upper level managers, business owners etc.)

Just because they live in a third world country doesn't mean they are impoverished.
 

Ajay

Lifer
Jan 8, 2001
16,094
8,106
136
Don't forget that while higher wafer costs are a fixed thing per unit (okay, with the downturn TSMC might allow some re-negotiation and going chiplets play into this too), the R&D costs are fixed cost per generation which will be baked into the price of each product that generation.

Ergo, volume is key. No point in spending hundreds of $millions on a new generation and then pricing it so high it barely sells. By all means don't make a loss but take lower margins at higher volumes to keep the same or better profits.
I don’t think so. Doesn’t matter if it’s 5M cards @ $1000 or 10M cards at $500 total revenue and profits will be the same. [so long as the $500 card isn’t selling at a loss]
 
Last edited:

Heartbreaker

Diamond Member
Apr 3, 2006
4,262
5,259
136
Nvidia actually got huge gains per mm2 and seems to ask way more than the increased costs should justify, so I don't believe that the current pricing actually reflects that it is now at least more costly to get gains. The prices seem inflated far beyond that.

By switching a new process that cost much more per mm2, which throws the rest out the window.

This is the big change in Moore's Law.

Up until about Pascal, a new full process node jump, would essentially give you double the transistors for the same price. So you would get a massive increase in the transistor budget for FREE, so you could essentially double up on the GPU functional units at the same price. It was a great ride while it lasted, and it lasted for decades.

But that free ride is over. Progress hasn't stopped, but now your FREE increase in transistor budget each generation may be more like 10-20% instead of 100%, and that leads to slower progress. Node shrinks are still happening, but they hit a cost wall, which makes double the transistor budget for FREE, a thing of the past.

AMD seems to be getting something like 10-20% gains between process and tech improvements this generation, which is paltry compared to previous generations, but that is just the new reality of transistor economics. How does AMD handle that? Pass on small 10% to 20% gains to consumers, or slash margins to the bone to give more gains to the consumer than the underlying technology provided? If they slash margins this time, where does that leave them next time if they only get another small technology gain?

It's a tough spot for AMD to be in, which is why some are speculating N32 won't even sell to consumers.
 
Reactions: Mopetar

insertcarehere

Senior member
Jan 17, 2013
639
607
136
I just took a little look on an indian subreddit and they are demanding path tracing and raytracing from video cards. So their budgets may be lower, but the expectations are not necessarily much lower. In any case, good value matters even more to places with less wealth.

Strix Point is probably going to be a godsend for 2nd and 3rd world gamers who have both lower incomes and much higher taxes than the US and EU.
To be on that Indian subreddit, the average poster (if Indian) probably has:
- Conversational knowledge of English, in a country where less than a tenth of the population speaks any English.
- Has the income level to afford internet in India, or works in a country where that is a given.

That demographic would almost certainly put all of them as well-off by Indian standards. In any rate that should not be taken a representative sample of what the broader 'India' market demands.
 

beginner99

Diamond Member
Jun 2, 2009
5,223
1,598
136
N33 is clearly designed to be cheap as hell and to flood the market. Of course AMD might have preferred fat margins, but I think they're in a position to adapt to market demands and move a lot of inventory for a lot less.

$250 still won't be cheap enough. If you can choose between a used 6700 XT with more RAM or a likely slighter worse 7600 (XT) with less RAM for same price, $250 seems like too much.
 

Aapje

Golden Member
Mar 21, 2022
1,467
2,031
106
But that free ride is over. Progress hasn't stopped, but now your FREE increase in transistor budget each generation may be more like 10-20% instead of 100%, and that leads to slower progress. Node shrinks are still happening, but they hit a cost wall, which makes double the transistor budget for FREE, a thing of the past.

AMD seems to be getting something like 10-20% gains between process and tech improvements this generation, which is paltry compared to previous generations, but that is just the new reality of transistor economics. How does AMD handle that?
Nvidia has already found a solution by focusing more on specialized hardware and software solutions. What disappoints me most about AMD is how little effort they put into this. They are mostly just copying Nvidia, which means that they'll always be a step behind. I'd like to see them put out a feature that puts Nvidia on the back foot for once.

Lisa Su has recently actually spoken about this and promised that they can keep the pace going with alternative methods, but I'm not seeing them actually making good on that. Perhaps they thought that multi-die would do it for the 7000-series, but clearly not.
 
Reactions: Tlh97

Heartbreaker

Diamond Member
Apr 3, 2006
4,262
5,259
136
Nvidia has already found a solution by focusing more on specialized hardware and software solutions. What disappoints me most about AMD is how little effort they put into this. They are mostly just copying Nvidia, which means that they'll always be a step behind. I'd like to see them put out a feature that puts Nvidia on the back foot for once.

Lisa Su has recently actually spoken about this and promised that they can keep the pace going with alternative methods, but I'm not seeing them actually making good on that. Perhaps they thought that multi-die would do it for the 7000-series, but clearly not.

NVidia pricing is still disappointing a lot of people though, so not the solution many seemed to want.

It looks like AMD will integrate more Xilinx tech into their GPU cards. XDNA AI engine in Phoenix APU is a good start, that has me hoping the RDNA 4 cards will have it, and possible the Xilinx Video encoder tech as well.

I want to mess around with things like Stable Diffusion, and right now an RTX 3050, can turn out images faster than an RX 6950 XT.
 

moinmoin

Diamond Member
Jun 1, 2017
4,994
7,765
136
What disappoints me most about AMD is how little effort they put into this. They are mostly just copying Nvidia
Remember that Nvidia sees itself more as a software company, so it's natural it has a advantage there. Also that's an advantage built over decades like with CUDA. AMD until recently didn't have the financial power to even pay an equal work force. So being a follower is safe and not as costly.

This likely changes completely now that Xilinx people are taking over everything software.

Lisa Su has recently actually spoken about this and promised that they can keep the pace going with alternative methods, but I'm not seeing them actually making good on that.
Except for the 7000 launch (where something obviously went awry considering AMD didn't meet its own promises from shortly before) I'm actually quite happy with their results as a follower so far: Hardware was keeping up, and in software AMD managed to counter Nvidia's proprietary solutions with adequate open source counterparts.

Of course AMD's ambition should be to be more than a follower. Let's see what change Xilinx being in the mix makes in the next couple years.
 

Aapje

Golden Member
Mar 21, 2022
1,467
2,031
106
NVidia pricing is still disappointing a lot of people though, so not the solution many seemed to want.
Yeah, but it's pretty much impossible to explain their prices based on increased costs, so they just seem to be price gouging.

For example, the 4080 has a much smaller chip than the 3080, so even if they pay twice as much to TSMC as they paid to Samsung, these prices are pretty much impossible to explain. Especially since other parts of the card certainly won't have had a 200% increase in cost (VRAM, cooling, PCB, etc).

And even if you abandon all common sense and rationality to believe that the 4080 is priced as it is because the chip is so much more expensive, then the pricing of the 4070 doesn't make any sense at all. After all, the 4070 chip is actually much closer in size to the 3070 chip than the 4080 chip is to the 3080 chip. So if we would believe that Nvidia was forced to make the 4080 70% more expensive due to higher chip costs, then the 4070 should be way more than 70% more expensive than the 3070. Yet in reality, it is 'only' 20% more expensive.

So the entire 'Nvidia was forced into these prices due to Moore's law slowing down' narrative is just untenable. Perhaps they can't go as low as we'd like, but they certainly didn't choose these prices due to increased cost.
 
Reactions: Cableman and Tlh97

Aapje

Golden Member
Mar 21, 2022
1,467
2,031
106
Except for the 7000 launch (where something obviously went awry considering AMD didn't meet its own promises from shortly before) I'm actually quite happy with their results as a follower so far: Hardware was keeping up, and in software AMD managed to counter Nvidia's proprietary solutions with adequate open source counterparts.
I think that we have different definitions of 'keeping up' and 'counter.'

In hindsight, it seems that AMD was only able to almost match Ampere with the RDNA 2 cards because Nvidia was a node behind. Right now, they need a total of 531 mm2 on the 7900 XTX to match a 380 mm2 4080. Of course the AMD card is less space efficient due to the multi-die, and has cheaper cost per mm2 on average. But it's not at all what I would call 'keeping up.'

And FSR 2 is a bit worse than DLSS 2 & FSR 3 is nowhere to be seen. AMD is a generation behind on raytracing performance. AMD is way behind on ROCm compared to CUDA.

It's all good enough that they can still sell their cards for a decent profit (also thanks to Nvidia price gouging because AMD lets them get away with it), but I can't imagine Jensen having any worry about AMD.
 

moinmoin

Diamond Member
Jun 1, 2017
4,994
7,765
136
I think that we have different definitions of 'keeping up' and 'counter.'
Yes, very different ones apparently.

Remember that RDNA2 was mainly in the work for the the current gen of consoles. That it held up well against Nvidia at the time was a surprise, node difference or not. RDNA3 completely dropped the ball though.

And FSR 2 is a bit worse than DLSS 2 while being open source and not relying on any proprietary hardware or being locked to one manufacturer. I don't know about you but for me the choice in such cases are always very easy picks. (I use Linux though, been away from Windows as main system for over two decades now and always pick open source solutions wherever possible.)

Raytracing performance is still completely unusable on mainstream graphics. Nvidia obviously wants to sell the highest end cards so they don't mind. But for the overall gaming market the current RT tech is an dead end unless people seriously expect the tech of huge power sucking cards to end up in consoles and handhelds at a time where moore's law scaling is dead.

CUDA is 16 years old. ROCm was started 6 years ago and already is in use in exascale supercomputers. It already does it job, but obviously it will take more time to supplant the market leader in mindshare and the general market. But that's not an effort of AMD alone anyway but also Intel and all the other companies and clients looking for alternatives to Nvidia's closed ecosystem.

All that said I'm actually not interested in whether AMD can sell its cards at a decent profit. If they do good work they should be able to do just that. The current consumer gen RDNA3 may as well not exist though.

Jensen worries about building an Apple-like premium ecosystem, and the focus is increasingly moving away from the consumer market toward the data center market. Due to Nvidia's single minded focus on making every single (essentially stolen or otherwise) tech proprietary I couldn't care less in any case though.
 
Reactions: Tlh97 and KompuKare
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |