Discussion Ada/'Lovelace'? Next gen Nvidia gaming architecture speculation

Page 94 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

gdansk

Platinum Member
Feb 8, 2011
2,778
4,065
136
The component costs to make GPUs are low. The margins are high. They are high to recoup R&D.
You know all those fancy features you hardly ever use.

The BOM lunatics are wrong about it being chips and components. It is entirely because Nvidia wants mass market products to have margins too. All Nvidia buyers pay for the development of those tensor cores, ray tracing support, CUDA and so on whether they use it or not.
 

jpiniero

Lifer
Oct 1, 2010
15,023
5,590
136
The BOM lunatics are wrong about it being chips and components. It is entirely because Nvidia wants mass market products to have margins too. All Nvidia buyers pay for the development of those tensor cores, ray tracing support, CUDA and so on whether they use it or not.

You could say they could have removed the OFA on AD107 to save space or add more SMs. But that would mean no frame gen.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,307
5,380
136
I would not be surprised if the BOM of a 4060 was less than the 7600. The silicon might cost a tad more due to wafer costs but the lower TDP will mean NV can save on the heatsink so I bet that is a rough wash.

As such a $249 is still going to be very profitable for AMD / NV for this kind of product.

No doubt NVidia will get better margins than AMD, because NVidia will charge $299, which will force AMD to charge $249 at most, and even then they will be unlikely to match NVidia sales.

But the question remains. If margins are so rich, why won't AMD lower prices to $199 and take market share?
 

Aapje

Golden Member
Mar 21, 2022
1,497
2,056
106
But the question remains. If margins are so rich, why won't AMD lower prices to $199 and take market share?
Because this is an oligopoly. Then you often get bad market conditions far from a truly competitive market where production costs drive prices.
 
Reactions: Lodix

Timorous

Golden Member
Oct 27, 2008
1,746
3,236
136
No doubt NVidia will get better margins than AMD, because NVidia will charge $299, which will force AMD to charge $249 at most, and even then they will be unlikely to match NVidia sales.

But the question remains. If margins are so rich, why won't AMD lower prices to $199 and take market share?

Because AMD learned after the HD4000 series that does not really work in the face of NV mindshare.

This can be seen by the fact the more expensive and way slower 3050 has outsold the 6600.
 

Mopetar

Diamond Member
Jan 31, 2011
8,053
6,620
136
But the question remains. If margins are so rich, why won't AMD lower prices to $199 and take market share?

Normally they'd rather use wafers to make something more profitable, but with sales down everywhere no one will be wafer limited and Navi 33 isn't competing for 5nm wafers anyway.

The die for Navi 33 is smaller and cheaper on top of that if the claims of TSMC offering rebates for customers to move 7nm production to 6nm were true.

AMD doesn't have any Navi 32 cards to steal sales from either. It's also doubtful if NVidia would try to chase AMD by cutting prices since they wouldn't want to devote more wafers to low-end and low margin dies and cannibalize some of their own mid-range sales by leaving a massive price gap between cards.

It sold well but it didn't set AMD up for future prosperity.

Some of that's on AMD. Evergreen was a decent follow up, but by the time they started releasing their Island line of GPUs, NVidia had fixed a lot of their issues and were willing to build a bigger die (as much as 50% larger) if that's what it took to take the performance crown.

AMD's first foray into making a truly massive die (Fiji) didn't really pan out for them as they had a hard time keeping all of the shaders occupied with useful work. Maxwell was also one of NVidia's best architectures so having to go up against a 980 Ti which offered 50% more VRAM and generally better performance at slightly lower power draw made it hard for AMD to sway gamers at the time. Never mind that there was a Titan card above that.

If AMD were willing to make a massive die when they were ahead or on even footing, things may have gone a lot differently for them. They might have had some of the top cards to build more mind-share and more expertise with designing bigger cards that would have made Fiji a better product.
 

Saylick

Diamond Member
Sep 10, 2012
3,493
7,716
136
If AMD were willing to make a massive die when they were ahead or on even footing, things may have gone a lot differently for them. They might have had some of the top cards to build more mind-share and more expertise with designing bigger cards that would have made Fiji a better product.
Therein lies the rub. At at time, AMD would never make a massive die because outside of gamers, they'd have no alternate market to sell it to. Nvidia always had an HPC and workstation market that would justify the design and fabrication of a massive die, where the higher margins of those markets more or less enables a massive die GPU for the consumer market.
 
Reactions: TESKATLIPOKA

Ajay

Lifer
Jan 8, 2001
16,094
8,108
136
Therein lies the rub. At at time, AMD would never make a massive die because outside of gamers, they'd have no alternate market to sell it to. Nvidia always had an HPC and workstation market that would justify the design and fabrication of a massive die, where the higher margins of those markets more or less enables a massive die GPU for the consumer market.
But, AMD should be able to amortize RDNA R&D across the console market as well. That’s a pretty big leg up. Workstation market is more about 'approved' drivers for various CAD/CAM software - etc (though that changed a bit with CUDA development and will change more with AI development I imagine).
 
Reactions: TESKATLIPOKA

GodisanAtheist

Diamond Member
Nov 16, 2006
7,108
7,548
136
Therein lies the rub. At at time, AMD would never make a massive die because outside of gamers, they'd have no alternate market to sell it to. Nvidia always had an HPC and workstation market that would justify the design and fabrication of a massive die, where the higher margins of those markets more or less enables a massive die GPU for the consumer market.

-Its a vision and cultural problem for DAAMIT. Nvidia went into the programable shader DX10 gen with a vision.

They weren't selling Graphics Processing Units, they were selling General Processing Units, and their dies were gonna be every bit as good and ubiquitous as Intel's.

It's wild to look back and see how expertly and with laser focus NV executed on that vision with CUDA on the 8800 series all the way to today.

DAAMIT on the other hand started DX10 generation with a massive wet turd in the form of the 2900XT ( the 2900XTX never even launched, maybe a cursed moniker, next XTX card they would release is the 7900XTX, which also failed to live up to the hype). Their next couple gens was on the backfoot, following exactly the wrong small die small features strategy while NV entrenched CUDA and was willing to sell massive dies for razor thin margins because they knew CUDA was the golden ticket while AMD played the short game for children's toys in a fickle market.

AMD was and is a CPU company. GPUs were silly things that played children's games. They had these farty ideas about heterogenous compute (who even remembers that whole spiel) and TO THIS VERY GOD DAMN DAY have not been able to make a heterogenous compute complex (sort of what became an APU) where the graphics cores and x86 cores worked together on generalized workloads. We see APUs today as cute HTPC/Laptop parts, but they were supposed to be AMD's CUDA level revolution and they have failed to this very day in making that vision a reality.

AMD is fundamentally the wrong company to shepherd a big GPU division. They're *just not interested*. Look at all the cool stuff AMD has done in CPUs throughout the years. Even their failures are kinda neat "what if" moonshots that had vision for a market that never came to exist. Their CPUs have a vision. Their GPU's though? Paint by numbers, lost in the desert, the other guys. It's just a product to them, run in a very MBA esq manner. Which is fine for the vidyagames but it's not nearly enough to take on a juggernaut like Nvidia.
 

Saylick

Diamond Member
Sep 10, 2012
3,493
7,716
136
-Its a vision and cultural problem for DAAMIT. Nvidia went into the programable shader DX10 gen with a vision.

They weren't selling Graphics Processing Units, they were selling General Processing Units, and their dies were gonna be every bit as good and ubiquitous as Intel's.

It's wild to look back and see how expertly and with laser focus NV executed on that vision with CUDA on the 8800 series all the way to today.

DAAMIT on the other hand started DX10 generation with a massive wet turd in the form of the 2900XT ( the 2900XTX never even launched, maybe a cursed moniker, next XTX card they would release is the 7900XTX, which also failed to live up to the hype). Their next couple gens was on the backfoot, following exactly the wrong small die small features strategy while NV entrenched CUDA and was willing to sell massive dies for razor thin margins because they knew CUDA was the golden ticket while AMD played the short game for children's toys in a fickle market.

AMD was and is a CPU company. GPUs were silly things that played children's games. They had these farty ideas about heterogenous compute (who even remembers that whole spiel) and TO THIS VERY GOD DAMN DAY have not been able to make a heterogenous compute complex (sort of what became an APU) where the graphics cores and x86 cores worked together on generalized workloads. We see APUs today as cute HTPC/Laptop parts, but they were supposed to be AMD's CUDA level revolution and they have failed to this very day in making that vision a reality.

AMD is fundamentally the wrong company to shepherd a big GPU division. They're *just not interested*. Look at all the cool stuff AMD has done in CPUs throughout the years. Even their failures are kinda neat "what if" moonshots that had vision for a market that never came to exist. Their CPUs have a vision. Their GPU's though? Paint by numbers, lost in the desert, the other guys. It's just a product to them, run in a very MBA esq manner. Which is fine for the vidyagames but it's not nearly enough to take on a juggernaut like Nvidia.
Ehhh, I was too young to be paying that close attention to AMD/ATI and Nvidia during that time, but I'm not so sure Nvidia had this grand scheme in mind the entire time. If I'm not mistaken, CUDA was developed by some non-Nvidia software engineer who was tinkering on their own and had come up with a way to use the GPU to do compute. Nvidia saw the value in the work of this person and they were hired to lead the development of this framework. A few years later, this project was given the name CUDA and it was launched in 2006 with the 8800 series. I don't think Jensen Huang had the foresight to develop CUDA because if they did, he wouldn't need inspiration from this software engineer. I will give him credit for seeing the opportunity and jumping on it though, not unlike how during the early innings of deep learning, Jensen Huang discovered that AI researchers were using Nvidia GPUs. Prior to this, the company's focus in the GPU accelerated computing space was HPC and scientific simulation. You know, 64-bit, double-precision traditional HPC stuff. Again, when he realized the potential of this, he immediately shifted the company's focus on this fledgling market because he correctly predicted that it would eventually dwarf the market cap of the markets they previously competed in. The same could possibly be said of their continual investments in the metaverse and self-driving cars, both of which are long-term plays that have the potential to become 100+B dollar industries. Long story short, if there's one thing JHH is good at, it's not that he's been able to "invent" anything that his company is known for doing. What he is good at is diversifying the use of GPUs and to keep making investments to find ways to increase that diversification. He recognizes that Nvidia is so dominant in the markets they currently compete in that to not diversify would be to stagnate because they're likely already saturating those markets. It's the Blue Ocean strategy in a nutshell: can't gain more market share when you already have a commanding portion of it, so you must create new markets to grow.

AMD and Intel compete in Red Oceans:


Nvidia strives to come up with Blue Oceans. When you're first to enter an unknown market, you can dictate the rules. CUDA is what allows them to dictate the AI space.


Apple is a good example of a company that has created and captured numerous Blue Oceans (iPod, iPhone, iPad). It's no surprise that JHH wants Nvidia to be like Apple, and many of us have made that comparison before.

Lastly, Nvidia's disinterest in providing value to customers in the consumer GPU space is because the consumer GPU space is precisely the definition of a Red Ocean, and JHH doesn't give a Redacted about Red Oceans anymore when there's Blue Oceans to capture.

Profanity is not allowed in tech forums.
admin allisolm
 
Last edited by a moderator:

jpiniero

Lifer
Oct 1, 2010
15,023
5,590
136

nVidia only planning on doing an FE for the 4060 Ti 8 GB.

We'll see how $299 the 4060 ends up being.
 
Reactions: igor_kavinski

Heartbreaker

Diamond Member
Apr 3, 2006
4,307
5,380
136
Ehhh, I was too young to be paying that close attention to AMD/ATI and Nvidia during that time, but I'm not so sure Nvidia had this grand scheme in mind the entire time. If I'm not mistaken, CUDA was developed by some non-Nvidia software engineer who was tinkering on their own and had come up with a way to use the GPU to do compute. Nvidia saw the value in the work of this person and they were hired to lead the development of this framework. A few years later, this project was given the name CUDA and it was launched in 2006 with the 8800 series. I don't think Jensen Huang had the foresight to develop CUDA because if they did, he wouldn't need inspiration from this software engineer. I will give him credit for seeing the opportunity and jumping on it though, not unlike how during the early innings of deep learning, Jensen Huang discovered that AI researchers were using Nvidia GPUs. Prior to this, the company's focus in the GPU accelerated computing space was HPC and scientific simulation. You know, 64-bit, double-precision traditional HPC stuff. Again, when he realized the potential of this, he immediately shifted the company's focus on this fledgling market because he correctly predicted that it would eventually dwarf the market cap of the markets they previously competed in. The same could possibly be said of their continual investments in the metaverse and self-driving cars, both of which are long-term plays that have the potential to become 100+B dollar industries. Long story short, if there's one thing JHH is good at, it's not that he's been able to "invent" anything that his company is known for doing. What he is good at is diversifying the use of GPUs and to keep making investments to find ways to increase that diversification. He recognizes that Nvidia is so dominant in the markets they currently compete in that to not diversify would be to stagnate because they're likely already saturating those markets. It's the Blue Ocean strategy in a nutshell: can't gain more market share when you already have a commanding portion of it, so you must create new markets to grow.

AMD and Intel compete in Red Oceans:
View attachment 80798

Nvidia strives to come up with Blue Oceans. When you're first to enter an unknown market, you can dictate the rules. CUDA is what allows them to dictate the AI space.
View attachment 80799

Apple is a good example of a company that has created and captured numerous Blue Oceans (iPod, iPhone, iPad). It's no surprise that JHH wants Nvidia to be like Apple, and many of us have made that comparison before.

Lastly, Nvidia's disinterest in providing value to customers in the consumer GPU space is because the consumer GPU space is precisely the definition of a Red Ocean, and JHH doesn't give a f*ck about Red Oceans anymore when there's Blue Oceans to capture.

Seems like a long winded way of saying no company wants to compete in commodity products and instead seeks differentiated products to stand out.

No one chooses to be a Dell when they could be an Apple.
 
Reactions: Saylick

Heartbreaker

Diamond Member
Apr 3, 2006
4,307
5,380
136

nVidia only planning on doing an FE for the 4060 Ti 8 GB.

We'll see how $299 the 4060 ends up being.

As long as the current economic malaise continues, we will probably get MSRP cards.
 

Aapje

Golden Member
Mar 21, 2022
1,497
2,056
106
As long as the current economic malaise continues, we will probably get MSRP cards.
They are actually making three-fan 4060's that are probably going to cost a decent chunk more, which is completely ridiculous given that it's actually a 4050.

Funnily enough, the AIBs are also making a lot of absolutely tiny 1-fan models, to really drive the point home that you are getting a tiny chip that even Lay's would be ashamed to sell.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,307
5,380
136
They are actually making three-fan 4060's that are probably going to cost a decent chunk more, which is completely ridiculous given that it's actually a 4050.

Funnily enough, the AIBs are also making a lot of absolutely tiny 1-fan models, to really drive the point home that you are getting a tiny chip that even Lay's would be ashamed to sell.

More of the same nonsense. It's not actually a 4050.

It is actually a 4060 which will be competitive with AMDs 7600. Which is also actually a 7600 and not a 7500...

This generation gives small gains except for top end parts from both AMD and NVidia. Get used to it, because this is going to largely be the story going forward.
 
Reactions: DeathReborn

TESKATLIPOKA

Platinum Member
May 1, 2020
2,467
2,952
136
More of the same nonsense. It's not actually a 4050.

It is actually a 4060 which will be competitive with AMDs 7600 which is actually at 7600 and not a 7500...
Yes, It's 4060, but the price is not good, but this is true for the whole generation.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,307
5,380
136
Yes, It's 4060, but the price is not good, but this is true for the whole generation.

For the 4060, it's actually the positive standout of this generation, because it gets a 20% performance bump, and a price cut.

I'm pretty happy with the 4060, and will likely get one, if the AIB pricing holds to MSRP.
 

Timorous

Golden Member
Oct 27, 2008
1,746
3,236
136
For the 4060, it's actually the positive standout of this generation, because it gets a 20% performance bump, and a price cut.

I'm pretty happy with the 4060, and will likely get one, if the AIB pricing holds to MSRP.

The 4090 is the only standout card this gen. 46% more performance than the 3090Ti for a 20% lower MSRP.

EDIT: For the 4060 to match that it would need to have 3070/4060Ti performance and cost around $260. For the 4070 to match that it would need to perform like the 4070Ti and cost $400. For the 4080 to match that it would need to perform like it does but cost just $560.

EDIT2: If you want to compare to the 3090 then it is 64% more performance for a 6.7% price increase so in that case the 4060 would need to perform like 3070Ti and cost $350. The 4070 would need to have 7900XT performance and cost $530 and the 4080 would need to have AIB 7900XTX performance and cost $750
 
Last edited:
Reactions: psolord

MrTeal

Diamond Member
Dec 7, 2003
3,596
1,775
136
The 4090 is the only standout card this gen. 46% more performance than the 3090Ti for a 20% lower MSRP.

EDIT: For the 4060 to match that it would need to have 3070/4060Ti performance and cost around $260. For the 4070 to match that it would need to perform like the 4070Ti and cost $400. For the 4080 to match that it would need to perform like it does but cost just $560.
Really 55% more performance than the 3090 for 7% more money. Still great, but compare it to the actual flagship of the Ampere generation and not the last desperate attempt to squeeze maximum margins from miners before the bubble burst.
 

Timorous

Golden Member
Oct 27, 2008
1,746
3,236
136
Really 55% more performance than the 3090 for 7% more money. Still great, but compare it to the actual flagship of the Ampere generation and not the last desperate attempt to squeeze maximum margins from miners before the bubble burst.


118 / 72 = 1.63888888 or 64% more performance for 6.7% more money ($1,500 vs $1,600) this is from the XTX Taichi White review which is the latest TPU review so will be the most upto date.

EDIT: If the 55% is coming from launch reviews then lots of places were a bit CPU limited at 4K with the 4090 which does show how much of a beast it is.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,307
5,380
136
Really 55% more performance than the 3090 for 7% more money. Still great, but compare it to the actual flagship of the Ampere generation and not the last desperate attempt to squeeze maximum margins from miners before the bubble burst.

Though, for me this is like arguing how much better the new Bugatti Veyron is compared to the last one. Though I did acknowledge the top end in the post just one or two up from that.

"This generation gives small gains except for top end parts from both AMD and NVidia"

For a couple of generations, the top end gets bigger gains, but the lower in the lineup you go, the smaller the gains are, until you almost reach a point where there are almost no gains at all.

So for someone interested in more affordable cards, the 4060 is the standout of this generation.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |