Question Why does the overall gaming GPU market treat AMD like they have AIDS?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

VirtualLarry

No Lifer
Aug 25, 2001
56,447
10,117
126
I guess I get the (sub-liminal) "The way it's meant to be played" ads from NVidia, along with the recurring FUD tropes about "AMD drivers", but I honestly don't get the sales disparity, especially for the price.

I've owned both NVidia-powered as well as AMD powered GPUs, and IMHO, AMD is (generally) just as good. Maybe 99% as good.

Edit: And I think that there's something to be said about the viability of AMD technologies, when they're in both major console brands.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,262
5,259
136
Reasons are many
- Halo effect, nvidia often has performance crown at the top end which translates to sales at the bottom
- Exclusive features, G-Sync, DLSS, RTX, NVENC, CUDA

I think those are the biggest ones. NVidia usually gets there first with flashy new features, and for many generations has had the king of the hill halo card, this builds them a lot of mindshare for the people that don't want to read every review detail.

I read every detail. My last two cards were one from ATI (9700 Pro) and the other NVidia (8800 GT - still using).

At equal perf/$, I'd go NVidia for the features which tend to have the edge. CUDA is still the leading choice of non gaming GPU compute, NVenc has been better, NVidia RT has been better. DLSS has been better, though G-Sync is now equal to Freesync.

It then becomes a question of how much perf/$ boost does AMD need to offset the feature deficit.

For me the answer is I typically want more perf/$ than AMD gives in return for the loss of features. Example, if I was in the market during Turing, I would have chosen a 2060 Super or 2070 Super over 5700XT.

But if I was choosing today between RTX 3050 and RX 6600, I'd choose the RX 6600.

But most of the time AMD prices more like the 5700 XT, than the RX 6600.
 
Last edited:

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
While I do feel a little locked into Nvidia's ecosystem due to having a G-Sync display (hardware G-Sync that does not support FreeSync), I don't necessarily have a problem switching. The issue is that I pretty much need to be wowed by AMD to make the change. It's not like I haven't used them in the past. According to my Newegg order history, I had an HD 5870 and an HD 6950. Looking at my other past purchases, it looks like I went from 8800 GTX -> HD 5870 -> HD 6950 -> GTX 680, and since the GTX 680, I've been using Nvidia. I think the problem is that AMD has always just been "good enough" and priced "well enough" that I haven't seen the reason to dump my Nvidia-specific hardware.
 

sniffin

Member
Jun 29, 2013
141
22
81
Because the pattern is that Nvidia innovate and AMD react months/years later.

Because people will pay for a product they perceive to be higher quality (see Apple).

Because nobody cares about the ethics of companies making graphics cards. Just like I don't care how ethical the manufacturer of my dishwasher is.

Consumers aren't stupid, they just don't share the same priorities as nerds on technical forums. I don't really blame them. My 5700 XT was a pain in the ass and I regretted it.
 

wanderica

Senior member
Oct 2, 2005
224
52
101
For me it's a bit like that time I tried to build an HTPC with a 4 channel cable card tuner split 3 ways over my home network. It worked well enough, but I was still plagued by frequent restarts, driver updates, and the hell of trying to teach Spectrum how to use their own equipment. At the end of the day, I just wanted to turn the TV on and watch it.

It's been a similar road for me with AMD. I've always seen AMD as more of an "old school enthusiast" brand. Back in the day, we used to appreciate doing more with less, custom modding, and pushing our rigs to the bleeding edge just for the shiggles. These days, being a PC enthusiast takes many different forms, and that's true of me today too. The last AMD card I had was a 290X that time they were on firesale. The deal was just too good to pass up for a struggling single man. I felt like I was plagued with little annoying issues that seemed to crop up though, and thanks to Nvidia's handling of DX11, they were the objectively better choice for titles during those times. In the end, I just wanted to turn the game on and have it work the way I wanted.

Today's AMD isn't the same as the AMD of 10 years ago, though, and I really hope they can continue to push their own boundaries consistently and continue to improve general public opinion of their GPU products. I desperately hope that AMD can achieve true parity with Nvidia this round in both Raster and RT, or at least win one by as much as they lose the other. Nvidia has been far from perfect for me over the years as well, and I really want an actual choice again at the high end. 3090 vs 6900XT was as close as it's been in years. Personally, I'm looking forward to looking at both brands this gen, and getting to make a real choice based on two great products.
 
Last edited:

Heartbreaker

Diamond Member
Apr 3, 2006
4,262
5,259
136
With what I'm seeing currently, I'm going to have to respectfully disagree.

Whenever I see these kinds of "people buying brand (different than my brand) X are stupid" statements. I think there is a lot of unwarranted arrogance behind those statements. Over the years I have seen it leveled most at Apple buyers.

Having different priorities doesn't make people stupid, even if that priority is not spending a lot of time researching and just going with the brand they bought last time.
 

Mopetar

Diamond Member
Jan 31, 2011
8,004
6,446
136
AMD GPUs were most energy efficient of last generation. Hands down. 5000-series wasn't bad either, about equal to Nvidia. AMD is about to launch 3rd generation after last GCN-based furnace, this thought of AMD being power hungry should die already.

Neither AMD or NVidia has been terribly inefficient for a long time. During the times where they were perceived as such it was the same issue in both cases where the company had a reasonably efficient product but pushed it far past that point to squeeze out a few extra percentage points of performance.

Plenty of people found Polaris to be a great card if you applied an undervolt to it. In a lot of cases that even made it perform better than the stock voltage settings while running cooler. But AMD needed to match the 1060 so they juiced it. Same with Nvidia last generation where AMD had a competitive product and NVidia didn't want to be behind on the charts. Tweak the settings and the power use dropped considerably for maybe a 5% loss in performance.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,262
5,259
136
In the current edition of how NVidia markets. DLSS 3 frame interpolation, is what NVidia is marketing hard this release. Example:

I consider marketing of DLSS 3, to be very misleading, but that doesn't make people falling for it stupid. Not many people have time to dig into the details of pros/cons, and even big popular sites are quoted praising it. I checked the quotes and they are real, and talking specifically about DLSS 3 frame generation.

Many people might not even be exposed to a critical look at DLSS 3 frame interpolation, and will just see DLSS 3 as this generations killer feature (just what NVidia wants).

I'd bet on AMD having a similar feature within a year and it will probably even work on some older generation cards, but the marketing damage will be done while NVidia launches with another exclusive feature...
 

HeXen

Diamond Member
Dec 13, 2009
7,832
38
91
I always use Nvidia because that's the only GPU brand I've ever had since 3Dfx got bought out. I had the Riva TNT 2, Geforce 3...etc. I still have quite a few old cards laying around. But I'm familiar with it. The drivers have always been good to me, especially the past decade or so. Even if the cost is a bit more...it's PC gaming and high end PC gaming has always been expensive so if an extra couple hundred bucks is gonna break me then I shouldn't be doing it. So I just never had an incentive to use anything but Nvidia.

Though I'd love it if Nvidia would release a 3Dfx branded Voodoo card for the nostalgia. I think it would make them a lot of extra sales.
 
Reactions: Leeea

Unreal123

Senior member
Jul 27, 2016
223
71
101
What is the problem?

First is man power.

Nvidia has nearly 22000 exclusive employees related to GPU segment.

AMD has more than 13xxx employees overall and majority of them are working under CPU department.

The more generation advances the more capital and man power was required.

There was a time when Nvidia was was much smaller company than AMD and some people thought that when AMD bought ATI than it will we would be over AMD.

Heck AMD was more advance than Nvidia in Radeon HD 5xxx to RX 290X. This were AMD did not capitalize that was golden era of AMD GPU . They left the market to AMD troll Fanboys and so much toxic on AMD Roy twitter that if the person went to complain than he would have thought many times. AMD had the best GPUs at that time ,however, due to bad people hired for PR and costumer support it all went into the dust. Than came Maxwell, where AMD Roy was shitting all over the place on Twitter and saying Nvidia is scum company etc. Heck they even wanted to protest on GPU launch of Maxwell dam that much bad repo there PR had at time.

Now coming to this generation.

One thing people are notcing is that Nvidia Turing and Ampere are turning out to be Fine wine.

Just look at the games that released this year

God of War RTX 3080 beats RX 6900 XT

Sniper Elite 5 RTX 3080 beats RX 6800 XT

Saint Row Remake RTX 3080 beats 6800 XT

Spiderman Remastered RTX 3080 is beating RX 6900 XT.

Dying Light 2 RTX 3080 is beating RX 6800 XT

Uncharted Legacy of Thives RTX 3080 is on par with RX 6800 XT.

etc..

What i mean is that RDNA did not had the impact in this gen console like did with PS4 or Xbox One.

Even at DX12 or Vulkan, which AMD was the godfather of this API due to that AMD was the one behind brought low level API to PC still AMD lags in performance.


AMD is unable to find what they want in GPU is the reason AMD is lagging. First AMD was best in mining and now Nvidia GPU are best in Mining and computing performance.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,805
21,544
146
All these responses about just wanting to be able to use stuff trouble free. I had to check to make certain I wasn't in the console forum. There also needs to be an internet law similar to Godwin's, that states when Apple is brought up in any discussion about tech companies, the person doing it automatically loses the argument, debate, or discussion. Anti consumer companies suck.

And no, all of them are not the same. Open source initiatives matter. Even if it is the ONLY thing distinguishing one from another. I may be on the losing side; that's cool with me. I plan to bite down on my mouthpiece and keeping fighting anyways. I prefer it to complacency, or worse, compliance.
 

maddie

Diamond Member
Jul 18, 2010
4,787
4,771
136
What is the problem?

First is man power.

Nvidia has nearly 22000 exclusive employees related to GPU segment.

AMD has more than 13xxx employees overall and majority of them are working under CPU department.

The more generation advances the more capital and man power was required.

There was a time when Nvidia was was much smaller company than AMD and some people thought that when AMD bought ATI than it will we would be over AMD.
So strange mentioning this as I see the opposite today. Yes, the old AMD was terrible at fusing the 2 companies. Today, I'm seeing ideas cross from both divisions. IF cache is an example of a world leading tech cross-pollination between the CPU & GPU divisions. If they continue and accelerate this process, then having a strong foot in both worlds becomes a distinct advantage. Management is the key here.
 

Unreal123

Senior member
Jul 27, 2016
223
71
101
So strange mentioning this as I see the opposite today. Yes, the old AMD was terrible at fusing the 2 companies. Today, I'm seeing ideas cross from both divisions. IF cache is an example of a world leading tech cross-pollination between the CPU & GPU divisions. If they continue and accelerate this process, then having a strong foot in both worlds becomes a distinct advantage. Management is the key here.
At that time i mean 2005-2012 PC gaming was not even considered by developers due to Xbox 360 and PS3 had totally different API.

I remember Assassins Creed 2 port coming after months of console release and some games of 3rd party developer did not even consider to port on PC due to many reason.


However now it's a different story. For 3rd Party developer PC is the main platform for development because PC has got more advance, developement has got more advance and learning curve is mature now.

Even PS second home is PC as well MS. So now Investment and man power matters more because of high competition and AAA gaming lineups.
 
Reactions: Leeea

amenx

Diamond Member
Dec 17, 2004
4,005
2,275
136
I think AMDs GPUs history needs to be looked at from a new vantage point. From RDNA 2 and what follows. Before that they may not have had stellar products to excite the masses. Now they do. And with RDNA 3, things could be taken to a new level. Therefore maybe pointless to judge the company on their pre-RDNA 2 record when that may no longer apply today.
 
Reactions: Leeea

Unreal123

Senior member
Jul 27, 2016
223
71
101
I think AMDs GPUs history needs to be looked at from a new vantage point. From RDNA 2 and what follows. Before that they may not have had stellar products to excite the masses. Now they do. And with RDNA 3, things could be taken to a new level. Therefore maybe pointless to judge the company on their pre-RDNA 2 record when that may no longer apply today.
AMD had better GPU than Nvidia from HD 5870 to R9 290X and this is fact ,however from Maxwell, Nvidia has far better over all GPU and now nvidia GPU's are better at computing as well.

Ampere>>>RDNA 2.

However, RTX 3070 or ti is the worst value.
 

fleshconsumed

Diamond Member
Feb 21, 2002
6,485
2,361
136
I think those are the biggest ones. NVidia usually gets there first with flashy new features, and for many generations has had the king of the hill halo card, this builds them a lot of mindshare for the people that don't want to read every review detail.

I read every detail. My last two cards were one from ATI (9700 Pro) and the other NVidia (8800 GT - still using).

At equal perf/$, I'd go NVidia for the features which tend to have the edge. CUDA is still the leading choice of non gaming GPU compute, NVenc has been better, NVidia RT has been better. DLSS has been better, though G-Sync is now equal to Freesync.

It then becomes a question of how much perf/$ boost does AMD need to offset the feature deficit.

For me the answer is I typically want more perf/$ than AMD gives in return for the loss of features. Example, if I was in the market during Turing, I would have chosen a 2060 Super or 2070 Super over 5700XT.

But if I was choosing today between RTX 3050 and RX 6600, I'd choose the RX 6600.

But most of the time AMD prices more like the 5700 XT, than the RX 6600.
Yes, nvidia does have more features, but very often those features are simply not useful to the majority of people using the product, or are half baked on the initial implementation to the point of being unusable. Take for example RTX, it took 3 generations before it was actually playable, and it still takes $1600 card to run RTX games at max settings, most of the people don't turn it on because frame rate hit is too much. NVENC is nice, but it's a niche feature that's only useful for PLEX users or streamers which is a fraction of gaming user base. G-Sync fizzled out and got replaced by freesync. First implementation of DLSS was pretty poor, and DLSS3 frame interpolation is also turning out to be a disaster.

If you look past the snazzy presentation, a lot of nvidia benefits are either not practical (RTX) or not useful (NVENC). On the other hand I bet 6800/6800XT are going to age a heck of a lot better than 3070ti with its 8GB of RAM. This is where the calculus gets more difficult. Given identical perf/$ performance right now, I'd go with AMD because most nVidia features are useless to me and AMD cards are going to age a lot better. This is where nuanced approach is needed and why I said most consumers who are swayed by nvidia glamor aren't knowledgeable enough to make an informed decision.
 

Leeea

Diamond Member
Apr 3, 2020
3,695
5,428
136
Hardware Unboxed did a great video on current prices:
https://www.youtube.com/watch?v=a5gxrcHUM0k

Part of the video compares AMD retail to Nvidia retail
and part of it compares AMD used to Nvidia retail
rdna1 parts have cratered on the used market


It all seems rather relevant to this discussion of AMD price point vs Nvidia.

I feel it illustrates the fallacy that AMD dropping their prices to increase market share would work. That has effectively happened. The rx5700 is roughly equivalent to the rtx3060 that goes for 2x the price.

I think most people, including me, would rather pay 2x for the 3060. Or, if staying in the same price bracket, a rx6500xt at 50% the performance of the rx5700xt.
 
Last edited:
Reactions: scineram

SteveGrabowski

Diamond Member
Oct 20, 2014
7,117
5,997
136
Hardware Unboxed did a great video on current prices:
https://www.youtube.com/watch?v=a5gxrcHUM0k

Part of the video compares AMD retail to Nvidia retail
and part of it compares AMD used to Nvidia retail

An interesting part was rdna1 parts have cratered on the used market, people just do not want them.


It all seems rather relevant to this discussion of AMD price point vs Nvidia.


It also illustrates the fallacy that AMD dropping their prices to increase market share would work. That has effectively happened. The rx5700 is roughly equivalent to the rtx3060 that goes for 2x the price.

I think most people, including me, would rather pay 2x for the 3060.

I absolutely refuse to buy a card that's going to have 20,000 hours running balls to the wall mining eth on it unless the price reflects that. It would have to be like $80 max and in perfect working order with good temps considering I don't think I'd even put 4,500 hours of gaming on any card over five years. I know I'd be buying the tail end of that card's life so I'd only pay bottom barrel prices as great a card as I think the RX 5700 is.
 

Leeea

Diamond Member
Apr 3, 2020
3,695
5,428
136
I absolutely refuse to buy a card that's going to have 20,000 hours running balls to the wall mining eth on it unless the price reflects that. It would have to be like $80 max and in perfect working order with good temps considering I don't think I'd even put 4,500 hours of gaming on any card over five years. I know I'd be buying the tail end of that card's life so I'd only pay bottom barrel prices as great a card as I think the RX 5700 is.
That is exactly my point. We know a used rx5000 series is a card that was burned out in the mines and is just going to be problems.


The same is true with AMD vs Nvidia. As long as people perceive AMD cards as having problems, it does not matter what the price is. They are going to buy Nvidia. Even if it is 2x the price. There is no point for AMD to discount the price to gain market share.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
7,117
5,997
136
That is exactly my point. We know a used rx5000 series is a card that was burned out in the mines and is just going to be problems.


The same is true with AMD vs Nvidia. As long as people perceive AMD cards as having problems, it does not matter what the price is. They are going to buy Nvidia. Even if it is 2x the price.

I mean I wouldn't touch a 1080 Ti over $60 either for the same reason. My brother bought a used 1080 Ti a year and a half ago and is a very light gamer: only a few hours per week, and playing easy to run things like Apex, WOW, and Diablo. And the card died on him a couple of months ago. Probably only got six months in the mines before he bought it.
 

Leeea

Diamond Member
Apr 3, 2020
3,695
5,428
136
I mean I wouldn't touch a 1080 Ti over $60 either for the same reason. My brother bought a used 1080 Ti a year and a half ago and is a very light gamer: only a few hours per week, and playing easy to run things like Apex, WOW, and Diablo. And the card died on him a couple of months ago. Probably only got six months in the mines before he bought it.
I had not thought about that.

Right now any used card is just asking for problems and a horrible experience.

In hindsight, it has been that way for years.


I wonder if @AnitaPeterson's cursed rx570 was a used card when purchased?
 
Last edited:

AnitaPeterson

Diamond Member
Apr 24, 2001
5,959
441
126
[...] If you look past the snazzy presentation, a lot of nvidia benefits are either not practical (RTX) or not useful (NVENC).

Sorry, but calling NVENC "not useful" is laughable and completely wrong.
You may not be using that feature - fair enough! - but such a blanket statement is factually wrong and ridiculous to anyone who dabs in video encoding.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
7,058
7,478
136
Sorry, but calling NVENC "not useful" is laughable and completely wrong.
You may not be using that feature - fair enough! - but such a blanket statement is factually wrong and ridiculous to anyone who dabs in video encoding.
Hyperbole much?

- Yep, if anything NVENC is one of the more useful features of NV's current suite for the large portion of the gaming audience (that does not include me) that streams.

I think AMD would do really, really well to slap a 3.0 behind a lot of their features and "relaunch" them alongside the 7xxx series to simply reintroduce them to the masses. You cannot discount NV marketing, they do a great job keeping their "gaming adjacent" features in the spotlight. Do an incremental upgrade on some side feature? Throw a new version number on it and talk it up to the moon!

Been part of a number of reddit discussions where people straight up don't know Freesync = Gsync, FSR 2.0 = DLSS 2.0, RTX is just NV's DX12 Ultimate branding, etc etc etc. People either don't even realize these features are "AMD" features or they assume that they're not an alternative feature but rather the "Medium setting to NV's proprietary Ultra Setting".
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |