Question Why does the overall gaming GPU market treat AMD like they have AIDS?

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

VirtualLarry

No Lifer
Aug 25, 2001
56,448
10,117
126
I guess I get the (sub-liminal) "The way it's meant to be played" ads from NVidia, along with the recurring FUD tropes about "AMD drivers", but I honestly don't get the sales disparity, especially for the price.

I've owned both NVidia-powered as well as AMD powered GPUs, and IMHO, AMD is (generally) just as good. Maybe 99% as good.

Edit: And I think that there's something to be said about the viability of AMD technologies, when they're in both major console brands.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,262
5,259
136
FSR 2.0 is certainly not inferior to DLSS.

I saw way more obvious artifacts in FSR in the testing I looked at.

Your comment on compute doesn't make sense. You say AMD compute is inferior, and then your compare their ability to handle certain types of math on the GPU to a programming framework

CUDA support is still much stronger, so the framework matters.

AMF got updated just recently and all the reviews I have seen of it say that its on par with nvenc.

Even if that just recently changed, it's been inferior up until just recently, and application support also lagged.

AMD's deep learning performance is a bit behind nVidias.

It's more than a bit.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,692
136
Most people doing that sort of thing use Handbrake, which does support AMD encoding and it works just fine.


However, as previously mentioned, most people do not use GPU encoders because they are inferior. Using a CPU encoder yields both better quality and a smaller output file. Typically for non-time sensitive encoding, people prioritize file size and quality over speed.

I dabbled a bit encoding a few old DVDs using Handbrake on my Athlon. Was faster then transferring the discs to my main system. Using AMD/H.265 with the fast profile actually gave pretty good, and certainly watchable, results. I imagine if you tweak things a bit, you'd be hard pressed to notice much difference compared with CPU encoding.

It's a lot faster then pure CPU encoding on it. That little Athlon encoded at ~300FPS @ 576p.

And in these days of rising power rates in many places, CPU is not the most power efficient way to encode either.

That is certainly the case right now. For archiving there is just no substitute for CPU encoding.

Luckily, most of my content is already archived, and the rest can wait while the world returns to some sort of normality.
 
Reactions: Leeea

Leeea

Diamond Member
Apr 3, 2020
3,697
5,431
136
They said "inferior", not missing features.

From what I have seen:
FSR is inferior to DLSS.
AMD RT performance is inferior to NVs RT.
AMD Compute is inferior to CUDA.
AMF is inferior to NVenc.
AMD Deep Learning performance is inferior.

This means for people that want to do more than Raster gaming, AMD may take a backseat in one or more of these areas they consider important.

I saw way more obvious artifacts in FSR in the testing I looked at.
The FSR 2.0 to DLSS 2.0 comparisons I have seen are a crap shoot.

CUDA support is still much stronger, so the framework matters.
True enough.

How many gamers use CUDA?

Have you or anyone else in this thread ever used CUDA for a non-mining application?


Even if that just recently changed, it's been inferior up until just recently, and application support also lagged.

It's more than a bit.
This is all but irrelevant.

Nvidia's GT and GTX lines do not even have the features you are touting as must buy features.

And there are a lot more of those GT and GTX cards floating around then RTX cards:

Sort the chart again by "Change this month", and you will see the GTX, RTX, and GT series easily outselling anything AMD. With the GTX1060 leading nvidia's desktop sales for September.


This strongly argues those features have nothing to do with why people buy nvidia cards.
 
Last edited:

Mopetar

Diamond Member
Jan 31, 2011
8,005
6,451
136
I don't think we need yet another discussion about why Steam surveys aren't that useful for the arguments that people try to use them for, but you'd think that the results showing an uptick in Pascal by that much would be a pretty big hint that drawing any kind of conclusions from said results may be a bit dubious.

The 1060 isn't leading NVidia's sales. They aren't making more of them and aren't earning any money from any used sales of existing cards. I don't think they were particularly popular mining cards, so all you're seeing are error bars related to the way that Steam gathers data, which is why it's not a useful survey for making broader claims about anything beyond people who use Steam and agree to participate in Steam surveys.
 

Leeea

Diamond Member
Apr 3, 2020
3,697
5,431
136
I don't think we need yet another discussion about why Steam surveys aren't that useful for the arguments that people try to use them for, but you'd think that the results showing an uptick in Pascal by that much would be a pretty big hint that drawing any kind of conclusions from said results may be a bit dubious.
My problem is it seems to be the only tool in the bin.

When all you have is a hammer, everything is a steam survey.


-------------------------------------------------

oh! GTX1060s had 6 GB of vram, making them able to mine until just recently.

I am going to speculate Steam's hardware survey is correct, these are 1060s being sold on the used market from the mines. I feel bad for the new owners.
 
Last edited:

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,879
3,230
126
I don't think people realize most of the console ports on PC are from developers who wrote the code on for a AMD GPU, especially if its a console port from a Xbox.

But i think its more professional developers use Nvidia, as they have more overall support in the entire workstation line, then AMD has.
You hear the words Quattro, and RTX way more then FirePro, and RadeonPro.

Oh AMD probably has more popularity on Apple... i know they use RadeonPro's way more then RTX.
So i guess overall more windows users just don't use it out of disgust for Apple, as we hate anything associated with Apple. (sarcasm)
 
Reactions: Tlh97 and Leeea

Leeea

Diamond Member
Apr 3, 2020
3,697
5,431
136
you don’t think a Geek can tell a difference in response of 1ms vs 20ms
I do not doubt it is much better then your previous monitor.

But folks using scientific instruments have shown the only monitors getting anywhere near 1 ms response times are oleds.


So yes, I think you are unable to tell the difference, perhaps because you likely have never seen the real thing.

And yes, even LCD monitors using overdrive with all the overshoot involved still do not get 1 ms.


I linked that video to you to give you a wake up call. The advertised 1 ms number is meaningless. Your not getting 1 ms, and it shows up as blur on fast moving objects.


One of my favorite tests is:
makes it easy to see the g2g timing. If you truly have a 1 ms display, you should be able to set the distance between the two boxes to 0 pixels and not see any overlap.

also check out:
if you truly have a 1 ms display, the top rectangle will look identical to the bottom one even though it is moving.

also check out:

Have fun going down the rabbit hole!
 
Last edited:

Heartbreaker

Diamond Member
Apr 3, 2006
4,262
5,259
136
How many gamers use CUDA?

Go read my post again. It was simply about the kind enhanced features NVidia has. It doesn't mean most people use them. A lot of people buy stuff thinking, they might want to use that for something, even if they never do. Want to play with 3D rendering, NVidia cards tend to be better supported and better performing. Same if you want to do some deep learning, etc....

You might not care about any feature leads NVidia has, but that doesn't mean no one does.



Nvidia's GT and GTX lines do not even have the features you are touting as must buy features.

And there are a lot more of those GT and GTX cards floating around then RTX cards:

Sort the chart again by "Change this month", and you will see the GTX, RTX, and GT series easily outselling anything AMD. With the GTX1060 leading nvidia's desktop sales for September.


This strongly argues those features have nothing to do with why people buy nvidia cards.

First I never said they were "must buy features". Just that they were features were NVidia was leading.

Steam HW survey isn't showing sales. That shows the total installed base, that were surveyed this month, so it has nothing to do with your point.

We are talking about why AMD sales lag. For marketing, features don't even have to be actually better, they can just be believed as better.

NVenc vs AMF is and example. It may now be equal, But it will take a LONG time before AMD gets over the reputation of an inferior Encoder that was true for a decade+. So it just becomes part of list of features where AMD is believed inferior.

This generation it's DLSS 3 "framerate multiplier" that is being heavily marketed. You and I might think it's useless, but many people will just see it as another feature advantage.

The point here, is that from the perspective of many reasonable buyers NVidia has a consistent feature lead, and that is part of the reason they sell more, but it's not the only reason, and you can't nullify this advantage by pointing out that AMD is catching up on feature X...

The point of this thread, is Why AMD gets outsold when it has better perf/$?

If you think features have nothing to do with it, why do you think it happens?
 
Last edited:

VirtualLarry

No Lifer
Aug 25, 2001
56,448
10,117
126
I think you have me confused as some noob here, so you don’t need to give me any wakeup calls, I’ve been at this stuff for over 20 years, probably longer than you are older.

I know all the Monitor tests out there, since they’ve been around, and I can tell the difference in response, because higher ms does have greater input lag.

1ms to 20ms is very noticeable.

Yes It’s has 1ms response...

Anyhow, enough chatter off topic...
I am going to call you a noob, because you just confused response time with input lag. The two specs are not remotely the same. Though they are interrelated.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,879
3,230
126
I am going to call you a noob, because you just confused response time with input lag. The two specs are not remotely the same. Though they are interrelated.

Well they can be co relational in gaming.
Refresh Rate can also be called response time.
And Input lag can be a direct result of having poor refresh rate.

So you cant really say they are not remotely the same.
 
Reactions: Leeea

Leeea

Diamond Member
Apr 3, 2020
3,697
5,431
136
If you think features have nothing to do with it, why do you think it happens?
I think it is purely a mind share thing.

You said:
For marketing, features don't even have to be actually better, they can just be believed as better.
I would take that a step farther. The features do not even need to work.

Case in point: RTX3050, which appears to sell 5 cards for every rx6600.

RTX on the 3050 is all but non-functional
DLSS at 1080p is not a great experience

But people believe the marketing, and buy features they will never use and do not work as advertised.



My push back in these threads is against the idea AMD should try to gain market share by dropping prices. I believe AMD dropping prices will have minimal effect on its market share.

In short, it is pointless for AMD to try and juice market share. Nvidia has reached snowball status due to previous AMD failures, and now AMD should just execute on products and wait for people to forget AMD's failures. Raja moved on, things will improve now.



case in point:
@AnitaPeterson from this thread. One unhappy AMD customer convinced Nvidia is better. Nothing you, I, or AMD does is going to convince Anita AMD card is a good idea.

or take this guys post:
http://www.portvapes.co.uk/?id=Latest-exam-1Z0-876-Dumps&exid=thread...amd-like-they-have-aids.2607705/post-40868593
It will be years before he stops screaming nonsense.

Most people do not research purchases. They listen to the loudest voice in the room and buy accordingly.
 
Last edited:

beginner99

Diamond Member
Jun 2, 2009
5,223
1,598
136
Looks like 5.24 rtx3050s were sold for every rx6600.

Not even close.

Supply matters. I think NV had far greater supply and hence why they now have so much 3000series cards left over. It also makes sense as they were pretty much they only ones using the specific Samsung process.

AMD availability here in Europe up to about 5 years ago was just terrible, hit and miss. It got better in recent years but still you usually have more options with NV if AMD has way less models and 2/3rds are out of stock. I could imagine that this also adds to the "AMD is bad" mindshare.
 

Leeea

Diamond Member
Apr 3, 2020
3,697
5,431
136
Well they can be co relational in gaming.
Refresh Rate can also be called response time.
And Input lag can be a direct result of having poor refresh rate.

So you cant really say they are not remotely the same.
they are not remotely the same


Refresh rate is the timing of frames sent to the monitor.

After a monitor receives the new frame, it takes time to respond to said frame.

Older monitors typically took several ms before they even started updating the pixel to the new value. Newer monitors, specifically gaming monitors, are much better, typically taking 1 ms to start changing the pixel.

The time it takes to finish transitioning the pixel to the new value is response time.

Take some time to look at the chart in the previously linked review at 3min 31seconds:

Notice how when the pixel is 128 grey ( rgb values 128, 128, 128 ), it takes 15.70 ms to go to 102 grey. That is the actual response time for that pixel transition.

A 165 Hz monitor refreshes every 6 ms. It is going to take 3 refreshes for it to show the above mentioned update.



Lets say the transition is part of someones face that is moving. There will always be a 3 frame blur on that persons face the above mentioned monitor will be unable to overcome.


If you take a bit more time to examine the chart, you will realize it is even worse. On average, the pixel will only transition 9.1% for the first of 3 frames it takes to transition. Most of the transistion will occur on the 2 and 3rd frames.
 
Last edited:

Leeea

Diamond Member
Apr 3, 2020
3,697
5,431
136
Supply matters. I think NV had far greater supply and hence why they now have so much 3000series cards left over. It also makes sense as they were pretty much they only ones using the specific Samsung process.

AMD availability here in Europe up to about 5 years ago was just terrible, hit and miss. It got better in recent years but still you usually have more options with NV if AMD has way less models and 2/3rds are out of stock. I could imagine that this also adds to the "AMD is bad" mindshare.
Definitely, people buy what they know.

If all they were able to buy was nvidia, that will be all they know.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
I don't have lot of personal experience with dGPUs but I do and for many, many years Nvidia has been(and still to a degree) better, just on the driver side. That's a reputation they've developed over decades.

That kind of thing does not go away overnight. Remember, that's consumer's way of putting trust in a company. You aren't buying the things you buy for features much as the fact that they garnered your trust over the years.

Nvidia cards allowed forcing of visual features for as long as I've been a hardware enthusiast, back in high school. Such sort of things are very cool and creates a frictionless experience for the consumer. Or much more recently when I was dealing with Polaris cards and you always need some sort of restart, while there's a lot less with Nvidia.

Unless you spend most of your days reading GPU reviews every generation, and I admit most of us here including me do, you won't know that. And there's a difference between skimming and going into detail and asking people in forums such as these to discuss and know more. That's a waste of time.

I repeat, in the big perspective of things, which is living life, we're wasting time. That's how most will view it anyway. As much as I care about computers and technology, I don't care about things as fashion for example. There are people who do. Or becoming more popular or climbing up the social ladder, whatever.
 
Reactions: Leeea

maddie

Diamond Member
Jul 18, 2010
4,787
4,771
136
Definitely, people buy what they know.

If all they were able to buy was nvidia, that will be all they know.
How do you change that mindset? Buying what you don't know. It's really simple marketing and happens all of the time, but is costly.

AMD has to find a way for more gamers to believe that they are very good and at the least, competitive with NVIDIA.

How do they do that?

1) Take the crown (raster, RT, power) indisputably, for several generations. Halo effect. (Difficult technically)
2) Price it so that even reluctant gamers will buy and get "reprogrammed" over time. (Costly)


We have spoken about Intel needing to stick with it for the long haul. This also applies to AMD. They will never gain equal or leading marketshare with business as usual. If as some think that they just need to offer a small discount on perf/$ relative to Nvidia, then they will stagnate.

Margin, margin, all you want, but if they want to master PC gaming they will have to spend. The thing is, they dominate consoles and if they also have more relevance in PC, they could lead in tech features for the entire market, as nvidia now does for the PC crowd.
 
Reactions: Tlh97 and KompuKare

Heartbreaker

Diamond Member
Apr 3, 2006
4,262
5,259
136
I think it is purely a mind share thing.

Mind Share is just another layer of abstraction.

Some of that mind share also comes from features, if you don't think features are involved in mind share, what do you is involved in mind share?

Note that you next discuss features/marketing...


I would take that a step farther. The features do not even need to work.

Case in point: RTX3050, which appears to sell 5 cards for every rx6600.

RTX on the 3050 is all but non-functional
DLSS at 1080p is not a great experience

But people believe the marketing, and buy features they will never use and do not work as advertised.

It's not that simple. 3050 is a popular laptop chip, so that increases the numbers on the Steam HW survey. Also the big pricing shift on the RX6600 only happened recently. Originally it was $329 MSRP for RX 6600 vs $250 MSRP for the RTX 3050, and all prices messed up by mining. Right now on the Canadian Amazon best Sellers, RX 6600 is outselling the RTX 3050, so pricing can have an effect.

My push back in these threads is against the idea AMD should try to gain market share by dropping prices. I believe AMD dropping prices will have minimal effect on its market share.

I mostly agree. But a big enough price delta for long enough and AMD would take market share, but then NVidia would respond to stop that, and then they would be back at status quo, just with less margin for both of them. This is why I doubt AMD will get in any price war. Right now we see it, but this is clearance at the end of a release, and NVidia is only interested in clearing high end cards, so they aren't dropping low end prices at all. Many are still selling above MSRP.

Most people do not research purchases. They listen to the loudest voice in the room and buy accordingly.

Even if they do research, they may see value in those features that you don't.

If buying a new GPU above the low end, I would want Deep Learning HW, CUDA, DLSS, etc... You arguing that most people don't need that stuff, is not going to convince me that I don't want it, or that it's unreasonable to want it. Your lack of interest in a feature, does not affect my interest in it.

What you see as pointless or undifferentiated features will not be universally agreed on, even by people that have done research.
 
Reactions: Leeea

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I don't have lot of personal experience with dGPUs but I do and for many, many years Nvidia has been(and still to a degree) better, just on the driver side. That's a reputation they've developed over decades.

That kind of thing does not go away overnight. Remember, that's consumer's way of putting trust in a company. You aren't buying the things you buy for features much as the fact that they garnered your trust over the years.

Nvidia cards allowed forcing of visual features for as long as I've been a hardware enthusiast, back in high school. Such sort of things are very cool and creates a frictionless experience for the consumer. Or much more recently when I was dealing with Polaris cards and you always need some sort of restart, while there's a lot less with Nvidia.

Unless you spend most of your days reading GPU reviews every generation, and I admit most of us here including me do, you won't know that. And there's a difference between skimming and going into detail and asking people in forums such as these to discuss and know more. That's a waste of time.

I repeat, in the big perspective of things, which is living life, we're wasting time. That's how most will view it anyway. As much as I care about computers and technology, I don't care about things as fashion for example. There are people who do. Or becoming more popular or climbing up the social ladder, whatever.

This post is a prime example of "mind share". Where people assume something is true, because it once was. Even if it no longer is. nVidia has outright bricked GPUs with their drivers, on more than one occasion. But nobody cares because there was that one time where AMD drivers had worse performance, or they would crash under one rare scenario. They ignore the stunts nVidia does where they degrade performance of older cards to make newer cards look better. Or they give reviewers of new cards a different driver for the new cards, which have optimizations the current publicly available drivers do not.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,448
10,117
126
For those that feel like NVidia is always inherently superior in features:

How do you explain the GTX 970's "3.5GB" memory limitations, and the deceptions about it from NV (at least initially)? I've never seen similar deception from AMD about their cards memory configs.

Isn't that a good example, of the times that NVidia is "truely evil", and that it's a good thing that we have AMD on the market (and now Intel too).
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,262
5,259
136
This post is a prime example of "mind share". Where people assume something is true, because it once was. Even if it no longer is. nVidia has outright bricked GPUs with their drivers, on more than one occasion. But nobody cares because there was that one time where AMD drivers had worse performance, or they would crash under one rare scenario. They ignore the stunts nVidia does where they degrade performance of older cards to make newer cards look better. Or they give reviewers of new cards a different driver for the new cards, which have optimizations the current publicly available drivers do not.

While there have been driver issues on both sides, they have been worse on AMD side for a long time. Most recently the RX 5700 cards were plagued with black screen crashes right after launch. That took some time to resolve, and was likely a significant drag on sales. I don't remember any driver issue on NVidia side coming close to that.

I have not heard about any significant RX 6000 driver issues though, and do believe that AMD drivers are fine today, but one good generation, is not going to instantly erase a negative history for everyone, and you really can't blame them, especially the people who had some very negative experiences.
 
Reactions: Leeea and Saylick

maddie

Diamond Member
Jul 18, 2010
4,787
4,771
136
While there have been driver issues on both sides, they have been worse on AMD side for a long time. Most recently the RX 5700 cards were plagued with black screen crashes right after launch. That took some time to resolve, and was likely a significant drag on sales. I don't remember any driver issue on NVidia side coming close to that.

I have not heard about any significant RX 6000 driver issues though, and do believe that AMD drivers are fine today, but one good generation, is not going to instantly erase a negative history for everyone, and you really can't blame them, especially the people who had some very negative experiences.
Bricking your card is not worse than that? Amazing.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |