Question Why does the overall gaming GPU market treat AMD like they have AIDS?

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

VirtualLarry

No Lifer
Aug 25, 2001
56,448
10,117
126
I guess I get the (sub-liminal) "The way it's meant to be played" ads from NVidia, along with the recurring FUD tropes about "AMD drivers", but I honestly don't get the sales disparity, especially for the price.

I've owned both NVidia-powered as well as AMD powered GPUs, and IMHO, AMD is (generally) just as good. Maybe 99% as good.

Edit: And I think that there's something to be said about the viability of AMD technologies, when they're in both major console brands.
 

Leeea

Diamond Member
Apr 3, 2020
3,697
5,431
136
The devil is in the details. First of all, please remember the prices and availability are different outside the US. Your quoted MSRP prices are useless here. Also, FWIW, I bought the RX570 in December 2019, a few months before the pandemic. So at least in theory it *should* have ironed all the bugs by then.

Second, a card that's just dropped in the machine and instantly works with any game or application you throw at it should be the normal experience. Yet, the RX570 even had trouble displaying 4K content (multiple HDCP issues every day, no matter how many HDMI cables I changed), aside from gaming.
Some thoughts:
I had an rx580, (purchased new) and if I remember right it had a HDMI 2.0 port, not the 2.0a or 2.0b.

So it could do 4k @ 60 Hz, and that was it. 4k @ 60 Hz - HDR? it could be forced, but it would break all the time
Never had HDCP problems, but never played protected content either

Thing is, on the card spec sheet it was clear that it was either 4k @ 60 Hz, or 4k @ 30 Hz with HDR. I wonder if you just pushed the 2017 spec HDMI port beyond its specs.

I did use hardware encoding on my rx580, but it was with discord to share my screen with my friends. Seemed to work just fine. Very different app though.

It was also unable to use hardware encoding in software packages like ClownBD. And that was in 2020, a good 3 years after it became available.
Using GPU encoders for movie backup is very rare. Most people prefer the much superior CPU encoding. From my minimal research, it seems nvenc specifically is disliked for its poor handling of dark areas.

Yes, AMD got better hardware encoding in the next generation, but even then it trailed Nvidia spectacularly. You don't have to take my word for it. See this detailed comparison:

https://obsproject.com/forum/resour...s-2020-nvenc-vs-amf-vs-quicksync-vs-x264.998/
The problem with that, is the OBS people themselves indicate the AMD encoder is not a priority, the plugin was done years ago by a guy who stated he did not know what he was doing, and they are not going to do anything about it. They do not care.

It seems most serious OBS users are targeting CPU encoding, which kind of explains why they don't care.


If your going to do youtube / twitch streaming on the cheap with an AMD card right now, it looks like ReLive is the best / only option. On the Nvidia side there is geforce experience, although it seems nvidia does offer official OBS plugins.


But it seems from the research I did, the reason most people use OBS is for its CPU streaming capability. Making this entire topic irrelevant.


-----------------------------------
While your experience with your rx570 was horrible, lets be blunt. Your using niche software tools in a manner that is niche even for those tools. Nearly everyone using those software tools with those use cases are using CPU encoding.
 
Last edited:
Reactions: scineram

fleshconsumed

Diamond Member
Feb 21, 2002
6,485
2,362
136
Sorry, but calling NVENC "not useful" is laughable and completely wrong.
You may not be using that feature - fair enough! - but such a blanket statement is factually wrong and ridiculous to anyone who dabs in video encoding.
I didn't call it useless. I called it useless for the majority of nvidia buyers because majority of nvidia buyers are not using it. How many nvidia buyers actually stream on twitch or dabble in video editing? 1%? 2%? So yes, I stand by my statement, it's useless for the majority of nvidia buyers, just like RTX or GSync.
 

Leeea

Diamond Member
Apr 3, 2020
3,697
5,431
136
Sorry, but calling NVENC "not useful" is laughable and completely wrong.
You may not be using that feature - fair enough! - but such a blanket statement is factually wrong and ridiculous to anyone who dabs in video encoding.
The problem is most people dabbling* with video encoding are not going to use nvenc even if they own a nvidia card.

It is inferior to software encoding, and everyone knows it.


The only real use case for nvenc is people doing real time encoding that are also poverty** stricken. That is a pretty small group.


People serious about encoding just buy a multi-core** CPU and enjoy the superior experience that offers.


-------------------------------------------

*the casual user will never notice. They use Geforce Experience or Radeon ReLive. Maybe Discord to share with their friends. With those commonly used software choices there is no apparent difference to the casual user.

**TwitchTV recommends any 6 core CPU for 1080p.
 
Last edited:

SPBHM

Diamond Member
Sep 12, 2012
5,058
410
126
sometimes is just a little annoying being an AMD user, like I'm trying to stream on discord, and they support hardware video encoding for Nvidia, not on my AMD card, I'm playing my favorite and slightly obscure MMO, and the ambient occlusion option works fine on Nvidia but has been glitched for years on AMD and they don't bother touching it,
there are advantages to having what 70%+ of other gamers are using sadly :/

other than that I always preferred the higher perf per $ ATI/AMD has historically offered,
 

AnitaPeterson

Diamond Member
Apr 24, 2001
5,960
445
126
Sorry, but without going into details about my video usage needs and scenarios, I can only point out that a beefier CPU is not always the easiest, cheapest or fastest path to take (and it's definitely NOT the quietest, either!)

In the end, every user has their own (generic or particular) requirements.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,818
21,573
146
Sorry, but without going into details about my video usage needs and scenarios, I can only point out that a beefier CPU is not always the easiest, cheapest or fastest path to take.

In the end, every user has their own (generic or particular) requirements.
I agree with this. I used NVENC H.265 highest Quality to transcode some blurays, and they look better than the streaming version. Hence, good enough for me. Even Return of the King was around 30 minutes I think? With my aging eyes, it's all good.
 
Feb 4, 2009
34,703
15,951
136
Well said and EXACTLY why we need intel to be a 3rd player in the market. The Duopoly has done what duopolies do. Given us one high end player that is very expensive and another low end/midrange player that is “good enough”. Being “Good Enough” is simply not that exciting.
WTF someone down voted that comment. Down voter do you happen to work with AMD or nVidia?
 
Reactions: GodisanAtheist

DasFox

Diamond Member
Sep 4, 2003
4,668
46
91
So if AMD is going to continue having an Inferior Feature Set compared to Nvidia, basically, anyone’s best guess, how much lower should their cards be selling for?

I’ll be honest, I have not been following the DLSS scene, but, I thought Ray Tracing is a big deal, and possibly still a very big reason to own Nvidia, especially if the titles you play have support.

I’ve been locked so long in the Nvidia world, because of how bad AMD’s drivers use to be in Linux, that I’ve never looked back, to ever consider buying AMD.

So, let’s say, the majority here, only ever bought EVGA, as I have done, and now that’s gone, what are we really going to miss, if we also jump off the Nvidia ship and head to AMD?

I don’t know the present features between the latest GEN AMD vs Nvidia, because I’ve never had a reason till now.

Oh I’m gonna cry, bye bye EVGA!



hmm 🤔
 

DasFox

Diamond Member
Sep 4, 2003
4,668
46
91
It will be interesting to see where AMD is headed next, with EVGA out of the scene, this is going to interesting.

hmm 🤔
 
Last edited:

Thunder 57

Platinum Member
Aug 19, 2007
2,814
4,104
136

Yesterday I was talking to someone who had screen tearing in their game, and I said I just set the game at the desired framerate through Nvidia's control panel. He uses an AMD GPU and his reaction was one of disbelief, the fact I could just lock the framerate per application, even my browser's FPS.

People said Intel selling dedicated GPU would be good for the consumer. No it's not, it has been nothing but issues, Intel is releasing half-assed drivers just like AMD.

With posts like that, their username should be Miss Information! Also not sure what they're talking about with DirectX 9 and AMD. Never had trouble with any DX9 games with any AMD card.
 

Leeea

Diamond Member
Apr 3, 2020
3,697
5,431
136
IPS Panel, 165Hz, 1ms
wrong forum, but that 1 ms is marketing lies.

Reality is going to be closer to 20 ms. They all advertise 1 ms.

The step up from the MSI Optix G273QPF, the $400 MSI Optix MAG272CQR also advertises 1 ms. It actually has a 15 ms response time:

I could not find a decent review on the G273QPF, most likely for sad reasons.

--------------------------------

If you think about it, 1 ms was never possible. At 165 Hz, the monitor is only going to start an update every 6 ms. Which is going to be the theoretical best.

--------------------------------

The reason GSYNC died is the gsync monitors tend to be more expensive then the freesync counterparts. A gsync module is typically $50 part. Usually companies release the same monitor as both. For example, you could have purchased a MSI G273QF, which is the exact same panel with the exact same performance as your gsync monitor, for $30* cheaper:

Which is why gsync is dead.

*yours is an old model, they are flushing the inventory and discounted it

On 4k monitors the gsync module has a fan. People who buy expensive monitors dislike fans in their monitor.
 
Last edited:
Reactions: Ranulf

Leeea

Diamond Member
Apr 3, 2020
3,697
5,431
136
I agree with this. I used NVENC H.265 highest Quality to transcode some blurays, and they look better than the streaming version. Hence, good enough for me. Even Return of the King was around 30 minutes I think? With my aging eyes, it's all good.
Most people doing that sort of thing use Handbrake, which does support AMD encoding and it works just fine.


However, as previously mentioned, most people do not use GPU encoders because they are inferior. Using a CPU encoder yields both better quality and a smaller output file. Typically for non-time sensitive encoding, people prioritize file size and quality over speed.
 
Last edited:

Leeea

Diamond Member
Apr 3, 2020
3,697
5,431
136
I’ll be honest, I have not been following the DLSS scene, but, I thought Ray Tracing is a big deal, and possibly still a very big reason to own Nvidia, especially if the titles you play have support.
The sad thing about raytracing is it usually takes an expert to tell if it is even turned on:

AMD gpus have hardware support for raytracing. Although, they call it Microsoft DirectX 12 DXR rather then RTX. RTX is just Nvidia's version of Microsoft Directx 12 DXR.

AMD gpus also have motion vector upscaling, but instead of calling it DLSS 2, they call it FSR 2.


If you have a Nvidia GT or GTX GPU, Nvidia prevents those cards from using DLSS.

They didn't have to either, AMD FSR works with old AMD gpus like the rx500 series, and also works with nvidia GT and GTX cards. Nvidia was just implementing market segmentation at nvidia gpu owners expensive.
 
Last edited:

Leeea

Diamond Member
Apr 3, 2020
3,697
5,431
136
I’ve been locked so long in the Nvidia world, because of how bad AMD’s drivers use to be in Linux, that I’ve never looked back, to ever consider buying AMD.
are you aware that most people consider AMD drivers on Linux to be superior to Nvidia's?

I believe Linus Torvalds said it best:
"the single worst company we've ever dealt with":

Link removed.

Links showing profanity are not allowed in the tech forums regardless of being hidden with spoiler tags.

AT Mod Usandthem
 
Last edited by a moderator:

yottabit

Golden Member
Jun 5, 2008
1,375
240
116
Most people doing that sort of thing use Handbrake, which does support AMD encoding and it works just fine.


However, as previously mentioned, most people do not use GPU encoders because they are inferior. Using a CPU encoder yields both better quality and a smaller output file. Typically for non-time sensitive encoding, people prioritize file size and quality over speed.

I dunno, I’m another heavy Plex / Handbrake user firmly in the NVENC camp. I’ve spent many hours (days realistically) coming up with my handbrake profiles and A/B testing sample clips on my projector.

In my experience for equivalent to my eyes picture quality, NVENC is maybe 15 to 20% larger file size than CPU encode. It’s also many times faster than a top end CPU with even a midrange card and uses less energy. I’m not that strapped for storage space where I’d consider CPU encode for this

This is for mostly Bluray source though and using h265. For lower resolution media like 480p / DVD material I do use CPU encode and the differences are more pronounced

Anyways I’ve got two Nvidia Shield TVs in the house as game streaming clients… I use DLSS 2 in VR… I run CUDA accelerated apps… Nvidia has me locked in pretty well lol

at least some of my monitors are FreeSync

I do agree the vast majority of users would be just fine or better off perf/dollar with Radeons
 

CP5670

Diamond Member
Jun 24, 2004
5,527
604
126
The problem is that Nvidia had a complete monopoly at the high end for many years. AMD only became competitive at the high end with the 6800/6900 generation. I've used AMD/ATI cards in the past and loved them, but am too tied to the Nvidia ecosystem of third party tools and feature set at this point to seriously consider an AMD card now. I would use an AMD card if I ever needed a second gaming PC and do recommend them to other people.
 
Reactions: Tlh97 and Leeea

Unreal123

Senior member
Jul 27, 2016
223
71
101
So why i am saying again is that Turing and Ampere are fine wine?

Look upcoming Games Requirements.

Silent Hill 2 requires Nvidia RTX 2080 and for AMD it requires RX 6800 XT dam.

Dead Space Remake.

Requires RTX 2070 and for AMD side it requires RX 6700Xt.

Spiderman Milles.

RTX 3070/ RTX 6900XT.


CRISIS CORE –FINAL FANTASY VII– REUNION


GTX 1060 and AMD RX 5500 XT

a plague tale requiem dam this is one hardest that RTX 3070 is beating RX 6800 XT.


All new engines are that being port is favoring nvidia not because Nvidia has done excellent but because Nvidia share on Steam user is beyond 76% and AMD less than 15%.

Even RTX 3080 user are more than RDNA 2 overall users on steam.


That is why i said that PC gaming is together a different story.


For PC gaming now developers are making a totally new trailer and what features they are giving. When developers ports it's game or code to PC first he see which user to market too and he cannot simply ignore Nvidia because he sale relies on 75%+ users not 14%. That is why now developer do not care to optimize for AMD but for Nvidia it is sale point for them.
 

zinfamous

No Lifer
Jul 12, 2006
110,806
29,557
146
So why i am saying again is that Turing and Ampere are fine wine?

Look upcoming Games Requirements.

Silent Hill 2 requires Nvidia RTX 2080 and for AMD it requires RX 6800 XT dam.

Dead Space Remake.

Requires RTX 2070 and for AMD side it requires RX 6700Xt.

.....

Uh, you shouldn't take these as gospel. Especially if it "requires" those two completely disparate cards. This has happened in the past when you see this nonsense published on game requirements. You know that they have no experience with the other card, and are just tossing name salad at the walls because they are so brainholed into the nVidia system.

For PC gaming now developers are making a totally new trailer and what features they are giving. When developers ports it's game or code to PC first he see which user to market too and he cannot simply ignore Nvidia because he sale relies on 75%+ users not 14%. That is why now developer do not care to optimize for AMD but for Nvidia it is sale point for them.

This is also a weird comment, because PC gaming only is itself extremely limited. You talk about the market being paramount: 75% of users vs 15%; but "PC Gaming only" is itself an extreme niche. It seems that you are aware that developers are making games for AMD first because console dominates, so they already have the bulk of the gaming world under their hats. They are already optimized, for the most part, for AMD.

Yeah, there is a good bit more work to optimize for better hardware from console to PC, I think they are just mostly done with what is needed for AMD. I also suspect that the reason the marketing geniuses publish bizarre requirements for AMD is that they don't know the comparables for PS5 and XBXS--which I think are more like 6550 or something? They probably just assume "oh, it must be 6800."
 
Last edited:

Heartbreaker

Diamond Member
Apr 3, 2006
4,262
5,259
136
Sorry, but without going into details about my video usage needs and scenarios, I can only point out that a beefier CPU is not always the easiest, cheapest or fastest path to take (and it's definitely NOT the quietest, either!)

In the end, every user has their own (generic or particular) requirements.

And in these days of rising power rates in many places, CPU is not the most power efficient way to encode either.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,262
5,259
136
So what feature does AMD not have?

They said "inferior", not missing features.

From what I have seen:
FSR is inferior to DLSS.
AMD RT performance is inferior to NVs RT.
AMD Compute is inferior to CUDA.
AMF is inferior to NVenc.
AMD Deep Learning performance is inferior.

This means for people that want to do more than Raster gaming, AMD may take a backseat in one or more of these areas they consider important.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
sometimes is just a little annoying being an AMD user, like I'm trying to stream on discord, and they support hardware video encoding for Nvidia, not on my AMD card, I'm playing my favorite and slightly obscure MMO, and the ambient occlusion option works fine on Nvidia but has been glitched for years on AMD and they don't bother touching it,
there are advantages to having what 70%+ of other gamers are using sadly :/

other than that I always preferred the higher perf per $ ATI/AMD has historically offered,

Discord only just recently enabled the ability to use hardware acceleration for streaming. Before that, it was CPU only for everybody. And even now, pretty much everybody agrees to not use GPU encoding because it results in pretty bad video.
 
Reactions: Tlh97 and Leeea

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
They said "inferior", not missing features.

From what I have seen:
FSR is inferior to DLSS.
AMD RT performance is inferior to NVs RT.
AMD Compute is inferior to CUDA.
AMF is inferior to NVenc.
AMD Deep Learning performance is inferior.

This means for people that want to do more than Raster gaming, AMD may take a backseat in one or more of these areas they consider important.

FSR 2.0 is certainly not inferior to DLSS. In most cases, they are very similar. With other cases favoring one or the other. Sometimes FSR is better, sometimes DLSS has the edge. But on a whole, they look and perform similar enough to not be readily distinguishable.

RT is inferior on 6K series cards, its AMD's first generation. We will see how the 7K cards do. But even on nVidia cards, RT is still rarely worth the performance hit.

Your comment on compute doesn't make sense. You say AMD compute is inferior, and then your compare their ability to handle certain types of math on the GPU to a programming framework. CUDA is just that, a framework for devs to use. It has nothing to do with compute performance, or what types of compute the cards can handle. And currently, in the HPC space, there has been a lot of pushback against CUDA because of the vendor lockin.

AMF got updated just recently and all the reviews I have seen of it say that its on par with nvenc.

AMD's deep learning performance is a bit behind nVidias. But I would venture to say the number of gamers that also work with AI to be extremely low. Like, tenths of a percent, if its even that.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |