Question Why does the overall gaming GPU market treat AMD like they have AIDS?

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

VirtualLarry

No Lifer
Aug 25, 2001
56,450
10,119
126
I guess I get the (sub-liminal) "The way it's meant to be played" ads from NVidia, along with the recurring FUD tropes about "AMD drivers", but I honestly don't get the sales disparity, especially for the price.

I've owned both NVidia-powered as well as AMD powered GPUs, and IMHO, AMD is (generally) just as good. Maybe 99% as good.

Edit: And I think that there's something to be said about the viability of AMD technologies, when they're in both major console brands.
 

Leeea

Diamond Member
Apr 3, 2020
3,704
5,434
136
That still doesn't make it a driver issue. EVGA analyzed their cards and reported it.

Let me know what you find out about others if it's that important to you.
We are going to have to agree to disagree.

The previous article I linked pointed out it was far more widespread then that, and not just limited to EVGA cards.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,262
5,259
136
We are going to have to agree to disagree.

The previous article I linked pointed out it was far more widespread then that, and not just limited to EVGA cards.

The bulk of the cards were EVGA 3090's and that was found and diagnosed HW issue.

Yes, it was more widespread among other cards, but that was smaller numbers, including AMD cards:


The graphics cards that he listed include the AMD Radeon RX 590, Radeon RX 6800, Radeon RX 6800 XT, Radeon RX 6900 XT, and NVIDIA's GeForce RTX 3080 Ti, and GeForce RTX 3090 graphics cards.


No doubt you still blame this on NVidia drivers.

 
Reactions: ozzy702 and Leeea

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,882
3,230
126
yes.

But the issue was a bit more widespread then just EVGA rtx 3090s.

actually its more then solder faults.

as much as i hate JayZ, he also found eVGA cards pulled WAYYYYYYYYYYYYYYY more power then what was allowed.

the Game actually broke the bios on the card and forced it to pull way more power then was set in.
I also noticed this issue when i tried to run the game at a lower resolution during wars, so i wouldn't lag so much.
The card drew so much amps, it blew the transient power spike on my 1200W seasonic, which i replaced to an eVGA 1600W.
 

Leeea

Diamond Member
Apr 3, 2020
3,704
5,434
136
The bulk of the cards were EVGA 3090's and that was found and diagnosed HW issue.

Yes, it was more widespread among other cards, but that was smaller numbers, including AMD cards:





No doubt you still blame this on NVidia drivers.

Yep.

At the time I thought Jay was shilling for EVGA and still do.
 
Reactions: scineram

Leeea

Diamond Member
Apr 3, 2020
3,704
5,434
136
He was shilling for EVGA, by pointing out most of the dead cards were EVGA, caused by bad EVGA HW?

Logic FTW!
Pretty much.

Jay and EVGA were always pretty close.


EVGA had lots of issues as you say. But to be fair, most of the cards in Jay's audience were EVGA, so it made sense his audience would be having lots of issues with EVGA cards.


The way I figured it, Jay went and did damage control for EVGA, by claiming everyone's cards were having issues, trying to smear the mess around and spin the subject to be a New World problem rather then an EVGA problem.



But then the issue was fixed for everyone, by a Nvidia driver update. Curious that, even the AMD cards were fixed by Nvidia updating their driver.

So yea, calling shenanigans on that one.


It would not be the first time nvidia paid a bunch of shills to smear AMD on forums and reddit:
https://gizmodo.com/nvidia-shills-on-hardware-forums-152901
The way that smear everyone campaign ended the instant nvidia fixed their driver was a bit to convenient for me.
 
Last edited:

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,729
136
Benchmarks with average FPS don't show the issue. The only way to see stuttering in benchmarks is to look at 99th percentile frames, or to use frame time graphs with FCAT.

And technically, it could happen at anytime. Before the driver changes, memory could be put into that last 512MB at anytime. This is why it took people a while to figure out what was going on. It was really sporadic. For games that use over 3.5GB, its all the time.

The driver changes that nVidia put out after all of this prioritized the first 3.5GB, and it would only use the last 512MB if it absolutely had to.

But the issue itself was *NOT* a corner case. If it was, its likely it never would have become as big of an issue as it did.
The prioritization of 3.5 GB happened all the time, even before the controversy blew up. People noticed the card not using more than 3.5-3.6 GB in games like heavily modded Skyrim. To really cause noticeable stuttering you'd have to use placebo settings like 8x MSAA in games like Far Cry 4 at high resolutions. And like I said before, it really cropped up in DX 12 titles with high VRAM pressure, but then it also was an easy solution with DX 11 fallback.
 

Thunder 57

Platinum Member
Aug 19, 2007
2,814
4,108
136
I said I was sorry for going off topic, it’s BS to think GSync is dead, and you don’t think a Geek can tell a difference in response of 1ms vs 20ms? It doesn’t matter if it’s old inventory, I’ve used a lot of monitors in my day, and there is absolutely nothing wrong with it for 2K Gaming.

Before spewing fud over hardware, use it, then come back and have an intelligent computer conversation over the facts.

Old inventory doesn’t always mean it’s irrelevant.

If you were following the post, someone else mentioned AMD as having an Inferior Feature Set, I was only asking a question to inquire about this, I was not the one saying it.

How long have you been using Linux? I’ve been using Linux since 1999, and I’ve used every major distro out there and then some.

I’m not a causal Linux user, I’m an Extreme Slackware Linux Geek.

I don’t know where you get you information from, but you are not talking at all to a Linux Noob here, and you are completely wrong about AMD.

Linus’ bitch about Nvidia has nothing to do anything.

Anyhow, who cares, use what you like, is what I always say.

Linux is Linux, what one can do, they all can do, it’s just a personal preference!

Oh and did you know that Technically speaking, Linux is the kernel, not the complete OS.

And don’t take me so seriously, I respect my fellow PC Geeks, but I just had to rough you up a bit is all, in good banter! Just try not to talk so much fud next time. LOL 😆

All I know is I just bought a 3080, so I’ll be sitting in Nvidia’s boat for a while, gently rowing down the stream! LOL 😆

P.S. Sorry for the rant and going off topic.

I think you have me confused as some noob here, so you don’t need to give me any wakeup calls, I’ve been at this stuff for over 20 years, probably longer than you are older.

I know all the Monitor tests out there, since they’ve been around, and I can tell the difference in response, because higher ms does have greater input lag.

1ms to 20ms is very noticeable.

Yes It’s has 1ms response...

Anyhow, enough chatter off topic...

Could you be more off-putting? This whole "I know everything and then some" and "You must be a young n00b who doesn't know squat" is really... sad. You almost sound like a narcissist.
 

Thunder 57

Platinum Member
Aug 19, 2007
2,814
4,108
136
The prioritization of 3.5 GB happened all the time, even before the controversy blew up. People noticed the card not using more than 3.5-3.6 GB in games like heavily modded Skyrim. To really cause noticeable stuttering you'd have to use placebo settings like 8x MSAA in games like Far Cry 4 at high resolutions. And like I said before, it really cropped up in DX 12 titles with high VRAM pressure, but then it also was an easy solution with DX 11 fallback.

So NVIDIA releases the 1060 6GB and a cut down 1060 3GB with less cores alongside the VRAM. Was that OK? Misleading people buying the 3GB version thinking it will perform as well as the 6GB, just with less memory?
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,882
3,230
126
He was shilling for EVGA, by pointing out most of the dead cards were EVGA, caused by bad EVGA HW?

Logic FTW!
funny ending tho, it was only the cards i think limited to 28 of them in total that came with a red lip on the stock ftw heat sink.
I had to dig though my box of "thou shall not revisit air gpu heatsink" bucket to pull it out and confirm mine was not a red lipped one.
So even after it pulled that insane amount of wattage and tripped my PSU transient, I almost raged like that fat little kid on youtube meme, thinking i might have blown out my gpu, but it booted right back up.

However i leave it to amazon on how badly they coded the game to the point where they managed to break the bios on my EVGA 3090 FTW3 Ultra so that it would ignore all the safety's and end up pulling almost 140% power.

But i still love and will miss eVGA.
They were always my GOTO vendor with videocards, as they had the most relax'd water cooling policy.
Now i need to find a new brand to stick to, which STRIX is pulling me to.
But AMD STRIX vs NVIDIA STRIX is not the same, even tho it has the same STRIX brand.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,729
136
So NVIDIA releases the 1060 6GB and a cut down 1060 3GB with less cores alongside the VRAM. Was that OK? Misleading people buying the 3GB version thinking it will perform as well as the 6GB, just with less memory?
AMD did the same thing with the RX 560. AMD, therefore, also misled people buying that card, not telling clearly that there are cut-down versions along with the full-fat GPU. Companies segment their products - this is nothing new.
 

Thunder 57

Platinum Member
Aug 19, 2007
2,814
4,108
136
AMD did the same thing with the RX 560. AMD, therefore, also misled people buying that card, not telling clearly that there are cut-down versions along with the full-fat GPU. Companies segment their products - this is nothing new.

What did they lie about with the 560?
 

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,729
136
Reactions: SMU_Pony and Leeea

Thunder 57

Platinum Member
Aug 19, 2007
2,814
4,108
136

Leeea

Diamond Member
Apr 3, 2020
3,704
5,434
136
What did they lie about with the 560?
They used a cut-down version of the GPU later into the product cycle without announcing a name change. If you think NVIDIA was wrong to have launched two versions of the 1060, AMD is equally wrong to have done the same in this case.


Also:

This one was far more disgraceful in my opinion.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,450
10,119
126
Also:
This one was far more disgraceful in my opinion.
If we're going with China-only cards, what about the GTX 1060 5GB?
 

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
7,548
2,546
146
I have used both Geforce and Radeon cards over the years, and will continue to consider either, but I will touch on a matter that has annoyed me about Nvidia recently...

Now I know some of you would say that DLSS is the superior tech, or maybe Gsync is where it is at, as opposed to other adpative sync tech. But I would actually argue that since they are somewhat proprietary, they are actually inferior in some ways. I would much rather have open source standards than proprietary tech. (another reason to avoid Apple, as mentioned earlier)

FSR works on both Geforce and Radeon cards, and the adaptive sync "Freesync" monitors support it on both Radeon and Geforce cards. I prefer to be able to buy what I want, without having to worry about stupid "ecosystems". So I would never buy a solely Gsync monitor for instance.

As for the cards themselves, the 3090 is a bit better than the 6900XT for the most part, but tbh, I would have been perfectly happy with a 6800XT at launch, if I could have found one around MSRP. That was the card I wanted to get, if only they were available for $650 or so instead of the $2000 I paid for my 3090, when it was the only thing available in that performance range.
 

DasFox

Diamond Member
Sep 4, 2003
4,668
46
91
I am going to call you a noob, because you just confused response time with input lag. The two specs are not remotely the same. Though they are interrelated.

I wasn’t talking about Input Lag, I was referring to the monitor having a 1ms GtG response time.

Not sure why you thought I was talking input.

 
Last edited:

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I wasn’t talking about Input Lag, I was referring to the monitor having a 1ms GtG response time.

Not sure why you thought I was talking input.


No LCD actually gets 1ms response time. Its a made up stat. Even the very fastest monitors using overdrive barely get to 3ms. And in this mode its a washed out mess that looks terrible.

If you like the monitor, that's fine. But don't call out people and throw around numbers that are invented by marketing teams in a tech forum.
 

maddie

Diamond Member
Jul 18, 2010
4,787
4,771
136
No LCD actually gets 1ms response time. Its a made up stat. Even the very fastest monitors using overdrive barely get to 3ms. And in this mode its a washed out mess that looks terrible.

If you like the monitor, that's fine. But don't call out people and throw around numbers that are invented by marketing teams in a tech forum.
Man, I'm waiting for those 1000 FPS GPUs & CPUs to really drive this baby. Talk about being ahead of its time.
 

DasFox

Diamond Member
Sep 4, 2003
4,668
46
91
No, we are referring to GtG.

And it is not 1 ms. Not even close.

Yes GtG and?

Have you tested the MSI Optix G273QPF? All tests and specs I’ve seen listed it as 1ms GtG.

Of course I understand how companies list monitors like this, but supposedly their Rapid IPS Panel technology is suppose to deliver this.

If you don’t actually have the monitor and you haven’t tested it, then how are you so certain Rapid IPS can’t do this? 🤔

@Stuka87 I’m not calling out anyone, I was only merely making a reply in regards to this, and I understand how companies have always been marketing this, but as I mentioned Rapid IPS is suppose to be changing this.

I see there are various ways to test this, is there a good method everyone’s in agreement on for testing? 🤔
 
Last edited:

maddie

Diamond Member
Jul 18, 2010
4,787
4,771
136
Yes GtG and?

Have you tested the MSI Optix G273QPF? All tests and specs I’ve seen listed it as 1ms GtG.

Of course I understand how companies list monitors like this, but supposedly their Rapid IPS Panel technology is suppose to deliver this.

If you don’t actually have the monitor and you haven’t tested it, then how are you so certain Rapid IPS can’t do this? 🤔

@Stuka87 I’m not calling out anyone, I was only merely making a reply in regards to this, and I understand how companies have always been marketing this, but as I mentioned Rapid IPS is suppose to be changing this.

I see there are various ways to test this, is there a good method everyone’s in agreement on for testing? 🤔
Same panel, different stand. Features as listed by MSI is identical.
MSI Optix G273QF
  • Rapid IPS – Provides 1ms GTG fast response time, optimizes screen colors and brightness.
  • WQHD High Resolution – Games will look even better, displaying more details.
  • 165Hz Refresh Rate – Respond faster with smoother frames.
  • 1ms GTG Response Time – Eliminate screen tearing and choppy frame rates.
  • NVIDIA G-SYNC Compatible – Prevent screen tearing or stuttering, producing ultra-smooth lag-free gameplay.
  • Night Vision – Smart black tuner to brighten your day by bringing out the fine details in dark areas.
  • Wide Color Gamut – Game colors and details will look more realistic and refined.
  • Game Mode – Choose the best mode to experience the best visual effects.
  • Frameless Design – Enjoy the ultimate gaming experience with super narrow bezels.
  • Anti-Flicker and Less Blue Light – Game even longer and prevent eye strain and fatigue.
  • 178° Wide Viewing Angle – Colors and details will stay sharp at more angles with a 178° wide viewing angle.
MSI Optix G273QPF
  • Rapid IPS – Provides 1ms GTG fast response time, optimizes screen colors and brightness.
  • QHD High Resolution – Games will look even better, displaying more details.
  • 165Hz Refresh Rate – Respond faster with smoother frames.
  • 1ms GTG Response Time – Eliminate screen tearing and choppy frame rates.
  • NVIDIA G-SYNC Compatible – Prevent screen tearing or stuttering, producing ultra-smooth lag-free gameplay.
  • Night Vision – Smart black tuner to brighten your day by bringing out the fine details in dark areas.
  • Wide Color Gamut – Game colors and details will look more realistic and refined.
  • Game Mode – Choose the best mode to experience the best visual effects.
  • Frameless Design – Enjoy the ultimate gaming experience with super narrow bezels.
  • Anti-Flicker and Less Blue Light – Game even longer and prevent eye strain and fatigue.
  • 178° Wide Viewing Angle – Colors and details will stay sharp at more angles with a 178° wide viewing angle.
RTINGS.com MSI Optix G273QF Monitor Review

Response Time @ Max Refresh Rate

 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |