There is absolutely nothing wrong with his Titan X. NV and AMD for decades had a setting in the control panel giving gamers a choice between Performance and Quality settings. In fact, this goes back to 1990s and even ATI had it. Any PC gamer worth his salt who buys $500+ flagship cards only games with CCC/NV control Panel set to Quality (unless you are just benchmarking for e-peen). That's why sites like Computerbase, PCGamesHardware, etc. that analyzed this for years would confirm that NV's Performance reduces AF/texture quality IQ in some games. Any time I buy an NV GPU, I always use Quality as I don't care about 10% more performance loss at the cost of reduced texture/filtering IQ.
HardOCP's response that they leave everything on default in NV's Control Panel (i.e., Performance mode) and refuse to set it to Quality to match AMD's Quality just continues to show that the site has completely lost touch with reality about objective GPU testing.
This doesn't change the current recommendation for most gamers who overclock (i.e., after-market 980TI > Fury X) but it shouldn't even be a discussion in 2015 that all GPU testing should be done with CCC/NV control panel texture/filtering set to Quality, no exceptions. If I wanted texture filtering optimizations at the cost of IQ, I'd be gaming exclusively on a PS4, not a $650+ GPU.
As others have pointed out already: Gregster got his result
only by 0x AF, which
isn't by default in the default settings in the NV drivers. There have been multiple people testing with normal/default NV settings and got much better results. The reality is that the standard settings are fine. If you want 0xAF you need to custom-tweak your settings but Gregster hasn't shown SS of what he did in the NV CP. And he's the only guy.
Exactly.
People posting single card results as proof one way or the other are not reproducing the original situation.
Gregster stated exactly what he did. To refute you have to exactly reproduce and either confirm or reject his results.
Stating that I have a Titan X and don't get his results when I do XYZ, proves nothing, and the amazing part is that some posters don't even realize this.
Don't move the goalposts. This applies to the posters who now suddenly claim it's all about only TitanX
and when moving from an AMD GPU while using DDU.
That isn't how this meme was sold, and if you want any evidence of that, read the initial replies in this thread and the other threads, it was all "OMG NV shafts their users again! AMD IQ was always better, 10-bit desktop monitor OMG! I got data 10 years back yu gaiz!".
Well, what happened was that the whole case fell apart. It's true that the
exact way to conduct the experiment hasn't been done, but notice how the case is just shrinking faster and faster as more and more people line up and debunk the whole "NV drivers are artificially crippling IQ for benchmark wins!" meme which is falling apart rapidly.
It may be true that there are issues if you move from an AMD card to an NV card and that it may be isolated with TitanX owners, but we're talking a very, very small group of people and not an overall rule/phenomenom, which is, again, how this whole meme was presented at the outset and pushed for by a lot of pro-AMD folks who have had to repeatedly walk back time and again as more and more data has come through.
So while there may be an issue for TitanX owners switching from AMD GPUs and using DDU, there isn't an overall issue at all and the whole "scandal" has been properly dealt with.
(And anyone who keeps shifting the goalposts and claim that it was about the TitanX only, and only using DDU and only from an AMD card, I say once again: look at the early replies in the thread and from OP, both at Anandtech and other sites. It wasn't that specific, it was "NV drivers are messing up stuff for benchmarks". That tinfoil hat story got destroyed incredibly fast. Make this a teachable moment for yourselves before you spew out generalized conspiracy theories that get wrecked within days).