Starting this topic so as not to diverge much from the Forza 7 benchmark thread where references to other games seem to have become the larger focus re image quality.
As many of us (old enough) to recall, Nvidia was once caught cheating back in 2003 IIRC with their failed FX series and a driver release that turned down settings to gain a higher futuremark score. It was quickly spotted by many and NV were rightly pilloried over this. I know of no other instance where that may have occurred over the years. This unfortunately seemed to have given birth to the myth that AMD have 'superior image quality' to this day and with many users tirelessly scrounging the internet for any perceived differences whether false, different settings used, non-reproducible in other instances or the rare driver glitch that is usually quickly rectified.
One setting that may (or may not depending on game) which could be cited as affecting IQ is anisotropic filtering. This setting (with very old GPU architectures) used to incur a larger performance penalty, so was left to the user to determine optimal setting in driver CP. As of 2004/2005 and onwards (with Nvidias 6xxx series), AF was no longer a taxing setting for GPUs, but was still left to users as an 'application controlled' setting by default in the driver CP. I have always set that to the max (AFx16) upon every driver install regardless because it did seem to improve IQ and level of detail in most cases.
In my view, image quality is a HUGE matter! I would rather choose a much weaker GPU with better IQ if it was proven that that one brand cannot match the other in that area. If both manufacturers do their work properly, everything should look equally good in all games. There are still areas that may be open for debate like different AA modes or settings, where each brand may follow different philosophies. But I dont believe that is definitive or consistent with all games equally, where one brand comes out as decidedly superior to the other.
Thoughts?
As many of us (old enough) to recall, Nvidia was once caught cheating back in 2003 IIRC with their failed FX series and a driver release that turned down settings to gain a higher futuremark score. It was quickly spotted by many and NV were rightly pilloried over this. I know of no other instance where that may have occurred over the years. This unfortunately seemed to have given birth to the myth that AMD have 'superior image quality' to this day and with many users tirelessly scrounging the internet for any perceived differences whether false, different settings used, non-reproducible in other instances or the rare driver glitch that is usually quickly rectified.
One setting that may (or may not depending on game) which could be cited as affecting IQ is anisotropic filtering. This setting (with very old GPU architectures) used to incur a larger performance penalty, so was left to the user to determine optimal setting in driver CP. As of 2004/2005 and onwards (with Nvidias 6xxx series), AF was no longer a taxing setting for GPUs, but was still left to users as an 'application controlled' setting by default in the driver CP. I have always set that to the max (AFx16) upon every driver install regardless because it did seem to improve IQ and level of detail in most cases.
In my view, image quality is a HUGE matter! I would rather choose a much weaker GPU with better IQ if it was proven that that one brand cannot match the other in that area. If both manufacturers do their work properly, everything should look equally good in all games. There are still areas that may be open for debate like different AA modes or settings, where each brand may follow different philosophies. But I dont believe that is definitive or consistent with all games equally, where one brand comes out as decidedly superior to the other.
Thoughts?