AMD & NV image quality comparison

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
That's what its supposed to do. Let app decide. Not override it. On AMD default, you control it in-game. So Ultra = Ultra.



At least [H] confirms they bench with everything on default in CC and NVCP. I agree with that because its what I would have done as well to ensure its "fair". Except with this bug, it's very far from fair.


Bug is a generous descriptor. Why would an alleged world class driver team make such a mistake -assuming this is legit of course ?
 

Osjur

Member
Sep 21, 2013
92
19
81
So the old saying that AMD has better image quality still holds candle to this day:

-In the past ATI had better VGA quality because they used better RAMDAC
-They started using 10bit internal LUT from 5xxx series onwards which gives you better monitor picture quality with calibrated monitors because you get less banding. Nvidia still uses 8bit LUT on their consumer cards.
-In 2014 they upped the internal LUT to 12bit, meaning even less banding with calibrated monitors. I can now calibrate my 10bit Direct Drive HP ZR30w monitor (no internal LUT / scaling board) and get no banding at all.
-And now it seems they even have better texture quality at default settings.

No wonder why all my nvidia cards have looked like shit :hmm:
Did I just say that out loud :awe:
 
Last edited:

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
So the old saying that AMD has better image quality still holds candle to this day:

-In the past AMD had better VGA quality because they used better RAMDAC
-They started using 10bit internal LUT from 5xxx series onwards which gives you better monitor picture quality with calibrated monitors because you get less banding. Nvidia still uses 8bit LUT on their consumer cards.
-In 2014 they upped the internal LUT to 12bit, meaning even less banding with calibrated monitors. I can now calibrate my 10bit Direct Drive HP ZR30w monitor (no internal LUT / scaling board) and not get banding at all.
-And now it seems they even have better texture quality at default settings.

No wonder why all my nvidia cards have looked like shit :hmm:
Did I just say that out loud :awe:


There is a very obvious reason most reviewers don't do comparative iq tests, they look the same. Not quite sure about your assertion that nv looks terrible.
 
Feb 19, 2009
10,457
10
76
Bug is a generous descriptor. Why would an alleged world class driver team make such a mistake -assuming this is legit of course ?

It's a bug until more info suggests otherwise.

I'm known to have an anti-NV stance here (not unwarranted, it's been the GW thing that led me to boycott NV), but I will consider it a bug that needs fixing. If its widespread and happening for so long, then I will consider it cheating, something that is anti-gamer because you pay for expensive hardware to crank settings up and you get ass "donkey" quality if you left it on driver default.

It would also be anti-competitive since its cheating IQ to get faster performance in benchmarks.
 

Osjur

Member
Sep 21, 2013
92
19
81
There is a very obvious reason most reviewers don't do comparative iq tests, they look the same. Not quite sure about your assertion that nv looks terrible.

All my NV cards have looked comparatively worse on this monitor because they don't have enough bits in their LUT to properly calibrate this monitor which is pure 10bit display without internal dithering or FRC.

Only Quadro on my laptop has looked the same when compared to AMD cards.
 

greatnoob

Senior member
Jan 6, 2014
968
395
136
It's a bug until more info suggests otherwise.

I'm known to have an anti-NV stance here (not unwarranted, it's been the GW thing that led me to boycott NV), but I will consider it a bug that needs fixing. If its widespread and happening for so long, then I will consider it cheating, something that is anti-gamer because you pay for expensive hardware to crank settings up and you get ass "donkey" quality if you left it on driver default.

It would also be anti-competitive since its cheating IQ to get faster performance in benchmarks.

How can hard-set default settings = bug. Programmers set default values, nothing is left to guesswork for us; do you really think these settings are randomly chosen?
Clearly, you can see there is a difference between the two qualities so that confirms it is a functional setting - now all that's left in the equation are the chosen defaults. Are you telling me mobile-esque crappy trilinear AF is acceptable for hardware you've paid anywhere from $150-1500 for?
 
Last edited:

loccothan

Senior member
Mar 8, 2013
268
2
81
loccothan.blogspot.com
Hre is mine for BF4 (campaign) Ultra noAA, 1792:1344 85Hz (little more pixels than 1080p) also on Radeon is DX11.1 (in Mantle i cannot use RP) No complains about Picture Details here ;-)
[/spoiler]

 

amenx

Diamond Member
Dec 17, 2004
4,012
2,283
136
So the old saying that AMD has better image quality still holds candle to this day:

-In the past ATI had better VGA quality because they used better RAMDAC
-They started using 10bit internal LUT from 5xxx series onwards which gives you better monitor picture quality with calibrated monitors because you get less banding. Nvidia still uses 8bit LUT on their consumer cards.
-In 2014 they upped the internal LUT to 12bit, meaning even less banding with calibrated monitors. I can now calibrate my 10bit Direct Drive HP ZR30w monitor (no internal LUT / scaling board) and get no banding at all.
-And now it seems they even have better texture quality at default settings.

No wonder why all my nvidia cards have looked like shit :hmm:
Did I just say that out loud :awe:
Bollocks. There is no content in gaming, movies or general apps that can make use of 10-bit. Only Adobe Premier Pro/CS5 use 10-bit OpenGL which AMD support but with Nvidia you need a Quadro card. Nvidia Geforce are capable of 10-bit on DirectX full screen surfaces (since 200 series) but again, there is nothing that can make practical use of it either for Nvidia or AMD yet. ArgyllCMS calibration software is 10-bit capable, but again no point if just about everything outside of it is not 10-bit (gaming, programs, general use). So banding can only be an issue if you have Adobe Premier/CS5 to show it on 10-bit monitors.
 
Last edited:

amenx

Diamond Member
Dec 17, 2004
4,012
2,283
136
Re the AF and texture filtering, AF has minimal impact on performance, maybe 1 or 2% if that. Texture filtering when going from Quality (default) to High quality there is an impact on Nvidia cards, and that I believe is around 10% or even more. My comment is only based on personal testing in 3dmark and observing resulting scores with such settings enabled. Even that was a few years ago, not sure of current impact, but will test it later on.
 

Osjur

Member
Sep 21, 2013
92
19
81
Bollocks. There is no content in gaming, movies or general apps that can make use of 10-bit. Only Adobe Premier Pro/CS5 use 10-bit OpenGL which AMD support but with Nvidia you need a Quadro card. Nvidia Geforce are capable of 10-bit on DirectX full screen surfaces (since 200 series) but again, there is nothing that can make practical use of it either for Nvidia or AMD yet. ArgyllCMS calibration software is 10-bit capable, but again no point if just about everything outside of it is not 10-bit (gaming, programs, general use). So banding can only be an issue if you have Adobe Premier/CS5 to show it on 10-bit monitors.

Yes we all know windows can't do more than 8-bit system wide so its all candy and dandy but how do you explain this:
Here's a small sample of the start of my monitor color profile (it's actually a text file!):

0.00000000 0.00000000 0.00000000 0.00000000
0.00392160 0.00115970 0.00012207 0.00000000
0.00784310 0.00402840 0.00308230 0.00296030
0.01176500 0.00695810 0.00608830 0.00610360
0.01568600 0.00993360 0.00915540 0.00929270
0.01960800 0.01294000 0.01228400 0.01252800
0.02352900 0.01600700 0.01545700 0.01583900
0.02745100 0.01912000 0.01867700 0.01919600
0.03137300 0.02226300 0.02197300 0.02262900
0.03529400 0.02548300 0.02531500 0.02612300

This is called a gamma ramp / curve.

Range: 0 --> 1, for the sake of simplicity I will assume in my examples that the range is 0 --> 255 (8-bit per color channel).

The graphics card loads up the matrix in a Look-up Table, where it basically converts the input color signals to the corrected output signals on-the-fly.

Now here's where it gets confusing:

AMD's consumer cards support 10-bits per color channel. Nvidia's cards don't; they only support 8-bits per color channel. Now, how is this relevant when you have an 8-bit per color channel monitor, you might ask?

Look up, at the matrix. Notice how many digits there are after the decimal point. These number contain enough precision that even 16-bits per color channel do not cover them. So, that means 8 or 10 bits are taken, and the rest are dumped (not considered).

Here's how it looks when mapped from 0 --> 255:

0 0 0 0
1 0.295724 0.0311279 0
2 1.02724 0.785987 0.754876
3 1.77432 1.55252 1.55642
4 2.53307 2.33463 2.36964
5 3.2997 3.13242 3.19464
6 4.08179 3.94154 4.03895
7 4.8756 4.76263 4.89498
8 5.67707 5.60311 5.77039
9 6.49816 6.45533 6.66137

Here's what it looks when only 8 bits are taken (after rounding is performed):

0 0 0 0
1 0 0 0
2 1 1 1
3 2 2 2
4 3 2 2
5 3 3 3
6 4 4 4
7 5 5 5
8 6 6 6
9 6 6 7

Duplicate numbers in sequence? Inability to display certain shades? Banding? Whoops.

Here's how it looks like when 10 bits are taken (after rounding is performed):

0 0 0 0
1 0.25 0.00 0.00
2 1.00 0.75 0.75
3 1.75 1.50 1.50
4 2.50 2.50 2.50
5 3.25 3.25 3.25
6 4.00 4.00 4.00
7 5.00 4.75 5.00
8 5.75 5.50 5.75
9 6.50 6.50 6.75
10 7.50 7.50 7.50

Less gaps, smoother curve, less banding. Awesome.

BUT...monitor is still 8-bit...sooo what AMD does is use dithering from the 10-bit LUT to the 8-bit monitor. Banding is gone!

EDIT: why SPOILER tags do not work. Me not understand :|
EDIT2: Sorry for slight offtopic.
 
Last edited:
Feb 19, 2009
10,457
10
76
Re the AF and texture filtering, AF has minimal impact on performance, maybe 1 or 2% if that. Texture filtering when going from Quality (default) to High quality there is an impact on Nvidia cards, and that I believe is around 10% or even more. My comment is only based on personal testing in 3dmark and observing resulting scores with such settings enabled. Even that was a few years ago, not sure of current impact, but will test it later on.

The difference on Titan X, 1080p, in BF4 is ~10%.

https://www.youtube.com/watch?v=a2IIM9fncqc
 

casiofx

Senior member
Mar 24, 2015
369
36
61

Screenshot provided by GTX 980 user on NCP at default settings. Looks fine on this.
 

Zanovar

Diamond Member
Jan 21, 2011
3,446
232
106
I think it should be looked into and continuously be looked into rather than juat raw fps only*shrug*.
 
Last edited:

amenx

Diamond Member
Dec 17, 2004
4,012
2,283
136
Yes we all know windows can't do more than 8-bit system wide so its all candy and dandy but how do you explain this:

EDIT: why SPOILER tags do not work. Me not understand :|
EDIT2: Sorry for slight offtopic.
Not sure, may be best to ask Florian Hoech, author of DispcalGUI. He is always helpful and can give you a far more informed answer than I can.

Banding was an issue I often complained about with cheaper monitors (less than 8-bit), but completely gone when switched to better quality 8-bits. Also have a 8-bits + FRC display (AH-IPS) which extends color space further ('virtual 10-bit') and have not seen a difference between AMD and Nvidia cards on it.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,726
1,342
136
Benchmarks should be run with both vendors set to the highest quality level in the drivers. What one vendor calls default may not be (and probably isn't) the same as another vendor. On the other hand, with everything set to the highest quality there shouldn't be any performance optimizations that noticeably reduce IQ (at least not in this day and age).

Of course, looking at IQ under a magnifying glass is something sites should still do now and then to make sure everyone is playing honest.
 

omek

Member
Nov 18, 2007
137
0
0

Screenshot provided by GTX 980 user on NCP at default settings. Looks fine on this.

NVIDIA forced AF


That picture only furthers the problem
1. The photographer is closer in the picture you posted, the AF cut off is further away but it's still there
2. There are multiple missing objects in the picture you posted. Immediately you notice the missing paper decals on the road but take a look at the yellow stop lights, the missing lamp poles and that yield sign in the distance

That's a good way to free up draw calls.

--
http://forums.anandtech.com/showpost.php?p=37533843&postcount=187
Object LOD is bound with FOV.
 
Last edited:

Zanovar

Diamond Member
Jan 21, 2011
3,446
232
106
Benchmarks should be run with both vendors set to the highest quality level in the drivers. What one vendor calls default may not be (and probably isn't) the same as another vendor. On the other hand, with everything set to the highest quality there shouldn't be any performance optimizations that noticeably reduce IQ (at least not in this day and age).

Of course, looking at IQ under a magnifying glass is something sites should still do now and then to make sure everyone is playing honest.

You dont need a magnifying glass*coughs*
 

Udgnim

Diamond Member
Apr 16, 2008
3,664
111
106
NVIDIA forced AF


That picture only furthers the problem
1. The photographer is closer in the picture you posted, the AF cut off is further away but it's still there
2. There are multiple missing objects in the picture you posted. Take a look at the yellow stop lights, the missing lamp poles and that yield sign

can take a look at the hand glove textures

the "Screenshot provided by GTX 980 user on NCP at default settings. Looks fine on this." glove version looks blurrier than the "Nvidia forced AF" glove version

could be due to image compression though
 

RaulF

Senior member
Jan 18, 2008
844
1
81
This situation is kind of ridiculous.

Brent over at HardOCP is discussing the matter. And his comments leave a lot to be desired about their reviews.

http://hardforum.com/showthread.php?t=1867421&page=5
Starts on post #77


Some choice quotes, which I preface with his first quote:




Then the kicker for me.



What is the point of Apples to Apples comparisons if the picture does not look exactly the same or even as close as they can get? I want to say that this one case from OCUK can be the outlier and I want to give Brent the benefit of the doubt, and his reviews are trustworthy. But this really needs more investigation.

I do agree, lots of reviews are just graphs and words, no real comparative pictures, or diving into impacts of different settings.

Well he is sort of right. I'll effect on quality, yes if you take the time and pay really close attention you will find it. But how come no one has notice before? I would say because it is very minute.

I am not trying to defend the practice FYI.
 

greatnoob

Senior member
Jan 6, 2014
968
395
136
NVIDIA forced AF

That picture only furthers the problem
1. The photographer is closer in the picture you posted, the AF cut off is further away but it's still there
2. There are multiple missing objects in the picture you posted. Immediately you notice the missing paper decals on the road but take a look at the yellow stop lights, the missing lamp poles and that yield sign in the distance

That's a good way to free up draw calls.

Nice find! :thumbsup:
 
Feb 19, 2009
10,457
10
76
Well he is sort of right. I'll effect on quality, yes if you take the time and pay really close attention you will find it. But how come no one has notice before? I would say because it is very minute.

I am not trying to defend the practice FYI.

Because at the upper levels of IQ settings, the returns diminish. In most games, going from High to Ultra tanks performance, but provides very little in terms of IQ gains. I've seen this in many recent titles, hence I tweak games to find the best sweetspot on IQ/perf.

All it takes is for one vendor to cheat and make "Ultra" slightly less "Ultra" and most wouldn't notice it, except for the faster performance, because a raw number is more obvious.

This used to happen a long time ago, both ATI & NV did cheating optimizations but we thought the call out back then, meant it no longer happens.

http://www.geek.com/games/futuremark-confirms-nvidia-is-cheating-in-benchmark-553361/

Project Cars:

 
Last edited:

RaulF

Senior member
Jan 18, 2008
844
1
81
Because at the upper levels of IQ settings, the returns diminish. In most games, going from High to Ultra tanks performance, but provides very little in terms of IQ gains. I've seen this in many recent titles, hence I tweak games to find the best sweetspot on IQ/perf.

All it takes is for one vendor to cheat and make "Ultra" slightly less "Ultra" and most wouldn't notice it, except for the faster performance, because a raw number is more obvious.

This used to happen a long time ago, both ATI & NV did cheating optimizations but we thought the call out back then, meant it no longer happens.

http://www.geek.com/games/futuremark-confirms-nvidia-is-cheating-in-benchmark-553361/
:thumbsup:


I know in my experience that AMD always did looked better, even in just text like in word or webpage it was sharper to me compare to Nvidia.

If Nvidia is using the trick to boost their numbers then they deserve to be called out for it.

Now we need some testing to see if setting the control panel to best quality affects performance and to what %.

I currently have a 295X2 and have a Ti inbound. I will try to take some SS if i remember, might be a while, work has been crazy.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
NVIDIA forced AF


That picture only furthers the problem
1. The photographer is closer in the picture you posted, the AF cut off is further away but it's still there
2. There are multiple missing objects in the picture you posted. Immediately you notice the missing paper decals on the road but take a look at the yellow stop lights, the missing lamp poles and that yield sign in the distance

That's a good way to free up draw calls.

For comparison this is default settings by the same user as your screenshot (should take care of issue 1):



It's safe to say that there is a clear difference in AF, and also a clear difference in performance (roughly 10%). We just need someone else to actually replicate this behavior
 
Last edited:

jj109

Senior member
Dec 17, 2013
391
59
91
Yay, time to test my new 980 Ti.

Highest:
http://i.imgur.com/KVWfW4I.jpg

Quality:
http://i.imgur.com/Q2P1KWp.jpg

No differences here.

Also, FOV changes the level of detail in BF4:

default FOV:
http://i.imgur.com/cNeWsz6.jpg

80 FOV:
http://i.imgur.com/lu8NkeQ.jpg

This is how far I had to walk at 80 FOV to see the stop light. I dropped the med pack when I first saw the stop lights at default FOV:
http://i.imgur.com/MweMltq.jpg

So...

1) Gregster needs to fix his computer

2) People should at least know a little about the game before spewing conspiracies all over forums. This reminds me of the hurr PhysX BS from pCARS all over again.
 
Feb 19, 2009
10,457
10
76
For comparison this is default settings by the same user as your screenshot (should take care of issue 1):



It's safe to say that there is a clear difference in AF, and also a clear difference in performance (roughly 10%)

Well that's obvious.

@jj109
Default "Let App Decide" vs High Quality. The bug is on default, not forced settings. Force settings works (thankfully!) lol
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |