Originally posted by: josh6079
Correct. Since you have a preference towards accurate colors vs. less vibrant colors, your image quality criteria is a subjective matter. It is based on your preference, as you said.
Yep, except my opinion in no shape or form automatically makes it good image quality, as you claimed based on another's preference for oversaturated colors. Again, accurate colors and inaccurate colors are not subjective, they're either accurate or less accurate. This is further emphasized when test subjects claim to base their decision on objective criteria, like color accuracy:
?The video on monitor B seems more saturated, but the color in monitor A looks more accurate.?
Apparently that was a vote in favor of ATI (Monitor A).....so again, its either A or B, the subject is claiming they know which is which, but "accurate" can't mean both if they're saying they prefer ATI and its oversaturated default colors compared to objective criteria like the Adobe (or any other) RGB color spectrum.
Because the difference is not as noticeable to some people. In this case, the test subjects.
In other cases,
enthusiasts. (5th post)
Actually it just shows how flawed this exeriment was, where the test subjects weren't even able to identify the one major difference between ATI and NV IQ despite being explicitly instructed to do so. Its even more surprising they didn't pick up on this difference given some of the comments they made about detail like:
?That one looks sharper. Monitor B looks a little fuzzy, and I think monitor A has better color quality.?
As for enthusiasts, well, lets say I have reason to believe certain "enthusiasts" simply won't disclose problems they have, even if they do notice them. As a recent example, I specifically asked someone who claimed they had no issues with their current hardware/drivers/games about no less than 5 known issues in recent games, and they confirmed all of them by acknowledging they existed or deflecting blame.
Or its possible they think this kind of shimmering is normal, as they simply don't know any better and have never seen the difference on an Nvidia part. The difference is the subjects in that test had a side-by-side comparison, so they have no such excuse.
As such, the AF is the only hallmark for discrepancy. Considering many review sites will compare AA but not AF and some enthusiasts miss such differences, it is valid to say that the difference in AF is something that can be missed as well.
I disagree, I'd say AF vs. no AF is as noticeable if not moreso than no AA, as ugly/blurry textures stick out just as much as jaggies, especially at higher resolutions where AF becomes more important and AA importance diminishes.
It is a scientific finding because the method in retrieving the data was based on the scientific method.
The fact that the test subjects can or can't distinguish something doesn't mean it's not scientific.
Actually it just shows the test was inherently flawed to begin with and returned equally flawed results, as test subjects weren't able to identify the single biggest difference in IQ, one that is particularly noticeable in live action (compared to stills). Instead they focused on details like color using completely subjective criteria and personal preferences.
Again, different people have different preferences, making moves like that a subjective issue.
Originally posted by: Anand Lal Shimpi & Derek Wilson here
It can be quite frustrating to enable a high anisotropic filtering level to increase the detail of textures only to find them blurred by your AA mode.
If the end result is similar quality, then the lack of TrAA in legacy OpenGL isn't an issue. Sure its subjective, but if 16xS produces similar or superior IQ to 8xMS + AAA in OpenGL and similar frame rates, why bring it up as a disadvantage or preference?
Also I find it a bit ironic you'd quote AT when they were actually referring to the problems with ATI's inferior tent and box filters. In the case of Nvidia's mixed-mode AA however, you're actually not just blurring the entire scene, you're also getting more color samples, meaning the end result does tend to look better, particularly in older OpenGL games with smaller and less detailed textures.
Perhaps. I can't be sure if setting the LOD bias to "0" in ATT or CCC is the equivalent to "Clamping" it on nVidia's control panel. It was probably a stretch to say that it was, I just assumed.
But, do you have a link that proves this?
I'm assuming that if one disables Catalyst A.I. and sets the value to "0" (where it's already at) it does prevent the game from forcing a negative (or positive) LOD bias. Disabling Catalyst A.I. ultimately forces settings made in the driver to overwrite the application. (most of the time).
Just search for Radeon texture crawling/shimmering or ATI LOD Bias clamp and that should bring up a few threads discussing lack of LOD clamp, and how the ATT option just sets LOD Bias to 0, but does not prevent negative LOD bias if the game itself forces it.
Regardless, I doubt that is the issue between AFs. I suspect the issue is in the algorithm of the AF itself. As BFG10K mentioned, the pattern is just not as tight as nVidia's current AF.
Certainly, ATI also underfilters more than Nvidia which is why the LOD clamp by itself probably wouldn't completely solve the problem, which again points to inferior AF/IQ.
Again, I don't believe the LOD is the problem. It's the AF itself.
Such discussions were used back when the situations were reversed. GeForce 7 had noticeable inferior AF than X1k, and LOD clamps aside, it came down to being the AF.
Now nVidia has a tighter pattern - an effectively finer filter.
As covered above, part of the problem is certainly ATI underfiltering, but the lack of LOD bias clamp is certainly part of the problem, as removing it on Nvidia parts will result in finer textures but more shimmering.
As for Nvidia, let's finish the thought: "effectively finer filter" resulting in better IQ. Now that wasn't so hard, was it?
Turns out I was correct. Both AF are angle independent.
The sum of my existence in this thread is to illustrate that with the current hardware, the differences between image quality are trimmed down to the point where user preference is more of the criteria that needs to be met. As such, image quality is, for now, largely subjective.
I guess there is some confusion with regard to what reviewers are calling angle-dependent or independent. My reference was to ATI parts still showing inferior filtering at 45-degree angles, particularly noticeable off-center or on angled textures. Its certainly better than the lower quality filtering on R5XX, but its still the same AF algorithm, except only High-Quality is used now.
As for this thread, again, I don't disagree that many of the differences do come down to user preference, particularly colors if you're going to use completely subjective criteria to begin with. With AF its still quite obvious Nvidia has the lead here both in theory and practice. AA is somewhere inbetween, with both subjective and technical merits to the different modes used. I just laugh when people claim ATI still holds the IQ crown after all these years, when that clearly hasn't been the case since G80 launched over 2 years ago.