EVGA GeForce GTX 295+ Review

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: chizow
Originally posted by: dadach
it has been like this for "ages"...even though nvidia was mostly fastest card on the market, if the sheer speed was not the most important, but rather IQ was, people will buy ati, and no ammount nvidia (wannabe) focus grp members will ever change that

at least i know im not the only one...after switch from 7950GX2 -> 1950xt, and 8800gts -> 2900/3870, it seemed like i have bought a brand new monitor that shows sharper and more colorful images...no ammount of screenshots and reviews will ever replace what i saw with my own eyes during the gameplaying...and thats what i tell people when they ask me which company has better IQ...trust me, if it was other way around, i would be on nvidia hardware
LOL, another "believer". Unfortunately, reviewers and those familiar with the technical aspects of both cards do not agree with you, and have not for the last 2 years since G80. And their credibility will always be greater than yours, especially given some of the comments you've made in the past.

Kinda*

*Note: The article is a little dated, but (unless I'm mistaken) the AF hasn't changed since then.

Also, I will agree that when I went from my 7800 GT to a X1900XTX I noticed better colors.

That said, I'm saving for a GTX 260 in part because of the better AF and xS modes (legacy games with fullscreen supersampling ftw)

Reviewers and enthusiasts both are valid sources for considering image quality.

For instance, HardOCP - probably the best GPU benchmarking website imo - has cited at various times in this review better AA from ATi.

Yet, while reviewers seem to enjoy the CFAA modes and their ability to create the best pure edge AA modes out there, enthusiasts may have a different opinion - one in which they may prefer the totality of the xS modes. (such as myself)

In summary, which card has the best image quality is largely subjective and arguing about it without defining the boundaries for its context will get us no where.
 

Sliceup

Junior Member
Jan 21, 2009
10
0
0
Originally posted by: chizow
Originally posted by: Sliceup
http://www.computerbase.de/art...ia_geforce_gtx_295/20/

ATI still holds the crown? at high res aa/af
I guess it depends which reviews you trust, especially given the fact this site wasn't able to replicate their performance gains in their 9.1 comparison done a day later. Here's a few other reviews I posted earlier in this thread that all come to the conclusion the 295 is the faster part, even in a majority of high res AA/AF settings:

There's myriad reviews done with the latest drivers from both ATI and AMD (181.20 WHQL and 8.12 Hot Fix) comparing the GTX 295 and 4870X2 clearly showing the GTX 295 is the faster part, even in the majority of 8xAA or 2560+AA benches. Not all of the original launch GTX 295 reviews have the latest ATI hot fix, but almost all of the GTX 285 reviews have the latest drivers + GTX 295 and 4870X2.

FiringSquad
TechReport
PCGamesHardware
AnandTech

Certainly a bit surprising how Nvidia seems to have caught up in high bandwidth/VRAM situations from the initial set of previews done with beta drivers. Really comes down to whether or not the additional $50 or so is worth it.

yea it does depend on what you trust.

all the links you posted are with old 8.xx drivers i clearly said with 9.x drivers? did you not get that?

here is another instresting review but they are only running 4x aa/16x af and at 1920 x 1200.

http://hothardware.com/Article...-GeForce-GTX-295-Plus/

where the other site was showing very similar results but when at 2560 x 1600 i believe and 8x aa/16af ati took the win but i guess i will wait for more reviews cause you never know who we can trust.

i am also talking about with ati's new 9.x drivers it seems to be showing improvement.

but like i am not a fanboy of either card company. i think both companies need to engineer there cards with a bit more quality from the pcb down to the cooler. and for the record i prefer Nvidia just becuase i have had better experiences with them as far as driver stability and issues. but i have to say colors were a bit more vivid with ati cards. to me. but maybe i am blind. i give credit to each of these two companies where credit is do. and not just say omg i love this or i love that. and just brag how one is better than the other all the time but instead i point out faults of both. i dont continually argue that one is better lol when it is not always. just because i own it doesnt mean i bash the other side continually like you. i would say your a fanboy and i am not.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Elfear
Originally posted by: chizow

Certainly a bit surprising how Nvidia seems to have caught up in high bandwidth/VRAM situations from the initial set of previews done with beta drivers. Really comes down to whether or not the additional $50 or so is worth it.

From a quick glance at Newegg it looks like the price delta is back to ~$100. Makes it a tough sell for 2-4% more performance.
Actually it looks like they're completely out of stock, I guess people were willing to pay that $100 price premium for the fastest card available.
 

thilanliyan

Lifer
Jun 21, 2005
11,944
2,175
126
Originally posted by: chizow
Actually it looks like they're completely out of stock, I guess people were willing to pay that $100 price premium for the fastest card available.

Or they didn't have much stock to begin with?
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: josh6079
Kinda*

*Note: The article is a little dated, but (unless I'm mistaken) the AF hasn't changed since then.

Also, I will agree that when I went from my 7800 GT to a X1900XTX I noticed better colors.

That said, I'm saving for a GTX 260 in part because of the better AF and xS modes (legacy games with fullscreen supersampling ftw)

Reviewers and enthusiasts both are valid sources for considering image quality.

For instance, HardOCP - probably the best GPU benchmarking website imo - has cited at various times in this review better AA from ATi.

Yet, while reviewers seem to enjoy the CFAA modes and their ability to create the best pure edge AA modes out there, enthusiasts may have a different opinion - one in which they may prefer the totality of the xS modes. (such as myself)

In summary, which card has the best image quality is largely subjective and arguing about it without defining the boundaries for its context will get us no where.
That review is funny actually, I guess all the test subjects missed the greatest difference in IQ between the two solutions, the massive difference in FPS between 8800GTX SLI and 3870 CF. It sounds like many of the testers were mesmerized by ATI's liberal application of color profiles, which is similar to Nvidia's Digital Vibrance for those who prefer oversaturated colors.

But again, the difference in AF clearly favors Nvidia, a difference even more pronounced once you get away from still photos and ATI's texture shimmering/crawling starts appearing. I guess the testers missed that as well....maybe they thought the extra "effects" were added 3D animation or something lol.

HardOCP results confirm what we already knew, that ATI only caught back up in terms of AA with this last generation with AA modes beyond multi-sampling with their Edge Detect AA. They clearly state 2x,4x and 8xQ MSAA for Nvidia are about the same as ATI's 2x,4x,8x MSAA. They only make 1 clear distinction, with 12xCFAA being generally better than 16xCSAA. They end the comparison with Nvidia's 16xQ being similar to 24xCFAA, however, Nvidia parts also support mixed-mode AA for even better quality.

Image quality isn't subjective when being compared by people who actually know the difference, know what to look for, set criteria and stick to that criteria.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Sliceup
yea it does depend on what you trust.

all the links you posted are with old 8.xx drivers i clearly said with 9.x drivers? did you not get that?
Uh, no, they're all 8.12 hot fix or 9.1 beta, which are the same driver revision. Did you not get that?

here is another instresting review but they are only running 4x aa/16x af and at 1920 x 1200.

http://hothardware.com/Article...-GeForce-GTX-295-Plus/

where the other site was showing very similar results but when at 2560 x 1600 i believe and 8x aa/16af ati took the win but i guess i will wait for more reviews cause you never know who we can trust.

i am also talking about with ati's new 9.x drivers it seems to be showing improvement.
ROFL, the first line of your linked article's conclusion:

Performance Summary: Summarizing the EVGA GeForce GTX 295 Plus' performance couldn't be any easier--it was the fastest, single graphics card we have ever tested. It outpaced the reference GeForce GTX 295 across the board and edged the Radeon HD 4870 X2 in almost every test, occasionally by a significant margin.

As for the Computerbase.de review, again, interesting how they weren't able to replicate those large gains in their 9.1 round-up, isn't it? So again, you've got 1 review with questionable results against myriad reviews from well-known and respected sites saying the GTX 295 is the faster part. Only a fanboy would argue otherwise.

but like i am not a fanboy of either card company. i think both companies need to engineer there cards with a bit more quality from the pcb down to the cooler. and for the record i prefer Nvidia just becuase i have had better experiences with them as far as driver stability and issues. but i have to say colors were a bit more vivid with ati cards. to me. but maybe i am blind. i give credit to each of these two companies where credit is do. and not just say omg i love this or i love that. and just brag how one is better than the other all the time but instead i point out faults of both. i dont continually argue that one is better lol when it is not always. just because i own it doesnt mean i bash the other side continually like you. i would say your a fanboy and i am not.
LMAO, you're not a fan boy, but you're going to ignore blatant evidence to the contrary, that all state the GTX 295 is the faster part and that Nvidia has had better IQ for over 2 years since the G80 launched. Considering you're batting .000% in this thread like your buddy PCButcher I'm sure you're just interested in giving each company credit "where credit is do".
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: thilan29
Originally posted by: chizow
Actually it looks like they're completely out of stock, I guess people were willing to pay that $100 price premium for the fastest card available.

Or they didn't have much stock to begin with?
Enough for anyone who actually wanted one for a month after release. I guess there is still a market for $500 video cards.
 

NoStateofMind

Diamond Member
Oct 14, 2005
9,711
6
76
Originally posted by: chizow
Considering you're batting .000% in this thread like your buddy PCButcher I'm sure you're just interested in giving each company credit "where credit is do".

Does this mean our date is off
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: PC Surgeon
Originally posted by: chizow
Considering you're batting .000% in this thread like your buddy PCButcher I'm sure you're just interested in giving each company credit "where credit is do".

Does this mean our date is off
Ya I guess so, although I doubt anything could come between you and sliceup.... :laugh:
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: chizow
That review is funny actually, I guess all the test subjects missed the greatest difference in IQ between the two solutions, the massive difference in FPS between 8800GTX SLI and 3870 CF.

They didn't miss it because they weren't tested using the 8800 GTX SLI.

Originally posted by: Michael Brown here
The other problem was that the Ultras were too fast for our purposes: We couldn?t come close to synchronizing frame rates in our gaming tests on the ATI and Nvidia machines.
So we moved down to Nvidia?s 8800 GT. It supports HDCP on both links, the frame buffers on the cards we selected are the same size (512MB) as those on the 3870s, and the ATI and Nvidia cards would run our game benchmark at approximately the same speed...

Originally posted by: chizow
It sounds like many of the testers were mesmerized by ATI's liberal application of color profiles, which is similar to Nvidia's Digital Vibrance for those who prefer oversaturated colors.

The color profiles for each card were at their default values. The only calibration involving color that took place was done to the monitors.

Originally posted by: Michael Brown here[/i]
We paired the Blackbirds with identical HP LP3065 30-inch LCD monitors. We set the brightness controls to the same values, and then calibrated the two monitors using a Pantone HueyPro calibration kit.

To change the default color values to match one or the other would be an indication that one set has already been judged and preferred, which is what the article was trying to test.

Originally posted by: chizow
But again, the difference in AF clearly favors Nvidia, a difference even more pronounced once you get away from still photos and ATI's texture shimmering/crawling starts appearing. I guess the testers missed that as well....maybe they thought the extra "effects" were added 3D animation or something lol.

The testers were not involved in judging the image quality as it was a double-blind test.

Originally posted by: chizow
HardOCP results confirm what we already knew, that ATI only caught back up in terms of AA with this last generation with AA modes beyond multi-sampling with their Edge Detect AA. They clearly state 2x,4x and 8xQ MSAA for Nvidia are about the same as ATI's 2x,4x,8x MSAA. They only make 1 clear distinction, with 12xCFAA being generally better than 16xCSAA. They end the comparison with Nvidia's 16xQ being similar to 24xCFAA, however, Nvidia parts also support mixed-mode AA for even better quality.

If by mixed-mode AA you mean the xS modes then no, technically nVidia doesn't support them as of this moment (officially). Even so, the performance with xS modes causes a drastic FPS difference on current games; an issue you stressed the test subjects had "missed" at the beginning of your post. Its real benefit is with legacy gaming.

Originally posted by: chizow
Image quality isn't subjective when being compared by people who actually know the difference, know what to look for, set criteria and stick to that criteria.

If people set criteria for how they want their graphics to look, it becomes a subjective issue.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: josh6079
They didn't miss it because they weren't tested using the 8800 GTX SLI.

here
The other problem was that the Ultras were too fast for our purposes: We couldn?t come close to synchronizing frame rates in our gaming tests on the ATI and Nvidia machines.
So we moved down to Nvidia?s 8800 GT. It supports HDCP on both links, the frame buffers on the cards we selected are the same size (512MB) as those on the 3870s, and the ATI and Nvidia cards would run our game benchmark at approximately the same speed...
Guess it wouldn't have hurt to actually read the article.

The color profiles for each card were at their default values. The only calibration involving color that took place was done to the monitors.
Exactly, which just emphasizes default color profiles and people's preference for oversaturated colors vs. accurate colors. A more interesting use of calibration tools would've been comparing those reference color profiles to reference RGB color spectrums. Many prefer Nvidia's version of Digital Vibrance over the more accurate default color profile, but that's a subjective opinion that isn't conclusive in determining color accuracy, which is an objective criteria.

To change the default color values to match one or the other would be an indication that one set has already been judged and preferred, which is what the article was trying to test.
Again, more detail is needed here on their calibration testing. Clearly there is a reference card used in the calibration, otherwise the different cards should not be producing significantly different colors.

The testers were not involved in judging the image quality as it was a double-blind test.
Huh? They directly comment on image quality, that's the point of the comparison to begin with, is it not? And how would double-blind preclude them from making distinctions from the clearly inferior AF and AA (no edge detect CFAA) on the ATI parts, even if they didn't know which vendor it was? The article explicitly states this:

We thought our test subjects might see differences in color rendering, antialiasing, and lighting. We expressly told them not to evaluate frame rate or animation quality.

If by mixed-mode AA you mean the xS modes then no, technically nVidia doesn't support them as of this moment (officially). Even so, the performance with xS modes causes a drastic FPS difference on current games; an issue you stressed the test subjects had "missed" at the beginning of your post. Its real benefit is with legacy gaming.
Yep, of course higher AA modes become less obtainable due to performance issues, which makes it all more important to focus on the more accessible modes that offer similar visual quality like 2x, 4x, and 8x MSAA. However, if we're comparing sugar-cubed sized pixels as some kind of relevant test, we should look at all options regardless of performance.

If people set criteria for how they want their graphics to look, it becomes a subjective issue.
So if you wanted your graphics on ATI parts to not look blurry depending on angle, or to not shimmer due to lack of LOD clamp, those are subjective issues as well? Again, there are objective tests that specifically test and expose the limitations of the different hardware and software limitations of both vendors, which leaves little subjective doubt to which is superior. These tests and conclusions are affirmed and verified in screenshots and actual gameplay, so again, if you prefer shimmering/crawling and blurry angle-dependent textures, then I guess it is subjective which is better.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: chizow
Exactly, which just emphasizes default color profiles and people's preference for oversaturated colors vs. accurate colors.

If it's people's preference, it counts as good image quality. Over-saturated or not.

Huh? They directly comment on image quality, that's the point of the comparison to begin with, is it not?
No. The testers did not directly comment on the quality of said features. They (the testers) only stated that they thought the subjects would have seen differences in those areas. The point of the comparison was to have unbiased judges. The people performing the test did not pick which looked better or make any comments as to why. The subjects did.

As far as what they told the subjects directly:

Originally posted by: Michael Brown here
The test administrator told each subject only that we were evaluating image quality...We expressly told them not to evaluate frame rate or animation quality...The test administrator asked each subject to express a preference for the image displayed on monitor A or monitor B or to express no preference for either. Subjects were expressly told that ?no preference? was a perfectly valid opinion, but if they did choose A or B, they were asked to explain their rationale for that decision

Originally posted by: chizow[/i]
And how would double-blind preclude them from making distinctions from the clearly inferior AF and AA (no edge detect CFAA) on the ATI parts, even if they didn't know which vendor it was?

Because the people taking the test didn't even know they were comparing vendors of GPUs,

Originally posted by: Michael Brown here
the subjects were not informed that we were evaluating videocards or any other hardware.

and were chosen

Originally posted by: Michael Brown here
...because of their in-depth expertise at evaluating image quality in all three of our test criteria.

Originally posted by: chizow
So if you wanted your graphics on ATI parts to not look blurry depending on angle, or to not shimmer due to lack of LOD clamp, those are subjective issues as well?

First, there isn't a lack of LOD clamp with the 4800s.

Second, no, if you wanted those things not to happen you'd be more inclined to buy an nVidia GPU.

Likewise if you'd prefer to have more noticeable tree limbs on foliage, less blur on alpha textures, transparency AA in OpenGL titles (excluding Quake Wars), or higher edge AA samples you'd be more inclined to buy an ATi GPU.

It's those preferences that make image quality a subjective matter.

Originally posted by: chizow
These tests and conclusions are affirmed and verified in screenshots and actual gameplay, so again, if you prefer shimmering/crawling and blurry angle-dependent textures, then I guess it is subjective which is better.

If I'm not mistaken, which I probably am, ATi's cards don't use angle-dependent AF. The last generation of cards to use that was GeForce 7.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: josh6079
If it's people's preference, it counts as good image quality. Over-saturated or not.
It counts as opinion, as some people can't stand oversaturated colors and prefer accurate colors.

No. The testers did not directly comment on the quality of said features. They (the testers) only stated that they thought the subjects would have seen differences in those areas. The point of the comparison was to have unbiased judges. The people performing the test did not pick which looked better or make any comments as to why. The subjects did.
Right, which brings us back to the conclusion that uninformed subjects made subjective conclusions based on personal preference. This is far from objective or even scientific, given said subjects couldn't even distinguish basic differences in actual image quality and AA, given both were inferior on those ATI parts compared to the Nvidia parts tested. AA would've probably been hard to distinguish as the extent of AA probably would've been limited to 4xMSAA in such a simple test, but ATI's texture shimmering/angle-dependent AA has been around since at least R300.

Because the people taking the test didn't even know they were comparing vendors of GPUs,
Yet they were clearly instructed to compare image quality, colors, AA, etc, yet failed to identify one of the most noticeable differences between Nvidia and ATI IQ.

First, there isn't a lack of LOD clamp with the 4800s.
ATI does not have a LOD clamp in their driver. You can force LOD bias to 0 in ATT, but that won't prevent a game from forcing a negative LOD bias, which again, will result in sharper textures but also shimmering due lack of an actual 0 LOD clamp.

Likewise if you'd prefer to have more noticeable tree limbs on foliage, less blur on alpha textures, transparency AA in OpenGL titles (excluding Quake Wars), or higher edge AA samples you'd be more inclined to buy an ATi GPU.

It's those preferences that make image quality a subjective matter.
Or you could just remove any LOD clamp to get equally sharp, but more shimmering on alpha textures with Nvidia parts, similar to what's seen on ATI parts. As for TrAA in older OpenGL titles, you can just use some of the mixed-mode AA like 16xS or 32xS and blur the entire scene while maintaining playable frame rates.

Again, there's no doubt there's subjectivity when it comes to AA modes, but when you need to zoom in to the point single pixels are the size of sugar cubes to see any difference, you'd be better off splitting hairs.

If I'm not mistaken, which I probably am, ATi's cards don't use angle-dependent AF. The last generation of cards to use that was GeForce 7.
Heh, yeah, you'd be mistaken, which makes it more surprising you'd claim this fact is subjective, when it clearly is not.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
ATi's had angle-independent AF since the R5xx, the pattern's just not as tight as it is on nVidia's DX10 GPUs.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: chizow
It counts as opinion, as some people can't stand oversaturated colors and prefer accurate colors.[/i]

Correct. Since you have a preference towards accurate colors vs. less vibrant colors, your image quality criteria is a subjective matter. It is based on your preference, as you said.

Originally posted by: chizow
Yet they were clearly instructed to compare image quality, colors, AA, etc, yet failed to identify one of the most noticeable differences between Nvidia and ATI IQ.

Because the difference is not as noticeable to some people. In this case, the test subjects.

In other cases, enthusiasts. (5th post)

Originally posted by: chizow
AA would've probably been hard to distinguish as the extent of AA probably would've been limited to 4xMSAA in such a simple test, but ATI's texture shimmering/angle-dependent AA has been around since at least R300.

Yes, I think we can agree that the level of AA used was probably 4x MSAA, in which case, very hard to distinguish differences between vendors.

As such, the AF is the only hallmark for discrepancy. Considering many review sites will compare AA but not AF and some enthusiasts miss such differences, it is valid to say that the difference in AF is something that can be missed as well.

Originally posted by: chizow
This is far from objective or even scientific, given said subjects couldn't even distinguish basic differences in actual image quality and AA, given both were inferior on those ATI parts compared to the Nvidia parts tested.

It is a scientific finding because the method in retrieving the data was based on the scientific method.

The fact that the test subjects can or can't distinguish something doesn't mean it's not scientific.

Originally posted by: chizow
As for TrAA in older OpenGL titles, you can just use some of the mixed-mode AA like 16xS or 32xS and blur the entire scene while maintaining playable frame rates.

Again, different people have different preferences, making moves like that a subjective issue.

Originally posted by: Anand Lal Shimpi & Derek Wilson here
It can be quite frustrating to enable a high anisotropic filtering level to increase the detail of textures only to find them blurred by your AA mode.

Originally posted by: chizow
ATI does not have a LOD clamp in their driver. You can force LOD bias to 0 in ATT, but that won't prevent a game from forcing a negative LOD bias, which again, will result in sharper textures but also shimmering due lack of an actual 0 LOD clamp.

Perhaps. I can't be sure if setting the LOD bias to "0" in ATT or CCC is the equivalent to "Clamping" it on nVidia's control panel. It was probably a stretch to say that it was, I just assumed.

But, do you have a link that proves this?

I'm assuming that if one disables Catalyst A.I. and sets the value to "0" (where it's already at) it does prevent the game from forcing a negative (or positive) LOD bias. Disabling Catalyst A.I. ultimately forces settings made in the driver to overwrite the application. (most of the time).

Regardless, I doubt that is the issue between AFs. I suspect the issue is in the algorithm of the AF itself. As BFG10K mentioned, the pattern is just not as tight as nVidia's current AF.

Originally posted by: chizow
Or you could just remove any LOD clamp to get equally sharp, but more shimmering on alpha textures with Nvidia parts, similar to what's seen on ATI parts.

Again, I don't believe the LOD is the problem. It's the AF itself.

Such discussions were used back when the situations were reversed. GeForce 7 had noticeable inferior AF than X1k, and LOD clamps aside, it came down to being the AF.

Now nVidia has a tighter pattern - an effectively finer filter.

Originally posted by: chizow
Heh, yeah, you'd be mistaken, which makes it more surprising you'd claim this fact is subjective, when it clearly is not.

Turns out I was correct. Both AF are angle independent.

The sum of my existence in this thread is to illustrate that with the current hardware, the differences between image quality are trimmed down to the point where user preference is more of the criteria that needs to be met. As such, image quality is, for now, largely subjective.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: josh6079
Correct. Since you have a preference towards accurate colors vs. less vibrant colors, your image quality criteria is a subjective matter. It is based on your preference, as you said.
Yep, except my opinion in no shape or form automatically makes it good image quality, as you claimed based on another's preference for oversaturated colors. Again, accurate colors and inaccurate colors are not subjective, they're either accurate or less accurate. This is further emphasized when test subjects claim to base their decision on objective criteria, like color accuracy:

?The video on monitor B seems more saturated, but the color in monitor A looks more accurate.?

Apparently that was a vote in favor of ATI (Monitor A).....so again, its either A or B, the subject is claiming they know which is which, but "accurate" can't mean both if they're saying they prefer ATI and its oversaturated default colors compared to objective criteria like the Adobe (or any other) RGB color spectrum.

Because the difference is not as noticeable to some people. In this case, the test subjects.

In other cases, enthusiasts. (5th post)
Actually it just shows how flawed this exeriment was, where the test subjects weren't even able to identify the one major difference between ATI and NV IQ despite being explicitly instructed to do so. Its even more surprising they didn't pick up on this difference given some of the comments they made about detail like:

?That one looks sharper. Monitor B looks a little fuzzy, and I think monitor A has better color quality.?

As for enthusiasts, well, lets say I have reason to believe certain "enthusiasts" simply won't disclose problems they have, even if they do notice them. As a recent example, I specifically asked someone who claimed they had no issues with their current hardware/drivers/games about no less than 5 known issues in recent games, and they confirmed all of them by acknowledging they existed or deflecting blame.

Or its possible they think this kind of shimmering is normal, as they simply don't know any better and have never seen the difference on an Nvidia part. The difference is the subjects in that test had a side-by-side comparison, so they have no such excuse.

As such, the AF is the only hallmark for discrepancy. Considering many review sites will compare AA but not AF and some enthusiasts miss such differences, it is valid to say that the difference in AF is something that can be missed as well.
I disagree, I'd say AF vs. no AF is as noticeable if not moreso than no AA, as ugly/blurry textures stick out just as much as jaggies, especially at higher resolutions where AF becomes more important and AA importance diminishes.

It is a scientific finding because the method in retrieving the data was based on the scientific method.

The fact that the test subjects can or can't distinguish something doesn't mean it's not scientific.
Actually it just shows the test was inherently flawed to begin with and returned equally flawed results, as test subjects weren't able to identify the single biggest difference in IQ, one that is particularly noticeable in live action (compared to stills). Instead they focused on details like color using completely subjective criteria and personal preferences.

Again, different people have different preferences, making moves like that a subjective issue.

Originally posted by: Anand Lal Shimpi & Derek Wilson here
It can be quite frustrating to enable a high anisotropic filtering level to increase the detail of textures only to find them blurred by your AA mode.
If the end result is similar quality, then the lack of TrAA in legacy OpenGL isn't an issue. Sure its subjective, but if 16xS produces similar or superior IQ to 8xMS + AAA in OpenGL and similar frame rates, why bring it up as a disadvantage or preference?

Also I find it a bit ironic you'd quote AT when they were actually referring to the problems with ATI's inferior tent and box filters. In the case of Nvidia's mixed-mode AA however, you're actually not just blurring the entire scene, you're also getting more color samples, meaning the end result does tend to look better, particularly in older OpenGL games with smaller and less detailed textures.

Perhaps. I can't be sure if setting the LOD bias to "0" in ATT or CCC is the equivalent to "Clamping" it on nVidia's control panel. It was probably a stretch to say that it was, I just assumed.

But, do you have a link that proves this?

I'm assuming that if one disables Catalyst A.I. and sets the value to "0" (where it's already at) it does prevent the game from forcing a negative (or positive) LOD bias. Disabling Catalyst A.I. ultimately forces settings made in the driver to overwrite the application. (most of the time).
Just search for Radeon texture crawling/shimmering or ATI LOD Bias clamp and that should bring up a few threads discussing lack of LOD clamp, and how the ATT option just sets LOD Bias to 0, but does not prevent negative LOD bias if the game itself forces it.

Regardless, I doubt that is the issue between AFs. I suspect the issue is in the algorithm of the AF itself. As BFG10K mentioned, the pattern is just not as tight as nVidia's current AF.
Certainly, ATI also underfilters more than Nvidia which is why the LOD clamp by itself probably wouldn't completely solve the problem, which again points to inferior AF/IQ.

Again, I don't believe the LOD is the problem. It's the AF itself.

Such discussions were used back when the situations were reversed. GeForce 7 had noticeable inferior AF than X1k, and LOD clamps aside, it came down to being the AF.

Now nVidia has a tighter pattern - an effectively finer filter.
As covered above, part of the problem is certainly ATI underfiltering, but the lack of LOD bias clamp is certainly part of the problem, as removing it on Nvidia parts will result in finer textures but more shimmering.

As for Nvidia, let's finish the thought: "effectively finer filter" resulting in better IQ. Now that wasn't so hard, was it?

Turns out I was correct. Both AF are angle independent.

The sum of my existence in this thread is to illustrate that with the current hardware, the differences between image quality are trimmed down to the point where user preference is more of the criteria that needs to be met. As such, image quality is, for now, largely subjective.
I guess there is some confusion with regard to what reviewers are calling angle-dependent or independent. My reference was to ATI parts still showing inferior filtering at 45-degree angles, particularly noticeable off-center or on angled textures. Its certainly better than the lower quality filtering on R5XX, but its still the same AF algorithm, except only High-Quality is used now.

As for this thread, again, I don't disagree that many of the differences do come down to user preference, particularly colors if you're going to use completely subjective criteria to begin with. With AF its still quite obvious Nvidia has the lead here both in theory and practice. AA is somewhere inbetween, with both subjective and technical merits to the different modes used. I just laugh when people claim ATI still holds the IQ crown after all these years, when that clearly hasn't been the case since G80 launched over 2 years ago.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: chizow
Yep, except my opinion in no shape or form automatically makes it good image quality...

Opinions are what you're basing your whole argument on concerning image quality, thus making it subjective.

For instance, the fact that you believe nVidia has the best image quality due to their extensive AA options and superior AF is a valid opinion to have. I myself agree.

However, as I've pointed out, others may sometimes prefer AA modes that don't come with as much blur and preserve the texture's detail.

Both are valid reasons to which it is up to the individual to decide the feature set closest to their liking.

Originally posted by: chizow
Actually it just shows how flawed this experiment was, where the test subjects weren't even able to identify the one major difference between ATI and NV IQ despite being explicitly instructed to do so.

The experiment itself was sound. If anything is flawed it is the conclusion some of the test subjects may have made. For example:

Originally posted by: Michale Brown here
Despite all our assurances that expressing ?no preference? was a valid opinion, nearly everyone in our control group insisted they could see differences in image quality, despite the fact that three of them were unknowingly comparing SLI to SLI, and three others were comparing CrossFire to CrossFire.

Again, the experiment was not flawed in its execution. The control group confirmed this.

To me, the most interesting finding in this experiment was not whether or not people picked SLI or Crossfire, but the fact that their visual judgments were sometimes mistaken. The control group was probably the most interesting in pinpointing this human error that we all can be victim to from time to time.

Originally posted by: chizow
As for Nvidia, let's finish the thought: "effectively finer filter" resulting in better IQ. Now that wasn't so hard, was it?

I've never said that nVidia's current hardware didn't have better AF.

Originally posted by: chizow
...I don't disagree that many of the differences do come down to user preference, particularly colors if you're going to use completely subjective criteria to begin with. With AF its still quite obvious Nvidia has the lead here both in theory and practice. AA is somewhere inbetween, with both subjective and technical merits to the different modes used.

That was the point I was attempting to make, since some of the major differences between image qualities come down to user preference.

However, I'm now having trouble pinpointing your stance. Earlier, you claimed:

Originally posted by: chizow
Image quality isn't subjective when being compared by people who actually know the difference, know what to look for, set criteria and stick to that criteria.[/i]

and now you're saying

Originally posted by: chizow
...I don't disagree that many of the differences do come down to user preference...[/i]

In short, what was it about my statement:

Originally posted by: josh6079
In summary, which card has the best image quality is largely subjective and arguing about it without defining the boundaries for its context will get us no where.

or

Originally posted by: josh6079
As such, image quality is, for now, largely subjective.

that you disagreed with?
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: josh6079
Opinions are what you're basing your whole argument on concerning image quality, thus making it subjective.

For instance, the fact that you believe nVidia has the best image quality due to their extensive AA options and superior AF is a valid opinion to have. I myself agree.

However, as I've pointed out, others may sometimes prefer AA modes that don't come with as much blur and preserve the texture's detail.

Both are valid reasons to which it is up to the individual to decide the feature set closest to their liking.
Again, my arguments aren't based on opinion, as you can objectively measure and test each aspect of image quality with specific criteria. The problem is when the test itself is flawed (the Maximum PC link) or the differences are inconclusive (AA).

Color quality is only subjective when you don't set any criteria and leave that valuation to individual preference. If you set a criteria, like color accuracy, that takes the subjectivity out of the equation. Either the colors produced are accurate, or not, similar to LCDs.

Same for AA, you can design specific tests and criteria and then compare the execution of design. In this case, you'd compare whether the two versions and see if they're sampling and blending correctly based on theory and design. If both seem to be executing their design correctly and produce slight variations then preference between the two is subjective, like comparing differences in MSAA modes. In cases that are less clear, AAA and TrAA, where there is no reference or clearly correct interpretation, again, that'd be subjective as to which is better.

With AF its also possible to remove subjectivity through use of tests specifically designed to expose differences and accuracy, like the D3D AF tester. Again, clearly this test shows Nvidia's AF is superior, as its more accurate and properly filters textures at all angles. Further, Nvidia does not exhibit nearly as much texture shimmering/crawling as ATI parts in actual gameplay.

Now, you can say its subjective whether or not someone would prefer sharper textures vs. shimmering textures, but that clearly isn't true, as one of the main benefits of AF is to sharpen textures while reducing shimmering. That'd be like admitting ATI's AF succeeds in one aspect while failing at another.

The experiment itself was sound. If anything is flawed it is the conclusion some of the test subjects may have made. For example:

Originally posted by: Michale Brown here
Despite all our assurances that expressing ?no preference? was a valid opinion, nearly everyone in our control group insisted they could see differences in image quality, despite the fact that three of them were unknowingly comparing SLI to SLI, and three others were comparing CrossFire to CrossFire.

Again, the experiment was not flawed in its execution. The control group confirmed this.

To me, the most interesting finding in this experiment was not whether or not people picked SLI or Crossfire, but the fact that their visual judgments were sometimes mistaken. The control group was probably the most interesting in pinpointing this human error that we all can be victim to from time to time.
I'm not sure how you can claim the experiment wasn't flawed when the only conclusion they could come to about color "accuracy" would've been purely subjective. As for the control group and false positives, I'd say it proves one of my original points, that you'd get inaccurate and unreliable results from people who aren't familiar with what they should be looking for or comparing.

I've never said that nVidia's current hardware didn't have better AF.
And how did you come to this conclusion?

That was the point I was attempting to make, since some of the major differences between image qualities come down to user preference.

However, I'm now having trouble pinpointing your stance. Earlier, you claimed:

Originally posted by: chizow
Image quality isn't subjective when being compared by people who actually know the difference, know what to look for, set criteria and stick to that criteria.[/i]

and now you're saying

Originally posted by: chizow
...I don't disagree that many of the differences do come down to user preference...[/i]
No, my point was that there were certain aspects of IQ where there is little to no difference between the vendors in design and execution, so obviously preferences between those aspects would be left to subjective opinion. The flaw in the study you provided as evidence was that it was based on completely subjective criteria and design to begin with. That's very different than claiming IQ is purely subjective, as you have on many occasions, when it clearly is not.

In short, what was it about my statement:

Originally posted by: josh6079
In summary, which card has the best image quality is largely subjective and arguing about it without defining the boundaries for its context will get us no where.

or

Originally posted by: josh6079
As such, image quality is, for now, largely subjective.

that you disagreed with?
[/quote]
Because that's clearly not the case with regards to AF, which is the only aspect of IQ I've focused on or claimed a significant difference. The difference between vendors is easily tested in both theory and practice, leaving little to subjectivity. Again, I've acknowledged throughout my replies to you that the differences between AA and color reproduction were so minimal (based on objective criteria) that preference would be subjective. The only major difference was with AF based on objective criteria, tests and gameplay evidence that show Nvidia's AF is clearly superior.

If you have 2 aspects of IQ that are so similar that preferences are subjective, but one aspect that clearly favors Nvidia, the obvious conclusion would be that Nvidia has superior IQ, would it not? Which brings us back to my comments prior to yours focusing solely on AF in reply to those who claimed ATI held the IQ crown when that clearly hasn't been the case since G80.
 

Pantalaimon

Senior member
Feb 6, 2006
341
40
91
I think you two are approaching the image quality question from two different point of view, i.e. quality, and beauty. One is about a pure scientific view where you can measure certain attributes and criteria to determine if the image is of good quality, where as the other is about the view where the image looks pleasing to the viewer. The first can be objectively measured, the second is subjective.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: chizow
The problem is when the test itself is flawed (the Maximum PC link)...

Again, the test was not flawed in that it claimed judging image quality was a subjective task and that claim was supported by the data.

The flaws that you tie to the test are that the subjects couldn't distinguish between vendors as correctly as you or other knowledgeable enthusiasts such as BFG10K. But that is the subjects mistake, not the mistake of the test itself.

They took the measures they could to ensure that such errors wouldn't happen, recruiting 21 subjects with expertise in the three areas tested. The fact that those experts drew faulty conclusions does not mean they weren't tested correctly.

Originally posted by: chizow
The flaw in the study you provided as evidence was that it was based on completely subjective criteria and design to begin with.

I only provided it as evidence to show that even experts can be mistaken when they weren't involved in putting together the machine they are judging.

The study showed that.

Originally posted by: chizow
That's very different than claiming IQ is purely subjective, as you have on many occasions, when it clearly is not.

It is and it isn't, depending on which subcategory you're discussing. As you've mentioned, AA is inconclusive - relying on preference. This makes image quality in regards to AA subjective.

However, currently the differences in AF, I agree, are objective differences. At no point did I say otherwise.

Originally posted by: chizow
If you have 2 aspects of IQ that are so similar that preferences are subjective, but one aspect that clearly favors Nvidia, the obvious conclusion would be that Nvidia has superior IQ, would it not?

This would all rely on just how clear the distinction is to the judge. Some that have not researched image quality as much as you and I may not be able to tell the difference between current AFs. I constantly wonder if this isn't the case with many big name review sites, as it is uncommon to see AF comparisons unlike AA comparisons. In such cases, even review sites can award a vendor with better image quality when, in fact, they have only compared half of the material.

Originally posted by: chizow

Originally posted by: josh6079
I've never said that nVidia's current hardware didn't have better AF.

And how did you come to this conclusion?

I asked unbiased sources and ultimately compared the difference between labeled, cropped screenshots. Those screenshots also had explanations telling me what to look for and what effects each approach brought to the gaming experience. Link

Had they just been screenshots with no guide telling me which is what or what will happen with the settings in motion (like the Maximum PC comparison) my judgment as to which was better might have been mistaken.

Even the site I learned the differences between the current AFs acknowledged this:

Originally posted by: BFG10K here
Because nVidia?s image looks blurrier, many would assume it is worse, but this is incorrect.

Simply put, mistakes can happen when you're left completely to your own to decide image quality - whether you be a reviewer or an enthusiast. Had none of us ever had guides explaining which methods were more accurate or pleasing, who's to say what we would have picked?

This was, essentially, the point I was attempting to convey.

Your last post clarified some of the confusions I had regarding your positions with image quality. Indeed, I too believe that there are measures that can be taken to rule out subjectivity when comparing this. (e.g., AF).

Originally posted by: chizow
Which brings us back to my comments prior to yours focusing solely on AF in reply to those who claimed ATI held the IQ crown when that clearly hasn't been the case since G80.

Perhaps I should clarify. I mentioned the AF in italicized sentencing because it was, at the time of posting, the only area I was unsure as to if it had been changed from the 3870s to the 4870s. (The link I was using was a little dated) I wasn't providing the link as a direct comparison of AFs. I only intended to show that even some experts can be mistaken in image quality when they are unaware of the vendor.

Because of this, it isn't quite safe to say that:

Originally posted by: chizow
...their [reviewers] credibility will always be greater than yours...

That said, if you have suspicions that other:

Originally posted by: chizow
..."enthusiasts" simply won't disclose problems they have, even if they do notice them.

then what is and isn't subjective? Their lying or your suspicions?

In any case, I think we have a mutual understanding as to which vendor has better, overall image quality for the majority of PC gaming titles.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: josh6079
Again, the test was not flawed in that it claimed judging image quality was a subjective task and that claim was supported by the data.

The flaws that you tie to the test are that the subjects couldn't distinguish between vendors as correctly as you or other knowledgeable enthusiasts such as BFG10K. But that is the subjects mistake, not the mistake of the test itself.

They took the measures they could to ensure that such errors wouldn't happen, recruiting 21 subjects with expertise in the three areas tested. The fact that those experts drew faulty conclusions does not mean they weren't tested correctly.
Again, I'd have to disagree about the test's design when the results could only produce a subjective result on color quality and none of the subjects noticed the significant difference in AF.

I only provided it as evidence to show that even experts can be mistaken when they weren't involved in putting together the machine they are judging.

The study showed that.
Based on the comments and results of that test, I'd say the subject's expertise is highly questionable, not to mention actual experts have no trouble at all making these distinctions as shown in their reviews and white papers.

It is and it isn't, depending on which subcategory you're discussing. As you've mentioned, AA is inconclusive - relying on preference. This makes image quality in regards to AA subjective.

However, currently the differences in AF, I agree, are objective differences. At no point did I say otherwise.
Image quality with regards to AA isn't purely subjective, its only subjective when results are so similar there is very little difference, but that conclusion is based on objective criteria, not subjective preference. For example, you can't claim there's a subjective difference between 2x MSAA and 8x MSAA, because anyone comparing the results should come to the conclusion 8xMSAA is superior based on objective criteria. If not, its safe to completely disregard their conclusion as uninformed or irrelevant, much like the Maximum PC test.

This would all rely on just how clear the distinction is to the judge. Some that have not researched image quality as much as you and I may not be able to tell the difference between current AFs. I constantly wonder if this isn't the case with many big name review sites, as it is uncommon to see AF comparisons unlike AA comparisons. In such cases, even review sites can award a vendor with better image quality when, in fact, they have only compared half of the material.
I've already linked a few reviews that have specifically commented on the difference in ATI/NV's AF and specifically commented on the texture shimmering on ATI parts. Many explicitly note that while ATI's AF may look sharper in some instances, it actually results in worst IQ in motion.

There's numerous mentions of it in passing from various reviews, but if anything it hasn't gotten as much press as it should, especially given the claims from the ATI camp about ATI retaining the IQ crown when again, that clearly hasn't been the case since G80. If anything they've had to play catch-up with AA and they still haven't closed the gap with AF.

I asked unbiased sources and ultimately compared the difference between labeled, cropped screenshots. Those screenshots also had explanations telling me what to look for and what effects each approach brought to the gaming experience. Link
Right, you used objective criteria to determine which image was actually producing the objectively better image.

Had they just been screenshots with no guide telling me which is what or what will happen with the settings in motion (like the Maximum PC comparison) my judgment as to which was better might have been mistaken.
Perhaps in a still, but that certainly shouldn't have been the case with regard to texture shimmering, which again, was shown in a side-by-side comparison.

Even the site I learned the differences between the current AFs acknowledged this:

Originally posted by: BFG10K here
Because nVidia?s image looks blurrier, many would assume it is worse, but this is incorrect.
Yep, I've already linked a side-by-side IQ review that comes to a similar conclusion, but again, the test subjects weren't looking at stills where AF would've come into play, they were looking at games that would've exposed the shimmering issue more obviously than slight differences in texture sharpness in a still.

Simply put, mistakes can happen when you're left completely to your own to decide image quality - whether you be a reviewer or an enthusiast. Had none of us ever had guides explaining which methods were more accurate or pleasing, who's to say what we would have picked?
And I don't disagree with this, which is why I stated earlier that reviewers who know what they're looking at or trained to notice such differences will always have more credibility than your typical tech forum poster or a random or a flawed survey of "experts" who failed to identify the major difference in IQ even though they were explicitly instructed to.

This was, essentially, the point I was attempting to convey.
So how is that quote of mine which you took exception to different than what you were trying to convey? Uneducated, uninformed people will make mistakes or come to incorrect conclusions, and as such, their opinions and conclusions should not be weighed as heavily as those of more informed, trained conclusions from reviewers. This is why we use informed reviews and technical documents or whitepapers as reference and guidance, is it not?

Perhaps I should clarify. I mentioned the AF in italicized sentencing because it was, at the time of posting, the only area I was unsure as to if it had been changed from the 3870s to the 4870s. (The link I was using was a little dated) I wasn't providing the link as a direct comparison of AFs. I only intended to show that even some experts can be mistaken in image quality when they are unaware of the vendor.

Because of this, it isn't quite safe to say that:

Originally posted by: chizow
...their [reviewers] credibility will always be greater than yours...
Again, I'm not sure how linking to a test revealing the incompetence and subjectivity of people who clearly aren't experts in image quality does anything to dispute my claim, which has been clearly backed by the conclusions of actual experts who all comment and notice the differences in AF, particularly in motion.

That said, if you have suspicions that other:

Originally posted by: chizow
..."enthusiasts" simply won't disclose problems they have, even if they do notice them.

then what is and isn't subjective? Their lying or your suspicions?
You linked to a thread as evidence not everyone would notice such a problem. My reply was that some people, whether they notice something or not, will not disclose whether or not they see the problem. This of course relies on the objective, mostly unbiased observations of reviewers who claim this flickering issue effects all ATI parts. If someone says they don't notice the problem, they must either think its normal, they're not giving full disclosure, or they have a special video card that transcends all existing hardware/software limitations.

In any case, I think we have a mutual understanding as to which vendor has better, overall image quality for the majority of PC gaming titles.
Right, which was my original point in reply to some of the continuous perpetuation of misinformation in this thread with regard to IQ.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Pantalaimon
I think you two are approaching the image quality question from two different point of view, i.e. quality, and beauty. One is about a pure scientific view where you can measure certain attributes and criteria to determine if the image is of good quality, where as the other is about the view where the image looks pleasing to the viewer. The first can be objectively measured, the second is subjective.
That's a good assessment and my point exactly.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: chizow
Again, I'd have to disagree about the test's design when the results could only produce a subjective result on color quality and none of the subjects noticed the significant difference in AF.

Even if a person did notice the difference between AFs, the design of the test would remain unchanged.

Originally posted by: chizow
Based on the comments and results of that test, I'd say the subject's expertise is highly questionable, not to mention actual experts have no trouble at all making these distinctions as shown in their reviews and white papers.

Fair enough. It's a bit speculative to begin attacking their credentials, but they did miss the AF differences and as such causes doubt.

Originally posted by: chizow
I've already linked a few reviews that have specifically commented on the difference in ATI/NV's AF and specifically commented on the texture shimmering on ATI parts. Many explicitly note that while ATI's AF may look sharper in some instances, it actually results in worst IQ in motion...There's numerous mentions of it in passing from various reviews, but if anything it hasn't gotten as much press as it should...

I don't doubt. It just seems unbalanced with the plethora of performance comparisons with respect to IQ comparisons - especially IQ comparisons with sections concerning AF.

Originally posted by: chizow
So how is that quote of mine which you took exception to different than what you were trying to convey?

Because even those considered to be experts can be wrong at times. If that happens, it is then up to enthusiasts to serve as a check and balance (or even other reviewers). As such, both are valid sources - one not weighing more than the other.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: josh6079
Even if a person did notice the difference between AFs, the design of the test would remain unchanged.
I still disagree simply because the monitors weren't calibrated to the video card's default profiles to begin with. Given some of the replies and the respondent's emphasis on "color accuracy" and "saturation" as the focus of their replies, I'd say this glaring flaw in design overshadowed any meaningful comparison and actual difference in image quality.

Fair enough. It's a bit speculative to begin attacking their credentials, but they did miss the AF differences and as such causes doubt.
Again, the results speak for themselves, not sure how these people can be considered IQ experts when they return results you'd expect from any random sample of the general population.

I don't doubt. It just seems unbalanced with the plethora of performance comparisons with respect to IQ comparisons - especially IQ comparisons with sections concerning AF.
Well most reviews comparing IQ do show the D3D AF tester results, which have clearly favored Nvidia since G80. Some screenshots do show weaknesses in ATI's AF depending on angle, but criticisms about texture shimmering can't be illustrated in a SS so those comments are typically lost in text.

I've seen much more emphasis on it lately though, but you'll find hints about it going back to R300, so its certainly not anything new. Again, I'd say the G80 had a lot to do with this as its superior HQ AF + LOD clamp reduced the issue significantly from G7x and simultaneously highlighted the difference on ATi parts.

Because even those considered to be experts can be wrong at times. If that happens, it is then up to enthusiasts to serve as a check and balance (or even other reviewers). As such, both are valid sources - one not weighing more than the other.
Again, I guess it comes down to who you consider experts. I haven't seen many miss this one, and certainly not as bad as the folks in that Maximum PC study, as every reputable review site has concluded Nvidia has superior AF since G80.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: chizow
I still disagree simply because the monitors weren't calibrated to the video card's default profiles to begin with.

How do you do that?
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |