X800 & 6800 Filtering Quality: NV Wins... I think?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
No it's not and I made sure that you would be able to see the artifacts at any angle, distance whatever. I showed you in the COD screen shot the same places and it held true, so now I still try to go to the same place, but not as worried if I screw up. I could be playing Splinter Cell: Pandora Tomorrow right now, I don't need to be wasting my time.

All I'm saying is make fair comparisons. You can say it's the same all you want... you can satisfy yourself all you want. But if you're doing this to show other people the difference, I WANT TO SEE THE DIFFERENCE. If you just want people to take your word for it, you can just edit your post and say "I've seen both, and trust me, ATI has higher default image quality." If you're not willing to make fair comparisons, don't bother making comparisons at all.
 

housecat

Banned
Oct 20, 2004
1,426
0
0
Well Im not against this thread or its mission.. be interesting what you find (even if i, or others take it for gospel or not)
 

VirtualLarry

No Lifer
Aug 25, 2001
56,452
10,120
126
Originally posted by: TheSnowman
I'm not denying that Nvidia hardware uses more bits filitering, that is a simle fact. However, people who want to know the truth obviously want to see evidence of how that fact releates to the image quality in real world gaming situations. If it can be seen in "every 3D PC game outside of Tron2.0" then clearly it shouldn't be hard for you to provide screenshot or video evidice of your claims. Otherwise, claims of "filtering tricks" when even the article points out that ATI's implementation matchs Direct3D's reference rasterizer will only appeal to people looking for a reason to dog on ATI.
I think that the DirectX refrast only uses six bits of Z for aniso filtering too, which is probably why ATI's implementation matches it, although NV implemented, IIRC, 8 bits, which is actually better, and probably better too for their workstation-oriented/OpenGL usage. But if ATI "matched MS's specs", then blame MS in the first place, for implementing specs that lead to poor image-quality. Kind of like the ATI 24-bit FP vs. NV 32-bit FP stuff. ATI still does have the edge in terms of angle-adaptive anisotropic filtering vs. NV's brilinear stuff though. So it kind of evens out. It might be nice to be able to simply force "pure hardware real aniso" for everything, and just leave it at that, can you do that with the current crop of drivers?
 

VirtualLarry

No Lifer
Aug 25, 2001
56,452
10,120
126
Originally posted by: VIAN
But I think that it is important for Websites to start testing at High Quality instead of Quality. Because I'm pissed for buying a card based on a bunch of bechmarks compared to other cards with higher IQ. The card I bought should have had less fps to get the same IQ, but it didn't.
I think that what I hear you saying, is a call for websites to start testing the actual quality/speed of the hardware, rather than the relative effectiveness of the clever hacks implemented in any one company's drivers, regardless of the "true" capabilities of the hardware. Does that seem accurate? The difficulty, is that getting the hardware to do anything requires drivers.

Sound like yet another case of this: "Pick any two - Good, Cheap, Fast.", combined with the fact that high-end gamers want to have their cake (high image quality) and eat it too (high frame rates). Benchmarks or marketing material, tend to show one or the other, in isolation, but in real-world usage, you can't get both at once.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
A bit confusing.

I'm saying I bought hardware based on a lie. The lie that the image quality between 2 products was the same.

Benchmarks showed that the 9700 Pro was slower than the 6600GT. Let's take this benchmark in UT2004 for example. The 9700 Pro is slower than the 6600GT. But here, the reason that the 6600GT beats out the 9700 Pro is because it is rendering a worse image. If the 6600GT was rendering and identical image to the 9700 Pro, the 9700 Pro would have won this benchmark. So, thinking I was gonna get a performance increase, I bought the card and got a performance decrease.

Not that I expected a big performance increase, but certainly not a decrease.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
I'd really like to see a game saved, a screenshot taken on a nVidia card, the saved game file sent to someone else with an ATI card, and the EXACT same screenshot taken.

*EDIT* It would be nice of both people had Fraps too and could provide bmp's rather than jpegs since there's some quality loss with jpegs. I can do the nVidia part of that, anyone with an x800 or somethin wanna do the ATI part?
 

housecat

Banned
Oct 20, 2004
1,426
0
0
Originally posted by: VIAN
A bit confusing.

I'm saying I bought hardware based on a lie. The lie that the image quality between 2 products was the same.

i dont think you'd get either nvidia or ati to make that statement. i dont even know of a review site that has said such.

out of curiosity, where'd you get this idea?

in my experience the IQ can be fairly drastically different from color reproduction ect. between the two lines.
as an aside, if you own a 6600gt i wouldnt feel too cheated, even if you found worse IQ.. its simply a great card (SM3/purevideo ect)

best allround card that could be recommended IMO :thumbsup:
 

AnnoyedGrunt

Senior member
Jan 31, 2004
596
25
81
A couple things:

First, baseline reviews need to be performed in a manner similar to how the user will run the card. To me, this means all default driver settings. Image quality should be a major focus on these baseline tests, so people get a good idea of how the different cards render the scenes @ the default settings. Frankly, I don't want to spend hours configuring the card for each game I play, I just want to install it and start playing. It's only if I find problems that I'll start tweaking things. I think many users would operate that way.

Second, after a default baseline score is given, reviews should show the cards with whatever settings ensure similar IQ. It may also be worth while to show each cards performance @ the highest IQ and @ the lowest IQ, so we can get an idea of the effect of the optimizations.

Now, I used to have a 9700pro, and was avoiding the 6600GT simply because the speed improvement wasn't significant IMO. If you look at many of the reviews, the 6600GT only beats the 9700/9800pro by a few FPS in some games, even though it may beat it handily in others. If the OP bought a 6600GT based on the review linked in an earlier post, the framerate difference was only 3 FPS. I therefore think it is unreasonable to expect a significant performance improvement between such closely performing cards (at least in that one specific game).

Finally, this IQ comparison would make a far greater impact if there were ATI pics to compare with (as well as the equivalent ATI framerate). I can see a differnce in the pics shown, but that is no surprise since one is Quality and one is High Quality. I'd expect to see a differnce, otherwise there'd be no point in having different settings. So, the IQ difference can only have meaning when comparing with a different card so the IQ @ FPS can be seen.

Anyhow, just my 2cents.

-D'oh!
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
i dont think you'd get either nvidia or ati to make that statement. i dont even know of a review site that has said such.

out of curiosity, where'd you get this idea?

in my experience the IQ can be fairly drastically different from color reproduction ect. between the two lines.
as an aside, if you own a 6600gt i wouldnt feel too cheated, even if you found worse IQ.. its simply a great card (SM3/purevideo ect)

best allround card that could be recommended IMO
Review sites are supposed to be unbiased and honest. If a product is doing something worse to get an edge, we're supposed to be informed of it. I'm not aiming this at Nvidia or ATI, even though ATI did say not to test Nvidia at Quality settings(for very good reason too), both have great image quality. This is aimed at review sites that totally missed this. None of them caught it. By luck I happened to find it as it is hard to spot in most textures out there. But I expect the image quality to be the same in all types of textures, not just some.

I can accept variations, because those are variations, but this difference isn't a variation and it is freaking annoying to play with it.

As for SM3.0 and purevideo. I didn't buy the card for either of those features as I knew SM3.0 wasn't going to be vastly used this year and pure video wasn't a reality at the time(not that I find much use for it now). In fact, the main reason I bought the card was because ATI had no AGP counterpart). I like ATI's driver maintenance.

Second, after a default baseline score is given, reviews should show the cards with whatever settings ensure similar IQ. It may also be worth while to show each cards performance @ the highest IQ and @ the lowest IQ, so we can get an idea of the effect of the optimizations.
That's definitely a good Idea. But it seems like review sites are in too much of a rush all the time to pay that much attention to one hardware review. But it would be nice.

If the OP bought a 6600GT based on the review linked in an earlier post, the framerate difference was only 3 FPS.
LOL, no I didn't buy it because of those 3fps.

Finally, this IQ comparison would make a far greater impact if there were ATI pics to compare with (as well as the equivalent ATI framerate).
Need an ATI card. Maybe this week I'll buy a 9600(so fps isn't gonna matter), but the 9600 has similar IQ to the X800 series because of it's Adaptive Trilinear. Maybe.
 

housecat

Banned
Oct 20, 2004
1,426
0
0
Surely someone with a ATI card will help this guy out??
it isnt like this is going to favor NV here.. without comparing AA/AF quality (which I believe NV has better IIRC), ATI should come out on top here.

Pick some standard games most of us have like FC/HL2/Doom3 and start on the first level without moving and take a screen shot at 1280x1024 (a pretty good standard res that everyones hardware supports)
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Here's a couple more in HL2 from me, lol.

Quality
High Quality

Notice the moire effect on the tile near the bottom left corner that's "visible" in the Quality shot, but not in High Quality. I have visible in quotes because while it's not visible in the screenshot in High Quality, it can be seen while moving... I'm going to make a couple short bink videos of it to prove it.
 

McArra

Diamond Member
May 21, 2003
3,295
0
0
VIAN, instead of making this kind of statements why don't you try to solve the "problem". Try XG 71.84, it may help, and report back
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Here's the Quality video
and High Quality

They're both around 16 MB... and they're bink videos that I made into an exe so you don't need to find the bink codec to play them.

The moire effect IS less apparent in High Quality mode, but it still exists. BTW... those videos are only a few seconds long, but I left it in full resolution (1024x768).

In my opinion, the difference in quality isn't so great that I'm willing to take up to a 25% performance hit.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,452
10,120
126
Originally posted by: VIAN
Review sites are supposed to be unbiased and honest. If a product is doing something worse to get an edge, we're supposed to be informed of it.
Welcome to advanced competitive video-card marketing 201 - shaping the mass-market ("herd") purchasing mentality through selectively-biased reviews, in exchange for further considerations from the mfg. (Aka "The Way It's Meant To Be Reviewed.")
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
The difference between quality and high quality is simply the three optimization options which are clearly visible and can be disabled. Most reviewers do in fact disable them so I'm not sure where this unfair benchmarking comes into it.

Now in contrast ATi has a number of optimizations along the same lines but some of them can't be disabled (e.g trylinear, trilinear on first texture stage). I've seen some reviewers turn off all of nV's optimizations but of course ATi's are still running.

Personally I never cared about either side doing it (unless it's sneaky, e.g detecting "ut2003.exe") but if you are, keep both sides consistent.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: BFG10K
The difference between quality and high quality is simply the three optimization options which are clearly visible and can be disabled. Most reviewers do in fact disable them so I'm not sure where this unfair benchmarking comes into it.

Now in contrast ATi has a number of optimizations along the same lines but some of them can't be disabled (e.g trylinear, trilinear on first texture stage). I've seen some reviewers turn off all of nV's optimizations but of course ATi's are still running.

Personally I never cared about either side doing it (unless it's sneaky, e.g detecting "ut2003.exe") but if you are, keep both sides consistent.

There's more to it than that... I have those 3 optimizations disabled for both of those videos and all my screenshots, yet there's still a difference, although slight.
 

housecat

Banned
Oct 20, 2004
1,426
0
0
Originally posted by: VirtualLarry
Originally posted by: VIAN
A bit confusing.
I'm saying I bought hardware based on a lie.
Welcome to highly-competitive video-card marketing 101.

Or welcome to dumbass 507.
More like promises made in ones head from their own misconceptions of the truth, influenced by marketing.. but not directly attributable to.

In no way is anything like this ATIs or NVs fault.

If very slight differences in IQ mean that much, asking people who own both brands before hand is the thing to do.
I've always noticed slight differences in ATI compared to NV.. but nothing I felt was giving a ring-ding-ding win for either company.

This goes for trylinear and whatever we are investigating on NV here.
The optimizations rarely turn the game into some ugly freakshow.


I guess to make VIAN feel better, he has a card where the optimizations CAN be disabled, more so than they can on ATIs hardware..
so in that respect one could start beating on ATI for this and that.. but instead we all know the truth and just keep it in mind.

Neither trylinear (or other ATI cheats) nor this mean squat.


What exactly would you do? Buy a NV to rid your life of horrid trylinear?
Then buy a ATI to rid yourself of the horrid "default IQ settings" bug?

Or just bash on company X's optimizations when you have an agenda against them?
Kind of a alot of needless bitching.

I hope you all realize "optimizations/cheat" witch-hunts are the lifeblood of fanboys for both sides. Aint no shindig like a floptimization partay!

I guess you could protest BOTH of them!
Then have that superb image quality you so desire from XGI, or Intel. :thumbsup:
 

VirtualLarry

No Lifer
Aug 25, 2001
56,452
10,120
126
Originally posted by: housecat
More like promises made in ones head from their own misconceptions of the truth, influenced by marketing.. but not directly attributable to.
In no way is anything like this ATIs or NVs fault.
LOL. I'm not touching that one with a 10, no, 20ft pole. If I did, this entire forum would be filled with nothing but "ATI lies"/"NV lies" fanboy flame posts.

Are you trying to say that NV and ATI have never done anything deceptive, not in marketing, specs, launch dates, operations / features / options of drivers, questionable optimizations, etc.?

Originally posted by: housecat
The optimizations rarely turn the game into some ugly freakshow.
Well, actually, now that you mention it... there have indeed been cases in the past where that has been true. At least, until the next driver upgrade or game patch to make the issues less obvious, or to detect that game specifically and disable those optimizations for that game in that driver rev.

Originally posted by: housecat
Or just bash on company X's optimizations when you have an agenda against them?
Considering that this issue affects both of the main vendors here, I don't see any specific brand agenda being argued, so much as the fact that there are any "optimizations" enabled by default anyways. Personally, I think that whatever the game requests of the hardware, it should get. It shouldn't get lied to, or ignored, in order for a "driver that knows better" to attempt to wring a few more FPS out of a game, at the same time that numerous advertised IQ-related hardware features are ignored or bypassed.

Originally posted by: housecat
Kind of a alot of needless bitching.
I guess you could protest BOTH of them!
Not a bad idea, not at all.

Originally posted by: housecat
Then have that superb image quality you so desire from XGI, or Intel.
Do their solutions, actually offer superior image-quality compared to ATI or NV? I hadn't heard that.
 

McArra

Diamond Member
May 21, 2003
3,295
0
0
Do their solutions, actually offer superior image-quality compared to ATI or NV? I hadn't heard that.

Yes they are impresive LOL!!!!! He was being sarcastic (I hope )
 

housecat

Banned
Oct 20, 2004
1,426
0
0
Originally posted by: VirtualLarry
Originally posted by: housecat
More like promises made in ones head from their own misconceptions of the truth, influenced by marketing.. but not directly attributable to.
In no way is anything like this ATIs or NVs fault.
LOL. I'm not touching that one with a 10, no, 20ft pole. If I did, this entire forum would be filled with nothing but "ATI lies"/"NV lies" fanboy flame posts.

Why not? Find me where either ATI or NV said IQ was the same between both.
They never ever were the same or even the same to the naked eye without photochopping. I can attest to that.

Originally posted by: VirtualLarry
Are you trying to say that NV and ATI have never done anything deceptive, not in marketing, specs, launch dates, operations / features / options of drivers, questionable optimizations, etc.?
No. I only said what was said.
No one was decieved in this situation. NV never said once that their IQ was the same as ATIs. If anyone is being deceived it is from their own twisted perception of what they were buying.. but I have yet to find on my BFG boxes "same IQ of ATI at half the price!"
ATI doesnt put on their boxes or ads how trylinear is active.

In fact it's been said by AT that NV cards overall do more work to producing a correct image to the developers standards than ATI.
Wish i had the link on me
*checks pockets*
but you can find it in a search.

Originally posted by: VirtualLarry
Originally posted by: housecat
Or just bash on company X's optimizations when you have an agenda against them?

Considering that this issue affects both of the main vendors here, I don't see any specific brand agenda being argued, so much as the fact that there are any "optimizations" enabled by default anyways. Personally, I think that whatever the game requests of the hardware, it should get. It shouldn't get lied to, or ignored, in order for a "driver that knows better" to attempt to wring a few more FPS out of a game, at the same time that numerous advertised IQ-related hardware features are ignored or bypassed.

I'm merely pointing out that optimizations are not vendor specific and if we did this for both companies.. those who CLAIM to be THAT concerned about IQ (when in reality both products produce great IQ) would be "videophiles" and couldnt stand using mere ATI/NV cards that render a inaccurate image.
If one truley agreed with you on demanding a driver only produce what the developer intended (I think this is what you were getting at), then we would be using Intel Extreme or XGI. It'd be a circular argument, its pretty anal, and pretty unrealistic to expect that on a cutting edge card from two high competitive companys.

I'd be willing to bet that a Quadro or FireGL renders things "properly" as you guys seem to desire. Seriously, take a shot with one of those and I'd be willing to bet its perfect IQ as the developers intended.

I'll stick with a Radeon or Geforce myself. I can handle that.

Originally posted by: VirtualLarry
Originally posted by: housecat
Then have that superb image quality you so desire from XGI, or Intel.
Do their solutions, actually offer superior image-quality compared to ATI or NV? I hadn't heard that.

I dont know. I was attempting to make a point how stupid the optimization witch hunts go.. if people meant half whole heartedly half the witch hunts and the conclusion that is usually drawn by many at the end of it (ie. I'm not using ATI or NV) then we'd be long out of all options. Barring developer cards.. doubtful theres any optimizations on those.

Its not too hard to flash to a Quadro and find out.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
Optimal Image Quality has changed. Seems that at default Nvidia does Bilinear in some games, or at least the ones I've tested including HL2, FarCry, so to change this you will have to force Trilinear. This was tested without Anisotropic Filtering and Trilinear enabled in game, but not FORCED in driver. Noted in first post.

Hey jeff, how do you host those videos, I would like to do some with fraps.

VIAN, instead of making this kind of statements why don't you try to solve the "problem". Try XG 71.84, it may help, and report back
Because that wasn't the problem. The problem is the reivew sites benchmarking wrong.

Welcome to advanced competitive video-card marketing 201 - shaping the mass-market ("herd") purchasing mentality through selectively-biased reviews, in exchange for further considerations from the mfg. (Aka "The Way It's Meant To Be Reviewed.")
LOL

The difference between quality and high quality is simply the three optimization options which are clearly visible and can be disabled. Most reviewers do in fact disable them so I'm not sure where this unfair benchmarking comes into it.
They could've benched it like that, I would have no problem as long as they told me that there was a loss in image quality in the Nv card.

I guess to make VIAN feel better, he has a card where the optimizations CAN be disabled, more so than they can on ATIs hardware..
You can disable ATI's optimizations, but I don't see why you would want to since they provide apparently the same image quality. The difference between ATI's optimizations is that you would need special tests to test it out. You wouldn't see it in a game.

I also have this problem with Nv cards in COD where in the darker areas of the game, there are white ghosty lines running the walls as I move. But in the light areas, it goes away. It's like some kind of fog, but really bad.




 

McArra

Diamond Member
May 21, 2003
3,295
0
0
Vian, if you don't like the card sell it. By the way I don't feel ripped off, and I own a 6800GT.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
I'm gonna sell it soon as the new mainstream cards come out next gen.
 

Avalon

Diamond Member
Jul 16, 2001
7,567
152
106
I'm with you on this one VIAN. When I finally upgraded my 9700 to an eVGA 680NU, I noticed it pretty easily. It was noticeable with AF enabled. I just merely keep my IQ setting on high quality. My performance on this card is so good that I don't mind sacrificing some for IQ. I may have to give those XG drivers a looky, though. That sounds promising.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |