They could've benched it like that, I would have no problem as long as they told me that there was a loss in image quality in the Nv card.
Originally posted by: Jeff7181
Here's a couple more in HL2 from me, lol.
Quality
High Quality
Notice the moire effect on the tile near the bottom left corner that's "visible" in the Quality shot, but not in High Quality. I have visible in quotes because while it's not visible in the screenshot in High Quality, it can be seen while moving... I'm going to make a couple short bink videos of it to prove it.
Originally posted by: Duvie
Originally posted by: Jeff7181
Here's a couple more in HL2 from me, lol.
Quality
High Quality
Notice the moire effect on the tile near the bottom left corner that's "visible" in the Quality shot, but not in High Quality. I have visible in quotes because while it's not visible in the screenshot in High Quality, it can be seen while moving... I'm going to make a couple short bink videos of it to prove it.
I have to agree with you Jeff...i am not a big gamer but I took the pics zoomed in a bit tighter and then overlayed them and clicked back and forth and my opinion was in this freeze frame mode quality was better....
now this was pretty objective cause from the jpg names I couldn't tell which one was quality or high quality until I checked the link.....
Looking forward to watching the video......
Originally posted by: Duvie
too hard to see much from the video....I cant throw it into adobe premiere with it being an exe....I would think a slower frame by frame playback could help, but it may be me but I though the quality moved better without blurring looking at the tiles and the grunge on the wall....
Overall you guys waste too much time in this as the difference are so minute....picky bastages!!!!
Originally posted by: Jeff7181
Here's the Quality video
and High Quality
They're both around 16 MB... and they're bink videos that I made into an exe so you don't need to find the bink codec to play them.
The moire effect IS less apparent in High Quality mode, but it still exists. BTW... those videos are only a few seconds long, but I left it in full resolution (1024x768).
In my opinion, the difference in quality isn't so great that I'm willing to take up to a 25% performance hit.
Originally posted by: Avalon
I'm with you on this one VIAN. When I finally upgraded my 9700 to an eVGA 680NU, I noticed it pretty easily. It was noticeable with AF enabled. I just merely keep my IQ setting on high quality. My performance on this card is so good that I don't mind sacrificing some for IQ. I may have to give those XG drivers a looky, though. That sounds promising.
Originally posted by: Jeff7181
Originally posted by: Duvie
too hard to see much from the video....I cant throw it into adobe premiere with it being an exe....I would think a slower frame by frame playback could help, but it may be me but I though the quality moved better without blurring looking at the tiles and the grunge on the wall....
Overall you guys waste too much time in this as the difference are so minute....picky bastages!!!!
You don't see the moire effect on the tile floor? It's not as pronounced in High Quality, but it's still there.
Particularly in this area.
From what I've seen, High Quality in the HL2 scene presents a slight moire effect. If you switch to Quality without the optimizations, the moire effect is more pronounced. If you turn on the optimizations the moire effect is very easily seen.Trilinear Optimization, Anisotropic Mip Filter Optimization, and Anisotropic Sample Optimization ARE NOT THE ONLY DIFFERENCES BETWEEN QUALITY AND HIGH QUALITY.
Jeffs pictures are taken at an angle where the artifacts are more unnoticeable up. I couldn't spot the difference between the two pictures. I guarantee if he stepped down towards the ground more, he would see more of an effect. Why don't you check my pictures out in the first post under PICS IN OTHER GAMES.I have to agree with you Jeff...i am not a big gamer but I took the pics zoomed in a bit tighter and then overlayed them and clicked back and forth and my opinion was in this freeze frame mode quality was better....
now this was pretty objective cause from the jpg names I couldn't tell which one was quality or high quality until I checked the link.....
Looking forward to watching the video......
The difference was on the tiles on the ground and nowhere else. The is a moire effect when moving akin to texture aliasing. It's more annoying in your face and when you get lower to the ground.too hard to see much from the video....I cant throw it into adobe premiere with it being an exe....I would think a slower frame by frame playback could help, but it may be me but I though the quality moved better without blurring looking at the tiles and the grunge on the wall....
Overall you guys waste too much time in this as the difference are so minute....picky bastages!!!!
Originally posted by: otispunkmeyer
Originally posted by: Jeff7181
Here's the Quality video
and High Quality
They're both around 16 MB... and they're bink videos that I made into an exe so you don't need to find the bink codec to play them.
The moire effect IS less apparent in High Quality mode, but it still exists. BTW... those videos are only a few seconds long, but I left it in full resolution (1024x768).
In my opinion, the difference in quality isn't so great that I'm willing to take up to a 25% performance hit.
wow, not picking you apart jeff, sorry if it sounds like it, but you must have keen eyes coz i cant tell the difference between your pictures or your videos...i dont really know what im looking for but, i watch and looked for changes mulitple times, and i had to really look hard to notice that the tiles on the floor looked a smidge rougher than HQ.
for me i simply cant tell the difference at all, so ill take my quality setting, and my extra performance and be happy
What sort of a difference? I'd probably be checking your profiles in case the local game profiles are overriding the global ones, as they will.There's more to it than that... I have those 3 optimizations disabled for both of those videos and all my screenshots, yet there's still a difference, although slight.
There's loss of quality on ATi cards too but most reviewers keep them on. Again I don't mind optimizations but whichever you choose keep both sides consistent.They could've benched it like that, I would have no problem as long as they told me that there was a loss in image quality in the Nv card.
Not all of them.You can disable ATI's optimizations,
You can't see them but somebody else might well. In such a case it then comes down to opinion as somebody else might well not see nVidia's optimizations.but I don't see why you would want to since they provide apparently the same image quality. The difference between ATI's optimizations is that you would need special tests to test it out. You wouldn't see it in a game.
How about before doing any IQ tests you ditch beta drivers and use official drivers?I also have this problem with Nv cards in COD where in the darker areas of the game, there are white ghosty lines running the walls as I move. But in the light areas, it goes away. It's like some kind of fog, but really bad.
What sort of a difference? I'd probably be checking your profiles in case the local game profiles are overriding the global ones, as they will.
How would you be able to present bmps. I can only do jpegs with my site. and you still haven't told me how to host a video.*EDIT* It would be nice of both people had Fraps too and could provide bmp's rather than jpegs since there's some quality loss with jpegs. I can do the nVidia part of that, anyone with an x800 or somethin wanna do the ATI part?
What optimizations are there that you can't turn off? Can you see the loss?There's loss of quality on ATi cards too but most reviewers keep them on. Again I don't mind optimizations but whichever you choose keep both sides consistent.
All the things I talk about here existed in the OFFICIAL drivers as well. I have four pics in the first post taken with the them. But after 5 months of not upgrading drivers, I figured, sh|t, why not. Nothing seems to have changed in the graphics.How about before doing any IQ tests you ditch beta drivers and use official drivers?
I use global only: My first post reads:What sort of a difference? I'd probably be checking your profiles in case the local game profiles are overriding the global ones, as they will.
How would you be able to present bmps. I can only do jpegs with my site. and you still haven't told me how to host a video.
6 bit filtering, angle dependent AF, trylinear.What optimizations are there that you can't turn off?
Yes but's minor compared to the speed gain you get. Again I'm not arguing against global optimizations like that provided they benefit all apps and don't degrade IQ past tolerable limits.Can you see the loss?
It may also be a good idea to take the snapshots in exactly the same place, not to mention of there's any dynamic IQ differences such as active shaders it's almost impossible to do a valid comparison.I use global only: My first post reads:
The end result is lower IQ compared to 8 bit filtering though.6 bit filtering is not an optimization, it is standard DX9 specs.
And why would one want to turn it off? Again, I'm all for global optimizations but let's not pretend that ATi doesn't do them.Angle dependent AF is also available on Nv cards which also can't be turned off.
If we could turn it off you might well be able to.I don't know about adaptive Trilinear, but I doubt you can tell the difference.
You can disable ATI's optimizations, but I don't see why you would want to since they provide apparently the same image quality. The difference between ATI's optimizations is that you would need special tests to test it out. You wouldn't see it in a game.
6 bit filtering is not an optimization, it is standard DX9 specs.
Angle dependent AF is also available on Nv cards which also can't be turned off.
Otherwise, claims of "filtering tricks" when even the article points out that ATI's implementation matchs Direct3D's reference rasterizer will only appeal to people looking for a reason to dog on ATI.
That's true, but it's still not and optimization and I don't think you can blame a ATI for following minimum specs. Plus Nvidia uses 4-bit filtering when they can. And remember Nvidia could be using 4-bit filtering throughtout all OpengGL games because John that the number of bits used for filtering isn't as important in OGL games unlike DX games. So ATI's implimentation isn't all cookies all the time. They should move to Nvidia's method.The end result is lower IQ compared to 8 bit filtering though.
I think you're going a bit too extreme here. Although weighted manhattan sucks, both companies use it and that means both are to blame for using it so in an arguement siding one company over the other, this topic cancels out. I think the one you should be blaming here is Nvidia for going the way ATI did instead of displaying optimal IQ.And why would one want to turn it off? Again, I'm all for global optimizations but let's not pretend that ATi doesn't do them.
probably, but the difference isn't discernable during gameplay. Like LCD's VS CRT's, you would need to have them side by side in order to tell, but I think even with adaptive filtering it will be hard to tell side by side.If we could turn it off you might well be able to.
Well, if you want to shut off the application detection in Doom3, you would turn Catalyst A.I. from Advanced to Standard, the default setting that doesn't use questionable optimizations according to ATI. Trylinear should be able to be disabled by turning off Catalyst A.I.How do I shut off the ani hack ATi is using for DooM3? Or trylinear?
No Anisotropic would suck. Well that explains why all games don't come with those type of options.Anisotropic is not a DX spec, using your basis for lower quality being OK because it is within DX specifications, then a company need only show slight improvement over base trilinear filtering to have acceptable IQ in terms of anisotropic.
I wouldn't say lot's of it, but more than Nvidia's. The idea that aliasing looks like additional detail is because it provides a bit more definition to the image when it's not moving and makes lines in the textures clearer to see when moving - because you can see jaggies on them. This is a loss in detail and it's a crappy one.It is clearly visible in any 3D game that has texture maps. I could as easily say to you that the only people who would deny it are those looking to worship ATi. I can pull up tons of quotes of people talking about the additional "detail" in ATi's parts over nV's("detail" that is actually loss of accuracy due to lower precission). It introduces aliasing, and lots of it.
I thought that the MS Reference Rasterizer used for DirectX also used 6 bits of precision for aniso. I haven't confirmed that personally, but I read that somewhere.Originally posted by: BenSkywalker
Anisotropic is not a DX spec, using your basis for lower quality being OK because it is within DX specifications, then a company need only show slight improvement over base trilinear filtering to have acceptable IQ in terms of anisotropic. You are sliding down a very slippery slope when you go the route you are supporting with that statement.6 bit filtering is not an optimization, it is standard DX9 specs.
Although it may still be "standard" (or not), it is slightly unfortunate, because the purpose of AF is to reduce aliasing of the textures applied to polygon sides, and if aliasing is still visible to the user, then ... well, it's not doing a very good job. I would be really surprised that you would be able to notice it in every game, it would seem to manifest itself the worst on very large-area textures, much like the poor-quality linear texmapping used on the PSX (1), and which was later mitigated by the use of polygon sub-division in the 3D API libs.Originally posted by: BenSkywalker
It is clearly visible in any 3D game that has texture maps. I could as easily say to you that the only people who would deny it are those looking to worship ATi. I can pull up tons of quotes of people talking about the additional "detail" in ATi's parts over nV's("detail" that is actually loss of accuracy due to lower precission). It introduces aliasing, and lots of it.Otherwise, claims of "filtering tricks" when even the article points out that ATI's implementation matchs Direct3D's reference rasterizer will only appeal to people looking for a reason to dog on ATI.
That's true, but it's still not and optimization and I don't think you can blame a ATI for following minimum specs.
Plus Nvidia uses 4-bit filtering when they can. And remember Nvidia could be using 4-bit filtering throughtout all OpengGL games because John that the number of bits used for filtering isn't as important in OGL games unlike DX games.
I think the one you should be blaming here is Nvidia for going the way ATI did instead of displaying optimal IQ.
Well, if you want to shut off the application detection in Doom3, you would turn Catalyst A.I. from Advanced to Standard, the default setting that doesn't use questionable optimizations according to ATI. Trylinear should be able to be disabled by turning off Catalyst A.I.
I wouldn't say lot's of it, but more than Nvidia's.
The idea that aliasing looks like additional detail is because it provides a bit more definition to the image when it's not moving and makes lines in the textures clearer to see when moving - because you can see jaggies on them.
What we need is Ti4600 Anisotropic with 6800 Filtering and X800 optimizations(other than anisotropic).
I thought that the MS Reference Rasterizer used for DirectX also used 6 bits of precision for aniso. I haven't confirmed that personally, but I read that somewhere.
I would be really surprised that you would be able to notice it in every game, it would seem to manifest itself the worst on very large-area textures, much like the poor-quality linear texmapping used on the PSX (1), and which was later mitigated by the use of polygon sub-division in the 3D API libs.