"I WANT TO BELIEVE": Nvidia's texture hardware X-file and Anantech's silence

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

lsd

Golden Member
Sep 26, 2000
1,184
70
91


<< Ever notice in the graphics menu how you have to &quot;accept&quot; changes when you change texture or colordepth, then everything blinks out and whatnot? that's what is happening, the video is restarting >>


Yes i know, it did flash thats why I thought the settings were changed. Can someone confirm this for me?
 

Michael

Elite member
Nov 19, 1999
5,435
234
106
audreymi - Anand personally answered the root of your complaints, that Anandtech's reviews are biased and unfair, in a post a few months back in the general hardware forum. Do a search, you should be able to find it. Someone was complaining that Anand was going to post an unfair (flattering) review of the Pentium 4.

I have seen Anand and his reviewers admit to errors and make corrections lots of times in the past. I do not think that the GF2 &quot;sky&quot; error is all that significant and I think that it is a very good chipset. As far as I can tell, you have to make comprimises in all chip designs. When comparing one chip to another (NVIDIA vs ATi), even a lay person can find areas where one is stronger. In the end, it becomes subjective as to what is the most important.

With NVIDIA buying the IP of 3dfx, they will be getting the 2D core that 3dfx designed. If their 2D really does need help, they'll have a fresh set of tools to work with very soon.

Michael
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
lsd, I ran the mpdemo1. 59.9 - 60.1 fps (1st pass - 2nd pass). The settings are 1024x768x32 . Everything max, high, on, marks on walls, floating scores, ejecting brass, yada yada. The &quot;SHQ&quot; settings I guess. Not a super high score I guess, but gameplay is smooth (what I consider smooth amyway).
 

lsd

Golden Member
Sep 26, 2000
1,184
70
91


<< lsd, I ran the mpdemo1. 59.9 - 60.1 fps (1st pass - 2nd pass). The settings are 1024x768x32 . Everything max, high, on, marks on walls, floating scores, ejecting brass, yada yada. The &quot;SHQ&quot; settings I guess. Not a super high score I guess, but gameplay is smooth (what I consider smooth amyway). >>


I'm getting about the same too and that's almost a 40% drop in performance. How much is that a drop for you?
 

WetWilly

Golden Member
Oct 13, 1999
1,126
0
0
Ben,

Wow, I get to answer before Robo? As an aside, I've got to admit that your and Robo's exchanges are usually pretty entertaining and sometimes informative along the way.

Image quality is ALL about accuracy. You may have a preference for an improperly rendered scene, but an improperly rendered scene is by definition of lower image quality(when speaking of 3D graphics, accuracy is how image quality is defined).

Boy, I've really got to disagree on this one. First, since most of us here aren't developers, I'll repeat what I said about the definitions of those two terms. The definition of &quot;quality&quot; we're talking about here is the &quot;degree or grade of excellence&quot; of the image. The definition of &quot;accuracy&quot; is &quot;precision; exactness.&quot; They AREN'T the same thing - you can have excellence without precision, and you can have precision without excellence. Image quality is subjective; image accuracy isn't. Like you said, I may have a preference for an improperly rendered scene; I say it's got good quality/higher degree of excellence than the properly rendered scene because of better color saturation, better aliasing, etc. You say it's got bad quality because it's imprecise. Who's wrong? Neither of us.

Second, let's take a hypothetical that's not too far removed from the real world. Say I've got two GeForce2 Ultra boards - a Canopus Spectra 8800 and one from NoNameOEM. NoNameOEM has found they can save some money by going REAL cheap on the video signal filter components on the card. The Canopus, on the other hand, has a Dual Filter System on the signal (believe it's jumper adjustable as well) and a SuperSignalHighway BNC board attached which replaces the HD15 VGA connector with 5 BNC connectors. But to make things even, we'll attach a VGA-to-BNC cable to the NoNameOEM card. Now to test image &quot;quality&quot;, we'll run Sharky's reference image test at 1600x1200. So you're saying that because both boards render the reference image identically - and accurately using Sharky's results - that their image quality is identical? I'd consider the argument if you were judging chipsets, but we're not talking about chipsets. We're talking about boards - boards that have other components affecting the image besides the GPU/accelerator that, at least in the case of the GeForce, nVidia has relatively little control over.

Better yet, let's go one step further. Let's take the NoNameOEM card and disconnect the Red signal pin from the VGA connector and run Sharky's XOR image quality test. Guess what? The Canopus and and NoNameOEM card with no red output signal have the same image quality.

Radeon looks real sharp and the FSAA doesn't work for sh!t, they go together. They make a compromise to have a sharper image, increased aliasing. Take a flight sim or a racing sim and compare the aliasing to a GeForce series of boards, neither of them having FSAA enabled. The Radeon suffers quite a bit more noticeable aliasing then the GeForce boards.

This doesn't make sense. The Radeon is sharper than the GeForce at default settings. So you're saying that the Radeon has a sharper image because they use some sort of &quot;hidden&quot; aliasing at default that's disabled when you disable FSAA - which wasn't on in the first place?

The V5 gives the best compromise in this case, being able to select your own LOD settings(nVidia and ATi should take note) so you can decide.

Agreed.

Myself, I drop the gamma way down(~0.65) and raise the contrast up a decent amount and the brightness slightly for GF boards(all of them I have ever used). This gives you the color saturation that you see with the Radeon out of the box, at the cost of slight bleeding which the ATi also has(take any screen shot and bump up the contrast, colors start to bleed along with getting brighter).

Let me get this straight. GeForce + gamma/contrast/brightness adjustment + slight color bleeding ~= Radeon without color bleeding. But on the bright side, if I take the good Radeon non-color bleed image and bump the contrast I can make it color bleed just like the GeForce?

And why the heck do I have to work so hard to get the GeForce to look good? The Matrox didnt' require all this, the Radeon didn't, and neither did the V5500. If we apply the image quality/accuracy analogy to this, Matrox, ATI and 3dfx (RIP) are all doing it wrong and nVidia's doing it right?

Image quality comes down to accuracy. Everything else can be adjusted(as long as the tools are available).

If you can take that NoNameOEM board with a VGA HD15 connector and crappy filters from my first example and adjust it with software/monitor/gamma calibration to look exactly like the Canopus Spectra BNC at 1600x1200 or 1920x1440, I'd say you're a wizard.
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
lsd....drop from what? The old 1.17 demo1? I got about 80 in that one if thats what you are asking.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
WetWilly-

&quot;As an aside, I've got to admit that your and Robo's exchanges are usually pretty entertaining and sometimes informative along the way.&quot;

D@mnit Robo, we are going to have to work on that, say we go back and forth with 3dfx svcksor and nVidia ownzz joo for a while

&quot;I say it's got good quality/higher degree of excellence than the properly rendered scene because of better color saturation, better aliasing, etc. You say it's got bad quality because it's imprecise. Who's wrong? Neither of us.&quot;

If we are talking about 3D image quality there is a long running definition of exactly what it pertains to. The accuracy compared to the native API software rasterizer. Everything else is secondary. If I, as a devloper, want to render say the Elephant man's face and card A does this while card B renders the face of a supermodel, it in no way whatsoever means that card B has better image quality, it is in fact inferior and provably so(even if it is nicer to look at).

Image quality as a perception is one thing, dealing with 3D images it is very easy to declare right and wrong. After we see a perfect rendering, then we can get into the subjective matters(color brightness etc), until we see a perfectly rendered image then one is definitively inferior.

&quot;This doesn't make sense. The Radeon is sharper than the GeForce at default settings. So you're saying that the Radeon has a sharper image because they use some sort of &quot;hidden&quot; aliasing at default that's disabled when you disable FSAA - which wasn't on in the first place?&quot;

Perhaps I didn't word this well. The Radeon uses a more agressive LOD setting then they are supposed to. This is provable by comparing it to the reference API software rasterizer. The reason is looks sharper, and why you notice details that you didn't with another board, is because they use a more agressive, albeit out of spec, LOD setting. This, the more agressive LOD, causes more serious texture aliasing and contributes to the Radeon's &quot;poor&quot; FSAA when compared to nVidia's. Ever wonder why nV comes out ahead in pretty much every FSAA comparison even though ATi and nVidia are doing the exact same thing?

&quot;Let me get this straight. GeForce + gamma/contrast/brightness adjustment + slight color bleeding ~= Radeon without color bleeding. But on the bright side, if I take the good Radeon non-color bleed image and bump the contrast I can make it color bleed just like the GeForce?&quot;

The Radeon has color bleeding at default settings. Raising the contrast makes the problem more obvious for those with less sensitive eyes or those who don't spend a great deal of time looking for flaws in 3D graphics. Adjusting the GF series of boards you get the same &quot;vividness&quot; and color saturation and also the same amount of bleeding. It is a tradeoff, one that I chose to make for gaming but wouldn't think of making for anything that was supposed to be used for consumption.

&quot;And why the heck do I have to work so hard to get the GeForce to look good? The Matrox didnt' require all this, the Radeon didn't, and neither did the V5500. If we apply the image quality/accuracy analogy to this, Matrox, ATI and 3dfx (RIP) are all doing it wrong and nVidia's doing it right?&quot;

In 3D? nVidia is the market leader by a wide margin in this area amongst consumer boards, not sure what you are saying. They are good enough for SGI's workstations, that should be rather telling. If you want the GF to look as bright as a Radeon, yes, you need to tweak it. If you are talking about 2D, I have never said that nVidia was class leading, it just isn't anywhere near as bad as most people make it out to be(as long as you are not on a Trinitron). Up to a certain resolution, you would be hard pressed to tell a Matrox G400 apart from a GF2, it just varries greatly on when that happens)differentation) by which monitor you have and which particular video card(CL, Asus etc).
 

WetWilly

Golden Member
Oct 13, 1999
1,126
0
0
Ben,

D@mnit Robo, we are going to have to work on that, say we go back and forth with 3dfx svcksor and nVidia ownzz joo for a while

I haven't heard that exchange for a while, so now I'm satisfied

The accuracy compared to the native API software rasterizer.

This is the crux of our disagreement. Your perspective of quality is based on how accurately the chipset does what the developer asks it to do. I'm not a developer and I'm simply looking for the card that provides the best - i.e. sharpest, best color saturation - display on my monitor. I also fully realize, as I said in my post above, that my definition is subjective and yours isn't. And like Robo said, no review can tell me how a card is going to look in my personal system. But can the review give a general idea, like the Riva3D review of the Radeon? Yes.

I'll go back to my hypothetical - take two GeForce cards, remove the video out connector on one of them, write a script to render the reference image, XOR it to the actual reference image, output the XOR image to a removable media, view it on another computer, and guess what? Under your definition, the disabled card's image quality is identical to the normal card's, even if the disabled card is physically incapable of outputting a video signal at all. Under your premise, ALL GeForce cards have or are capable of identical image quality, and that's simply not true. Image accuracy, yes. Image quality, as defined by pretty much all consumers not developers, no.

After we see a perfect rendering, then we can get into the subjective matters(color brightness etc), until we see a perfectly rendered image then one is definitively inferior

Again this is a developer's viewpoint, not a consumer's. How do you see a perfect rendering? Do you only look at XOR'd images or do you look at actual card output? Going back to Sharky's test and the significant differences between the V5500 and the reference image, how many people could tell there were that many differences without seeing the XOR'd image? How many consumers took back their video card because &quot;the display is sharp and well saturated, but it doesn't render those tent legs in 3DMark2000 well.&quot; The operative word is &quot;see,&quot; and most users will much more quickly see differences in color saturation and sharpness than even moderate deviances in rendering. Unless you're talking about the S3TC skies - then EVERYBODY notices it.

Adjusting the GF series of boards you get the same &quot;vividness&quot; and color saturation and also the same amount of bleeding.

But I don't on my system, and that's my point about the Radeon. I don't need to adjust contrast/brightness/gamma for the desktop and need to do it again for games (except of course of in-game controls). And I'm not alone on that with the Radeon. Comparing the best I've been able to tweak the GeForce using my eyes (and I also tried eColor as well) with the best I can get out of the Radeon, and the Radeon simply looks better on my system.

&quot;And why the heck do I have to work so hard to get the GeForce to look good?&quot;

I admit I was being a bit facetious there. But I've done the things you've mentioned here and on other threads (which was how I found eColor) with gamma/contrast, and I've spent a LOT more time trying to find the optimal display for the GeForce than with the Radeon. Not only that, even the BIOS boot screens are noticeably whiter and sharper with the Radeon. It's the first thing I noticed when I booted with the Radeon before I even got to Windows. For the heck of it I tried to get the V5500 and GeForce to match the brightness and sharpness of the Radeon's BIOS boot screen text/graphic while maintaining the blackness of the background. Couldn't do it with brightness/contrast/color temperature. Do I spend a lot of time at the boot screen? No, but anecdotally, I'd say that ATI made color saturation a priority.

Like I've said in a previous post, I still haven't ruled out the Elsa. I prefer the Radeon, but I've had an Xpert128 and ATI's driver support history isn't terribly reassuring. In fact, if their 8000 series drivers aren't released within my return period, I may end up keeping the Elsa. So, despite what you might read into my reply to RobsTV, I don't really hate nVidia or the GeForce. I simply said that of all the cards I've tried in my system, 2D-wise nVidia ranked last, and I'm real picky about 2D. Doesn't mean it's bad, but it's not as good as the others. All I mentioned about 3D is that the ATI had better color saturation, which you've acknowledged here. It's just that every nVidia card I've used just hasn't been as good as the competition image quality-wise (my definition). Riva 128 vs. Rendition V2200 vs. Voodoo Graphics, TNT2 vs. G400, GeForce2 vs. Radeon. I also don't think that the GeForce's 3D sucks. I thought their 32-bit looked great until I tried the Radeon. Now I'd rate it as very good. I've had &quot;discussions&quot; with RobsTV because he seems to believe that nVidia's cards are apparently as close to perfection as you can get (e.g. my experience was unlikely/impossible or a bad card, ignoring the fact that a Creative MX had the worst 2D of all I'd tried on my system); worse, he apparently hasn't used either an ATI or Matrox card. Like one thread where someone who had a Banshee and wasn't a heavy gamer wanted to use their PC as a replacement DVD player with output to a TV. I recommended a $70-100 Matrox G450 DualHead with DVDMax; RobsTV suggested a GeForce2 MX with either a scan converter or TV-out. Sorry, ain't no comparison. Come on, DVD's through a scan converter?
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
Ben:

Image quality is ALL about accuracy.

in theory, yes. In practice, not necessarily.

Here's a few examples of theoretically inferior &quot;inaccuracy&quot; looking better than &quot;accuracy&quot;

1) 22-bit post filter
2) FSAA

another good example is me. If you want to improve the image quality of my ugly-ass mug, you don't want accuracy. The more inaccurate, the better.

The V5 gives the best compromise in this case, being able to select your own LOD settings

indeed, and we can thank Reverend from the Pulpit for that one.

Yes i know, it did flash thats why I thought the settings were changed. Can someone confirm this for me?


er...I just told you. When you &quot;accept&quot; changes, the game is issuing the command

/vid_restart

also, team arena maps and texturing are much heavier. That's why the 5500 tends to drop a bit *less* than a GTS.

WetWilly:

Wow, I get to answer before Robo?

I been busy ya bastage!! Mario Lemieux's comeback is driving this Pittsburgh Penguins hockey fan nuts!! WH00000000T!!!!!!!!!!

you can have precision without excellence.

a perfectly rendered mug shot of me, for example.

<Robo flexes his big green muscles>

YEAH BABY!


lsd:

what did you say there? I only see a &quot;period&quot; (.)

why does that happen sometimes?

Ben:

3dfx svcksor and nVidia ownzz joo

bah! Blow me!

nVidia ownzz jizzz, 3dfx svcksor me.....

or something......

If we are talking about 3D image quality there is a long running definition of exactly what it pertains to.

Ben, get your goddamn head out of the books and look at a flippin' screen. Does it look good or not? Do the colors look vibrant, or do they look a bit washed out? Ghosting? pixellation? Banding? etc....

The theoretical stuff is all well and good, but it just doesn't always apply to reality.

WetWilly:

Image accuracy, yes. Image quality, as defined by pretty much all consumers not developers, no.


there ya go. That's my point. Ben looks at things from a bookworm perspective, which doesn't always translate out to real life.

Kinda like some officers in the military. They tell me (Sergeant) what they want to happen, and then they (if they're dense enough) tell me the 'book way to do things'. My job is to say &quot;yessir&quot;, then do what actually WORKS.

&quot;By the book&quot; doesn't always get the best real world results.

FSAA is quite inaccurate, as Ben loves to point out.

But guess what? It looks fantastic.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
WetWilly-

&quot;But can the review give a general idea, like the Riva3D review of the Radeon? Yes.&quot;

No, they can't. When dealing with a subjective viewpoint you can't accurately say which looks better. If you have a preference for one being brighter, they can mention that. I did not notice anywhere in the Riva3D review mention of the Radeon having the worst texture aliasing of any board, even though it does. Which do you like better? If someone likes bright colors a bit better then duller, but absolutely hates texture aliasing, he wouldn't have any idea about that issue by reading reviews that mention only the strong points. Perhaps if you could get several people to give their perspective on subjective image quality in a review that might mean something, having one person's views certainly isn't going to help in this area.

&quot;Under your premise, ALL GeForce cards have or are capable of identical image quality, and that's simply not true. Image accuracy, yes. Image quality, as defined by pretty much all consumers not developers, no.&quot;

Going the other way what if we had a card that only displayed images of bare naked supermodels all the time, no matter what your computer was trying to do. Under your definition that could would have better image quality Clearly both examples are absurd.

&quot;But I don't on my system, and that's my point about the Radeon. I don't need to adjust contrast/brightness/gamma for the desktop and need to do it again for games (except of course of in-game controls).&quot;

I leave it set one way unless I'm working, in which case I have a quick toggle enabled. Is it easier on the Radeon? Absolutely, again I have never argued this point. For that matter, I don't recall the last time I have reccomended anyone buy a GF2 for gaming(unless they have ruled out the Radeon), I've been telling people to buy ATi for some time now(outside of Win2K).

Robo and WetWilly-

&quot;The theoretical stuff is all well and good, but it just doesn't always apply to reality.&quot;

OK, I'll go into why accurate image representation is becoming increasingly important.

Take UBB compared to FuseTalk. For UBB a monochrome monitor with a 512KB graphics card can pretty much display things just fine, of course it will of lower quality image wise, but the accuracy will be there. When you reach the point of having a 100% accurate image, then the subjective becomes more important.

Compare that to FuseTalk. For fusetalk, you would need I would think at least a 2MB vid card and a VGA monitor. Trying to display this forum on an old 512K board would result in serious image inaccuracies.

3D graphics are still in their infancy as far as real time is concerned, and while something may show up as a very minor imperfection in Quake1, it could be much much worse in Quake3 and intollerable in Doom3.

Having things out of spec, such as the LOD, can take away certain elements of the game. Say developers want to force you to walk close to a wall to read something, and there is a trap door on the floor when you get close enough to read it. If you have a 21&quot; monitor and are running 1600x1200 with the LOD offset enough, you may be able to avoid the trap and read the wall. This could detract from the experience the devloper intended. Inaccurate image rendering can cause gameplay issues.

Looking at some current issues the Radeon has, look to cubic environment mapping. With the latest official drivers, the board has problems displaying them properly, this results in a very noticeable inferior image when compared to GeForce boards, even though the Radeon at default still has brighter colors. Pixel popping is another point. With nV boards, pixel popping is nearly non existant(though not entirely). One of the big strong points of FSAA that we heard from many 3dfx loyalists was that it rectified this issue, an issue that wasn't one for nV owners(very rare occurance). If the 3dfx boards did not have a problem with accurate image represention to begin with, this wouldn't have been a factor.

If we need to say that board X looks better if there aren't any major image glitches(dropped textures, corrupted textures, Z errors, pixel popping, dropped polys etc), then I have a very hard time using subjective measures to deem board X as having better image quality. When all the boards are rendering everything perfectly, then subjectivity becomes more important but even then, it is still a subjective view and one that you need to trust the reviewers tastes to match yours.

&quot;FSAA is quite inaccurate, as Ben loves to point out.

But guess what? It looks fantastic.&quot;


And corrupts the image. For some games this is fine, sims mainly. For FPSs it makes shooting someone at distance a b!tch. If you have to have a tradeoff, then can it be considered an across the board enhancement?

MSAA combined with high tap anisotropic is superior to current FSAA, and you can use a software rasterizer to prove it FSAA, as it exists on current boards is a hack, a better way is coming and very soon.

I know you have commented several times Robo that perhaps the reason why V5 owners care more about FSAA is because they have the best FSAA, have you considered that perhaps everyone who cares greatly about it bought a V5 in the first place and the overwhelming majority of people truly don't care that much? If you don't play sims, how much does FSAA offer you?

22bit post filter-

This certainly didn't make the V3 look better then the TNT2U. Does it help the V3/V4/V5? Of course, but would you trade the 22bit post filter on your V5 for support for Dot3, EMBM and trilinear filtering? I would jump at the chance. The filter is designed to make 16bit look better, which it does. The problem isn't in enhancing the IQ beyond default, the problem is getting the image right in the first place.

&quot;Ben, get your goddamn head out of the books and look at a flippin' screen. Does it look good or not? Do the colors look vibrant, or do they look a bit washed out? Ghosting? pixellation? Banding? etc....&quot;

I would wager that I have spent a lot more time looking at screens of 3D images concerning IQ then anyone else in this thread, quite possibly on this board. I was observing 3D IQ before the Voodoo1, my eyes are quite used to picking out imperfections, perhaps that is why I care so much about the actual integrity of the image.
 

audreymi

Member
Nov 5, 2000
66
0
0
Day 5: O.K. I'll make it a bit easier. You guys could write about the improvement in sky textures in the newly released Quake Team Arena. Just be sure to state that texture compression is defaulted to off.
Hardware with working correct auto compression should be encouraged to turn it on.

1)Regarding BenSkywalker comments:
Ben, I used to teach writing and debating in high school. In a debate you have to concede sometimes and pick your battles. In forums, this is even more true. Trust me on this one, we do not need a line by line rebuttal of both RoboTech and Willy's post. Summarize, reduce and then comment. Some good points, though.

1) Abstinence makes the silence more deafening
Some have asked me to give up on this &quot;mexican standoff&quot; and have asked me why I continue? I did look for Anand's statement on review policy in the form of a link to a dead post. Does anyone have it handy? Very strange. The texture corruption problem was, for me, another example of bias or selective silence. I read the reviews and chose a Radeon, but came across the Anand's version of a budget card review with conclusions completely at odds with Sharky's. The rationalization of his decision makes for an interesting read for the critical reader. My son used to ignore the facts when he said he didn't do it. There are some statements in there (see enclosed links) which are counter to his performance graphs.
Secondly, his conclusions place a lot of weight in 3Dmark2000, that in itself has questions about it lining up with good real world game perforamnce. It was also written and tested on Nvidia chips, which at at time was the only T&amp;L engine for them to debug their test software. I think it is premature to place a lot of weight in T&amp;L support based upon 3Dmark2000.

Now the story would normally end at this point were it not for the fact that readers were invited to comment on the article back to the author, Matt Witheiler. Many readers raised good questions about the article but the only response from Matt was that he was busy and had to run and would answer all our questions on image quality and data in his article that went counter to his statements in his conclusions.

The feedback post was yanked from the Anandtech Articles forum 13 days later without another response to any of the comments. Boys, ilence is not golden, it only makes you run to the bathroom more often. Please respond to your readership.

P.S. Please try out the Win2K drivers that are now within 10% of Win98/ME's performance. Could you also explain why Win2K has favoured son status amongst MS's various offerings? Would this have altered your conclusions?
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
&quot;Summarize, reduce and then comment.&quot;

I am summarizing.

In a debating class you are dealing, mainly, with subjective material. Computers, in terms of 3D calculations, are not subjective. They deal with 0s and 1s and nothing else. I also have been avoiding breaking down every sentence in each post which I am known to do from time to time.

I would say that one of the biggest problems with society in general is the search for the soundbite. A complex issue demands a complex explenation, isn't that what you are requesting?

I could get much longer winded in my posts, I try to be brief and leave holes to prolong the discussion, that is intentional. If you poke through the GH or Video forums I'm sure you will dig up a couple 300+ post arguments that I've been involved in that in some cases spilled over into multiple different forums, and I, and the others involved, were *not* trying to be brief in those(if your interested, be prepared to be reading for many hours, some real good ones on FSAA with the average sized post being a bit longer then the longest in this thread).

&quot;In a debate you have to concede sometimes and pick your battles.&quot;

On a subjective matter I will. On a factual issue I will not. This forum, while it may seem to be confrontational at times, is one mainly of education. It is not in the best interest of the body of users to leave out important elements of the discussion.

Trying to keep things as brief as possible would be ignoring it completely, that wouldn't do anyone any good. This issue is not simplistic in nature, we are talking about hardware that executes millions(very conservatively, the current GF boards peak at 50 billion or more) of operations per second, it can be broken down to exactly what is causing it. If you don't think that to be the case, get a hex editor and I can walk you though how to fix it. The problem with that is, it only is a workaround for the condition, our discussion started on the core issue and has since expanded into broader image quality issues.
 

audreymi

Member
Nov 5, 2000
66
0
0

Ben, I rest my case

This afternoon, I'll edit your reply above to the &quot;concise&quot; version. Not to be critical but I have to admit you are fairly transparent and one can pretty much size you up from reading just one post. BUT!!! I still suggest you try and adopt other demeanors from time to time. Cheers.

&quot;Luke,Simplicity is the key to the Force. We all feel misundertood in our own time&quot;
 

Michael

Elite member
Nov 19, 1999
5,435
234
106
audrymi - If you look up to the top of your screen and the bottom of your screen, you will note that your &quot;mexican stand-off&quot; is making Anand wealthier. I do not understand why Anand doesn't publish the principles he espoused in his P4 post on the messageboard as the Anandtech editorial policy. If it is, I see no reason not to trust the reviews here (with the knowledge that they will make mistakes at times).

BenSkywalker - The way I look at it, 3D rendering is a trick to make a 2D scene (my CRT) look like it is in 3D. It isn't really 3D. You are right that the math behind the rendering can be tested and that accuracy is measurable. I bet your eyes are better trained as, if I'm remember correctly, you're been working as a 3D artist for quite a while. I still think that &quot;image&quot; quality today is more subjective than objective because the current state of 3D cards for consumers are still making trade-offs. It comes down to bandwidth - current cards are not able to render in realtime what the professional software packages using rendering farms can put out (the Pixar putdown of NVIDIA is a good example of this discussion). It also comes down to game budgets. More detail means more money on art and artists.

That said, I still would trust your assessment of 3D image quality because my eyes are nowhere near as trained as yours and I do not have the knowledge or experience to understand everything that I am seeing. I also have owned 3dfx cards for a while (but now have 2 NVIDIA cards to try out as well), so my eyes are trained to see the improvements that the V2 -> V3 -> V5 progression have given. The NVIDIA defaults still do not look right to me, but I think that it is only because I am used to something else.

I also appreciate your longer replies. Since I do not have the same background, a short reply which may be best for someone with your experience does almost nothing to help me.

Michael
 

WetWilly

Golden Member
Oct 13, 1999
1,126
0
0
Ben,

No, they can't. When dealing with a subjective viewpoint you can't accurately say which looks better.

But Ben, subjectivity and accuracy don't go hand in hand. Which is why I understand up to a point why reviews totally avoid these issues and stick to things like FPS and image accuracy which they can measure with some degree of consistency, if not accuracy. And that's why my suggested comment was &quot;Maybe it's us but we found that ...&quot; This is also why it's next to impossible to find any good reviews on the quality of TV-output on video cards. Avoiding subjectivity may be an easy out, but sometimes it's a cop-out and disservice to readers.

Going the other way what if we had a card that only displayed images of bare naked supermodels all the time, no matter what your computer was trying to do

Well, while all of those developers out there are busy discussing their horror over this card's significantly inaccurate rendering, I'd be one REAL HAPPY guy - to heck with what I was trying to do But this actually proves my and Robo's point about image quality (my definition) and for that matter, real life - and that's the paradox that sometimes it takes something imperfect to make something perfect, and sometimes (like this instance) the definition of &quot;perfect&quot; is subjective. True, FSAA may blur the screen much to developers' horror, but you know what? I like it. After going from 2x FSAA on all the time on the V5500 to the first installs on the GeForce and Radeon with no FSAA on by default, the first thing I noticed was the aliasing (well on the Radeon it was the second thing I noticed).

Trying to display this forum on an old 512K board would result in serious image inaccuracies

I'm not sure about this one. I changed my resolution to 640x480x256. The board was still usable, but with a heck of a lot more scrolling. If you were using a VGA monochrome monitor, those had 64 shades of grey as I recall, so I'm not sure that the image would cause the board to be unusable. Not as &quot;pretty,&quot; yes. What are you defining as &quot;serious inaccuracies&quot; outside of the lack of color, and how would they impede the usage of the board? And don't forget, if there were color issues, they'd probably be fixed by changing FuseTalk's color scheme.

Having things out of spec, such as the LOD, can take away certain elements of the game. Say developers want to force you to walk close to a wall to read something, and there is a trap door on the floor when you get close enough to read it. If you have a 21&quot; monitor and are running 1600x1200 with the LOD offset enough, you may be able to avoid the trap and read the wall

I dunno ... my personal solution to this one is to save often

But guess what? It looks fantastic.&quot;
And corrupts the image.

...
perhaps that is why I care so much about the actual integrity of the image.

Again, this is exactly crux of the disagreement: Your priorities: 1) image accuracy, 2) subjective attributes such as sharpness (however defined), color saturation. Mine, and again I'd bet most consumers', are exactly the opposite. Good example of this - someone who had a PowerVR Kyro board posted a thread titled &quot;Wow! Kyro's Z buffer accuracy rocks!&quot; going forth about Z-buffer accuracy and had two screenshots (now unavailable) from a Kyro and a V5500 to illustrate his point. The general consensus was either &quot;I don't see it&quot; or &quot;I see it but I don't care.&quot; I had seen Sharky's XOR test back when it was initially published. I personally considered it an intellectual curiosity, but wouldn't even think twice of including it in the decision-making process. And if a developer actually wrote/published a game in which gameplay was affected or altered by differences in image accuracy even as &quot;significant&quot; as those in Sharky's 3dfx/XOR test, then I'd bet that game would be on a fast track to Daikatana City, aka the Bargain Bin.

...

On a subjective matter I will. On a factual issue I will not.

This [/b]is[/b] a subjective matter. We totally agree on the factual definition of &quot;image accuracy;&quot; we don't agree on the definition of &quot;image quality,&quot; because the definition itself is subjective. To developers it may be &quot;image accuracy,&quot; but to consumers it's definitely not. Heck, how many consumers would even know how to check image accuracy, let alone make it a part of a purchasing decision? Outside of Sharky's review and one test of image accuracy, how many other consumer/tech websites or publications, or any other sources for that matter, consistently run comparative accuracy tests for informational purposes?

Audreymi,

There's no problem with Ben's posts. I'll admit I'm guilty of it myself, because there are some posts, particularly outrageous ones, that beg for a line-by-line response. The only downside to long posts I believe is peoples' attention spans, because I'd say a fair number of them don't get read - ironically by people who should probably read them. And I personally find a longer, thoughtful post is preferable to the somewhat popular &quot;Oh Yeah? You suck!!!!&quot; response
 

audreymi

Member
Nov 5, 2000
66
0
0

Michael:

I do not get your comments about making Anandtech wealthier. I looked up and down and came up empty. As sites get bigger and kids at Anandtech begin to have buy cars and houses, court spouses and hire more employees, I think a standard editorial policy and copy editor would be in the best interest for all. Mistakes can be avoided. Review sites will eventually be just as susceptible to the whimsy of advertisers as other dot-com companies have found out.
Have not run across that Pentium 4 post that you mention, could you
post a link?

Ben and Willy:

I take it back my comments after having read some of the counter arguments and having re-read your posts. I guess I was a little impatient and rolled my eyes when I looked at the length of the replies. Sometimes, it does feel like I am being hit over the head with a 2x4 when you guys are making a point. I get it! I wish Anandtech would hire you two as a pair of ghost writers to write a series of articles on this. Heck, they might even let me read your copy before printing it

Gotta run, now, I hear somebody is desperately seeking me
 

audreymi

Member
Nov 5, 2000
66
0
0

Day 6: All I need is water, bread, and a PC. I am all linked out today
so I'll just say Happy New Year and I'll toast to more even handed reviews.

 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Michael-

&quot;The way I look at it, 3D rendering is a trick to make a 2D scene (my CRT) look like it is in 3D.&quot;

A good way to think of it is your video card takes a &quot;picture&quot; of a 3D scene to display on your monitor. Accuracy is about taking the right picture, objective IQ is about the quality of the picture(film speed, exposure etc).

WetWilly

&quot;But Ben, subjectivity and accuracy don't go hand in hand.&quot;

Which do you like better, Quake3 or UT? Would it be of service in a hardware review to make that kind of choice between video cards? Perhaps we are getting to a point where that is needed, but then we need to look at the amount of people offering their POV. Odds are that Anand will be handling the review of say the first NV20 board that is reviewed here, what if Anand's taste match mine but not Michael's or yours? What if it is the other way around?

&quot;True, FSAA may blur the screen much to developers' horror, but you know what? I like it. After going from 2x FSAA on all the time on the V5500 to the first installs on the GeForce and Radeon with no FSAA on by default, the first thing I noticed was the aliasing (well on the Radeon it was the second thing I noticed).&quot;

FSAA, as it exists now is dead. This generation of boards will be the last to use it, in no small part because of the inaccuracy of it. MSAA with high tap anisotropic filtering is replacing it, and will be superior in terms of end results. For this particular aspect, it is entriely subjective as FSAA is a new feature and one that is pretty much software transparent(a few apps do support it in some capacity). On this, I agree that the V5 does it best, but again that is subjective. You lose the most detail with the V5, most people for most games truly don't enough about it to make it a purchasing decission, so how much emphasis should be placed on it? This is an example of where subjectivity can become quite complex. If a reviewer spends a great deal of time discussing it, even if they conclude that they don't care for it enough to make it worthwhile, the amount of time spent on the subject gives the implication that it is of great importance.

&quot;What are you defining as &quot;serious inaccuracies&quot; outside of the lack of color, and how would they impede the usage of the board? And don't forget, if there were color issues, they'd probably be fixed by changing FuseTalk's color scheme.&quot;

Massive dithering. My point with using this as an example is like looking at a title in which say the nV boards have a slight flaw because of some Dot3 useage. In a game that uses the technique sparingly, it is a minor imperfection. Fire up Giants and everything in the game would be corrupted. That little imperfection seen in game A becomes serious overwhelming corruption in game B.

&quot;Again, this is exactly crux of the disagreement: Your priorities: 1) image accuracy, 2) subjective attributes such as sharpness (however defined), color saturation. Mine, and again I'd bet most consumers', are exactly the opposite.&quot;

You honestly think so? Look around at these boards for gripes about flickering textures in some games, or dropped textures, or Z errors. Now look for threads about color saturation. That is why I say that a precise image first, subjective second. The fact that those minor imperfections can become significantly amplified in future games is why I place such a high level of importance on rendering the scene properly now.

&quot;Good example of this - someone who had a PowerVR Kyro board posted a thread titled &quot;Wow! Kyro's Z buffer accuracy rocks!&quot; going forth about Z-buffer accuracy and had two screenshots (now unavailable) from a Kyro and a V5500 to illustrate his point.&quot;

Another 3D artist spotted the flaw I fully understand that most people won't care about such a minor glitch, but when you increase the levels of geometric compexity that little glitch can become serious image corruption.

&quot;And if a developer actually wrote/published a game in which gameplay was affected or altered by differences in image accuracy even as &quot;significant&quot; as those in Sharky's 3dfx/XOR test, then I'd bet that game would be on a fast track to Daikatana City, aka the Bargain Bin.&quot;

Why should it though? Are developers supposed to plan on out of spec boards forever? Are we to never progress because some boards can't get things right? In real life, we rely on our eyes more then any other sense when it comes to accomplishing most tasks, are developers supposed to ignore this and try to make games that are less immersive because of out of spec hardware? I have much higher expectations for gaming in general, I want to see the levels of immersiveness grow significantly, and that requires increasingly higher levels of reliance of precission in graphics and also audio and controller technology, but perhaps my hopes are too high.

&quot;This is a subjective matter. We totally agree on the factual definition of &quot;image accuracy;&quot; we don't agree on the definition of &quot;image quality,&quot; because the definition itself is subjective.&quot;

It isn't though. I did not define what 3D image quality implicates, it is an industry standard term. MS has their standard for D3D, SGI and many others have theirs for OpenGL, it isn't new and predates, by many years, 3D gaming. If you are looking for a different definition that is something else entirely, but the terms has existed for many years now and has not been altered by the industry. Does this agree with the dictionary? Perhaps not, but my volume of Websters doesn't have Duron in it either and we all know what that means

&quot;To developers it may be &quot;image accuracy,&quot; but to consumers it's definitely not. Heck, how many consumers would even know how to check image accuracy, let alone make it a part of a purchasing decision?&quot;

An awful lot of consumers care a great deal, just their levels of tollerance for imperfections are a bit higher. How many posts have we seen on these and other boards over the years about rendering imperfections? Gamers, and gaming in general right now, does not demand the level of precission that ProE or the likes do, but if the flaws are great enough even your average run of the mill gamer will gripe, sometimes quite loudly.

&quot;Outside of Sharky's review and one test of image accuracy, how many other consumer/tech websites or publications, or any other sources for that matter, consistently run comparative accuracy tests for informational purposes?&quot;

GameBasement does We aren't a hardware site, but when various issues come up in games we do report on them and try to figure out workarounds. For other sources, lots of them, mainly high end 3D at the moment though. As games continue to move forward, the level of precission required is getting increasingly higher, I don't think this should be ignored.
 

PeAK

Member
Sep 25, 2000
183
0
0
This would be Day 11 of audreymi's post, I guess. There are a lot neat points and links in this thread that I have bookmarked and borrowed for material over the last week. Thanks audreymi.

I read Anand's Looking Back at 2000: The Graphics Industry and wonder how much web sites have to do with the turn of events in the year 2000. It was great to see strong products survive on their own merits despite some questionable points raised by some of the more influential sites. Best of luck to the rest of the clan at 3Dfx.

Radeon firmly attached.
 

WetWilly

Golden Member
Oct 13, 1999
1,126
0
0
Gee, I'd almost forgotten about this thread and didn't get a subscription update on it. Well, let's get on with it ...

Ben,

&quot;But Ben, subjectivity and accuracy don't go hand in hand.&quot;
...
Which do you like better, Quake3 or UT?


I like UT better, which is a subjective choice. So you're saying there's an accurate choice as to whether Quake3 or UT is better? If there were, the only &quot;accuracy&quot; involved would be how &quot;accurately&quot; they comply with subjective criteria.

Would it be of service in a hardware review to make that kind of choice between video cards? Perhaps we are getting to a point where that is needed, but then we need to look at the amount of people offering their POV. Odds are that Anand will be handling the review of say the first NV20 board that is reviewed here, what if Anand's taste match mine but not Michael's or yours? What if it is the other way around?

Yes, it would be of service. And like I've said repeatedly, a subjective judgment prefaced by &quot;Maybe it's just us, but we found that ...&quot; certainly qualifies any subsequent judgment as subjective. As for the amount of people offering their POV, the more the better. If you have 30 POVs in video card reviews, chances are you'll likely find consensus more than 30 totally different opinions.

&quot;What are you defining as &quot;serious inaccuracies&quot; outside of the lack of color, and how would they impede the usage of the board? And don't forget, if there were color issues, they'd probably be fixed by changing FuseTalk's color scheme.&quot;
...
Massive dithering. My point with using this as an example is like looking at a title in which say the nV boards have a slight flaw because of some Dot3 useage...


Ummm ... my point was in response to your comment about FuseTalk. I don't believe FuseTalk uses Dot3

Look around at these boards for gripes about flickering textures in some games, or dropped textures, or Z errors. Now look for threads about color saturation. That is why I say that a precise image first, subjective second. The fact that those minor imperfections can become significantly amplified in future games is why I place such a high level of importance on rendering the scene properly now.

Well, let's look at the S3TC sky issue. I was over at GameBasement and read the threads on Wumpus' article. There were some comments about nVidia being fully compliant with the S3TC spec, and there's no reason to doubt this since nVidia actually licensed it from S3. But the broad consensus is that the Radeon, which &quot;cheats&quot; on the sky looks better. As for color saturation, I wouldn't expect a lot of posts because most people don't notice until they see something better like the Radeon.

Are developers supposed to plan on out of spec boards forever? Are we to never progress because some boards can't get things right?

I'm not a developer and I freely admit I don't know the answer to this, but my impression was that 3dfx's &quot;severly&quot; inaccurate rendering as exemplified by Sharky's test was not a major hardship on developers. And besides, at the rate things are going, soon you'll only have to deal with nVidia.

In real life, we rely on our eyes more then any other sense when it comes to accomplishing most tasks, are developers supposed to ignore this and try to make games that are less immersive because of out of spec hardware?

Again, my point. We rely on our eyes, and there's more to quality than accuracy. I'd take a vibrant, sharp, well saturated image with 98% image accuracy over a dull image muddled by bad video card signal filters that's rendered with 100% accuracy. I'm not alone either, as exemplified by many happy V5500 owners.

And, BTW, developers do ignore this, because in my opinion their eyesight abruptly ends at the end of their noses at their OWN workstations. It's also why I've taken this discussion this far, because I believe you're representing the developers' viewpoint that &quot;as long as it renders accurately on my system it's fine and if it looks like crap on a user's system because of non-accuracy issues that's their problem.&quot; Don't believe me? Then if the &quot;total visual experience&quot; which DOES include more than image accuracy is so important, then why don't developers put test/adjustment tools in their games that lets users adjust their gamma/contrast/brightness to be somewhere in the neighborhood of what the developer wanted the game's environment to be? Sort of like scanners' test images - here's a picture, adjust your gamma/brightness/contrast to match this image. I became painfully aware of this a couple of years ago when I was playing a game (think it was the original Turok) and was stuck for an hour. Finally turned to a walkthrough which said to pick up something. Went to the location, but couldn't see it. I reached for something under the monitor and accidentally moved the brightness control up. Guess what? The item showed up.

It isn't though. I did not define what 3D image quality implicates, it is an industry standard term. MS has their standard for D3D, SGI and many others have theirs for OpenGL, it isn't new and predates, by many years, 3D gaming. If you are looking for a different definition that is something else entirely, but the terms has existed for many years now and has not been altered by the industry. Does this agree with the dictionary? Perhaps not, but my volume of Websters doesn't have Duron in it either and we all know what that means

Of course I know what &quot;Duron&quot; means - you're talking about industrial floor coverings right? Which is exactly what my Canadian cousin-in-law who's a contractor thought (with a puzzled look) I was talking about when he overheard me mention I had had a Duron. This is again exactly my point - you're talking about a definition that has a particular meaning to a specific population - developers. Most consumers don't even know the difference between SGI and a SDK.

An awful lot of consumers care a great deal, just their levels of tollerance for imperfections are a bit higher. How many posts have we seen on these and other boards over the years about rendering imperfections? Gamers, and gaming in general right now, does not demand the level of precission that ProE or the likes do, but if the flaws are great enough even your average run of the mill gamer will gripe, sometimes quite loudly.

I'll certainly agree that consumers' level of tolerance for imperfections is high, particularly with poor 2D. But I've been talking about the level of &quot;severe inaccuracy&quot; that Sharky's XOR test showed. If you took a poll of V5500 owners and showed them the reference image (not the XOR image) and the V5500-rendered image, do you seriously think that 50% or more would immediately see the differences? Better yet, let them see the image test at low detail running at 60-100 FPS and see if they'd notice.

GameBasement does We aren't a hardware site, but when various issues come up in games we do report on them and try to figure out workarounds

I'll agree that GameBasement is a pretty good site, and I visit there a lot. And you're responsible for that 2nd UT CD actually being taken out of many peoples' jewel case . But you're definitely in the minority, which goes back to audreymi's point for this thread (BTW, what WAS that point ... )

BTW, as evidence of my objectivity between the GeForce and Radeon, I ended up keeping the Elsa GeForce2 GTS - but that was only after I did the complete filter mod a few weeks back. The filter mod fixed the major issues I had with the GeForce, and the 2D is now sharp as a tack. The 3D also benefitted from the mod - at defaults, the image is much sharper and better saturated, and, with a few gamma and contrast adjustments, is now almost indistinguishable from the Radeon. With image quality (my definition) a non-issue, the GeForce had it over the Radeon. Heck, I've even got AGP4X, sidebanding and fast writes enabled on the GeForce on an Abit KT7, for what it's worth. I was a little disappointed with the Radeon, though, because I got the impression that it's really choked by it's drivers. What pretty much tipped for me was that conference call with ATI's president where he was asked when they're going to match nVidia, and his response was that they did it last July. Doesn't sound like they're highly motivated (or motivated enough) to me. Ultra anyone?

PeAK,

Glad to see you here - I used to visit your site regularly when I had an Xpert 128.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
WetWilly-

I had to reread some of this to get my train of thoughts back

&quot;I like UT better, which is a subjective choice. So you're saying there's an accurate choice as to whether Quake3 or UT is better? If there were, the only &quot;accuracy&quot; involved would be how &quot;accurately&quot; they comply with subjective criteria.&quot;

My point on image quality is that you need to rely either entirely on sbjectivity, or need to have a defined standard for accuracy. I would much rather have a standard then rely on other people's subjective views. When we have boards that perfectly render all environments, subjectivity then becomes important. Would you call the Lexus LS400 a quality truck?

&quot;Yes, it would be of service. And like I've said repeatedly, a subjective judgment prefaced by &quot;Maybe it's just us, but we found that ...&quot; certainly qualifies any subsequent judgment as subjective. As for the amount of people offering their POV, the more the better. If you have 30 POVs in video card reviews, chances are you'll likely find consensus more than 30 totally different opinions.&quot;

And what should be used by the people to judge on? A torture test or a simplistic scene?

&quot;Ummm ... my point was in response to your comment about FuseTalk. I don't believe FuseTalk uses Dot3 &quot;

But it uses more then 8bit color too

&quot;But the broad consensus is that the Radeon, which &quot;cheats&quot; on the sky looks better. As for color saturation, I wouldn't expect a lot of posts because most people don't notice until they see something better like the Radeon.&quot;

The better sky and lack of rainbowing in Quake3, turns into missing textures in UT. They used a hack to make them look good in one case, and came up short in another. Which is a bigger problem? Lesser quality sky or completely missing texture maps? Missing textures very clearly stand out without having to see another board.

&quot;but my impression was that 3dfx's &quot;severly&quot; inaccurate rendering as exemplified by Sharky's test was not a major hardship on developers. And besides, at the rate things are going, soon you'll only have to deal with nVidia.&quot;

Not now, but should it be tollerated? As far as nV, GameCube supports OpenGL and runs on PPC hardware, the ArtX acquisition could be a boost for ATi on the PC side also.

&quot;We rely on our eyes, and there's more to quality than accuracy. I'd take a vibrant, sharp, well saturated image with 98% image accuracy over a dull image muddled by bad video card signal filters that's rendered with 100% accuracy. I'm not alone either, as exemplified by many happy V5500 owners.&quot;

And because developers know this they are forced to plan on inaccurate images being displayed. A developer can't count on a board being accurate then they can't make integral game portions reliant on precise image accuracy. That impacts all of PC gaming.

&quot;Then if the &quot;total visual experience&quot; which DOES include more than image accuracy is so important, then why don't developers put test/adjustment tools in their games that lets users adjust their gamma/contrast/brightness to be somewhere in the neighborhood of what the developer wanted the game's environment to be?&quot;

You want to deal with the hate mail from all the 3dfx owners? This isn't a joke. If half the Voodoo loyalists had any idea how bad their boards were, particularly for Gamma at default, they would be filling developers' mailboxes with flames. I think it would be a good idea to do it myself, now with 3dfx gone perhaps we will see it.

&quot;I became painfully aware of this a couple of years ago when I was playing a game (think it was the original Turok) and was stuck for an hour. Finally turned to a walkthrough which said to pick up something. Went to the location, but couldn't see it. I reached for something under the monitor and accidentally moved the brightness control up. Guess what? The item showed up.&quot;

You weren't running a 3dfx board I assume. This is an example of developers having to compensate for the flaws of particular hardware. Because of the extremely overbright settings on 3dfx boards many developers adjusted their game settings. They do think ahead. By default if memory serves the D3D on 3dfx cards is set to 1.3 for gamma, try that on a nVidia or ATi board(seriously, take a look) and look at the screen(or even a 3dfx board on the desktop with that setting). Absolutely hideous. With 3dfx the main player in PC gaming back then, the D3D renderer was built to compensate for this. If you would have been running in software mode, you would have seen its' pixelated self no problem. This is actualy a good example of what I have been saying, although in this case the game was designed to be run on non accurate hardware.

&quot;This is again exactly my point - you're talking about a definition that has a particular meaning to a specific population - developers.&quot;

I'm not a developer The definition also holds for 3D artists and CAD/MCAD pros, the people who pushed 3D in its' infancy forward, long before gaming was a realistic objective for the technology.

&quot;Most consumers don't even know the difference between SGI and a SDK.&quot;

But if people start saying that SDK means &quot;soulful dog kids&quot;(no, I can't think of anything better at the moment) I will not agree with them. It won't matter if the mainstream decides to take it up as there definition of it, it has already been defined. The duron issue you bring up is valid, in that instance I would have to cede to the preexisting definition. If I'm talking to a British naval enthusiast I realize that &quot;Corvette&quot; is a ship, the car came later. The preexisting definition is not replaced because another one is more popular in the mainstream.

&quot;I'll certainly agree that consumers' level of tolerance for imperfections is high, particularly with poor 2D.&quot;

It is much, much higher in 3D then 2D. If I typed &quot;gbvl upi&quot; and it showed up as &quot;fvck you&quot; I would be pretty ticked with whatever caused it, but this is what happens in 3D. The image fails to be rendered properly and people just smile.

&quot;If you took a poll of V5500 owners and showed them the reference image (not the XOR image) and the V5500-rendered image, do you seriously think that 50% or more would immediately see the differences? Better yet, let them see the image test at low detail running at 60-100 FPS and see if they'd notice.&quot;

Randomly select one hundred people off of the street and show them the Mona Lisa and ten different artist done copies and see how many can tell the difference. Then bring in a group of art professors and do the same Then try the same groups just walking by. People who care a great deal about the quality will likely have very little trouble pointing it out even when moving at 60-100FPS, those who don't, won't.

&quot;I'll agree that GameBasement is a pretty good site, and I visit there a lot. And you're responsible for that 2nd UT CD actually being taken out of many peoples' jewel case . But you're definitely in the minority, which goes back to audreymi's point for this thread (BTW, what WAS that point ... )&quot;

Thanks for the kind words I understand that we are in the minority testing(and the amount that we run and never report on is much larger then what is reported on) but I think that it should be up to the game sites to report on issues with particular games(and that was the original point, why AT didn't cover this).

Can you tell me of an across the board issue where one board wins hands down? Even in FSAA, the Radeon and GeForce retain more texture clarity then the V5, something that is thought to be completely in 3dfx's pocket.

Because we deal with games on a game by game basis, we can sit down and look long and hard at the graphics and look at what issues there are. We deal with gaming, first and foremost. For us, the graphics display quality are very important. For Anand, well, a motherboard doesn't effect the sky in Quake3 any My point in jumping in this thread is that it isn't the place of hardware sites to deal with every possible issue that could arrise with any one game.

In all honesty, if the sky problem would have shown up in NOLF instead of Quake3 would we have heard anything at all about it on any hardware site? I highly doubt it(and it is there in 16bit, not sure when we will have that covered although no problems in 32bit at all which meant if we hadn't been looking for any problems by testing settings we wouldn't use, we wouldn't have seen it).

Do I think the hardware sites are trying to cover up the problems? No, the only place I have seen the NOLF sky issue discussed that I can recall is in email between myself and &quot;wumpus&quot;, and it is likely one that end users will never see as it only shows up in 16bit color(which should remain the same for every board due to the compression used though we haven't been able to test yet).

&quot;and, with a few gamma and contrast adjustments, is now almost indistinguishable from the Radeon.&quot;

You using a trini monitor? What is your gamma set at(I really wish they had numbers for contrast/brightness as well)?

&quot;Heck, I've even got AGP4X, sidebanding and fast writes enabled on the GeForce on an Abit KT7, for what it's worth.&quot;

I do on my K7TPro2A too. In fact, I was curious as to what the he!! kind of problems everyone else was having, it worked no problems right off for me.< shrugs >

&quot;I was a little disappointed with the Radeon, though, because I got the impression that it's really choked by it's drivers.&quot;

Drivers and Ati...< rant > I have an 8MB All-In-WonderPro Rage Pro board from early '98 that I dropped over $300 on when new. This board was for a dual boot system, NT/Win95(then 98), the NT drivers, to this day, are absolutely horrible. I knew of existing problems when I bought the card, figured they would fix them given enough time, they never did. They were more concerned cheating on ZD's 3DWinBench then making something the consumer could use:|:|< /rant >

&quot;Ultra anyone?&quot;

Don't forget the Pro It does seem like they think they are leaders in the performance race when at best they are tied for third, fourth if you are being a bit anal about a couple FPS.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |