Gee, I'd almost forgotten about this thread and didn't get a subscription update on it. Well, let's get on with it ...
Ben,
"But Ben, subjectivity and accuracy don't go hand in hand."
...
Which do you like better, Quake3 or UT?
I like UT better, which is a subjective choice. So you're saying there's an
accurate choice as to whether Quake3 or UT is better? If there were, the only "accuracy" involved would be how "accurately" they comply with
subjective criteria.
Would it be of service in a hardware review to make that kind of choice between video cards? Perhaps we are getting to a point where that is needed, but then we need to look at the amount of people offering their POV. Odds are that Anand will be handling the review of say the first NV20 board that is reviewed here, what if Anand's taste match mine but not Michael's or yours? What if it is the other way around?
Yes, it would be of service. And like I've said repeatedly, a subjective judgment prefaced by "Maybe it's just us, but we found that ..." certainly qualifies any subsequent judgment as subjective. As for the amount of people offering their POV, the more the better. If you have 30 POVs in video card reviews, chances are you'll likely find consensus more than 30 totally different opinions.
"What are you defining as "serious inaccuracies" outside of the lack of color, and how would they impede the usage of the board? And don't forget, if there were color issues, they'd probably be fixed by changing FuseTalk's color scheme."
...
Massive dithering. My point with using this as an example is like looking at a title in which say the nV boards have a slight flaw because of some Dot3 useage...
Ummm ... my point was in response to your comment about FuseTalk. I don't believe FuseTalk uses Dot3
Look around at these boards for gripes about flickering textures in some games, or dropped textures, or Z errors. Now look for threads about color saturation. That is why I say that a precise image first, subjective second. The fact that those minor imperfections can become significantly amplified in future games is why I place such a high level of importance on rendering the scene properly now.
Well, let's look at the S3TC sky issue. I was over at GameBasement and read the threads on Wumpus' article. There were some comments about nVidia being fully compliant with the S3TC spec, and there's no reason to doubt this since nVidia actually licensed it from S3. But the broad consensus is that the Radeon, which "cheats" on the sky looks better. As for color saturation, I wouldn't expect a lot of posts because most people don't notice until they see something better like the Radeon.
Are developers supposed to plan on out of spec boards forever? Are we to never progress because some boards can't get things right?
I'm not a developer and I freely admit I don't know the answer to this, but my impression was that 3dfx's "severly" inaccurate rendering as exemplified by Sharky's test was not a major hardship on developers. And besides, at the rate things are going, soon you'll only have to deal with nVidia.
In real life, we rely on our eyes more then any other sense when it comes to accomplishing most tasks, are developers supposed to ignore this and try to make games that are less immersive because of out of spec hardware?
Again, my point. We rely on our eyes, and there's more to quality than accuracy. I'd take a vibrant, sharp, well saturated image with 98% image accuracy over a dull image muddled by bad video card signal filters that's rendered with 100% accuracy. I'm not alone either, as exemplified by many happy V5500 owners.
And, BTW, developers
do ignore this, because in my opinion their eyesight abruptly ends at the end of their noses at their OWN workstations. It's also why I've taken this discussion this far, because I believe you're representing the developers' viewpoint that "as long as it renders accurately on my system it's fine and if it looks like crap on a user's system because of non-accuracy issues that's their problem." Don't believe me? Then if the "total visual experience" which DOES include more than image accuracy is so important, then why don't developers put test/adjustment tools in their games that lets users adjust their gamma/contrast/brightness to be somewhere in the neighborhood of what the developer wanted the game's environment to be? Sort of like scanners' test images - here's a picture, adjust your gamma/brightness/contrast to match this image. I became painfully aware of this a couple of years ago when I was playing a game (think it was the original Turok) and was stuck for an hour. Finally turned to a walkthrough which said to pick up something. Went to the location, but couldn't see it. I reached for something under the monitor and accidentally moved the brightness control up. Guess what? The item showed up.
It isn't though. I did not define what 3D image quality implicates, it is an industry standard term. MS has their standard for D3D, SGI and many others have theirs for OpenGL, it isn't new and predates, by many years, 3D gaming. If you are looking for a different definition that is something else entirely, but the terms has existed for many years now and has not been altered by the industry. Does this agree with the dictionary? Perhaps not, but my volume of Websters doesn't have Duron in it either and we all know what that means
Of course I know what "Duron" means - you're talking about
industrial floor coverings right? Which is exactly what my Canadian cousin-in-law who's a contractor thought (with a puzzled look) I was talking about when he overheard me mention I had had a Duron. This is again exactly my point - you're talking about a definition that has a particular meaning
to a specific population - developers. Most consumers don't even know the difference between SGI and a SDK.
An awful lot of consumers care a great deal, just their levels of tollerance for imperfections are a bit higher. How many posts have we seen on these and other boards over the years about rendering imperfections? Gamers, and gaming in general right now, does not demand the level of precission that ProE or the likes do, but if the flaws are great enough even your average run of the mill gamer will gripe, sometimes quite loudly.
I'll certainly agree that consumers' level of tolerance for imperfections is high, particularly with poor 2D. But I've been talking about the level of "severe inaccuracy" that Sharky's XOR test showed. If you took a poll of V5500 owners and showed them the reference image (not the XOR image) and the V5500-rendered image, do you seriously think that 50% or more would immediately see the differences? Better yet, let them see the image test at low detail running at 60-100 FPS and see if they'd notice.
GameBasement does We aren't a hardware site, but when various issues come up in games we do report on them and try to figure out workarounds
I'll agree that GameBasement is a pretty good site, and I visit there a lot. And you're responsible for that 2nd UT CD actually being taken out of many peoples' jewel case . But you're definitely in the minority, which goes back to audreymi's point for this thread (BTW, what WAS that point ... )
BTW, as evidence of my objectivity between the GeForce and Radeon, I ended up keeping the Elsa GeForce2 GTS - but that was only after I did the complete filter mod a few weeks back. The filter mod fixed the major issues I had with the GeForce, and the 2D is now sharp as a tack. The 3D also benefitted from the mod - at defaults, the image is much sharper and better saturated, and, with a few gamma and contrast adjustments, is now almost indistinguishable from the Radeon. With image quality (my definition) a non-issue, the GeForce had it over the Radeon. Heck, I've even got AGP4X, sidebanding
and fast writes enabled on the GeForce on an Abit KT7, for what it's worth. I was a little disappointed with the Radeon, though, because I got the impression that it's really choked by it's drivers. What pretty much tipped for me was that conference call with ATI's president where he was asked when they're going to match nVidia, and his response was that they did it last July. Doesn't sound like they're highly motivated (or motivated enough) to me. Ultra anyone?
PeAK,
Glad to see you here - I used to visit your site regularly when I had an Xpert 128.