RoboTECH-
"agreed, it looks darn good. But again, why can't my GTS look *good* in Q3 with TC, since everyone else's card looks good with TC?"
This could be explained because of business choices. Without the cross lawsuits and resultant cross licensing by nVidia and S3, we probably never would have seen S3TC enabled on any of the GF based boards. Because of the license issues 3dfx developed their own method of texture compression and while I'm not yet sure what ATi has done, it appears that they at least are using a modified version of S3TC if they didn't develop their own compression method. It doesn't seem that nVidia ever had any plans on supporting S3TC under OpenGL, but since the hardware was already their(same as DXTC), and they obtained the cross licensing agreement they enabled it.
Why the long winded answer, it is possible that they simply can't "fix it". 3dfx developed a new solution from the ground up and ATi also made design choices that enabled a different type, or at least a modified version of, S3TC. nVidia is using the hardware designed to handle DXTC for S3TC, and it doesn't handle on the fly compression well.
"If we all just say "aw shucks, it's S3's fault, nvidia doesn't need to fix it", then THEY WON'T!!!
If enough people bitch, maybe they'll do something about it. Is that such a bad thing?"
But can they? That is what I'm interested in, and why I find it more important to look at what is causing it. If they can fix it then I'll gladly start sending emails on it and start up threads griping about it, if they can't due to hardware design then it won't do any good.
"Looks like ATi and 3Dfx have already done that. If nvidia is so damn revolutionary, why can't they fix this issue?"
3dfx pretty much had two choices, license S3TC or develop their own method for texture compression. They chose the latter and pulled it off with style, besting what was available in terms of visual quality. Without AGP texturing having their boards in a situation where they are forced to swap from memory would be horrendous for performance, far worse then anyone else. I'm not sure what ATi did, but like 3dfx they also deserve a big pat on the back for going above and beyond in improving on standards that were available.
Looking at the time frame when the GF was launched, many people were saying quite loudly that 32MB was too much/plent RAM and that more wouldn't be needed(ignoring TC). That was before Q3 final shipped and the desing was finalized before the Test version was out. Where should the R&D dollars go? In retrospect I think it is safe to say that more time devoted to R&D would have been well worth it when looking at TC, but at the time most people would have said it was a waste.
"yeah bro, but the 6.18's have been screwing with my system. They just don't stay stable long enough for me to be bothered. Blah."
Hmm, haven't had any problems myself, what kind of issues are you having?
"Christ dude, if we're going to go back that far, let's talk about the first transistors, eh?
The "truth" is that 3Dfx was the first company to bring fast, high quality 3d-hardware accelerated graphics to personal computer gaming."
If I recall properly the first 3D hardware ran on vacum tubes Yes, 3dfx was a pioneer in PC 3D gaming, not arguing that. They weren't the first, but they were the first with a worth the money viable option.
"if they fix the TC problem, then guess what? I won't have a reason to bash them, would I?"
If the problem is hardware related due to proper support of the S3TC standards then it wasn't nVidia's "fault". It is easy to look back and say could of, should of, would of, but who knew at the time? If nVidia can fix it then I'll join in your b!tching though quite quickly, I have a 32MB card, I need it more then you do
"I did. I'd rather see websites use the "real" numbers without TC tho. 16-bit w/TC off, even on the GTS, looks better than it's 32-bit w/TC on."
Agreed, but I think you and I both know that won't stop most people from ignoring the slower numbers and only paying attention to the fastest. When the 6.xx series drivers launched that is one of the first things I looked for. Comparing the non compressed numbers for the 6.xx to the compressed numbers from the 5.xx they are fairly close.
"I didn't buy an S3. I bought an nvidia board. They need to fix it. ATi and 3Dfx both "fixed" it (by developing their own/modifying the S3TC). nvidia can release 900 driver leaks per month, why hasn't ANY of them addressed it?"
They may not be able to.
"that is an excellent point. However, I'm pretty certain that each and every nvidia employee would rather eat cow mucuous than license a technology from 3dfx. Agreed?"
Agreed, though I'm not sure if there is another way, or for that matter if they could implement it in their hardware. I'm not sure if the design is close enough to utilize the current nV hardware by simply releasing a driver revision.
BTW- I don't think you talk too much, I rather enjoy discussing things point by point in long winded threads