mikelou, I think this is an interesting subject, despite your, er....interesting way of conveying your thoughts.
jpprod hit the nail on the head with why peeps are displeased with 3dfx.
1) They said "gamers don't need 32-bit or large textures"
2) They said "gamers don't need T&L"
3) They said "you should use glide"
Now then, were they right? yes (well, in some cases) Unfortunately, peeps don't want to be TOLD what they "need". Just give it to them and let them choose for themselves.
The problem is that 3dfx has a bunch of idiots running their PR department (notice how head PR guy Brian Burke left 3dfx and is now working for nvidia...hmmmm....interesting....) and they have, in the past, been run by engineers who are brilliant at logic but thoroughly suck when it comes to business.
Realistically speaking, there were almost no games that supported 32-bit color at the time. Q3 was one of the first big games that had 32-bit support, and it ran slow as hell on a TNT2 Ultra. T&L is STILL just this side of worthless (unless you like 3dMark2000 Arena) and games that use T&L in the future will probably not run that great on today's T&L Hardware.
Glide games look far better in 16-bit than any other games out there today, and can rival 32-bit in looks while blowing them away in speed and playability. A V3 can run UT in glide as well if not better than a present day GTS. Criticize glide if you want, but it was pretty great for its time.
Of course, they're not updating glide since 3dfx screwed up with the lawsuits on the glide wrappers (another story entirely)
The true problem with 3dfx is their failure to executeand that they have tried to tell gamers what they "need", instead of providing EVERYTHING and letting gamers choose.
They have also rested on their laurels and have not progressed as well as they could've/should've/would've (were I in charge, heh...)
Their lack of a true OGL makes their performance less than what it should be. The fact that they are using a .25u part is testament to their lack of execution. Think about it.
The 5500 SHOULD'VE been released @ Christmas time-99 to compete with the GeForce, which it would've blown out of the water. VSA-200 SHOULD'VE been available for a "spring refresh", which would've been a 200 MHz .18u part that SHOULD'VE had trilinear + anisotropic + an extra texel pipeline.
but alas, due to their poor execution, it just didn't happen. I like to think that they are on an upswing here, but I'm not going to hold my breath. If Spectre (not Rampage, peeps) doesn't outclass NV20 and Radeon-II and G800, then they'll not make it with me or anyone else, for that matter.
It's all about the execution. I think they've gotten past their "gamers don't need blah blah blah" BS. Let's hope they've gotten past their poor execution.