A few comments for the peanut gallery here (from a 64MB GTS owner):
I got my 64MB card from DecoY from here in the forums. Darn impressive card, actually. 220/400 without issue.
Doomguy:
"With the 6.18 drivers nvidia's opengl fsaa is just as good as 3dfx's and the new drivers help nvidia blow 3dfx away in speed. 3dfx is missing T&L, pixel shaders and the bump mapping that nvidia and ATI have. "
1) nvidia's FSAA is not in the same league as 3dfx's. Only a true nvidiot would think it is. I love my GTS, but let's get real here bud.
2) How many games are optimized for T&L? (MDK2)
3) How many games use per-pixel shading? (Evolva)
4) How many games use bump mapping? (Evolva)
wh00ps! A whole 2 games support these crucial, crucial features. And one of them is GHEY!!! So much for that pile of phloeey. don't let the nvidia marketing machine suck you in. That stuff will be important NEXT YEAR, but not now. :-/
SilverBack:
"I play Tribes. The V5 loses over 30 frames a second to the GTS using opengl. "
Why in god's name would you play Tribes in OGL with a V5? Hell, why not play UT and Deux Ex in OGL while you're at it. *shakes head*
I can certainly agree with your assessment of the 5500's OGL drivers. THEY'RE LAME!
"The worst part is that the V5 also loses COMPLETE graphics. I mean like a whole weapon is a light blue or black. If you would like to see screen shots let me know. I have them. Also using the AA in D3D cause some very interesting problems also.
I have a screen capture where the V5 completely drops whole objects off the image map. DUH. "
Duh? You obviously had something wrong with your system since no one else on the planet has had those issues. I suppose that's cuz most people DON'T run Tribes on a V5 with OGL, they probably use glide.
DUH.
sharkeeper (supreme dolt) :
"Go here and download this demo:
http://www.nvidia.com/Products/demos.nsf/pages/044E9E8336A809108825694E00005666
Let us know how it runs on your woodoo5."
Yeah, that makes sense. Let's see how a V5 runs an nvidia demo. Why the F*** would anyone run an NVIDIA DEMO on a non-nvidia card? Were you beaten as a child? Perhaps you should've been. Please note, I say that with all the love and compassion in the world, of course.
NOS:
"I hear that 3dfx sales are way down. Evidence enough for me as to whom the best really is in the video card world! "
The 5500 was the best selling card at retail. 3Dfx has owned the retail market for over a year. The V3-2000 and 3000 (AGP and PCI) owned last year's market. Their problem is lack of a worthwhile OEM product. Retail they're doing fine. OEM is what is killing them (they did sign a deal with Micron tho, we'll see how that pans out)
BenSkywalker:
"Quake3 has never, does not, and never will have support for DXTC. This reviewer clearly is utterly clueless on what he is trying to talk about. The issue he brings up involves S3TC, not DXTC which as of yet has very little, if any, support."
oh for God's sakes, it's the same damn thing, quit with your semantics argument. The Q3 engine makes full use of it. Straight up, the GeForce sucks for it. JUST ADMIT IT!! It's okay! I admitted it! I sold my 32MB GTS so I could get a 64MB one BECAUSE TC on the GTS sucks (that and the more overclockable RAM).
I disable TC on my 64MB GTS, what is the big goddamn deal? The GTS is fast enough to take the framerate hit, why are you whining about a FACT????
There is NO reason for nvidia to allow this to happen. WTF? They release 900 goddamn driver revisions a month (Derek perez sez :"we don't leak drivers"...ha!), why the hell cant' they fix this?
Doomguy:
"Also the 64 meg GF2 most likely suffers almost no performance drop with s3tc off because of its ram."
actually, not true, although I noticed the hit was far less substantial in large texture-type scenes (i.e. q3dm9) than the 32MB version, which got it's ass kicked. It still loses a good 10% of it's framerate in "easy" demos (i.e. demo001) and about 20% of it's framerate in quaver. The 32MB I had dropped to under 30 fps on quaver (ugh!!!), and hit as low as 15-20 in the harsh spots. Ugh....
Ben Skywalker:
"With a lowly GF DDR disabling texture compression running Quaver using the standard(by that I mean the accepted term) UHQ settings with all image enhancing game options on I'm hitting 39.8FPS at 1024x768 and 56.1FPS at 800x600. "
WTF? 40 fps in quaver with TC disabled on a DDr? nice. 6.18 I assume? I have a 64MB GTS and I get ~60. the 32MB board just plain died (tho it was using the 5.32) Yeah, don't forget to mention those nice teenage and low 20s framerates you get every time you enter the RL area. Ugh.
"The problem occurs on S3 boards, S3 has made statements about it and talks about the fact that developers should pre compress their texture to ensure that they will display properly"
well, it looks like those 3dfx characters who are "so far behind" seem to have figured out a way to get around that, and those ATi people who can't produce a decent driver have gotten around it also.. Their TC doesn't suck. The GeForce's does. Just admit it. It sucks. I am an nvidia owner, and I admit it. Why can't you?
"The first "real" 3D machine was developed by a division of General Electric for NASA when they were working on docking drills for astronaut training."
oh for god's sakes man. WTF does this have to do with the price of tea in china? ARe you on dope? Lordie...this has NOTHING to do with 3d-hardware accelerated computer games (and yes, I read your long-winded intro)
<rolls eyes>
3Dfx was the FIRST GRAPHICS COMPANY to bring hardware-accelerated 3d gaming to the PC in any large quantity. Yeah, yeah, NV1 beat it (laff) as did the Verite (slow), but 3Dfx was the first company to put out a PC chipset graphics adapter that drastically increased the visual quality and speed of 3d games. Again, as an nvidia owner, I admit it. You should to. It'll put hair on your chest.
"With up to 16MB dedicated texture memory and 8MB dedicaed frame buffer the StarFighter PCI was a very impressive all around board in its day."
the i740 was, is, and always will be a complete, utter, absolute piece of poop. Really now Benji-boy, remove your head from your hindquarters. It smells much better out here.
"If id had shipped Quake3 with precompressed textures then we wouldn't see the problems. "
and if nvidia could remove their heads outta their asses, we wouldn't see problems on the GeForces. nvidia needs to own up to this. Their competitors don't have problems, nvidia does. Get your head out of the sand man! It's YOUR card! You should DEMAND that nvidia fix this, because IT IS THEIR PROBLEM!!!
It's not Q3, or both 3dfx and ATi would have problems. Don't try to pass the buck. My graphics company needs to get on the ball and fix this poop. I've seen a gazillion "leaked" driver sets. Not one has even bothered to address this in any way, shape or form. In fact, most of them have been miniGL's for Quake3 performance.
"The visual quality I'm not sold on though, the V3 had horrible visual quality compared to most other offerings and yet was viewed as a contender on its' own benches. "
The lack of large texture support really killed the visual quality of the V3. however, do NOT compare the V3's 3d-image quality to the 5500's. They are WORLDS apart. I'll agree tho, the V3's 3d image quality (in Quake3) was quite lame. UT-glide sho' looked nice tho. *laff*
"I agree that I would like to see sites post both numbers, but most people truly aren't going to care about the non compressed numbers."
I don't care about the *compressed* numbers, because they mean nothing to me. I find it ironic that so many GTS owners like to pimp the "drastic image quality improvement" of 32-bit over 16-bit, or trilinear/anisotropic over bilinear, yet many ignore how butt-ass ugly the TC is. Humorous....
BlvdKing:
"The Voodoo 3 had really good 16 bit color quality when the gamma was adjusted properly (so that everything didn't look washed out)."
to each his/her own, but I thought the V3 looked pretty crappy in Q3. even with r_gamma nice and low, the texture aliasing looked bloody hellish and blurry. blech. UT sure looked good in glide tho! *g*
"When running a GeForce 2 against another card with texture compression on, the benchmark is in essence an apples to oranges comparsion since anyone can purposely lower visual quality to achieve higher frame rates. I would like to see visual quality taken into account when sites benchmark cards. The GeForce with texture compression off roughly compares to an ATI Radeon or Voodoo 5 with texture compression on or off in terms of visual quality "
agreed thorougly in that case. I saw no need to turn off TC with the 5500, since the ugly-ass artifacting wasn't there with the 5500. With the GTS, that's the first line in my graphics.cfg
r_ext_compress_textures 0
B-/
the o/c'ed 64MB GTS holds up nicely tho, I'm proud to say. In Quaver, tho, it's *not* as fast as the 5500 (o/c'ed also) with TC enabled (again, 6.18's don't agree with my system, so YMMV)
"The Voodoo 5 still may lose speed benchmarks against a GeForce/Geforce 2 with texture compression off) "
if you are stable with the 6.18 drivers, the 5500 wasn't *quite* as fast as the GTS (overclocked). If you are stuck with the 5.32 drivers, the 5500 was faster (on my system, at least, both cards overclocked)
Snicklfritz:
" have 32MB DDR, so I use the S3TC and the artifacts are not noticable in actual gameplay."
ugh....are you serious? I assume you use gl_linear_mipmap_nearest and r_colorbits 16 as well? If you can't see the horrid TC dithering/banding/discoloration, then you sure as hell can't tell the difference between 16 and 32-bit, or bilinear/trilinear filtering.
you really can't see a difference? Gadzooks man....get some glasses.
MGallik:
"Yep RoboTECH, these forums have been becoming quite the nVidia fan site."
tell me about it. A quick read of the reviews shows that. I almost died when I noticed Anand mentioned how f***ed up the TC was with the GTS (all of 1/3 of a 20-or-so- page review, heh...) I was impressed.
"Say something against a GF and the jackels attack"
it's all about religion man. A Graphics card choice is a deeply personal, religious experience for many, I suppose. I grabbed the GTS cuz it runs Q3 like mad. Big deal, eh? I'm going to replace it with a 5500 because the 5500 did several other things that I liked that the GTS doesn't do.
Hell man, I'd just be happy if I could get my frickin' GTS to run NFSU decently... :-/
"need help with your nVidia card and very few can offer any real help. What does that say? "
they're too busy comparing 3dMark2000 scores to be bothered, HA!!!!!!
(damn, I'm funny)
What I find hysterical is that here I am, 3 of my last 4 card purchases, and 5 of my last 7, have been nvidia cards. I presently own a 64MB GTS card, which replaced a 32MB GTS. I'm quite a fan of nvidia cards, and I'm in here ripping on the nvidiots. I feel like I'm watching an overprotective mother and her son....she just can't see that he can possibly be anything other than perfect.
What I REALLY love is when one of these schleps calls me a 3dfx fanboy, hehehe....just because I don't suck Derek Perez's d*** and bow at the altar of the GeForce. +laff+
I mean, I see on the 3dfx boards some pretty zealous peeps, but nvidiots really take the cake, LORDIE!
THEY'RE GRAPHICS CARDS PEOPLE!!! LIGHTEN UP!!!
oh, and VoodooSupreme? hehehe....you really are out of your league bro.