Weyoun:
<< im sure the engineers at 3dfx have just a slightly better knowledge on the subject than us. i do wish, however, they'd follow public opinion a bit more often instead of dictating the rules (32bpp wont be needed, t&l doesnt matter now) and then getting their rules shoved up their ass when the public actually appreciates those things they left out. >>
I agree. Their marketing department is complete $hit. They do the company a disservice. Unfortunately, I think they have engineers doing all their work for them. They are entirely too logical.
Quick question...when the V3 came out, the TNT2 came out at about the same time. 2 major games had support for 32-bit, but until the GeForce DDr came out, 32-bit in Q3 wasn't really very playable (unless you liked slow framerates) So the public didn't really *need* 32-bit at the time, as it was barely useable. However, 3dfx missed a cycle. had they come out with a card w/32-bit support @ the time of the GeForce, they would've been set.
WRT T&L, it will be an outstanding feature, but it makes a difference in only 2 games right now: MDK2 and Evolva, and only minimally in both. I do agree with you tho. Don't say "well, this feature isn't all that great", stick EVERYTHING on there. "Better to have and not need than need and not have", even if that "need" is merely for marketing purposes. Ya dig?
Sunner, you make a very good point, and I know where you're coming from. The thing is, if you check the GeForce FAQ, there are huge sections dedicated to ironing out the bugs that each driver set creates. Look at it from my point of view: If a client has a GeForce card, and he asks me "what drivers should I use", I'm usually somewhat stumped. With damn near EVERY other piece of computer equipment out there, you can recommend "just get the latest". I can't do that with the GeForce cards. I gotta sit there and give a lengthy explanation, which includes instructions on where to download the older driver set to be ready to install in case the newer drivers don't work. Blah blah blah. That really shouldn't be necessary.
fodd3r: Saying that T&l sucks is just plain ridiculous. It's just not employed right now. Your comments make it very obviuos what an ATi fan you are, and how much you dislike nvidia. The T&L unit doesn't suck. What sucks is that nvidia convinced everybody and their mother that there would be a bunch of T&L games right now. Like Chuck D used to say "don't believe the hype"
jpprod: as always, the voice of reason. That is what I meant to say. They should've made the pixel pipelines multi-texturing. That would make them right alongside the GTS in everything. B-/ As it stands, we get to see just how bandwidth limited the GTS is. Damn lucky thing for 3dfx that 250 MHz 256-bit QDR doesn't exist right now, or the GTS vanilla would be getting 100 fps @ 1600x1200x32 in Q3. !!!
Ben:
<< The only time that the CPU would need to get involved would be if you crashed and then only if the game supports locational damage(showing your fender dented) and even then it only needs to be utilized until the model is updated. That is it. >>
'That is it'??? dude, that is IT!!! That is EXACTLY what we want to see happen! In racing games, that would just FUGGIN' own!!! Think about firing a rocket at a wall, and seeing it crumble! That would kick ass!!!
Weyoun: as far as implementing a poll, this site here is pretty heavily nvidia-based. Not that that is a bad thing, but if I say a "bad thing" about my GTS (usually just the truth), I usually get my head bit off by several peeps. in other words, we'd need more than just one forum to really do that type of thing justice.
As it stands, a quick perusal of the GeForce FAQ is all that is needed to confirm what several peeps here have been saying all along: The GeForce cards aren't as stable as they should be, due to platform problems, hardware compatibility problems, driver problems, game engine problems, whatever. There are MANY MANY documented issues with the GeForce cards. They are quite a finicky beast. When they run, BOY do they run!!! Unfortunately, they can be rather annoying, as well.
The nv20 looks (assuming the specs to be real) to be quite a barnburner. It'll be wasted on me if they don't fix the issues they've had in the past (and present)