I WANT 3DFX BACK!!!!!!

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Sunner

Elite Member
Oct 9, 1999
11,641
0
76
PEZ, the difference betwen T&L is that once Glide was open sourced, it was pretty much dead, or at least dying quickly.
Until then it had been a properiarity API.

Do you really expect nVidia and ATI to start developing Glide drivers when Glide was quickly becomming a non issue.

T&L OTOH is a feature that anyone who felt like it could include on their cards.
 

voodood

Member
Nov 3, 2000
64
0
0
Just my two cents...I always had been and had hoped to remain a Voodoo/3DFX/Glide fan and user.

Personally, I always thought that the Glide API rendered the best 3d resolution...obviously at 16 bit color. I also am in the camp that believes that 16 bit color is more than sufficient, and that 32 bit color just isn't that big a deal IF it comes at the expense of frame rates.

IMO 3dfx screwed up when they stopped allowing third party card makers to produce vid cards. Which IMO is the same stupid mistake that Apple made (ie not allowing clone makers access to the technology to produce Apple clones). The only reason why Apple survived is because a) people were and are willing to pay a SIGNIFICANT premium to own Apple built Macs and b) Microsoft's money.

With the short life cycles of vid cards and deep discounting that goes on early and often, 3dFX just couldn't produce enuf cards at the right price to survive.

Will I miss 3dFx? No. But I will miss GLIDE. I STILL feel (and yes, this is purely subjective but then again isn't that REALLY what consumers rely on when they buy radios, televisions and sound systems) that the graphics I see on my Voodoo3 are somehow more defined and deeper? than what I see on my GeForce DDR GTS.
 

Wingznut

Elite Member
Dec 28, 1999
16,968
2
0
Sunner, not at all.

I'm just saying, why not review the hardware to the best of it's ability?
 

gcliv

Banned
Oct 24, 2000
264
0
0
Fordlorider:

Who the heck pissed in your corn flakes?? If I didn't know any better I'd say you worked for nVidia! I agree with Wingznut PEZ, with this many replies, it's a valid topic.
 

*kjm

Platinum Member
Oct 11, 1999
2,222
6
81
"Do you really expect nVidia and ATI to start developing Glide drivers when Glide was quickly becomming a non issue."

I agree with you that the new games may not use Glide as much there are some games still coming out that will. I think that with Glide being open source nVidia and ATI should develop drivers. I know one of the best cards for Diablo2 is a 3DFX card. I have around 70 games at home and about 60 or more I can use Glide in. If I go out and buy a nVidia or ATI card for $250+ shouldn?t I expect all my games to run at their best or should I just toss the old games and go take out a loan and buy 70 new ones...something to think about.
 

Oyeve

Lifer
Oct 18, 1999
21,995
854
126


<< Personally, I always thought that the Glide API rendered the best 3d resolution...obviously at 16 bit color. I also am in the camp that believes that 16 bit color is more than sufficient, and that 32 bit color just isn't that big a deal IF it comes at the expense of frame rates. >>




Most people today have at least a p3. Playing games in 32-bit should be no problem at all. UT in 32 blows away 16. Same for Q3 and any recent decent game. 32-bit, at least for me, (and I am sure a lot more) is mandatory. 1600x1200 to me is not necessary. Why they make games with this capability is beyond me. Even with a p4 I bet this would choke.
 

pidge

Banned
Oct 10, 1999
1,519
0
0
Does anyone here know 3dfx's technical support telephone number? Their website seems to be down so I can't look it up anymore.
 

Wingznut

Elite Member
Dec 28, 1999
16,968
2
0
&quot;UT in 32 blows away 16&quot;

I haven't used an nVidia card on UT yet... But let me say that I can't tell the difference between 16bit Glide and 32bit D3D on my V5. I believe this has to do with 3dfx's &quot;22-bit&quot; color rendering filter.

It's actually pretty rare that I can tell the difference between 16 bit and 32 bit on my V5. And NO, it's not because the 32bit sucks. (I can see the wheels turning in your head! )
 

borealiss

Senior member
Jun 23, 2000
913
0
0
3dfx had some really kickass products, but i never did buy any of their cards for one reason. texture resolution. something like 256x256 textures and in some games, ie counterstrike, looked like sh!t because the textures were sampled down. of course the voodoo 4 and 5 series fixed this, but that was way after the tnt and tnt2 had the texture resolutions required to display the texturing properly. i'm not too disappointed that 3dfx went under. i do agree that 32 bit rendering for a long time was hype, but my main gripe with 3dfx's cards (voodoo 2,3) is that image quality just bit. the only problem i've ever had with my tnt and geforce is with ff7 and ff8 series of games, but nothing else, and that was an easy patch to fix. and nvidia also was nice enough to make their drivers exist peacefully on smp systems, something that 3dfx never ever did, unless someone would like to enlighten me on this subject. granted it is sad that they are gone, but i just don't see anything they really offered over the competition except fsaa and really good 3d in the earlier generation of video cards.
 

Soccerman

Elite Member
Oct 9, 1999
6,378
0
0
I dont like companies telling me what I need.

so then you like companies that make you THINK that they care about what you want/need?

nVidia only produced cards with said features (T&amp;L, 32 Bit), becuase the competition didn't have it. along with their great PR department, they managed to get everyone on the internet saying 3dfx sucks becuase they don't have this or that..

I think I understand DaveB3D's answers to why nVidia T&amp;L sucks.. it wasn't fully functional, in that you don't see the full benefit for a few reasons (none of which I know, hey I don't design chips). otherwise all games that support it should be running 10x faster!
 

GoldenTiger

Banned
Jan 14, 2001
2,594
0
0
Nvidia cards always lock up my systems, too... I am going to miss 3dfx big time, I always bought their newest cards as soon as they came out for both of my systems. I was looking forward to post-V5 cards... but it cannot be . V5 has served me well since it came out, and I dunno what I'm going to do for a graphics card now... I share your pain, gcliv...
 

erikistired

Diamond Member
Sep 27, 2000
9,739
0
0
i thought i might chime in on this. i don't play games. i do make a living in the pc industry. i have a stack of nvidia cards at my office. cards i'll never be able to sell. tnt1 cards, tnt2 cards, vantra (or vanta? whatever) cards. why can't i sell them? not a single one of them is stable, and i don't want to screw my clients. i've used nvidia's drivers, creative's drivers, etc. i'm not even talking stable in UT or Q3, but in WINDOWS. they hang when using IE, they hang when scandisking, they hang all the time. i LIKE nvidia, and i'd love to at least be able to use these cards for myself if nothing else (altho eating the cost is not something i'm happy about). so when i see people say &quot;don't whine and cry because you can't get your nvidia card to work&quot; it makes me sick. i dunno, maybe i don't have the right drivers for IE loaded (altho you'd think the ones that came with it would work) but honestly i'm sick of trying. i'm not stupid, i've been working with computers for 13 years now. i've never seen such a problem with an entire line of cards. i was going to try one of the cards in my linux box, but then i read the that new linux drivers have issues too, what a surprise. ahh well, maybe i'll glue them to the wall with my collection of original sound blaster cards and pre-386 motherboards.

~erik
 

erikistired

Diamond Member
Sep 27, 2000
9,739
0
0
I dont like companies telling me what I need.

you don't seem to mind microsoft telling you it's okay to run a broken operating system.

sorry.

~erik

btw the way i still have an nvidia whatever 128zx card (can't remember the chipset name!) running in a machine of mine, perfectly stable for all the web surfing i do.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Wingz-

&quot;So what that the V5 won't have future support. The latest drivers are excellent, and are DX8 compatible. When do you suppose that the lack of support will be noticible?&quot;

Whistler.

&quot;What nVidia did correctly, was make their card perform real well in benchmarks (Q3 and 3DMark).&quot;

3DMark2K and Quake3 came out after the original GeForce. Can anyone list a game that has no Glide support that runs faster on a 3dfx card then a like generation nVidia board(TNT2/V3 on)? This is an honest question, are there any non Glide supporting games that are faster on a comparable generation 3dfx part(TNT2 on)?

32bit, Large Textures, Hardware T&amp;L

What do they all have in common? They all were strongly beneficial in certain games at least before 3dfx had a part out that supported them. One of them, large texture support, was a great disappointment to Valve who would have utilized them in Half-Life had 3dfx had the support everyone else in the industry did. 3dfx took something away from what is definately one of the best games ever to come to the PC by failing to support some of these &quot;useless&quot; features that &quot;you don't want anyway&quot;(the fvck I don't:|).

Soccerman-

&quot;I think I understand DaveB3D's answers to why nVidia T&amp;L sucks.. it wasn't fully functional, in that you don't see the full benefit for a few reasons (none of which I know, hey I don't design chips). otherwise all games that support it should be running 10x faster!&quot;

nVidia's T&amp;L is absolutely fully functional. I hope that we will be seeing benches released for a couple games shortly that are already out. The &quot;reasons&quot;(according to what the B3D crew has been saying) why you won't see the benefits come down to bandwith, as in upping the resolution too much will eliminate the benefits of having hardware T&amp;L and that is where people will play(not always true), and that the T&amp;L engine is not flexible enough. On flexibility, you can argue that having a more flexible T&amp;L engine will be a lot better, but if you optimize for current T&amp;L(GeForce/Radeon) you can see a very large performance boost(% wise) even at 1600x1200 32bit everything cranked on the current standard for PC gaming graphics.

If Dave reads this he'll probably jump in, but games are out now that are displaying exactly what T&amp;L proponents have been saying it would since the beginning.
 

Wingznut

Elite Member
Dec 28, 1999
16,968
2
0
Long time, no see Ben! Now all we need is Robo and BFG, and we're all set!

Whistler.. good point. I guess it slipped my mind, since it doesn't interest me personally.

Those examples that I threw out weren't meant to justify anything. They were just things that I'd like to see the card utilize when doing a review. Don't hold anything back, review the card to the best of it's ability.

Btw... HL sucks.
 

Soccerman

Elite Member
Oct 9, 1999
6,378
0
0
, but games are out now that are displaying exactly what T&amp;L proponents have been saying it would since the beginning.

I've seen a few (in picture format mainly), and they look quite nice, however their polygon models aren't much more complex then, say, Quake 3.

though the fact that they utilize T&amp;L is great, so I wouldn't have to worry (nearly) as much about my CPU bogging down the game, with games running so few polygons (I don't think they're running that many more, take a look at Firingsquad's look at Independance War 2.

It runs on Direct X 8 and it's pretty nice, but the order of magnatude of polygons compared to the 'previous generation' of games (like Quake 3) isn't anything more then double, more likely 1.5x. I wonder what kind of speed is guaranteed from a Geforce DDR?

btw, these new games aren't entirely T&amp;L dependant, right? You'd still see a difference between, say a 400mhz and 600mhz CPU right?

If I could get a Kyro 2 to run that, I would! c'mon PowerVR, I'm counting on you!
 

DeViSoR

Senior member
Jun 2, 2000
277
0
0
Just want to say that I love my Geforce.. newer had a lock up with it.. I must be lucky, My main game is UT, have tryed Mech 4 for 2 weeks no problem there (Mech= ) Well anyway it is sad that 3dfx is gone. I loved Voodoo2 and hated the Banshee.

He he.. and not spotting the differense between 16bit and 32bit in 3D, must be caused a fault in your card.
 

Wingznut

Elite Member
Dec 28, 1999
16,968
2
0
I'm not saying that the 32bit looks bad... To the contrary, I'm saying that the 16bit looks good... Real good.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
Ah, yet another 3dfx thread. My kind of reading!

I too can attest at 3dfx's greatness. My Voodoo 3 has been rock solid for the ~12 months I have been using it, and 3dfx's drivers are absolutely 5 star. I've never had any problems with any games and each driver update gave me a small performance update, which kept me very happy. Also 3dfx's driver control panel rocks, and the ability to set gamma in each API is great.

Regarding nVidia, I'm disappointed with my experiences with their boards. I have experienced problems with every single detonator driver I tried, and both GF2 MXs that I've tried kept locking up in 3D games (in fact I'm just returning the second card tomorrow). Also I have to say that nVidia's driver interface sucks and is really clutzy to use. 3dfx's interface is really good and I really like the Radeon's interface with the cool blue buttons.

Sure you can blame my motherboard, CPU, RAM or anything else under the sun, but the fact remains that a lot of people are having a lot of problems with nVidia's boards and their drivers. Making excuses for them doesn't help anyone.

Regarding the 32 bit colour issue, I've just put my Voodoo 3 back in my system. Aside from the lockups disappearing, the thing I noticed immediately was how good my Voodoo 3 running in 16 bit mode looked in Quake 3 compared to the GF2 MX in 32 bit colour. The banding/pixelisation in the smoke/fog/blood/gibs was only very slight and looked almost as good as the MX in full 32 bit colour. A lot of people seriously underestimate how good this post-filter actually is.

Also I agree with Wingznut PEZ's comments about 3DMark and Quake 3. Reviewers see nVidia at the top of the bar charts for those benchmarks and automatically give nVidia first prize for video cards. Sorry, but there's much more to it than that.

I too am saddened by 3dfx's downfall. My next card would have been a Voodoo 5 but now I really can't risk such a purchase when there's no future support. After trying two nVidia cards my next video card will be a Radeon, but first I'll be watching ATi's driver releases very carefully. After trying ATi's products in the past you'll understand why I'm a bit aprehensive about their driver support.

Oh and FordLorider, WTF is your problem? Did somebody speak out against the fatherland company that is nVidia? Go and cry about it somewhere else. It seems to me that you're the only troll in this thread.
 

borealiss

Senior member
Jun 23, 2000
913
0
0
wow! i just read some of the posts that i skipped over (lazy, i know) and i didn't realize so many people had problems with nvidia based cards. i've owned a tnt, tnt2, and geforce. all have performed great, and i've never had any compatibility problems in either smp, uniprocessor, intel, or amd systems. i've built computers with 2 geforce ddr's in them and they're still churning no problem with whatever my friends throw at them. one was a duron and another a bp6 based machine. i've run them on nt, win2k, win98, 98se, and ME, and never experienced any lockups that wasn't due to overclocking. i'm guessing that this is because i never try the new nvidia experimental drivers that come out on an hourly basis, i use only official ones provided by the card manufacturer. i know that i'm speculating here, but people mention that when they try to install nvidia's drivers, they don't work or cause lockups. why don't you use the ones that are made by the card vendor? nvidia's drivers come with absolutely no support, they are not obligated to support a product they did not produce (the entire vid card). i've had an absolutely horrendous experience w/ their drivers, which is why i don't use them.



<< I went out and bought a Geforce 2 MX. It was an ok price for a card bought locally and not on the net, $130. I brought it home and plugged it in, jumped on Nvidia's site and downloaded the most recent Detonator drivers. >>


why didn't you use the ones that came with the video card? or use the latest ones from the manufacturer's website? when nvidia provides a chip, not every company is going to bend to implement the reference design. there are going to be variants out there that might differ enough that a &quot;one-driver&quot; solution for a chip that runs on who knows how many pcb layouts would work for all of them. and just because a company's card sucks doesn't mean the chip on it does, it could be the way the company implemented it. maybe i'm just one of the lucky few that hasn't had a lot of trouble with nvidia, but i just don't see how many people can have so many problems.
 

DeViSoR

Senior member
Jun 2, 2000
277
0
0
&quot;borealiss&quot; your right, I myself do use the ref. drivers.. but that is because I have a LeadTeck card (they'r lazzy design is just a copy of the ref. card, but peform good and oc great ). Use the official drivers from your card manufactor, using ref. drivers is allways a risk.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
&quot;Sure you can blame my motherboard, CPU, RAM or anything else under the sun, but the fact remains that a lot of people are having a lot of problems with nVidia's boards and their drivers. Making excuses for them doesn't help anyone.&quot;

What purpose does making excuses for a complete POS motherboard serve? Look at PIV owners, they can't run a V5 in their mobo at all, should we be bashing 3dfx for this? Don't buy junk motherboards and you won't run into problems like this.

It is ignorant to bash either nVidia or 3dfx because of compatibility problems with out of spec motherboards. Always buy quality, and you won't have a problem.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Wingz-

We are all over in the Video forum, where have you been???

Soccerman-

&quot;I've seen a few (in picture format mainly), and they look quite nice, however their polygon models aren't much more complex then, say, Quake 3.&quot;

Check out Giants. While the polygon count can be had on a non T&amp;L board, the FPS suffer quite a bit, particularly with &quot;slower&quot; CPUs(sub GHZ are quite slow).

&quot;I don't think they're running that many more&quot;

Most of them aren't. We only need one single title to show that the unit works as advertised, and there are plenty more then that. With scaleable LOD becoming increasingly popular, it will be much harder to get accurate comparisons without decent benchmarks, but there are developers on record stating that they run both higher FPS and higher polygon counts then software T&amp;L on today's CPUs. The detractors said repeatedly that the GeForce's T&amp;L unit couldn't compete with then current CPUs(November '99) which is laughably inaccurate.

&quot;It runs on Direct X 8 and it's pretty nice, but the order of magnatude of polygons compared to the 'previous generation' of games (like Quake 3) isn't anything more then double, more likely 1.5x. I wonder what kind of speed is guaranteed from a Geforce DDR?&quot;

That game isn't a good example(do you think that looks good? Visuals look quite poor to me compared to some games already availalbe). When the game based on Crytek's engine comes out it will likely be a good showcase. Also, you can also look to the NV15 level when paired with a PIV and dual channel RAMBUS. Without even using the hardware lighting engine it is quite playable(60FPS) and is cloer to a five to six hundred percent increase over a standard Quake3 level.

&quot;btw, these new games aren't entirely T&amp;L dependant, right? You'd still see a difference between, say a 400mhz and 600mhz CPU right?&quot;

You will see a bigger difference then without T&amp;L. Offloading the T&amp;L strain from the CPU the limiting factor becomes the game code for anything out now or in the not too distant future. Because you reduce the amount of strain on the CPU, the 600MHZ CPU will have a bigger framerate increase when using hardware T&amp;L then not, until you become either memory bandwith(system or local) or T&amp;L limited(no game is close to being T&amp;L limited yet).

Figure a game devotes 30% of it's time to T&amp;L when running in software mode and runs at 50FPS on a 400MHZ CPU. If you calculate for a 600MHZ CPU then you will be at roughly 75FPS, moving T&amp;L off of the CPU and on to a dedicated unit will drop this down to closer to 5%. So, the 400MHZ CPU would be hitting roughly 62.5FPS while the 600 would be hitting about 93.75FPS.

We would need an explosion in poly counts to change the type of scenario above. A faster CPU should offer you even greater benefits with hardware T&amp;L then without.

&quot;If I could get a Kyro 2 to run that, I would! c'mon PowerVR, I'm counting on you!&quot;

Are you talking about the game you linked to? Well, from looking at the screenshots the visuals are quite drab. Lousy texturing, the models aren't very well done although there are some nice particle effects/explosions. If you are thinking that a deferred will help you much, I wouldn't hold your breath. From looking at the screenshots it looks like this game will have maybe 2X, if that, overdraw the overwhelming majority of the time. Without a sizeable amount of overdraw the KyroII is likely to get manhandled by pretty much any other board that is current when it launches. This may not be the case, but I'm hearing 166MHZ core clock, unless there are some serious changes from the Kyro1 then it simply won't be competitive particularly not in a game like the one you linked to.
 

gcliv

Banned
Oct 24, 2000
264
0
0
borealiss:

Like I said in my original post, I've used alot of the GeForce's. Each time I used one, I used every driver available: Old ones, new ones, from the company, refrence ones, and every freaking version of Detonator that exists. They ALL sucked.


UPDATE:

Glory hallelujah! Today I returned that POS MX and got a Radeon 32 mb DDR. And it kicks butt! It rocks! It may not be a Voodoo, but it is definitely a worthy successor!!!
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |