And still another reason that Nvidia Geforce 2 cards have below-average graphics

VoodooSupreme

Banned
Sep 3, 2000
12
0
0
We all know that Nvidia Geforce 2 cards have the worst graphic quality of all the current cards, and that Nvidia sacrifices quality for speed. I was quite shocked though to see yet another example of this in a recent article.

This is a very well written article by true professionals and I found it extremely helpful and informative. It shows another example of how Nvidia just doesn't care about quality.

I imagine Nvidia is tying to strongarm these guys, and get them to take this article down. Just like they have done with so many other reviewers who have pointed out the serious flaws with the graphic quality of Geforce2 cards.

http://www.gamebasement.com/pages/home.asp?nav=articles&id=31

Here is a few quotes from this excellent article.

"Given the importance of texture compression as an essential feature, I am more than a little disturbed that the nVidia implementation of DXTC texture compression seems to have serious problems.

"Clearly, the texture compression image quality problem many people have seen is caused by the poor implementation of DXTC in the GeForce series, not by DXTC itself."

"So what exactly is the problem with the GeForce2 and texture compression? Texture discoloration on opaque textures, and heavy artifacting on translucent textures. That's bad.

"I don't know about you, but I'd be pretty pissed if I bought this card and brought it home to see these kinds of problems."

"It's aggravating that one of the most important features on the card appears to be completely broken."

ATI Radeon-"The compressed version, on the right, is basically identical to the non-compressed screenshot. This is the way it's supposed to work, folks. I would call this perfect. I can't see any significant difference between the two shots, which is the way it should be. Bravo to ATI for a perfect implementation of DXTC! But what happens when we use the GeForce 2? Pretty damn crappy. I have no idea why enabling DXTC compression on the GeForce causes these ugly discolored artifacts , but it does. And they're all over the place."

"Disabling texture compression only reduces performance, sometimes drastically!"

"Neither the Voodoo5 nor the ATI Radeon show the obnoxious texture compression artifacts in Quake III that the GeForce does. In fact, if you were to judge texture compression solely on the basis of its performance in Quake III with a GeForce card? probably the most common combination in gamers' hands right now? you would erroneously conclude that texture compression sucks! That's a shame, because texture compression is a very good thing."


Hmmmm, this is probably one of the most honest and insightful articles I have read in a long time. Hats off the the pros who wrote this. This kind of integrity in a gaming site is quite refreshing.

We all knew that the grahics quality of the Geforce was not equal to the Voodoo5, and that Nvidia sacrifices quality for speed, and targets their cards at kids who only look at speed benchmarks. But it's suprising to see yet another way that the Voodoo5 beats the Geforce.

I do prefer Voodoo cards, but I have no bias, or anything against Nvidia of course. But you can't ignore the things that just keep surfacing showing they do not make a well rounded card. They will sacrifice anything for speed, and that's not a wise choice.

I think Nvidia is suing 3dfx because they can't keep up with the superior technology from other companies, and they are just trying to get the attention off their one sided and incomplete products.



 

Doomguy

Platinum Member
May 28, 2000
2,389
1
81
At least the GF2 has true trilinear filtering when multitexturing unlike the V5. Most games dont even use texture compression. In most apps the GF2 has better image quality without using FSAA.
 

Doomguy

Platinum Member
May 28, 2000
2,389
1
81
LOL 3dfx is also WAY behind in technology. With the 6.18 drivers nvidia's opengl fsaa is just as good as 3dfx's and the new drivers help nvidia blow 3dfx away in speed. 3dfx is missing T&L, pixel shaders and the bump mapping that nvidia and ATI have.
 

VoodooSupreme

Banned
Sep 3, 2000
12
0
0
Quake III-The most popular game in the world uses it, and a lot of games will use it the future. The Voodoo5 is better technology, and Nvidia knows it! That's why they are suing them. 3dfx has always been at the cutting edge, and this is another example. Balance and not just raw speed is the best. That's what makes Voodoo5 better!
 

SilverBack

Golden Member
Oct 10, 1999
1,622
0
0
I'm sorry. I have had both in my system.
The GTS simply has better quality all the way around. It has a WORKING opengl ICD. 3Dfx seems to think they do, but alas they don't.
I play Tribes. The V5 loses over 30 frames a second to the GTS using opengl. The worst part is that the V5 also loses COMPLETE graphics. I mean like a whole weapon is a light blue or black. If you would like to see screen shots let me know. I have them.
Also using the AA in D3D cause some very interesting problems also.
I have a screen capture where the V5 completely drops whole objects off the image map. DUH.
 

TimTim

Banned
Jul 4, 2000
85
0
0
Hey Voodoo Supreme,

First of all, that article talks mostly about the RADEON'S great image quality, NOT the Voodoo5. Look at your own quotes and then go read the article again. In fact they only mention the Voodoo5 once, and you highlighted it, in bold letters in your post. hehe

Secondly, not to insult the Voodoo5, but the RADEON and the GeForce2 smoke it.

Dude, the Voodoo5 is a pretty good card, but it's not even in the same class as the RADEON and GeForce2. I had a Geforce, and now I have a RADEON 64MB, and believe me.......you don't want to try and compare the Voodoo5 to either one of them!

You were obviously just trying to start a flame war, but it failed because we all love you! hehe
 

Fozzie

Senior member
Oct 9, 1999
512
0
0
MWahaha, I love these people. Obviously on a technical forum it can get a little dry with so much support talk and "which video card should I get" posts. VoodooSupreme must have known that and decided to liven the place up with some witty humor by acting like a moronic troll! In fact I nearly split a gut at this point:

"I do prefer Voodoo cards, but I have no bias, or anything against Nvidia of course."

Hahah! Brilliant! Of course it could be improved though. One may choose to provide a more "realistic" portrayal: "Heluo morun heeds! hear am VoodooSupreme to smak dowm al u nvidia fgotz! eye am goin 2 get bfg on your opangl asees! just lik tinm sweany is mastr kung foo an wil destory DERAK PERAZ!!! eye gett A TEEM ON U!!`!"

Personally I could read that all day! Oh joy. (wipes away a tear of laughter)



Rgrds.
 

han888

Golden Member
Apr 7, 2000
1,586
0
0
voodoo5 would be the last card i will choose, after radeon so firstly i will go to geforce2 after that radeon and voodoo5
 

TerreApart

Senior member
Aug 30, 2000
231
0
0
You may find this as interesting as i did...

If 3dfx is so cutting edge...

1) why do they need -2- graphics processors to come even close to the -1- processor on a geforce2 card?
2) why do they use the lower quality/higher cost way of implimenting graphics memory 64 vs geforce 32 to get close to the performance level...
3) NVidia is willing to share the geforce2 designs with other companies to allow them to tweak it as they see fit, does 3dfx do this?
4) don't get me started on the BooHoo6(i mean voodoo6) it needs 4 processors to compete with the geforce2-64meg cards--these cards still have 1 processor.

Come on get a clue...

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Junior VoodooSupreme, post this thread in General Hardware Forum for some more unbiased, kind, gentle comments and guidance that will show you the error of your ways.



 

XNice

Golden Member
Jun 24, 2000
1,562
0
76
I have a Geforce 2 GTS and my little brother has a voodoo3. After Voodoo3 3Dfx ran out out geniusness. Yea Geforce dont have the best image quality, but its damn good. and Quake 3 is not the most popular game in the world. Look at all the ratings, Counter-Strike would beat it and UT does beat it. Quake 3 was just made so other companies can use its engine. SOF 2 anyone?
 

VoodooSupreme

Banned
Sep 3, 2000
12
0
0
Radeon and GeForce, blah blah blah, you think just because they are faster and have more features they are better, blah blah blah. 3dfx are the pioneers, they started it all, you piss ants! Look at the way you react, like baboons. Get a life you F-ing trolls. No disputing the article by anybody, just a bunch of people caling me names.

GeForce sucks for everything but speed and you know it, and I bet the Radeon stole all those cool features and graphic quality from 3dfx, and the next series of Voodoo cards probably would have had them first, but they have just been delayed for a while, that's all.

F you ass wipes, you just can't deal with the truth, because you can't handle the truth, and you all make fun of me because you are jealous. Just because the Radeon and GeForce are better doesn't mean they are better you stupid trolls!

I feel sorry for you people who think that just because everybody says the Radeon and GeForce are better than Voodoo 5, that you believe them. Don't you have minds of your own? Or does your mind go blank just because some reviewer tells you that the Voodoo 5 isn't as good as the other cards. You probably have never even had a Voodoo 5 card, and you are just going by what people say. ATI and Nvidia probably pay people to say the Radeon and GeForce are great, but 3dfx has integrity and class, and they would never bribe reviewers.

F all you piss ants. Drop dead you F-ing fairies!!!!!!!!!

 

AxelRose

Member
Sep 3, 2000
94
0
0
3dfx Blows a big horny goat right up the wazoo!

I used to love em' but they suck now.

Also, what is wrong with sacrificing alittle to get alittle? Your Voodoo doodoo cards have done this "Sacrifice" for years...


3dfx still does it with Glide. Glide barely supports 32 bit textures or color.
 

TimTim

Banned
Jul 4, 2000
85
0
0
LOOK....LOOK

In the entire time I have been a member of Anandtech, I have never flamed anybody. It hasn't been easy either, let me tell you. Something from the last statement from Voodoo Supreme must be brought to everyone's attention however. This is just too good to resist. For the record, I am still not flaming, but I must point something out.

Quote from Voodoo Supreme's last post:

"Just because the Radeon and GeForce are better doesn't mean they are better you stupid trolls!"

Um........> Just because the Radeon and GeForce are better doesnt't mean they are better.....???? hehe

I'm sorry, I just couldn't resist that one.

Like many of us said before VoodooDude, the V5 is a good card, but it's just not in the same league as the Radeon or GeFore2. My regular GeForce was faster, and my new Radeon 64MB is way faster and even looks way better too. Sorry man, but like most people I agree that 3dfx is not the innovators they once were. Stop trying to start flame wars.
 

NOS

Member
Aug 29, 2000
126
0
0
I hear that 3dfx sales are way down. Evidence enough for me as to whom the best really is in the video card world!
 

TimTim

Banned
Jul 4, 2000
85
0
0
One more thing VoodooSupreme. ATI can't even keep up with the high demand for the Radeon, and most places sell out within days after getting them, and the GeForce2's are still selling strong after several months. The Voodoo5's on the other hand are gathering dust sitting on store shelves. Hmmmmmmm....What does that tell you?
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
I have to agree with Fozzie, this guy is funny, but some poor sap might look at this thread and believe what is being said so let's take a look at some of what that article had to say-

"So, given the importance of texture compression as an essential feature, I am more than a little disturbed that the nVidia implementation of DXTC texture compression seems to have serious problems . This problem is most obvious in the game Quake III Arena."

Quake3 has never, does not, and never will have support for DXTC. This reviewer clearly is utterly clueless on what he is trying to talk about. The issue he brings up involves S3TC, not DXTC which as of yet has very little, if any, support.

"Let's compare two shots from the game on a nVidia GeForce2 with Detonator 6.18 drivers. The shot on the left is with texture compression disabled; the shot on the right is with texture compression enabled."

What's this?? Texture compression can be disabled on the GF2 boards? There goes that argument, the user can chose quality or speed.

"Notice the radical improvement in the quality of the sky in the second shot! Clearly, the texture compression image quality problem many people have observed in Quake III Arena is caused by the poor implementation of DXTC in the GeForce series, not by DXTC itself. I can't think of any other rational explanation. Can you?"

That reason is absolutely false and contains not a single shred of accuracy. Quake3 doesn't use DXTC so clearly it has nothing at all to do with DXTC. Ignoring the authors ignorance on the subject matter and sticking to his general point the issue is with the sky textures in Quake3, and in particular issues with layering transparent textures over one another, a point which can easily be dealt with by turning texture compression off. The GF based boards support AGP texturing which does in fact alleviate much of the performance hit of dealing with swapping in the instances in Quake3 it happens to take place in.

The honest truth in the matter is that S3's own boards, from the company that invented the standard, exhibit the same issues as the GF based boards do. This quite simply appears to be an S3TC issue that ATi has dealt with via modifications of the standard. This wouldn't shock me as they were in development with the Radeon long after the GF2 and could have been well aware of the problems that would be displayed if they shipped suupport without any altercations.

Another aspect that the article rather glossed over is that the problems are almost all related to textures where lighting is evident along with transparancies. This would be textures that have lightmaps applied or those that are dealing with an infinite light source while being transparent themselves(the sky for instance). This may in fact be caused by the lack of precission, anything with a transparancy value should use 32bit accurracy to ensure accurate reproduction. These artifacts are the type that Carmack brought up when he mentioned the need to expand our current technology to deal with 64bit color accuracy.
 

VoodooSupreme

Banned
Sep 3, 2000
12
0
0
Quotes from Ben Skywalker, the wannabe king of the forums.

"The issue he brings up involves S3TC, not DXTC.."

Prove it! Regardless of it being DXTC or S3TC, the Geforce SUCKS at it dude. Quit making excuses. It looks like crap, it looks like crap, it looks like crap. Crap by any other name is still crap! Strike 1 for Ben Skywalker.

"What's this?? Texture compression can be disabled on the GF2 boards? There goes that argument..."

Perhaps you missed this comment- "Disabling texture compression only reduces performance, sometimes drastically!"
Strike 2 for Ben Skywalker.

"The honest truth in the matter is that S3's own boards, from the company that invented the standard, exhibit the same issues as the GF based boards do."

Wrong. Couldn't be any more wrong if you tried. Strike 3 for Ben Skywalker. He's outta there!
 

VoodooSupreme

Banned
Sep 3, 2000
12
0
0
The article is 100% correct and true. The people who published it are a well known website. The Geforce cards have lots of problems, face it. Ben, let us know when you have your own website, and thousands of readers like they do and please share with us your credentials for being a video card expert.

HAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAAHAHAHAHAHHAHA!!!!!!
 

Doomguy

Platinum Member
May 28, 2000
2,389
1
81
LOL Voodoo lost the argument! Its obvious. We won he lost. We dont care if 3dfx pioneered 3d if they cant keep pace today. Also the 64 meg GF2 most likely suffers almost no performance drop with s3tc off because of its ram. If 3dfx is good why dont they have true trilinear filtering? Everyone else does. The radeon is behind the geforce 2 in speed in 32 bit color with the new 6.18 drivers too.

THE ONLY ONE THATS TROLLED AND NAME CALLED IS YOU!!! We disputed the article and won. Get over it you're a troll.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
"Prove it! Regardless of it being DXTC or S3TC, the Geforce SUCKS at it dude. Quit making excuses. It looks like crap, it looks like crap, it looks like crap. Crap by any other name is still crap! Strike 1 for Ben Skywalker."

Quake3 runs under OpenGL. DXTC is DirectX Texture Compression, it doesn't take a rocket scientist to figure this one out. Strike one? I assure you not.

"Perhaps you missed this comment- "Disabling texture compression only reduces performance, sometimes drastically!" Strike 2 for Ben Skywalker."

Performance is one thing the GF2 has loads of to spare. With a lowly GF DDR disabling texture compression running Quaver using the standard(by that I mean the accepted term) UHQ settings with all image enhancing game options on I'm hitting 39.8FPS at 1024x768 and 56.1FPS at 800x600. That's worse case scenario everything cranked. For single player I use 1024x768 with those settings, multiplayer 800x600, and that is with a lowly GF DDR, not a GF2.

"Wrong. Couldn't be any more wrong if you tried. Strike 3 for Ben Skywalker He's outta there!"

No, we could enter into the level of bit precission and how that relates to artifacts when transparancies are utilized particularly when compression is used, but by the sounds of it you don't care about the particular technology involved, mearly the end results. The problem occurs on S3 boards, S3 has made statements about it and talks about the fact that developers should pre compress their texture to ensure that they will display properly.

"Ben, Ben, Ben, I have seen your comments before, and you usually cut and paste things that other people have said or written, and then try to impress us by pretending that you actually know anything about video cards."

Well it looks like I can pretend a lot better then you. As of yet I have only seen you repost what others have said without any intelligent added commentary.

You have not dis-proven anything from this article. Those guys are well known, and you.....you.....you are a wannabe, always have been, always will be! Heh!

I don't care in the least who they are, if they are wrong they are wrong. I have already posted my thoughts on the situation on their board and am awaiting a reply, we'll see what the article author has to say since you think so highly of him. I have no disrespect for him. He isn't very informed on the particular subject and I assure you I have no problem telling anyone that if they are posting misleading information, doesn't matter who they are. I also have never claimed to be anyone, how exactly would I qualify as a wannabe? Who is it that I am supposed to want to be? I am quite comfortable with who I am, can you say the same?

I am a poster on a few BBSs, nothing more or less then the majority of posters here. What ever it is I say should be judge by what I say, not who I am just like anyone else.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
You posted while I was typing again-

Ben, let us know when you have your own website, and thousands of readers like they do and please share with us your credentials for being a video card expert.

Let's here their credentials. One site that I know of that I can instantly respect their oppinion, even if I disagree with it, based on their level of expertise on video cards, Beyond3D. Dave and Kristoff know their stuff, though I do disagree with them on quite a few things, so do many others who also know what they are talking about.

What do my credentials have to do with anything? Again, who I am matters not, what I say is what is relevant. I would wager that I do have quite a bit more experience in 3D graphics then the vast majority of people who run the sites that you speak of, but again that shouldn't matter. I was more then likely working with 3D technonologies before most of these sites owners were on line, but again that doesn't matter. What matters is the topic at hand, and that alone is what this thread is about. Who is right on the particular issues in this matter is what is relevant. If you want a general discussion on 3D technology where we can discuss broder subject matter then feel free, this thread that you started is about S3TC compression quality running on the GF2.
 

DragonFire

Golden Member
Oct 9, 1999
1,042
0
0
VoodooSupreme and everyone else for that matter but mainly VoodooSupreme:

They might have started it all but they let it all go to there head, they thought they were better so they came out with glide to make there cards run faster then anyone else. Oh yes they were in the drivers seat when it came to the beginning of 3D. Little did they see were all these small video companies were using OpenGL. 3DFX didn't care about those small people with there lame OpenGL API. Well those little companies got bigger and then came the TNT....opppss....3DFX had its head up to high to where it couldn't see what was climbing up. Then we had all the fun of 32 bit color, well 3DFX though it was better then 32-bit because of its massive fillrate. Then the TNT2..opppsss....3DFX panics and rushs the V3 out after a few delays I think). By now there are only a few new games that really support the all mighty glide. 3FDX misses a launch and the GeForce slaps the V3 in to second. Now were at the GeForce2, 3dFX is dieing/dead. Whats this about cutting edge crap? They supported 32bit color last, they come up with something (fsaa) before it can be used with good framrates, as pointed out they have to use 2 or more chips now. There not even trying for cutting edge, there just trying to stay alive at this point. Now if you look, they same danm thing has happen with Intel and AMD. Intel thought no one could touch them and the bang here comes the Athlon.

This is no way flamming anyone, I just think its a brief summry of what has happen to 3Dfx and why they aren't that great. "We all know that Nvidia Geforce 2 cards have the worst graphic quality of all the current cards"

The only thing I have to say to that is:

VoodooSupreme, I would like to see you make a better product then the last every 6 months.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |