And still another reason that Nvidia Geforce 2 cards have below-average graphics

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Have to reply to one misconception that was stated in this thread, 3dfx in no way started 3D graphics.

The first "real" 3D machine was developed by a division of General Electric for NASA when they were working on docking drills for astronaut training. At the time, they were attempting to use a camera on tracks built in to a remote control but needless to say this caused for a significantly less then ideal simulation.

GE came up with a computer capable of drawing somewhere around 10-100 polygons per second that was launched in order to aid in these simulations, this was roughly 40 years ago. That company was later spun off and formed Real3D, who in recent memory is mainly recalled for aiding in, mainly developing in reality, the Intel i740 chipset which was a unique product in itself as it was designed with the intention of having no on board texture memory. The board was largely a PR attempt for Intel to show the usefulness of AGP texturing but was underminded when they launched a PCI version with an AGP "bridge" chip of sorts, along the lines of what the V5 6K will use though not exaclty the same, that allowed "AGP texturing" utilizing on board RAM(the V5 6K's chip allows for full AGP functionality if 3dfx were to implement it in the VSA-100), that was in fact faster in many cases then the AGP versions. With up to 16MB dedicated texture memory and 8MB dedicaed frame buffer the StarFighter PCI was a very impressive all around board in its day. Too bad that the true pioneers of 3D technology are no longer a player.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
3dfx sucks because:

No T&L
More and more games are taking advantage of this but 3dfx insists that "you don't need this". The GF2 Ultra is 3 times as fast as V5 5500 in MDK 2. The GF and GF2 MX even beat the V5 in T&L benchmarks like MDK 2.

Multiple CPUs
3dfx need multiple CPUs just to stay in the game which makes things more expensive and their boards suck up way too much power. Due to the memory sharing architecture, the 64 MB V5 has only 28 MB of effective memory. The V5 6000 with 128 MB RAM will only have 54.5 MB RAM to use! Also, the multi-CPU design doesn't even help them in performance, it just stops them from completely slipping off the charts.

Not to mention 3dfx's cards are the size of canoe paddles and they drain the national power grid when you turn them on.

Abysmal OpenGL
I don't know how the V5's drivers are, but the V3 sucks with OpenGL games. Half of them won't run or run incorrectly. They still have the MiniGL crap in their drivers. This is yet again due to 3dfx's "innovative" thinking about forcing Glide down the developers' throats.

No AGP Implementation
Once again, "you don't need this" says 3dfx. Benchmarks show when the VRAM is over-filled with textures and they are forced to go out to main memory, the GF based boards are able to achieve three times the level of performance of the V5 5500 when they use AGP texturing and AGP fast writes. The timedemos 1 & 2 don't show this because they don't have many big textures. Try maps like Hero's Keep or The Dredwerkz and watch the V5 5500 suffer while the GF based boards steam-roll ahead.

Slow
The GF2 and GF2 Ultra demolish the V5 5500 and even the GF and GF2 MX beat it in some tests. The price of a V5 5500 is the same as a GF2. I know which one I'd rather have!

No Tri-linear or Asintrophic Filtering, No Rolling Bump Mapping
The V5 only has approximated tri-linear filtering while the GF based boards utilise true tri-linear mip-mapping. Also the V5 is unable to do rolling bump mapping and it scores zero in the tests.

Higher Res Is Better Than FSAA
Despite what the Zombies tell you, playing games at higher resolutions is far better than the blurry effect of FSAA, and the jaggies go away with this too. The graphics are sharper and smoother and you can see for longer distances. The GF2/GF2 Ultra can run higher resolutions smoother than the V5, and the GF2 Ultra is able to bring 1600 * 1200 gaming to the desktop.

3dfx are dying because of poor drivers, poor hardware engineering and idiotic ideas about what the industry "standards" are. 3dfx's strategy to beat the competition is to throw more CPUs onto their boards. I seriously doubt whether a Rampage will even be on par with a GF2 Ultra.

A nice roundup of the GF based boards along with the V5 and the Radeon can be found here: http://www.tomshardware.com/graphic/00q3/000814/index.html

The GF2 Ultra is doubling the V5's score across the board and tripling it in the T & L enhanced game, MDK 2.
 

VoodooSupreme

Banned
Sep 3, 2000
12
0
0
I'm sorry to everyone (except Ben Skywalker). I just get tired of GeForce and Radeon owners talking bad about the Voodoo5. I was just happy to see an article pointing out that Nvidia isn't perfect either. I didn't write the damn article.

Anyway, just let this thread die. If I was rude, I am sorry. So end the flames please, because I have. Except for Ben Skywalker, I have nothing against anybody else, but that guy has been talking down to people for months, and I really don't like the dude. I hope he gets run over by a UPS truck delivering Voodoo cards!
 

Doomguy

Platinum Member
May 28, 2000
2,389
1
81
Voodoo can you read or not you little 10 year old? We proved you wrong. You refuse to admit it. Why dont go to your room and cry about it.
 

VoodooSupreme

Banned
Sep 3, 2000
12
0
0
Actually Doomguy, YOU haven't proven anything, except that you are a punk! Only the lowest form of life would continue to Flame after a guy says he's sorry. I did not admit anything about the facts being wrong. The article is the article, I didn't write it. I said I was sorry for being rude, but you just keep insisting on going on. Get over it dude!
 

Esis

Banned
Sep 3, 2000
12
0
0
Doomguy, just be quiet. The guy apologized for flaming, which is alot more than most people do. I have a Geforce GTS and I admit the image quality is not always the best, but who the hell cares? It's still good, and my card rocks. He even admitted the Geforce GTS and Radeon are much faster, so why are you still trying to mess with him. Let this thing die now.
 

Doomguy

Platinum Member
May 28, 2000
2,389
1
81
Voodoo you wont leave ben alone so I wont leave you alone. All ben has done is cleared up the misinformation and you hound him. You're an inmature punk that has nothing better to do than troll his new video card.
 

KarsinTheHutt

Golden Member
Jun 28, 2000
1,687
0
0
Someone call 911!!!

FIRE IN THE FORUMS!

TROLLS ON THE LOOSE!

btw my TNT IS the BOMB!
176 FPS Quake 3 1024x768 @ 32 bit color, so chew on THAT



 

EvilDonnyboy

Banned
Jul 28, 2000
1,103
0
0
Jeez.

90% of the people on these boards are biased against one company or another.

How would a confused guy askin questions get accruate information here?

To me, it seems like the FS boards are the only place you're guaranteed SOME accurate, no-biased info. And these boards are the place to go for mostly stupid debates that leave the people askin questions more confused than before they asked.

just my opinion. don't go beserk on me, with all the hate posts.
 

KarsinTheHutt

Golden Member
Jun 28, 2000
1,687
0
0
You are all idiots. Everyone knows that intel 740 graphics are the best because they are made by intel. :Q

VoodooExtreme - are you part of some twisted 3dfx cult? :Q

Ben Skywalker and Crew - hmm. What should I say? Some of you are quite arrogant. You know who you are. :Q

Methinks I need a fire extinguisher!!!
 

CyberSax

Banned
Mar 12, 2000
1,253
0
0
You need to take a lesson from Consumer Reports, and that is to stop making these idiotic allegiances to corporations and their products. When you want to buy something, you should do a little market research and buy the product that offers the best performance/price combination for your budget. 3dfx may have made a good card 4 years ago, but that doesn't mean that you have to keep buying their products today. Buy whatever you think is good, then shutup
 

jpprod

Platinum Member
Nov 18, 1999
2,373
0
0
Just a few obvious observations in terms of Quake III and texture compression... I'm sure most know that Quake III creates compressed textures out of the uncompressed, 32 bit ones while loading level using display card's OpenGL driver. NowFrom my understanding Quake III doesn't even have to know/specify what compression schemes and algorithms are being used, all that is needed are the certain OpenGL compressed texture extensions (GL_S3_S3TC for example) which relay the commands to video card driver. Now...

1. Voodoo5 supports FXTC and DXTC, but not S3TC because 3dfx hasn't licensed it from S3. Thus Voodoo5 only supports FXTC under OpenGL.
2. Radeon supports DXTC and S3TC - ATi has a license to use S3TC under OpenGL.
3. GeForce and S3 Savage chips supports DXTC and S3TC, S3TC licensed.

From these, GeForce and S3 Savage series cards suffer from poor compressed transparent texture quality, V5 and Radeon don't. What exactly is causing this then? There are two possibilities: ATi might've created a non-standard way to use S3TC under OpenGL, but that would cause issues when pre-compressed S3TC textures (as would 3dfx's extension emulation via FXTC) are being used, unless the driver is a particulary smart one (and remember that it's ATi we're talking about ).

Thus I believe this problem depends solely on OpenGL display driver's S3TC compression algorithm. S3 and NVidia weren't able to write as efficent compression algorithm as ATi and compromised quality to get decent level load times, whereas 3dfx uses a more advanced, adaptive compression scheme. It will be interesting to see how well manufacturers with upcoming chips, like Matrox, manage to do realtime S3TC into their drivers.

And as a GeForce owner I strongly feel that Nvidia should introduce a realtime S3TC compression quality slider into their drivers. I'm not the only one asking for this NVidia! Too bad the only employee of any video chip manufacturer company occasionally reading these forums is 3dfx's Alf Covey
 

BlvdKing

Golden Member
Jun 7, 2000
1,173
0
0
S3TC or DXTC, it doesn't matter. There is something wrong with the texture compression on a GeForce card. I don't like seeing the screwed up sky and artifacts on my GeForce card! I don't want to take the penalty for disabling compressed textures either, especially when other cards can render compressed textures correctly. The fact is, Quake 3 uses some form of texture compression that just doesn't work on the GeForce card and does work other cards. Obviously, this is a real problem with GeForce cards; I would like to see NVidia fix this somehow in driver updates (if it can be done).
BTW, 39fps in Q3A 1024*768 (32bit color I'm assuming) sucks. I haven't disable texture compression yet, but I'm not looking forward to the hit in performance it involves, especially since I get 75 fps @ 1024*768 32bit color/textures and texture detail to the max. I imagine I will take a large hit due to the fact that I have an MX card.
I'm just saying that this is a problem - NVidia should fix it, or all the GeForce cards should be rebenchmarked with texture compression off so that the visual quality of the cards can be taken into account.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
"The fact is, Quake 3 uses some form of texture compression that just doesn't work on the GeForce card and does work other cards."

Well, the problem is that QuakeIII doesn't use any texture compression itself, it allows boards to do it for it though. The particular type that nVidia uses for Quake3, S3TC, happens to have problems on S3 boards also, and they created the standard. I honestly think that we should be congratulating 3dfx and ATi for coming up with better solutions then bashing nVidia for using a set standard that they didn't come up with.

If id had shipped Quake3 with precompressed textures then we wouldn't see the problems. From what is being said it appears that UT running on a GF based board will display S3TC compressed textures just fine under OpenGL, though as of now that is a combo that only runs under Linux.

BTW- That 39FPS is with all game options that increase visual quality on and sound enabled under Quaver, not Timedemo1. I don't believe in posting numbers that do not represent a real gaming environment.

"NVidia should fix it, or all the GeForce cards should be rebenchmarked with texture compression off so that the visual quality of the cards can be taken into account."

If they can fix it I would definately want to see it, but I'm not sure that rebenching all the boards is a viable option. It would be nice to see how they stacked up with the highest level of visual quality. The 64MB boards would also clearly distance themselves, making for a good PR spin for nV's higher end parts. The visual quality I'm not sold on though, the V3 had horrible visual quality compared to most other offerings and yet was viewed as a contender on its' own benches. I agree that I would like to see sites post both numbers, but most people truly aren't going to care about the non compressed numbers.

To test it yourself-

r_ext_compress_textures 0
vid_restart

(0=Off, 1=On)
That is a setting that needs to be switched back, it will stay even after you restart the game.
 

BlvdKing

Golden Member
Jun 7, 2000
1,173
0
0
Skywalker, 39fps in Crusher is really damn good. I get exactly 25fps (1024*768 32bit color/textures, full texture detail, all goodies on) in Crusher with texture compression off. When I disabled texture compression, the levels in Q3A looked *alot* better, but I lost 20fps. I am now getting 55fps instead of 75 in timedemo 1. I surely hope NVidia fixes this problem some how.
 

BlvdKing

Golden Member
Jun 7, 2000
1,173
0
0
Oops, reading over your post again I noticed you said Quaver. I benchmarked again and got 42.6 fps 1024*768 32 bit, all goodies etc.... This is with texture compression disabled, also. That's not too bad, for an MX on a TBird 700 at 800mhz (114*7). I hope NVidia does something about this texture compression thing, because I was getting ~59 fps before I disabled it.
Edit: The Voodoo 3 was benched against other cards where they were both run in 16 bit color, ie. an apples to apples comparision. The Voodoo 3 had really good 16 bit color quality when the gamma was adjusted properly (so that everything didn't look washed out). When running a GeForce 2 against another card with texture compression on, the benchmark is in essence an apples to oranges comparsion since anyone can purposely lower visual quality to achieve higher frame rates. I would like to see visual quality taken into account when sites benchmark cards. The GeForce with texture compression off roughly compares to an ATI Radeon or Voodoo 5 with texture compression on or off in terms of visual quality (since both texture modes look almost the same on the ATI and Voodoo5). I think this would have an affect on the sale of the GeForce (I'm assuming the ATI card would look alot better in the benchmarks with these standards, at least. The Voodoo 5 still may lose speed benchmarks against a GeForce/Geforce 2 with texture compression off) and therefore get NVidia off their asses to do something about this so that the sites can compare the graphics cards with the same level of texture compression. I'm making the follow assumptions: Voodoo 5 uses FXTC and the Radeon uses S3TC in Quake 3 to compress textures on the fly. Just some thoughts.
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
Lordie, lordie lordie.

Such idiocy being spouted from both sides.

Poor Voodoo Supreme. A 3Dfx zealot, under attack from a horde of nvidiots.

hehehe...gotta love Anand's forums.

*rolls eyes*
 

Sunner

Elite Member
Oct 9, 1999
11,641
0
76
People, his name is VoodooSupreme, not Extreme, VoodooExtreme is a site, and a damn good one at that
 

snickelfritz

Junior Member
Sep 6, 2000
2
0
0
S3TC is implemented in the GeForce series to combat the swapping problem in several Quake3 levels at 32bit max texture detail.
The 64MB cards do not need to use it, since they have sufficient video memory.

I have 32MB DDR, so I use the S3TC and the artifacts are not noticable in actual gameplay.
I disable compression and enable FSAA when taking screenshots of my levels.
 

MGallik

Golden Member
Oct 9, 1999
1,787
4
81
Yep RoboTECH, these forums have been becoming quite the nVidia fan site.

Way to many bits and pieces conveniently left out or glazed over in
these posts.

Say something against a GF and the jackels attack, need help with your
nVidia card and very few can offer any real help.

What does that say?
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
A few comments for the peanut gallery here (from a 64MB GTS owner):

I got my 64MB card from DecoY from here in the forums. Darn impressive card, actually. 220/400 without issue.

Doomguy:

"With the 6.18 drivers nvidia's opengl fsaa is just as good as 3dfx's and the new drivers help nvidia blow 3dfx away in speed. 3dfx is missing T&L, pixel shaders and the bump mapping that nvidia and ATI have. "

1) nvidia's FSAA is not in the same league as 3dfx's. Only a true nvidiot would think it is. I love my GTS, but let's get real here bud.
2) How many games are optimized for T&L? (MDK2)
3) How many games use per-pixel shading? (Evolva)
4) How many games use bump mapping? (Evolva)

wh00ps! A whole 2 games support these crucial, crucial features. And one of them is GHEY!!! So much for that pile of phloeey. don't let the nvidia marketing machine suck you in. That stuff will be important NEXT YEAR, but not now. :-/

SilverBack:

"I play Tribes. The V5 loses over 30 frames a second to the GTS using opengl. "

Why in god's name would you play Tribes in OGL with a V5? Hell, why not play UT and Deux Ex in OGL while you're at it. *shakes head*

I can certainly agree with your assessment of the 5500's OGL drivers. THEY'RE LAME!

"The worst part is that the V5 also loses COMPLETE graphics. I mean like a whole weapon is a light blue or black. If you would like to see screen shots let me know. I have them. Also using the AA in D3D cause some very interesting problems also.
I have a screen capture where the V5 completely drops whole objects off the image map. DUH. "

Duh? You obviously had something wrong with your system since no one else on the planet has had those issues. I suppose that's cuz most people DON'T run Tribes on a V5 with OGL, they probably use glide.

DUH.

sharkeeper (supreme dolt) :

"Go here and download this demo:

http://www.nvidia.com/Products/demos.nsf/pages/044E9E8336A809108825694E00005666

Let us know how it runs on your woodoo5."

Yeah, that makes sense. Let's see how a V5 runs an nvidia demo. Why the F*** would anyone run an NVIDIA DEMO on a non-nvidia card? Were you beaten as a child? Perhaps you should've been. Please note, I say that with all the love and compassion in the world, of course.

NOS:

"I hear that 3dfx sales are way down. Evidence enough for me as to whom the best really is in the video card world! "

The 5500 was the best selling card at retail. 3Dfx has owned the retail market for over a year. The V3-2000 and 3000 (AGP and PCI) owned last year's market. Their problem is lack of a worthwhile OEM product. Retail they're doing fine. OEM is what is killing them (they did sign a deal with Micron tho, we'll see how that pans out)

BenSkywalker:

"Quake3 has never, does not, and never will have support for DXTC. This reviewer clearly is utterly clueless on what he is trying to talk about. The issue he brings up involves S3TC, not DXTC which as of yet has very little, if any, support."

oh for God's sakes, it's the same damn thing, quit with your semantics argument. The Q3 engine makes full use of it. Straight up, the GeForce sucks for it. JUST ADMIT IT!! It's okay! I admitted it! I sold my 32MB GTS so I could get a 64MB one BECAUSE TC on the GTS sucks (that and the more overclockable RAM).

I disable TC on my 64MB GTS, what is the big goddamn deal? The GTS is fast enough to take the framerate hit, why are you whining about a FACT????

There is NO reason for nvidia to allow this to happen. WTF? They release 900 goddamn driver revisions a month (Derek perez sez :"we don't leak drivers"...ha!), why the hell cant' they fix this?

Doomguy:

"Also the 64 meg GF2 most likely suffers almost no performance drop with s3tc off because of its ram."

actually, not true, although I noticed the hit was far less substantial in large texture-type scenes (i.e. q3dm9) than the 32MB version, which got it's ass kicked. It still loses a good 10% of it's framerate in "easy" demos (i.e. demo001) and about 20% of it's framerate in quaver. The 32MB I had dropped to under 30 fps on quaver (ugh!!!), and hit as low as 15-20 in the harsh spots. Ugh....

Ben Skywalker:

"With a lowly GF DDR disabling texture compression running Quaver using the standard(by that I mean the accepted term) UHQ settings with all image enhancing game options on I'm hitting 39.8FPS at 1024x768 and 56.1FPS at 800x600. "

WTF? 40 fps in quaver with TC disabled on a DDr? nice. 6.18 I assume? I have a 64MB GTS and I get ~60. the 32MB board just plain died (tho it was using the 5.32) Yeah, don't forget to mention those nice teenage and low 20s framerates you get every time you enter the RL area. Ugh.

"The problem occurs on S3 boards, S3 has made statements about it and talks about the fact that developers should pre compress their texture to ensure that they will display properly"

well, it looks like those 3dfx characters who are "so far behind" seem to have figured out a way to get around that, and those ATi people who can't produce a decent driver have gotten around it also.. Their TC doesn't suck. The GeForce's does. Just admit it. It sucks. I am an nvidia owner, and I admit it. Why can't you?

"The first "real" 3D machine was developed by a division of General Electric for NASA when they were working on docking drills for astronaut training."

oh for god's sakes man. WTF does this have to do with the price of tea in china? ARe you on dope? Lordie...this has NOTHING to do with 3d-hardware accelerated computer games (and yes, I read your long-winded intro)
<rolls eyes>

3Dfx was the FIRST GRAPHICS COMPANY to bring hardware-accelerated 3d gaming to the PC in any large quantity. Yeah, yeah, NV1 beat it (laff) as did the Verite (slow), but 3Dfx was the first company to put out a PC chipset graphics adapter that drastically increased the visual quality and speed of 3d games. Again, as an nvidia owner, I admit it. You should to. It'll put hair on your chest.

&quot;With up to 16MB dedicated texture memory and 8MB dedicaed frame buffer the StarFighter PCI was a very impressive all around board in its day.&quot;

the i740 was, is, and always will be a complete, utter, absolute piece of poop. Really now Benji-boy, remove your head from your hindquarters. It smells much better out here.

&quot;If id had shipped Quake3 with precompressed textures then we wouldn't see the problems. &quot;

and if nvidia could remove their heads outta their asses, we wouldn't see problems on the GeForces. nvidia needs to own up to this. Their competitors don't have problems, nvidia does. Get your head out of the sand man! It's YOUR card! You should DEMAND that nvidia fix this, because IT IS THEIR PROBLEM!!!

It's not Q3, or both 3dfx and ATi would have problems. Don't try to pass the buck. My graphics company needs to get on the ball and fix this poop. I've seen a gazillion &quot;leaked&quot; driver sets. Not one has even bothered to address this in any way, shape or form. In fact, most of them have been miniGL's for Quake3 performance.

&quot;The visual quality I'm not sold on though, the V3 had horrible visual quality compared to most other offerings and yet was viewed as a contender on its' own benches. &quot;

The lack of large texture support really killed the visual quality of the V3. however, do NOT compare the V3's 3d-image quality to the 5500's. They are WORLDS apart. I'll agree tho, the V3's 3d image quality (in Quake3) was quite lame. UT-glide sho' looked nice tho. *laff*

&quot;I agree that I would like to see sites post both numbers, but most people truly aren't going to care about the non compressed numbers.&quot;

I don't care about the *compressed* numbers, because they mean nothing to me. I find it ironic that so many GTS owners like to pimp the &quot;drastic image quality improvement&quot; of 32-bit over 16-bit, or trilinear/anisotropic over bilinear, yet many ignore how butt-ass ugly the TC is. Humorous....


BlvdKing:

&quot;The Voodoo 3 had really good 16 bit color quality when the gamma was adjusted properly (so that everything didn't look washed out).&quot;

to each his/her own, but I thought the V3 looked pretty crappy in Q3. even with r_gamma nice and low, the texture aliasing looked bloody hellish and blurry. blech. UT sure looked good in glide tho! *g*

&quot;When running a GeForce 2 against another card with texture compression on, the benchmark is in essence an apples to oranges comparsion since anyone can purposely lower visual quality to achieve higher frame rates. I would like to see visual quality taken into account when sites benchmark cards. The GeForce with texture compression off roughly compares to an ATI Radeon or Voodoo 5 with texture compression on or off in terms of visual quality &quot;

agreed thorougly in that case. I saw no need to turn off TC with the 5500, since the ugly-ass artifacting wasn't there with the 5500. With the GTS, that's the first line in my graphics.cfg

r_ext_compress_textures 0

B-/

the o/c'ed 64MB GTS holds up nicely tho, I'm proud to say. In Quaver, tho, it's *not* as fast as the 5500 (o/c'ed also) with TC enabled (again, 6.18's don't agree with my system, so YMMV)

&quot;The Voodoo 5 still may lose speed benchmarks against a GeForce/Geforce 2 with texture compression off) &quot;

if you are stable with the 6.18 drivers, the 5500 wasn't *quite* as fast as the GTS (overclocked). If you are stuck with the 5.32 drivers, the 5500 was faster (on my system, at least, both cards overclocked)

Snicklfritz:

&quot; have 32MB DDR, so I use the S3TC and the artifacts are not noticable in actual gameplay.&quot;

ugh....are you serious? I assume you use gl_linear_mipmap_nearest and r_colorbits 16 as well? If you can't see the horrid TC dithering/banding/discoloration, then you sure as hell can't tell the difference between 16 and 32-bit, or bilinear/trilinear filtering.

you really can't see a difference? Gadzooks man....get some glasses.

MGallik:

&quot;Yep RoboTECH, these forums have been becoming quite the nVidia fan site.&quot;

tell me about it. A quick read of the reviews shows that. I almost died when I noticed Anand mentioned how f***ed up the TC was with the GTS (all of 1/3 of a 20-or-so- page review, heh...) I was impressed.

&quot;Say something against a GF and the jackels attack&quot;

it's all about religion man. A Graphics card choice is a deeply personal, religious experience for many, I suppose. I grabbed the GTS cuz it runs Q3 like mad. Big deal, eh? I'm going to replace it with a 5500 because the 5500 did several other things that I liked that the GTS doesn't do.

Hell man, I'd just be happy if I could get my frickin' GTS to run NFSU decently... :-/

&quot;need help with your nVidia card and very few can offer any real help. What does that say? &quot;

they're too busy comparing 3dMark2000 scores to be bothered, HA!!!!!!

(damn, I'm funny)

What I find hysterical is that here I am, 3 of my last 4 card purchases, and 5 of my last 7, have been nvidia cards. I presently own a 64MB GTS card, which replaced a 32MB GTS. I'm quite a fan of nvidia cards, and I'm in here ripping on the nvidiots. I feel like I'm watching an overprotective mother and her son....she just can't see that he can possibly be anything other than perfect.

What I REALLY love is when one of these schleps calls me a 3dfx fanboy, hehehe....just because I don't suck Derek Perez's d*** and bow at the altar of the GeForce. +laff+

I mean, I see on the 3dfx boards some pretty zealous peeps, but nvidiots really take the cake, LORDIE!

THEY'RE GRAPHICS CARDS PEOPLE!!! LIGHTEN UP!!!

oh, and VoodooSupreme? hehehe....you really are out of your league bro.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |