GeForce3 Review...WITH BENCHES...TONS OF 'EM!!

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Remnant2

Senior member
Dec 31, 1999
567
0
0
propaganda? hardly. My video card is ancient V3 3000 that still does pretty well, so don't try to imply that I'm an nvidiot. I'm telling you this simply. Sure, a year from now most game will still run sweet on your geforce2, but you'll be running the few DX8 aware games with the eye candy turned off to do so. It will only get worse after that.

I'm not saying to rush out and buy a GF3 -- I certainly aint going to do so, until it costs about $150. I'm just saying that the opportunities that it gives programmers is enormous. Once the price comes down, you ARE going to see it get used. I'm very excited, I wish I had a GF3 to play around with. It's been a long time since I've been excited by the feature set of a videocard.
 

JellyBaby

Diamond Member
Apr 21, 2000
9,159
1
81
<< but you'll be running the few DX8 aware games with the eye candy turned off to do so. >>

Remnat,

If all that eye candy is the icing on a very good cake (game) then I will be distressed as my GF 2 isn't putting out. And if several quality games could be made into &quot;A++&quot; material by a simple hw upgrade, I will certainly purchase GeForce 3 goodness immediately if the price is right. But as you say not until this is the case...
 

aniki

Senior member
Sep 4, 2000
538
0
0
If you own a Geforce 1 or 2, upgading to a faster cpu would make more sense right now.
 

Zero

Senior member
Oct 9, 1999
783
0
0
Wow. That article kinda made me feel good about my Radeon 64mb VIVO purchase. With all the hype of the GeForce3, I almost feel honored that my $177 Radeon is in comparison to the GF3. I'm guessing I might be able to wait until the Geforce 4 Ultra Turbo Super Quad comes out to upgrade. My radeon didn't look half bad. We'll see though.

Zero
 

Soccerman

Elite Member
Oct 9, 1999
6,378
0
0
2d quality on nvidia cards has nothing to do with the chipset itself..its got more to do with the crappy RAMDAC filters manufacturers use to keep their costs down.

so, that means 3D speed/power doesn't depend on nVidia, after all, the companies can set their clocks to whatever the hell they want..

oh wait. for some reason they DON'T.. why?

nVidia pressured them into not producing boards with different then 'reference' clock speeds as default. you're almost guaranteed SOME sort of overclock anyway though.

so what does this mean? well nVidia COULD pressure the companies as well into providing better filters.. oh wait, we can't blaim nVidia, they're perfect ;-)

conclusion so far, ATi Radeons are sweet deals for todays games (unless you don't like their Win2K performance), and have better support for tommorows games then the Geforce 2 does..

The Geforce 3 is fine for todays games if you ABSOLUTELY must have the bleeding edge (hey I'd buy it in a second if I had that kind of money, just like I'd buy a good AMD 760 mobo (or wait for their SMP one), 21 inch Sony Trinitron, Ultra 160 SCSI + Cheetah X15's in RAID 0+1) in todays games and a pretty good lifespan ahead of it (my Voodoo 3 looks to be replaced with a Radeon LE soon)..

the question is, what will ATi bring to the table with their next card? we only NOW see the benefits to their architecture (though limited compared to the new Geforce 3, are alot better then nothing), so I'm wondering.. will they pile on T&amp;L performance, or work a bit on increasing efficiency of the rasterizer (tile based rendering?), or what?
 

NFS4

No Lifer
Oct 9, 1999
72,647
27
91
Soccerman, you lost the war...3dfx is dead.

No need to carry that chip on your shoulder anymore
 

Soccerman

Elite Member
Oct 9, 1999
6,378
0
0
Soccerman, you lost the war...3dfx is dead.

you love rubbing it in that 3dfx died don't u?

I never in that last post mentioned 3dfx at all, did I?

btw, how did I lose the war! I wish I could have seen the Rampage in action.. Dave!! when is your NDA up???

I wonder why Kyro has up their sleeves btw.. I hope they don't go bitboys and try to add a whole bunch of these programmeable pixel shaders etc.. I want that thing ASAP! add them later!
 

JayPatel

Diamond Member
Jun 14, 2000
4,488
0
0
well if u saw the footage of Virtua Fighter 4 on dailyradar then ull know what the next Kyro is capable of....absolutely amazing.
 

joohang

Lifer
Oct 22, 2000
12,340
1
0
Not to mention I look forward to XBox. *drool*

Hope that GameCube could do something similar. I can't wait 'til the next Zelda.
 

Katana

Senior member
Jan 8, 2001
561
0
0
Would there be a difference if a faster CPU was used in those benchmarks or will the GeForce3 be the bottleneck still?
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
ElFenix-

&quot;programmable t&amp;l that renders previous t&amp;l obsolete and almost worthless&quot;

Polygons still have to be transormed and lit, that isn't going away with DX8, the GeForce3, magic, or any combination of the three. Hardware T&amp;L, on the GF and GF2, is proving its' worth right now and still will be useful in games like Doom3. In fact, if you look at those benchmarks carefully you will see that the &quot;flexible&quot; GF3 T&amp;L has trouble with the horrible &quot;static&quot; GF2U's T&amp;L(honestly). Flexibility comes at a price, even if it is worth it overall.

Soccerman-

&quot;so, that means 3D speed/power doesn't depend on nVidia, after all, the companies can set their clocks to whatever the hell they want..&quot;

That would be a good point, except that nVidia DOES let vendors use different clock rates. Check on the nVidia site and they list the GF as having 480MPixel performance even though the Herc version had 520MPixels. The TNT2 Ultra series boards offered a rather huge rift between the fastest available and the default nVidia specs. Apple is currently producing GF2MXs clocked above that of their PC counterparts and Hercules shipped their GF2s OCed versus the default nVidia settings(and then had to roll the speed back, but not because of nVidia).

Orbius-

&quot;I'd argue that without T&amp;L those games were pretty good anyway&quot;

I'd argue that Half-Life was still a good game running in software mode.

&quot;but 2 games a feature does not make&quot;

There are many more, I'm just listing two GOTY winners. Ignoring that, how many people here think Voodo1 and in the same thought have TombRaider and/or GLQuake? I know I sure as he!l do.

&quot;Also with the amount of technology 'borrowed' from Ati, I think Nvidia should stamp an 'Ati' logo on one of their chips.&quot;

Name a single feature. You might want to take a look at the patents on those &quot;new&quot; features. &quot;Hyper-Z&quot; technology, the actual portions that make it work, were mostly patented right around the launch of the Voodoo1 and it wasn't by ATi(or 3dfx for that matter).

KarlHungus-

&quot;I honestly doubt the drivers are causing the &quot;low&quot; numbers.&quot;

They are definately having an impact. Look at those charts you posted the liks to, only a 4% drop moving from 16bit to 32bit running 1600x1200 Quaver benches? The GF2 Ultra is nowhere near its' peak theoretical 16bit or 32bit performance at that setting so the raw power edge that it has shouldn't be a factor.

Soccerman Part2

&quot;the question is, what will ATi bring to the table with their next card?&quot;

I hope they can do it, but I have very little faith. ATi in the high end, despite what many seem to believe, has not been a threat to nVidia for more then a total of three or four months in the last several years(and part of that was the Rage128 briefly edging out the TNT). The Radeon, for all it does well, was simply manhandled by the GF2Pro and Ultra which have been widely available for some time now. The Pro is also priced in the sub $300 range which has been considered the &quot;mass market high end&quot; price for video cards. Perhaps ATi will shock the doubters(myself included), but they have yet to show a long term commitment to trying to compete in the high end 3D gaming hardware market.

&quot;we only NOW see the benefits to their architecture&quot;

What benefits over the GeForce series of boards? I have been of the mindset that the Radeon's features best the GF's for gaming purposes, but it seems that developers think differently. Carmack has gone on record stating that Doom3 will look better on a GF then a Radeon already, that benefits of the Radeon are we supposed to see?

&quot;so I'm wondering.. will they pile on T&amp;L performance, or work a bit on increasing efficiency of the rasterizer (tile based rendering?), or what?&quot;

They need to work on both areas badly. Look at the Radeon in bandwith limited situations. The norm is for it to lose out to the GTS despite having a bandwith edge, and all of the &quot;Hyper-Z&quot; technology. Their T&amp;L performance is horrid, in some situations it is only one third that of a GeForce1 SDR, an area that could become worse moving to a more flexible unit(check out the GF2U compared to the GF3 in raw poly throughput).

&quot;I wonder why Kyro has up their sleeves btw.. I hope they don't go bitboys and try to add a whole bunch of these programmeable pixel shaders etc.. I want that thing ASAP! add them later!&quot;

IT(Kyro is just one board) needs to have all of the programmable pixel shaders to match DX8 specs. With X-Box on the horizon having a fully DX8 compliant part isn't a luxury, it is a requirement. I have much higher hopes that they will offer some competition for nVidia then ATi although the ArtX acquisition could put ATi in a competitive situation.

Zero

&quot;My radeon didn't look half bad.&quot;

Were you reading the same review? In actual game benches the GF3 was absolutely obliterating the Radeon's scores(though they kindly didn't show them head to head). Try 1024x768 32bit color Quake3 with 2X FSAA enabled, the GF3 is roughly 500% faster then the Radeon 64MB. Even ignoring FSAA and sticking with just plain old high resolution(16x12 32bit) the GF3 is in the 200% to 300% faster range. I can't wait till we see numbers with release drivers in games and boards that aren't an Ultra(a $500 MSRP piece itself folks) to compare.
 

Urinal Mint

Platinum Member
Jan 16, 2000
2,074
0
0
I'm still wondering if the 2D is any better. I mean, can this board make a Sony G400 look as good as, say, a Matrox G400?
 

KarlHungus

Senior member
Nov 16, 1999
638
0
0
Ben -

<< They are definately having an impact. Look at those charts you posted the liks to, only a 4% drop moving from 16bit to 32bit running 1600x1200 Quaver benches? The GF2 Ultra is nowhere near its' peak theoretical 16bit or 32bit performance at that setting so the raw power edge that it has shouldn't be a factor. >>

First off that earlier post of mine was a response to an even earlier post, guess I should've made that clear. As far as your quote goes I believe I already stated something along those lines. The GF3 has not eliminated the bandwidth problem, but it's done a damn good job. A simple comparison between the 16 bit and 32 bit Quaver scores will tell you that. My entire point was that I doubted that the GF3 would ever match the GF2U in 16 bit color at 1280x1024 or 1600x1200. I don't find this issue to be a major detriment for the GF3 because you get 32 bit for (almost) free. BTW, &quot;low&quot; was meant to be sarcastic - I'll be sure to use a next time.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Karl-

&quot;My entire point was that I doubted that the GF3 would ever match the GF2U in 16 bit color at 1280x1024 or 1600x1200.&quot;

My main point is I think it will, at least at those resolutions. Drop it to 640x480 or 800x600 and the less efficient T&amp;L of the GF3 likely won't be able to keep pace with the GF2U but running resolutions that high I would be surprised if the GF3 can't keep pace with the GF2U with proper drivers.
 

KarlHungus

Senior member
Nov 16, 1999
638
0
0
Ben -

I guess only time will tell... I'd bet you a beer on it, but I'm not sure how either of us could collect.
 

Noriaki

Lifer
Jun 3, 2000
13,640
1
71


<< Is it still worth the upgrade though? To shell out an additional $500 to upgrade from a GeForce 2? >>

Not yet. If you don't take into account the programmable T&amp;L (which current games don't) a GF3 is hardly more than a GF2 Ultra with a slightly downclocked core and better FSAA (4x FSAA for the price of 2x). It's really not going to be that impressive in current games. The GF3 will be a bad-ass when we see Doom3 and the like. But Until we see such games your GF2 will do you nicely.

nVidia had to produce a programmable T&amp;L engine now or developers never would have started using it, but it will be a while before gamers see the benefits. For now we are fine with our GeForces, Radeons and Voodoo5s (hell I still have a Voodoo3...though it's starting to feel a little old).

I firmly believe the GF3 is the biggest advancement in consumer 3D graphics since the Voodoo1, but it's doesn't have the &quot;wow-factor&quot; or the excitement. We have to wait to see the benefits, but a programmable T&amp;L engine is a HUGE leap forward. It just takes some time before we land.

I was never really on the T&amp;L bandwagon...but the GF256/GF2 static T&amp;L vs the GF3's programmable T&amp;L is a whole different ball game, if you'll pardon the tired old expression.
 

JellyBaby

Diamond Member
Apr 21, 2000
9,159
1
81
<<t's really not going to be that impressive in current games. >>

Noriaki,

Hmmm, what some developers might do is release enchanced versions of current games to support the GF 3. These would likely use some but not all of the new features but probably not to their fullest extent. So it's a mixed bag: these games will generate hype and &quot;reason to buy&quot; for the GF 3 but also won't show off the GF 3 as well as a full-fledged effort.
 

daddyo

Senior member
Oct 9, 1999
676
0
0
The next &quot;must have&quot; technology in the graphics arena will be when the BitBoys finally release their WonderKard 2000K using XZ-Buffer RMFF RAM and 2 Trillion Capaciductors. They're said to be in talks with &quot;major players&quot; in the video card market as we speak.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |