ANAND'S GeForce FX review: WEAK!!!

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

CurtCold

Golden Member
Aug 15, 2002
1,547
0
0
...And ATI fanboys Rejoice...


Really now that nvidia is on the .13micron process, they can only improve it imo. I think Nvidia will be poised to unleash some awesome videocards in the near future. There not going to let ATI run all over them for too much longer.
 

ondaedg

Junior Member
Sep 30, 2001
7
0
0
Originally posted by: Deeko
Originally posted by: ondaedg
uhm, and how do you know this? Nvidia's driver development team is one of the best in the business. I can't remember a single product they put out that didn't have its performance improved regularly due to timely driver releases. I'll reserve judgment until it officially ships.

I wish some of these sites that previewed the card would have run some benches at 16 bit just to determine if bandwidth really is what is killing the performance of the FX.

Umm...I don't think you get it. The FX is out a half year later, is clocked signifigantly faster, and it supposed to be more advanced. It should be mopping the floor with the 9700. It can't even MATCH it. Consider the price difference, too. And 16 bit? Are you on crack? Who runs 16 bit with a GeForce FX? The fact that it gets smashed in AA/ansio certainly makes it seem like memory bandwidth is the issue, wouldn't you say? There is no way that the bandwidth of that card can support that fill rate.

Why would Nvidia send out their product if they knew the Radeon 9700 pro would beat it? That doesn't seem logical. I don't even get why they would send anandtech one if they already knew a radeon 9700 would beat it. Lots of mystery going on here. I think i'll take the wait and see approach.
What are they gonna do? Scrap the project? They still have to release something....3dfx made the mistake of delaying too much with the Voodoo5. Look where it got them.

Hey dude, calm down it's just video cards. Not a World Cup Soccer match.... The reason for a 16 bit test is so that the performance delta can be measured at 16 and 32 bits. If the performance delta decreases at 32 bit, then it is obvious that bandwidth is the reason for the NV30's lackluster performance. As for your overzealous attitude about this issue, ATI and NVidia want your money, not your moral support. You won't see them on the sideline cheering for you in your world cup match.... or, whatever it is that you play.....
 

HappyGamer2

Banned
Jun 12, 2000
1,441
0
0
I agree with jiffylube, it reminds me of the voodoo5, late, expensive, huge, makes lots a heat and draws lots a power.
I remeber how th NV boys ripped on the voodoo5 for all this, but now........?
oh well I still run my V5 thou

ya, the voodoo5 wasn't as fast overall as the geforce 2 standard model but it wasn't all that much slower, plus it's AA was better

Nv kinda slipped on there great 6 months time cycle too

ps, at least when the voodoo5 finally hit the street it's drivers were good(not perfect thou), and the second set came out quickly afterwards at took care of most of the first drivers problems
 

sash1

Diamond Member
Jul 20, 2001
8,896
1
0
Originally posted by: jiffylube1024
Originally posted by: Thor86
Originally posted by: Rudee
Time to load up on ATI stock

Heh, I'm sooo tempted.

I'm tempted as well, but it's a BIG risk right now. Besides the insider trading probe into ATI now (nevermind the several probes into nVidia's insider trading practises), the market is just too volatile. ATI shares just keep getting lower and lower, and just dropped another buck, and some would say (myself for one) that this *looks* like a good time to buy. I would definatley recommend against it, however. Best to bite your lip and watch during a recession. Remember the depression...?

If anything, I'd say to sell nVidia stock now (if you own any), the GF FX line looks like it won't be as successful as the GF4 series (which was great).

Just my $0.02
I disagree. The nForce has become a huge part of nVidia's business. As well, the people of AT are only a small part of the population. There are morons who just buy the most expensive, best card. Morons who think nVidia is the best because they don't know crap (I love going around Best Buy and laughing at morons who actually buy stuff there and think they are getting a good deal). I still think nVidia has a strong hold on the market even though ATi is clearly better.

 

HappyGamer2

Banned
Jun 12, 2000
1,441
0
0
I love it, I go out and play in the ME desert for a year, never played a single pc game in that time, and I come back to see this
go ATI go
I wonder if ATI is just sitting back waiting to put there next card out only a few weeks after NV gets this card to the streets just to make Nv look bad, I hope so.
 

Deeko

Lifer
Jun 16, 2000
30,213
11
81
Hey dude, calm down it's just video cards. Not a World Cup Soccer match.... The reason for a 16 bit test is so that the performance delta can be measured at 16 and 32 bits. If the performance delta decreases at 32 bit, then it is obvious that bandwidth is the reason for the NV30's lackluster performance. As for your overzealous attitude about this issue, ATI and NVidia want your money, not your moral support. You won't see them on the sideline cheering for you in your world cup match.... or, whatever it is that you play.....
Who's not calm? Just because I ripped into your argument doesn't mean i'm not calm. And I'm well aware that 32 bit uses much more bandwidth than 16 bit, but that was the test back in the day. Now if you wanna see if bandwidth is a problem, pump up the resolution and the FSAA. And I don't know what you're talking about with moral support, kid. I'm a 'fanboy' of neither company, if I support any single video company its 3dfx, and they are no more. But don't try to tell me moral support doesn't help, a company with more moral support gets more sales, and gets a higher stock. So I'd say they'd prefer not to be hated....but regardless, your still talking out your ass.
 

tbates757

Golden Member
Oct 5, 2002
1,235
0
0
I wouldn't say Anand's review is weak, the product that he reviewed, GeForce FX, was however
 

CheapTOFU

Member
Mar 7, 2002
171
0
0
1)What I'm seeing here is that Gfx is basically a .13 nm Geforce4 with a faster core and added stuff for dx9... so it sux..

2)Because Nvidia introduced the first .13 nm tech for GPUs, stock prices will probably go up.. just like when AMD first introduced 1 Ghz athlons.. only for a while

3)Nvidia does not care if you buy Gfx or not... they care about ppl who buy Geforce 4 Ti 4200 or G4 MX (video cards below $200)

4)who cares!!!... there's no good game to play!!! I rather buy Xbox or PS2 or gamecube and many cheap games with $400..
 

WickedChild

Junior Member
Jan 28, 2003
13
0
0
Look... the fact of the matter is that at its current state, the GFFX isnt doing a very good job... sure its speed king... but heck, its pricey, hot, noisy... doesn't kick r9700p ass at all if you ask me... i to think they might clear things up with new drivers... but they did have those 6 months after all... i still expected more by now so im kinda skeptic about it even with newer drivers...
about overclocking it, i dunno... some sites tried overclocking and didnt get far... infact when they did overclock it got so hot that it automatically slowed itself down from the 500/1000MHZ 3D mode (which was overclocked a very small bit) to the 300/600MHZ 2D mode... good protection but heck... doesnt seem very OCable if you ask me... the R9700 on the other hand is very OCable and could probably easily cloe up most of those gaps between it and the GFFX...

considering the R350 is going to be basically a R300 only faster, smaller and with more features, id say its already got a good base... add the extra MHZ and the .13 production, your in for a hell of a performer in stock and overclocking speeds... probably hardly as hot as the GFFX...

i hate nVidia... always have... i still think their cards are good... but id never touch them... first time i touched an nVidia product was a little while ago when i ordered my moms new computer that had an nForce chipset... MOBO was screwed... i take that as a sign...

about the IQ... i just read at HardOCP that the screen shots showing the poor AA and AF dont actually show the ingame IQ cause they are taken from the frame buffer, whereas by that point the AA and AF work is only half done... it actually continues after the frame is out of the frame buffer... so you really might be looking at degraded pics of the IQ that the GFFX shows... though im surprised that the guys here at Anandtech didnt notice that... its strange...
Toms review is really strange... and Anands was way way way better if you ask me... but thats just me... and others...

basically, all in all, the GFFX is a good vid card... but its too little too late to compete with ATI...
looking at the R9700 Pro, theres no reason for a card giving that performance or even better with new drivers, to be that large and hot and power consuming... can just as easily OC your R9700 as we pointed out earlier and get almost identical results at lower noise levels, less heat and less power consumption... so with all its high speed core and DDR II memory which really doesnt add anything to it, it doesnt seem worth the buck... id rather buy a Ti4600 personally...
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: KnightBreed
Originally posted by: jiffylube1024
From the moment I started reading this review/preview, I got a sense of deja vu. This is 3dfx all over again (perhaps actually acquiring the core of 3dfx is responsible for this blunder). Just like the Voodoo 5, the GF FX is way late out of the gate, is mysteriously slower than advertised (drivers? who knows), is a total inconvenience to the PC user (the V5's were longer than a city bus, the GF FX takes 2 slots and generates similar heat to a new CPU), etc. I'm not saying it's not a great card - it's the king right now, well actually it's sharing the gold right now, kind of like the Canadian and Russian figure skating champs in the winter olympics. However, this is a serious blow to nVidia, as it's not the be-all end-all they proclaimed.
Your 3dfx comparison is flawed.

1) The Voodoo5 was trounced in every performance test.
2) The Voodoo5 was technologically behind the GeforceGTS. No DOT3 or hardware T&L support being most notable.

1) The FX is about on par with the 9700 in most of the tests I've seen. Wins some benches, loses others.
2) The FX is technologically superior to the 9700... barely.

With all that said, am I dissappointed with the FX launch? Eh, a bit. I wasn't expecting it to perform leaps and bounds over the 9700. I've noticed most people complaining about the cooler - and rightfully so. That thing is ridiculous.

Just a side note, the V5 was not that long of a card. Hell, the Geforce4 Ti's (4400/4600) are just as long.

I didn't say it's the exact same thing as the V5 launch, just that it's very similar. It's late to the market and not as good as advertised (by a long shot). It was supposed to be out 6 months ago, and would have been the undisputed king (the 9700 Pro would have come shortly after, and ATI would be the one playing catchup).
 

gururu

Platinum Member
Jul 16, 2002
2,402
0
0
I think the GeforceFX sucks and is arguably NOT technologically superior to the 9700. It is twice as big, three times as heavy, and runs
super hot. And barely inches ahead of the 9700.

What is that? That is simply ridiculous.

I thought tomshardware was ok. the card is the new king. it is the fastest.
 

Deeko

Lifer
Jun 16, 2000
30,213
11
81
Originally posted by: tbates757
I wouldn't say Anand's review is weak, the product that he reviewed, GeForce FX, was however

i meant the card
 

HappyGamer2

Banned
Jun 12, 2000
1,441
0
0
yep it MAY be the king if look at it purely by benchmark numbers, but it's not on the street yet. and things may look better or worse when it actually gets in publics hands
 

ondaedg

Junior Member
Sep 30, 2001
7
0
0
Originally posted by: Deeko
Hey dude, calm down it's just video cards. Not a World Cup Soccer match.... The reason for a 16 bit test is so that the performance delta can be measured at 16 and 32 bits. If the performance delta decreases at 32 bit, then it is obvious that bandwidth is the reason for the NV30's lackluster performance. As for your overzealous attitude about this issue, ATI and NVidia want your money, not your moral support. You won't see them on the sideline cheering for you in your world cup match.... or, whatever it is that you play.....
Who's not calm? Just because I ripped into your argument doesn't mean i'm not calm. And I'm well aware that 32 bit uses much more bandwidth than 16 bit, but that was the test back in the day. Now if you wanna see if bandwidth is a problem, pump up the resolution and the FSAA. And I don't know what you're talking about with moral support, kid. I'm a 'fanboy' of neither company, if I support any single video company its 3dfx, and they are no more. But don't try to tell me moral support doesn't help, a company with more moral support gets more sales, and gets a higher stock. So I'd say they'd prefer not to be hated....but regardless, your still talking out your ass.

I thank you for acting immature. It just makes anything you say even less credible.

 

glenn1

Lifer
Sep 6, 2000
25,383
1,013
126
I agree with jiffylube, it reminds me of the voodoo5, late, expensive, huge, makes lots a heat and draws lots a power.
I remeber how th NV boys ripped on the voodoo5 for all this, but now........?
oh well I still run my V5 thou

ya, the voodoo5 wasn't as fast overall as the geforce 2 standard model but it wasn't all that much slower, plus it's AA was better

When i said basically the same thing back in November, i was scoffed at. the original 3dfx-Nvidia comparison thread
 

Deeko

Lifer
Jun 16, 2000
30,213
11
81
Originally posted by: ondaedg
Originally posted by: Deeko
Hey dude, calm down it's just video cards. Not a World Cup Soccer match.... The reason for a 16 bit test is so that the performance delta can be measured at 16 and 32 bits. If the performance delta decreases at 32 bit, then it is obvious that bandwidth is the reason for the NV30's lackluster performance. As for your overzealous attitude about this issue, ATI and NVidia want your money, not your moral support. You won't see them on the sideline cheering for you in your world cup match.... or, whatever it is that you play.....
Who's not calm? Just because I ripped into your argument doesn't mean i'm not calm. And I'm well aware that 32 bit uses much more bandwidth than 16 bit, but that was the test back in the day. Now if you wanna see if bandwidth is a problem, pump up the resolution and the FSAA. And I don't know what you're talking about with moral support, kid. I'm a 'fanboy' of neither company, if I support any single video company its 3dfx, and they are no more. But don't try to tell me moral support doesn't help, a company with more moral support gets more sales, and gets a higher stock. So I'd say they'd prefer not to be hated....but regardless, your still talking out your ass.

I thank you for acting immature. It just makes anything you say even less credible.

I thank YOU for calling me immature, for no reason whatsoever. Please attempt to refute my arguments, and then grow up a little. Sorry if I hurt your feelings, don't tell mommy on me
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: glenn1
I agree with jiffylube, it reminds me of the voodoo5, late, expensive, huge, makes lots a heat and draws lots a power.
I remeber how th NV boys ripped on the voodoo5 for all this, but now........?
oh well I still run my V5 thou

ya, the voodoo5 wasn't as fast overall as the geforce 2 standard model but it wasn't all that much slower, plus it's AA was better

When i said basically the same thing back in November, i was scoffed at. the original 3dfx-Nvidia comparison thread

I just grazed through that thread - it's a pretty good thread, with a lot of skeptics...

People weren't ready to accept nVidia not living up to it's promises in November. Since the GeForce 1 card, they have been untouchable, and hadn't botched a release (until now). Now, it's finally clicking with people.

I guess your oracular foresight just wasn't appreciated .
 

sanstrom

Junior Member
Jan 29, 2003
2
0
0
What about all of the 9700 well documented compatibility problems?? And don't say "I never had any problems". That don't mean jack. Have you gone into a Best Buy or Fry's or local store that sell these cards. Look at the return tags on the boxes. I'd say over 2/3 are cards that were returned. To me that says volumes. I actually gave the 9700 a shot. I got two different cards to no avail. I started with a fresh install of windows, the latest drivers, and dx9. No dice. 3d would either reboot the two systems I tried them in or just crash to the desktop. This should be a deciding factor in what brand to buy. I would have loved for the 9700 to have worked for me... I do believe (if it didn't have issues) it would be a better buy right now. But....I think I'll stick with the ol geforce 3 and contemplate the fx. Maybe if ATIs next card isn't so buggy I'll try it out.......
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Why did you automatically assume it's the video cards fault? Is your power suppy powerful enough to handle the 9700? Did you have a resource conflict? Your problem with the 9700 could have been from any number of reasons other than the card itself.
 

FrodoB

Senior member
Apr 5, 2001
299
0
0
It will be very difficult for me to ever buy another ATI card. My old Radeon LE from a couple years back left such a bad taste in my mouth. There were so many compatibility problems with that card. I upgraded to a GeForce 3 Ti 200 after I realized that I was spending more time troubleshooting than actually playing games. I'm hoping the intro of the GeForce FX will lower prices of entire market. Despite the level playing field between ATI and NVidia, video cards in general are still EXPENSIVE. That's probably why so many new computers come with a GeForce 2 or 4 MX variant, even these manufacturers "high end" machines.
 

CoolLight

Member
Dec 26, 2002
88
0
0
Just want to say this,

An overclocked ATI Radeon 9700 Pro RULES!

You can't overclock Nvidia FX at all SUX.
 

Bopple

Member
Jan 29, 2003
39
0
0
I don't think it's proper to evaluate vga on solely overclockwise.
But then again, GFFX seems to be overclocked to the limit already in the factory.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |