Anand's 9800XT and FX5950 review, part 2

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Sazar

Member
Oct 1, 2003
62
0
0
better review than the first... certainly more professional...

still some issues but evan... you can read them in the other forum...

 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
Nice review. It was long but it was definitely worth the effort to read it.

I do not believe that nvidia could code 50% to 60% more performance in drivers alone.
I suspect that nVidia have written a whole heap of custom DX 8/9 shaders for current games and replaced the games' shaders with their own. Yes it's cheating but if it helps the customer without degrading IQ then you really can't fault them for trying to make their product more attractive.

The problem is that they're constantly going to have to do this for every new game that comes out.
 

XeoBllaze

Golden Member
Feb 12, 2003
1,414
0
71
Originally posted by: Luagsch
somehow i don't trust nvidia.... don't know why...
i read the whole thing and everything looks fine but the voice in my head says "it's nvidia, watch out"
That is 100% f4nb0y15m!
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Rollo
This is not fanboyism, this is based on the fact that ATI was(and still is) market leader for the last two years. Just like Nvidia wore the crown from the TNT2 all the way to the GeForce 4. Bring on NV40/R400.
Err, Jiffy, not to nitpick, but you're contradicting yourself here. The 9700Pro started to ship to retail stores 8/19/2002, which isn't even 14 months ago, let alone two years?
Beyond that, when you consider there's no real overriding advantage of the 9800 over the 5900 for the last 4-5 months, you're down to talking about a window where ATI was a clear market leader for 9-10 months, a long way from two years.

And you contradict yourself also. "Not to nitpick", and then you nitpick my post. Alright, so it's more like one year plus. I mean the market leader as the fastest performing card first out the door (until beaten). The Radeon 9700 was first, then the 9800 came. Nvidia has not surpassed the 9800 yet, they've merely matched it (almost). So Nvidia is technically the one playing catch up. That is what I mean.

And where did I say "clear" market leader?
 

yhelothar

Lifer
Dec 11, 2002
18,408
39
91
WOW!
you guys get the award for the "MOST COMPREHENSIVE 9800XT/FX5950 REVIEW"
60+20 pages... whoa

very imformative review...
 

Rogodin2

Banned
Jul 2, 2003
3,224
0
0
Good review but for a suggestion regarding flight sims you NEED TO USE IL-2 FB USING "PERFECT" SETTINGS as a benchmark, it's vga killer at high res with and without AF AND AA.

The XT won about %75 of the tests by more thant %10 and sometimes up to %40-so I'm impressed with ati and have the same regard for nvidia as I had before-I don't trust them.

rogo
 

Nebor

Lifer
Jun 24, 2003
29,582
12
76
have the same regard for nvidia as I had before-I don't trust them.

And you never will. That's good, never ever buy an nvidia card. I hereby forbay you to do so.
 

lifeguard1999

Platinum Member
Jul 3, 2000
2,323
1
0
BFG10K wrote
I suspect that nVidia have written a whole heap of custom DX 8/9 shaders for current games and replaced the games' shaders with their own. Yes it's cheating but if it helps the customer without degrading IQ then you really can't fault them for trying to make their product more attractive.

I agree with your suspicion, but disagree with your conclusion that it is cheating. As long as the answer (images) come out to be same, what makes it cheating?
 

Rogodin2

Banned
Jul 2, 2003
3,224
0
0
When I buy coffe from a gourmet roaster I expect it to be high grade arabica-which is what I roasted-if I found out (either by the taste or by looking at their bags) I'd not purchase their products-this analogy holds true for nvidia.

rogo
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: lifeguard1999
BFG10K wrote
I suspect that nVidia have written a whole heap of custom DX 8/9 shaders for current games and replaced the games' shaders with their own. Yes it's cheating but if it helps the customer without degrading IQ then you really can't fault them for trying to make their product more attractive.

I agree with your suspicion, but disagree with your conclusion that it is cheating. As long as the answer (images) come out to be same, what makes it cheating?

Exactly... it's very important to remember, as stated in this review, that nVidia and ATI GPU's are VERY different, and use different methods to accomplish the same goal. So in the end, what does it matter HOW the image is put up on the screen, as long as it happens, it looks good, and it's performance is acceptable.

BFG, I see what you're saying, that if nVidia does their own thing optimized for their hardware, and ATI does what the game developer tells them to, it's not exactly a level playing field. But... if ATI could write their drivers to get better performance without reducing image quality, would you, as a consumer, want that? You should.
AMD and Intel processors use very different means to achieve the same goals... does that mean one is cheating? Is Intel cheating by using SSE2 in the P4 while the AthlonXP didn't? By your logic, it is. Of course, I disagree. SSE2 is an optimization by Intel, so is Hyper-Threading... neither of which AMD CPU's have been able to take advantage of... if those are "legal" in the computer industry, why can't nVidia do the same things with their GPU????
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Rogodin2
When I buy coffe from a gourmet roaster I expect it to be high grade arabica-which is what I roasted-if I found out (either by the taste or by looking at their bags) I'd not purchase their products-this analogy holds true for nvidia.

Nvidia's in the coffee business now? I don't see much of a connection with this wonky analogy to Nvidia. Maybe they're cutting corners, maybe they aren't. We shall see...


Originally posted by: BFG10K


The problem is that they're constantly going to have to do this for every new game that comes out.

I agree with this, and it sucks unfortunately. Nvidia doesn't want to fight ATI straight on, so they come out with all this custom "Cg" nonsense that needs to be optimized for every single game. This is BS, and it creates unnecessary work for both Nvidia and game developers (see: Valve with HL2).

 

1ManArmY

Golden Member
Mar 7, 2003
1,333
0
0
Good read for the most part but I thought there was to much cheer leading going on for Nvidia improvements.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: 1ManArmY
Good read for the most part but I thought there was to much cheer leading going on for Nvidia improvements.

Why not? Competition is good, isn't it?
 

Sazar

Member
Oct 1, 2003
62
0
0
Originally posted by: Jeff7181
Originally posted by: lifeguard1999
BFG10K wrote
I suspect that nVidia have written a whole heap of custom DX 8/9 shaders for current games and replaced the games' shaders with their own. Yes it's cheating but if it helps the customer without degrading IQ then you really can't fault them for trying to make their product more attractive.

I agree with your suspicion, but disagree with your conclusion that it is cheating. As long as the answer (images) come out to be same, what makes it cheating?

Exactly... it's very important to remember, as stated in this review, that nVidia and ATI GPU's are VERY different, and use different methods to accomplish the same goal. So in the end, what does it matter HOW the image is put up on the screen, as long as it happens, it looks good, and it's performance is acceptable.

BFG, I see what you're saying, that if nVidia does their own thing optimized for their hardware, and ATI does what the game developer tells them to, it's not exactly a level playing field. But... if ATI could write their drivers to get better performance without reducing image quality, would you, as a consumer, want that? You should.
AMD and Intel processors use very different means to achieve the same goals... does that mean one is cheating? Is Intel cheating by using SSE2 in the P4 while the AthlonXP didn't? By your logic, it is. Of course, I disagree. SSE2 is an optimization by Intel, so is Hyper-Threading... neither of which AMD CPU's have been able to take advantage of... if those are "legal" in the computer industry, why can't nVidia do the same things with their GPU????

sse2 is an instruction set... as is sse... it is upto amd to acquire and use the sse2 instruction set...

seeing the situtation now... amd's 64 bit instruction set must now be used my intel...

the instruction sets are for a specific reason...standards and level playing field so to speak... the architecture may be however different... the end result should be the same...

as it stands... the gf fx is not too different architecturally than the gf4 ti series... undoubtedly it is more complex but the basic marchitecture is not too far removed...

I am hoping that there comes a time when the devs can code to an api... not to a cards individual strengths... it will save them money and time during which they can implement more features perhaps... or *gasp* better storylines
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
I am hoping that there comes a time when the devs can code to an api... not to a cards individual strengths... it will save them money and time during which they can implement more features perhaps... or *gasp* better storylines
I thought it was pretty clear in the article that nVidia plans on making it seamless... or at least trying to. So that the driver can turn regular DX9 instructions into whatever instructions nVidia wants to use for their GPU. In effect, they're programming the GPU to run DX9 instructions... hence the term, "highly programmable GPU."
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Jeff7181
I am hoping that there comes a time when the devs can code to an api... not to a cards individual strengths... it will save them money and time during which they can implement more features perhaps... or *gasp* better storylines
I thought it was pretty clear in the article that nVidia plans on making it seamless... or at least trying to. So that the driver can turn regular DX9 instructions into whatever instructions nVidia wants to use for their GPU. In effect, they're programming the GPU to run DX9 instructions... hence the term, "highly programmable GPU."

No, Nvidia has their own "Cg" compilers that are not DX9 standard.

Taken from Anand's conclusion:

Right now NVIDIA is at a disadvantage; ATI's hardware is much easier to code for and the performance on Microsoft's HLSL compiler clearly favors the R3x0 over the NV3x. NVIDIA has a long road ahead of them in order to improve their compilers to the point where game developers won't have to hand-code special NV3x codepaths, but for now ATI seems to have won the battle.
 

George Powell

Golden Member
Dec 3, 1999
1,265
0
76
Hope I'm not being impatient but when are we likely to see these new drivers from nvidia, it is one thing allowing benchmarking but why hold out to the community as a whole.

As it stands ATI has the lead at the moment because the drivers are available and work (mostly) this is the one that I'm more likely to buy.

However for those who already own NV35 chips then they would really like to have the performance and image quality that they bought the card for in the first place.

Cmon nvidia appease the masses and give something back to the people who help keep you in business.
 

THUGSROOK

Elite Member
Feb 3, 2001
11,847
0
0
very nice review. still a minor problem tho.....
we cant verify the results cause we dont have the 52.xx drivers
NV drivers are so buggy you never know what you will find.

here's a good example: (45.23)
enabling "texture sharpining" (for AA) makes nonAA benchmarks go faster

its not that i dont trust the AT reviewers methods, quite the opposite in fact ~ its Nvidia we dont trust

* are the 52.xx drivers just more BS or not?

 

reever

Senior member
Oct 4, 2003
451
0
0
Originally posted by: virtualgames0
WOW!
you guys get the award for the "MOST COMPREHENSIVE 9800XT/FX5950 REVIEW"
60+20 pages... whoa

very imformative review...

Comprehensive? They don't even provide full size pictures for basically every game, not everybody likes having things interpreted for them, and not everybody is going to believe everything a reviewer tels them, especially if they cant see it for themselves
 

Luagsch

Golden Member
Apr 25, 2003
1,614
0
0
Originally posted by: XeoBllaze
Originally posted by: Luagsch
somehow i don't trust nvidia.... don't know why...
i read the whole thing and everything looks fine but the voice in my head says "it's nvidia, watch out"
That is 100% f4nb0y15m!
your comment actualy made me laugh. thnx :beer:
now shut down your faulty "fanboyism"-detector
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
When I buy coffe from a gourmet roaster I expect it to be high grade arabica-which is what I roasted-if I found out (either by the taste or by looking at their bags) I'd not purchase their products-this analogy holds true for nvidia.

This would be the logical fallacy of "false analogy".

However, I have news for you Rogo:
If the coffee looks and tastes the same, and has similar caffeine content, I doubt your customers would care if it's brewed with lima beans.

Since a video card only has one job (generate a picture) nobody really cares if nVidia cards convert the image to an Excel spreadsheet and back to an image if the end result looks the same as it's competitor's product, and is rendered as fast.

Can you say "sour latte" for Rogo?
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
I was hoping that they would cover this in the review but apparently that wasn't in the schedule.

There are two levels of DX9 shader compilers. The one that they talked about in the review and the one most people here about is the MS HLSL compiler and Cg. The one they don't talk about is the driver level compiler that has to take the outputted assembly level code and convert it over to machine level code. All vendors need to do this, ATi, nVidia, S3, everyone.

The useage of comparing SSE/3DNow! etc to the compilers was a good one, but they did it in the wrong place. The DX9 HLSL/Cg compilers output straight x86 style code(in terms of this analogy) and can lean more towards one architecture or another. You can have a compiler that is more Athlon or P4 friendly simply due to their differing approcahes to design ignoring SIMD, same case with nV and ATi at that level. The driver level compilers are the ones that are comparable to a particular SIMD set. The driver level compilers are the ones that need to take the x86 style raw code and convert it over to machine level code that is optimized for the particular chip. This is sort of like the front end on x86 processors and a SIMD compiler combined, if you are converting the higher level code over to machine level and do it in such a manner that is poorly suited for your architecture your performance will be off by an enormous amount.

When the driver takes the compiled HLSL/Cg code and converts it over, it needs to be as optimized as possible. This sounds obvious but writing a compiler is significantly more complex then a driver and it takes considerable amounts of time. One of the reasons x86 is able to hang in there with the more exotic processor technology is due to the maturity, the decades of tweaking, to its compilers.

The current situation is that ATi was first out of the gate and they have had considerably more time to work on their driver level compilers to optimize for their architecture not to mention that developers have been working within the confines of what works well with ATi's drivers for some time. It is possible that certain shaders were deemed to intensive to be included in titles due to it not working at its potential speed eight or nine months ago that would now work considerably better due to greater optimizations. Another advantage ATi has is that its architecture is less 'twitchy' then nV's. Think of the FX like the P4 in that aspect. If you have compiled code that has a lot of branches and your BPU misses frequently, the P4 would look quite poor against the Athlon even with both chips using the same less then stellar compiled code.

Looking at the core functions and how many cycles it takes to complete certain operations the NV35 should be faster then the R3X0 core boards handling certain shaders clock for clock(considerably in some instances). We have seen none of this yet and likely won't for some time. Between ATi's head start and nV's very poorly optimized driver level compilers the ability to exploit any of that potential has been squandered for some time now.

The majority of the 'cheats' I've seen from nVidia seem to be compiler bugs rather then anything close to an actual cheat. Take the DoF effect in TRAoD, they significantly over did the effect, not something you try and do to pump your framerate up. Now we are seeing the same level of performance but with the IQ issues gone. 3DM2K3 was certainly an example of a cheat, not saying that they are free from the label, simply that most of the 'exposing nV's cheats' articles have been based around what seemed to be fairly obvious bugs, not anything they were using to try and boost scores. I expect that they will be working on their compilers for a long while yet, I wouldn't be surprised in the least to see more bugs crop up at some point and it wouldn't shock me to see ATi end up showing their share when we see games where nV was the lead card for shader effects, but they seemed to have improved their compiler significantly. Overall the R3x0 boards should hold on to their edge in DX9 shaders, they do have some advantages that should help them retain that title vs the NV3X line, what we have been seeing in terms of percentage rift just didn't follow along with how much of an edge they should have had.
 

skace

Lifer
Jan 23, 2001
14,488
7
81
Awesome job you guys at anandtech. I like the push for benchmarking as many games as possible. I'd like to think this was to get a bigger picture and to force video card makers not to try and optomize their card for 1 game but to make sure it works well over the whole field.

It seems as if nVidia is still living in an OpenGL time and ATi is really accelling at DX9 as well as edging out performance in DX8 titles.

My concerns at this point, from reading around is the following question:
Is it true that DX9 at this point is aimed at 24bit and nVidia naturally only does 16 and 32bit? I was reading an interview at TheFiringSquad, where one of the nVidia engineers basically stated that they feel they are at a disadvantage due to this scenario and considers 24bit only a short stopping point to 32bit. I wish I had more information on this.

I am also looking forward to a review that benchmarks the 3.8s against the 3.7s to show what the performance gain is from those when they arrive.

Even if the 52x drivers are raising the performance bar for the FX line, I still can't see any justification for buying an FX at this point. It is kind of a shame but I hope nVidia continues to fix this line and their future lines to better work with current games. Hopefully these fixes come in the form of legitimate global fixes and not game-specific hacks.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |