Anand's 9800XT and FX5950 review, part 2

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: skace
Is it true that DX9 at this point is aimed at 24bit and nVidia naturally only does 16 and 32bit? I was reading an interview at TheFiringSquad, where one of the nVidia engineers basically stated that they feel they are at a disadvantage due to this scenario and considers 24bit only a short stopping point to 32bit. I wish I had more information on this.

that guy at the firingsquad conveniently left out that nvidia can't even do 16-bit as fast as ati can do 24bit, which makes his argument rather pointless.


ohh and no full screen shots, no absolute numbers for tomb raider and the other points mentioned above all make this one peice of trash review. considering the last one, i imagine the next one will drive me to ditch anandtech permanitly. there is more dissusion on it here:

http://www.beyond3d.com/forum/viewtopic.php?t=8375
 

Sazar

Member
Oct 1, 2003
62
0
0
Originally posted by: skace
Awesome job you guys at anandtech. I like the push for benchmarking as many games as possible. I'd like to think this was to get a bigger picture and to force video card makers not to try and optomize their card for 1 game but to make sure it works well over the whole field.

It seems as if nVidia is still living in an OpenGL time and ATi is really accelling at DX9 as well as edging out performance in DX8 titles.

My concerns at this point, from reading around is the following question:
Is it true that DX9 at this point is aimed at 24bit and nVidia naturally only does 16 and 32bit? I was reading an interview at TheFiringSquad, where one of the nVidia engineers basically stated that they feel they are at a disadvantage due to this scenario and considers 24bit only a short stopping point to 32bit. I wish I had more information on this.

I am also looking forward to a review that benchmarks the 3.8s against the 3.7s to show what the performance gain is from those when they arrive.

Even if the 52x drivers are raising the performance bar for the FX line, I still can't see any justification for buying an FX at this point. It is kind of a shame but I hope nVidia continues to fix this line and their future lines to better work with current games. Hopefully these fixes come in the form of legitimate global fixes and not game-specific hacks.

currently the minimum spec of dx9 is fp24...

will be raised to fp32 in the future...

 

Sazar

Member
Oct 1, 2003
62
0
0
from hominid skull.. using anand's midget screenies

click me

notice teh blurring and general reduction in IQ as you go from cat 3.7 to det 45.23 and finally det 52.14..

why could anand not have done FEWER tests and reported more indepth ?
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
I have the 52.13 Detonators and they seem decent to me. The fog bugs from the previous 51.75 drivers are gone... PS2.0 speed is definately improved... but at the cost of vertex shader performance it seems.
 

George Powell

Golden Member
Dec 3, 1999
1,265
0
76
The actual link from post above is click me

That picture does tell many many words, however looking closely at it there are some areas that are better on the nvidia 52.14 than the catalyst 3.7 and the other way around.

It seems that the very distant objects get blurred worse by the ATI drivers than the nvidia which seems a little more balanced.

This might be a way of improving performance but raises a very interesting point.

If the drivers determine what areas need/would benefit from AA/AF and then only apply the effects to those areas of the frame then performance would go up, but rather curiously so would image quality.

This would surely be a better solution than applying the effects to the whole frame, as I certainly notice that some areas get excessively blurred from AA.

 

ginfest

Golden Member
Feb 22, 2000
1,927
3
81
That picture does tell many many words, however looking closely at it there are some areas that are better on the nvidia 52.14 than the catalyst 3.7 and the other way around.

It seems that the very distant objects get blurred worse by the ATI drivers than the nvidia which seems a little more balanced.

I noticed that too-of course the images linked were provided to disprove the fact that the NV drivers have improved performance without sacrificing image quality
Supposedly the 3D Center article is "better" than the AT article-at least that's what the FUD-meisters would have us believe

Might be time to take the 9800 out of my main rig and put in the 5900?
 

Sazar

Member
Oct 1, 2003
62
0
0
Originally posted by: George Powell
The actual link from post above is click me

That picture does tell many many words, however looking closely at it there are some areas that are better on the nvidia 52.14 than the catalyst 3.7 and the other way around.

It seems that the very distant objects get blurred worse by the ATI drivers than the nvidia which seems a little more balanced.

This might be a way of improving performance but raises a very interesting point.

If the drivers determine what areas need/would benefit from AA/AF and then only apply the effects to those areas of the frame then performance would go up, but rather curiously so would image quality.

This would surely be a better solution than applying the effects to the whole frame, as I certainly notice that some areas get excessively blurred from AA.

there are very blatant boundaries with the det's.. the blurriness otherwise is subjective it would appear... IMO the distant objects are rendered clearer for the most part on the cat screenies...

anyways... the boundaries are an issue... and you can see them clearly.. and this is backed up by the review @ 3d center on filtering methods and their results...
 

THUGSROOK

Elite Member
Feb 3, 2001
11,847
0
0
please stop blaming AT for a "bad review".
they did the best they can to provide as much info as they could and they did an excellent job.

AT is not the center of this controversy, NV is.
please keep that in mind

 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
As long as the answer (images) come out to be same, what makes it cheating?
Because it cannot exist without prior knowledge and because it isn't running the code that the game asks it to run.

Also what happens if a later game patch changes the way the shader code runs? Whoops, the drivers will still be running the old code and you won't see the new results until a newer version of drivers arrive. Or worse yet, the game will totally break.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Also what happens if a later game patch changes the way the shader code runs? Whoops, the drivers will still be running the old code and you won't see the new results until a newer version of drivers arrive. Or worse yet, the game will totally break.
This is nothing new, I can remember not being able to install game patches until driver updates came out years ago.
 

Blastman

Golden Member
Oct 21, 1999
1,758
0
76
originally posted by BenSkywalker:

The current situation is that ATi was first out of the gate and they have had considerably more time to work on their driver level compilers to optimize for their architecture not to mention that developers have been working within the confines of what works well with ATi's drivers for some time.
Nvidia had the NV30 core ready in the summer of 2002 about the sama time as ATI?s R300. Problem is Nvidia didn?t want to release it on 0.15 fab as a 5800U clocked the same speed a 9700 pro would have been beaten to a pulp in the benchmarks.

While what you say about the P4 Athlon optimizations (analogy) and compilers is true, in the end no amount of optimizations can make up for the lack of hardware shader units on the 5800 or other actual hardware deficiencies. The (ATI presentation) states the 5800 has only 2 full-precision-shader-units whereas the 9700pro has 8.

IQ is still being hacked to increase benchmarks scores. Just open up these Aquamark images and click back and forth between them. Check the grass below the explosion and the top of the vehicle --- much blurrier on the FX card ? ?

Nv aquamark3

ATI aquamark3

 

lifeguard1999

Platinum Member
Jul 3, 2000
2,323
1
0
BFG10K: It sounds like you are saying that effects are pre-computed, and later patches to the game may break it. An example of this would be the ExtremeTech article on NVidia & 3DMark2003. If that is what you are saying, then I would agree.

However, I am thinking along different lines. Mathematically, there is no difference between 2/2 = 1 and 2 * 0.5 = 1. However, when programming, it is well known that divisions take longer to compute so most programmers go ahead and use the second path and not the first. Actually, many compilers go ahead and do it for them now. Now, I am not a pixel shading programmer. But I make the assumption that ATI could take the first path, while NVidia could take the second path, and both end up with the right answer.

Of course, assumptions and make an @$$ out of U and ME.
 

Sazar

Member
Oct 1, 2003
62
0
0
Originally posted by: BFG10K
As long as the answer (images) come out to be same, what makes it cheating?
Because it cannot exist without prior knowledge and because it isn't running the code that the game asks it to run.

Also what happens if a later game patch changes the way the shader code runs? Whoops, the drivers will still be running the old code and you won't see the new results until a newer version of drivers arrive. Or worse yet, the game will totally break.

actually shader optimizations and what not are not really an issue...

it is expected that automated shaders will improve performance... this is not a big deal... the problem is the IQ people claim is the same...

compared to a base that in itself has shown iq degradation over previous det version that conclusion is pointless...

shader optimizations are likely where performance improvements will come from but from a consumer standpoint there has to be accountability... if there is IQ degradation.. for crying out loud CALL it degradation and leave it at that... it is pointless claiming there is no IQ difference when it is blatantly clear there are differences..

the mipmap boundaries are going to be clearly visible when playing a game and are ridiculously annoying IMO...
 

Sazar

Member
Oct 1, 2003
62
0
0
Originally posted by: THUGSROOK
please stop blaming AT for a "bad review".
they did the best they can to provide as much info as they could and they did an excellent job.

AT is not the center of this controversy, NV is.
please keep that in mind


evan promised much for this review...

AT did a lot of work.. and kudos to them for this...

but the adage QUALITY over QUANTITY comes to mind...
 

First

Lifer
Jun 3, 2002
10,518
271
136
Originally posted by: Sazar
Originally posted by: THUGSROOK
please stop blaming AT for a "bad review".
they did the best they can to provide as much info as they could and they did an excellent job.

AT is not the center of this controversy, NV is.
please keep that in mind


evan promised much for this review...

AT did a lot of work.. and kudos to them for this...

but the adage QUALITY over QUANTITY comes to mind...

That's just your opinion, which IMHO is jaded, especially considering how you badmouth AT on every other forum.
 

Sazar

Member
Oct 1, 2003
62
0
0
Originally posted by: Evan Lieb
Originally posted by: Sazar
Originally posted by: THUGSROOK
please stop blaming AT for a "bad review".
they did the best they can to provide as much info as they could and they did an excellent job.

AT is not the center of this controversy, NV is.
please keep that in mind


evan promised much for this review...

AT did a lot of work.. and kudos to them for this...

but the adage QUALITY over QUANTITY comes to mind...

That's just your opinion, which IMHO is jaded, especially considering how you badmouth AT on every other forum.

fyi.. I don't badmouth AT on every forum... the only place you will have seen me take issue is nvnews.. and you yourself came and said if you want additional information or want to provide feedback.. do so here... so here I am... on my forums I have posted many links to previous reviews done by this website...

there is little point to pointing things out if the person in question that is liasing happens to have a pre-determined view on things or a lack of understanding of other things...

Tell me how to improve my reviews.

did you or did you not state many things in the nvnews thread concerning part 1 of the review evan ? pointing out many things that would be changed and telling us to look for certain parts of part 2 that would answer our questions...

as many others incl myself have already stated we appreciate that many of the issues raised in part 1 were addressed... but @ the same time there are continued issues from part 1 as well as the issue with the images posted (or not posted) and the arbitary usage of AA and AF throughout the suite of tests...

seeing as you yourself have stated that you are not going to guide people who dont understand the data from the suite of tests by the finger... I would assume that you would cater to the appetite of enthusiasts who actually have an inkling of issues and technology... especially the undergrad/grad and post grad segments...
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Nvidia had the NV30 core ready in the summer of 2002 about the sama time as ATI?s R300. Problem is Nvidia didn?t want to release it on 0.15 fab as a 5800U clocked the same speed a 9700 pro would have been beaten to a pulp in the benchmarks.

The NV30 was always designed to be a .13u part, it is far from simple to move over to a more coarse build process. nVidia was always shooting for .13u for the NV30.

The (ATI presentation) states the 5800 has only 2 full-precision-shader-units whereas the 9700pro has 8.

And the GF2 doesn't support any FP standards, if we want to discuss out of production parts. I'll quote myself-

Looking at the core functions and how many cycles it takes to complete certain operations the NV35 should be faster then the R3X0

Also ATi is playing a bit of a spin game, the shader units for nV are not the same as those on ATi's boards. The nV units are considerably faster clock for clock, they are deeper units and more complex then ATi's. ATi has more individual units, but they are less powerful. Also notice that what I said about relative performance was focused on the chip level, in terms of individual shader units it would be a much larger edge in the cases where nV comes out ahead.

IQ is still being hacked to increase benchmarks scores. Just open up these Aquamark images and click back and forth between them. Check the grass below the explosion and the top of the vehicle --- much blurrier on the FX card ? ?

And of course compressing the nV shot more had nothing to do with that, right? Hell I'll post shots of a R300 core board compressed to the maximum level that gif will allow and show you some raw bitmaps of a Voodoo1 if you don't think it matters IQ is being hacked because one image was compressed more, ATi is clearly destroying IQ and cheating like hell in their drivers, I've got the gifs to prove it...

BFG-

Because it cannot exist without prior knowledge and because it isn't running the code that the game asks it to run.

Well what issues are showing up with the latest Cats? I know ATi is going to extreme measures doing this, but outside of exposing the 'cheats' in TRAoD(the flickering walls) what other issues are cropping up?

Also what happens if a later game patch changes the way the shader code runs? Whoops, the drivers will still be running the old code and you won't see the new results until a newer version of drivers arrive. Or worse yet, the game will totally break.

More then likely the Cats will balloon up over 30MBs to account for this. If you are on dial up this is certainly a downside, but perhaps they will start utilizing better optimizations with more compact code and they can get a handle on their exploding driver size. nVidia has obviously done this, maybe ATi will too.
 

spam

Member
Jul 3, 2003
141
0
0
I really liked this review it was thorough, and very persuasive in it's depth of coverage. I am curious about the means of assessing image quality. It is a subjective test to say the least. When I looked at the image quality comparison shots I thought the Cat 3.7's seemed sharper and crisper than Det 52's. Again that is my subjective opinion.

-How did the reviewers quantify their I.Q. tests?
-Could they set it up so that people would have to pick first, second and third choice I. Q. without knowing which image was from which card and driver?
-If this is practical, wouldn't that help to give more substance to I. Q. evaluations?
 

spam

Member
Jul 3, 2003
141
0
0
Hi Evan,

This is a little off topic but related. If Nvidia reads these reviews, does that bring any pressure upon them to use standard API'S? This review and others have pointed out the difficulty of application specific handcoding. It seems to me that it would be in their own interest to move towards standard API's but I do not know why Nv went this route ( do you?) and I don't know what will change their plans for the future. The developemnt cost of these drivers must have been much greater than if they had followed industry standards. Hopefully the bottom line on costs will be persuasive.

Can you give us any insight as to Nvidia's future driver plans? -may be a rumour or two? We won't tell anyone we promise!
 

Sazar

Member
Oct 1, 2003
62
0
0
Originally posted by: spam
Hi Evan,

This is a little off topic but related. If Nvidia reads these reviews, does that bring any pressure upon them to use standard API'S? This review and others have pointed out the difficulty of application specific handcoding. It seems to me that it would be in their own interest to move towards standard API's but I do not know why Nv went this route ( do you?) and I don't know what will change their plans for the future. The developemnt cost of these drivers must have been much greater than if they had followed industry standards. Hopefully the bottom line on costs will be persuasive.

Can you give us any insight as to Nvidia's future driver plans? -may be a rumour or two? We won't tell anyone we promise!

the move currently is for automatic shader detection and replacement...

it is in the interests of the consumers that this happens.. naturally... to be able to get good playable frame-rates... the issue is with the IQ and the glossing over of the fact that the IQ is currently suffering...

as the shader algorithms mature I am sure there will be a better ability to replace shaders more efficiently with little to no IQ loss...

I just wish as I had stated in the part 1 thread that a baseline of det 44.03's had been selected since it has been consistently shown the 45.23 drivers had worse IQ than the 44.03's.. I don't understand the logic for not using them as a baseline...

and ben

And of course compressing the nV shot more had nothing to do with that, right? Hell I'll post shots of a R300 core board compressed to the maximum level that gif will allow and show you some raw bitmaps of a Voodoo1 if you don't think it matters IQ is being hacked because one image was compressed more, ATi is clearly destroying IQ and cheating like hell in their drivers, I've got the gifs to prove it...

the IQ on the fx cards IS worse... specially if you are using 45.23 as the baseline... this has been shown in other tests all around the internet...

admittedly AT's images are a little shyte in terms of size... but if they were proper fullscreen ones we could all see the differences quite clearly...
 

Genx87

Lifer
Apr 8, 2002
41,095
513
126
According to who? And where can I get a raw image of what the picture should really look like. So they have a software renderer option in these games?

So basically it looks like we are basing IQ off what we percieve it should look like. There is no telling which driver is really doing it correctly. But the assumption is the ATI driver is indeed doing it correctly.

Again alot of this comes down to such little things why bother arguing over.
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
Just gonna copy what I posted in the comments section for now:

The article is extremely comprehensive, as one would expect from Anandtech. Some issues of note:

1. It was pointed out that the 5900 and the 5950, in many areas, performed almost identically. This doesn't pose well for nVidia.
2. I'm bothered by the tremendous frame rate difference between ATi and nVidia in some of the titles. It leads me to believe there's something underlying going on, and it's not just a simple card/driver issue.
3. It's nice to see the IQ back to where it should be, as visual quality should never be compromised for performance, unless the user makes the adjustments to do so.
4. I will admit it sort of seems that there is some bias towards ATi, but it's not flamingly apparent. Again, it is just my perception, and doesn't necessarily mean that there is.
5. The most accurate remark made in this review is simply that we are not in the world of DX9 games...yet. To that end, DX9 performance is not nearly as important as it will be. When it is, I think things will step up a few notches.



 

reever

Senior member
Oct 4, 2003
451
0
0
According to who? And where can I get a raw image of what the picture should really look like. So they have a software renderer option in these games?

Microsoft provides a software renderer for ANY game in one of the dx9 dev. tool packages or SDK that will give you the full correct image
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Ronin, where do you see a bias toward ATi in Derek's reviews?(!)
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
Simple enough. When nVidia did better (when the differences weren't so marginal), you saw ho hum comments about it. But man, ATi gets 2FPS better than nVidia and BAM!, glowing comments. Go through each of the tests, and see what was said for each comparison. Are there times where he says great things about nVidia? Yes, but not nearly as smashingly as ATi.

Again, this is MY perception of what I saw. I don't truly favor either card (and I happen to have engineering samples of both).
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |