Anand's 9800XT and FX5950 review, part 2

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ginfest

Golden Member
Feb 22, 2000
1,927
3
81
Originally posted by: Rogodin2
I'd be careful flinging the "troll" word about ginfest-you're bias shows up very conspiciously in PMs and your system specs.

rogo
Let's see, 9800P in my main box, TI-4400 in the 2nd box and the 5900 in it's box, now what was my bias again?


Oh, you mean my bias against people who sign up at diff forums just to stir the pot?
Or people who make outrageous OC claims?
Or those who try and push thier preferences on others?

Sorry, can't waste anymore time with your nonsense tonight.

Mike G


Oh and you've seen Muzzes "work" elsewhere, I'm surprised someone as "educated" as you would champion his cause.

"birds of a feather" perhaps?
 

muzz

Member
May 17, 2003
27
0
0
Originally posted by: Rollo
Originally posted by: muzz
NV blows, they will fk you fans right in the ass, so I hope ya have a condom and some KY cuz the big rammin' is coming.

They have porked ya for so damn long, that praying for a bs miracle is beyond hope.....

They don't care about you, they never have and they never will.......

So get over it ya fkn clowns.



The "big ramming"?! Spoken like a man who's yet a virgin....

Thats right, the big rammin'.... which is EXACTLY what NV fans have been getting since B4 the inception of the NV30.....
You don't agree? Thats fine.... but the #'s speak for themselves.... especially if ya use a set of drivers that DONT cheat.....
Which in and of itself is rather difficult.

 

muzz

Member
May 17, 2003
27
0
0
Gin IDGAS what you think, as OBVIOUSLY you are part of the sheep herd that NV has harvested over the yrs...... I bet that 9800 was given to ya just so you could see what a POS the NV offerings were, and you just can't give it back because it WIPES THE FLOOR with anything that NV has....
Massively cheating drivers aside......., and even then the ATI offering holds it's ground pretty well.

I bet you didn't even say a damn word when the AT reviews came out with BOGUS results in favor of the POS NV offerings? 60 pages that are NOT worth the webspace use... except as a tool to hurt ATi.

Not to mention the other review they did B4 with more BOGUS results because of the filtering ( or should I say LACK of filtering) skewed benchmarks, ya know the one where EVERYONE that KNEW better TOLD them MANY times that the results were BOGUS, yet they said they would " Look into it" and never said another damn word and deleted the thread.....
Pretty damn pathetic.

Get over it ya fan, your team stinks, cheats, and is a pathetic tool of marketing.
 

Rogodin2

Banned
Jul 2, 2003
3,219
0
0
"Oh, you mean my bias against people who sign up at diff forums just to stir the pot?
Or people who make outrageous OC claims?
Or those who try and push thier preferences on others?"


My motive for registering was valid (and you have no right to ask the question or inply that my motive was spurious, that's just immature and a lack of common social grace)-and the fact that I recomended ATI cards over nvidia during the 3dmark fiasco doesn't invalidate my proposition-you sure never did-and the fact that you are running an ATI 9800 in you main box proves it.

My OC claims are also valid and if you want I can post a pic of the receipt and a link to my 3dmark.

I don't push preferences, If I'm asked I will give it because I've run through enough cards to know when an ATI card is suitable for a person-and in most cases it's at least %85 of the time.

In retrospect it seems like you blame me for "having to purchase an ATI product" but I don't hold it against you big fella.


http://service.futuremark.com/compare?2k3=1437851

Item Description Quantity Unit Price Extended Price
14-102-292 VGA ATIOEM|RADEON 9800 128M 8X AGP% 1 $249.00 $249.00

Subtotal $249.00
Tax $0.00
Shipping and Handling charge $0.00
Amount Paid $249.00



Tracking your Order with FedEx: 630377792958

Special ordered items are not refundable. All items come with manufacturers warranty only. Refurbished items come with 15-day warranty only. All refunds require 15% restocking fee. Do not remove any labels from any parts or it will result in your warranty being void. Any wrong or damaged item must be reported within 7 days. No refund after 30 days, NO CPU REFUND AFTER 7 DAYS, no exceptions.

If I wasn't a gentleman I'd have to conclude that you're a bit off.

rogo

rogo



 

Rogodin2

Banned
Jul 2, 2003
3,219
0
0
Gin

IN reply to you're stirring the pot, I'd have to call you on that with your new account over at nvnews.net, registering just to plop some feces on the proverbial "public table".

If you want to talk to me you know where to go.

I'm Rogozhin at nvnews.net
rogodin at rage3d.com
rogodin at il2sturmovik.com

I have no tolerance for people that call me out and propound invalid claims regarding my posting character.


rogo
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: muzz
Originally posted by: Pete
Originally posted by: muzz
[tripe] Tripe my butt......... Talk about magical Det's... gimme a break with that crap.
This is totally unnecessary. Please keep this kind of unconstructive garbage to yourself.
Yes sir!! Is there anything else I can do for you?

I guess telling the truth around here gets the gods a lil pizzed huh.........

I was (obviously, I thought) talking about the tasteless wording of your message, though I don't entirely agree with the content, either.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Genx87-

While the NV30 had obvious issues with balanced and aggressive the application mode is much better than the highest quality on the 9800 Pro. Same can be said of the 5900 Ultra.

From what I've seen of the newest drivers I would certainly say this is in question as of now. nVidia's AF on the NV2x was a staggering amount better then that on the R2x0 boards. For nV, the problem is that they declined significantly while ATi improved significantly. The R3x0 still can't touch the AF quality that the NV2X boards have, but neither can the FX. None of the latest gen parts have what I would call very good AF, nVidia's was at least better then ATi's but the latest drivers are showing that that may no longer be the case. This is one of the reasons I'm hoping that Volari will turn out to be better on the driver and filtering front then either of the current boards, they simply don't cut it.

BFG

Then why hide something that they're not doing?

I've already explained this. nVidia has a history of hiding what their drivers are doing since the inception of the company. That is why there has never been a fully functioning open source nV driver. They like to keep what they are doing hidden, look at the amount of patents that they have.

think they're doing it to inflate performance which is why nVidia chose to follow their example. ATi and 3dfx obviously quite rightly felt image quality is more important than providing an unusable feature on their cards.

On the high end you would see a ~10% performance difference using S3TC1 versus S3TC3. That is with a lower level of compression. Why do you think they would do something that may increase the performance by maybe 5% in a bench they already dominated?

Because it's turned off by default.

Not in JKII and not in JKIII, have to double check on RtCW and Alice to be sure on those. The JediKnight games it is certainly enabled by default, certain level textures are too much for 128MB boards without TC on(the one you have to sneak around in on JKII as an example).

Also do you have Medal of Honour?

Was going to buy it, played it for about forty five minutes and had to wash my mouth out afterwards, couldn't believe how insanely linear it was(no, never purchased it in case you didn't figure ).

That's still cheating. If the user requests trilinear AF in the drivers then the drivers have no right to override those settings.

Then ATi is cheating and has been for years. They do not proper filtering implementations either(AF), and they haven't since the R200 debuted. There AF has always been adaptive and with the R200 it also forced bilinear no matter what. If you say nV is cheating on this one, then you have to say ATi is cheating and has been for years. I haven't said ATi was cheating on their filtering, just that it sucked. From what I've seen of the FX with the latest drivers, I'd have to say it looks like it is sucking pretty bad too(although it still doesn't look nearly as bad as the R2X0, it would be hard to be worse then that- and I still didn't call that cheating).

Application detection is bad because it's a fragile optimisation that can break with later versions of the program or by simply renaming the executable for whatever reason. It also removes all control from the user.

App detection nearly never breaks the game with later revisions, if it did you would have had Kyro owners frothing mad at the amount of games that wouldn't work. As far as taking control away from the user, you have never really had full control over your video card. I'm with you that I want as much control as possible, but you aren't getting it with your 9700Pro, you didn't get it with your GF4 prior to that or any of the boards you have owned since you have been here.

That gives control back to the user instead of just having the drivers to run off and do whatever they like without telling anyone. It also takes away the blame of cheating from the manufacturer since the reviewer can untick it if he/she doesn't like what the option does.

It is nothing new, and overwhelmingly you don't want users to have that kind of access to control. You or I or the overwhelming majority of AT users would be perfectly fine with that level of control, most people wouldn't. Can you imagine the support nightmare it would create having 'I broke my cupholder' users being allowed to assign the Zbuffering properties or caching limitations on their video cards? Of course applying simple filtering implementations is a basic request, but its not like you are getting pure filtering from ATi now either. For that matter, how does your R9700Pro overclock using the default BIOS and drivers? All of the vendors have certain things that they should implement IMO but don't. I am fully with you on the FX going to far right now, that's why I won't buy one.

And they did, with a mixed mode, hand-optimised, Microsoft compiled rendering path. If that path can't beat ATi's then there's no way the full precision path is going to do so, with or without Microsoft's compiler.

Actually with the mixed mode and the latest drivers it appears that nV does edge out ATi in some of the benches. But that is neither here nor there. The real reason they did it is for people like you. You now are under the impression that ATi's boards absolutely destroy the FX boards with PS 2.0 performance. That is not the case. ATi has an edge, but if the actual ~25%-35% was shown instead of the 70% they were able to rig up where would all the ATi hype be?

Why waste time and resources on something as useless as that?

They compiled it for ATi to the compiler that best matched their hardware, but they failed to do it for nVidia. They wanted to show a larger performance difference then there actually is(as Dave mentioned, it was to show the shader performance difference), and they have some people buying it. Look at Halo performance with PS 2.0, real staggering difference there isn't it? We have one top notch title shipping with PS 2.0 support and it doesn't show anything remotely like what we saw with HL2, don't you find that the least bit odd?

At some point you just have to accept bad hardware when you see it and move on.

Gearbox, who wasn't paid millions in promotional money from ATi, managed to get decent PS 2.0 performance out of this 'bad hardware'. Doesn't that ring any bells for you? If one company that was paid millions by company X comes out and shows how fast company X is, while another company who wasn't paid millions of dollars shows a finished shipping product that isn't remotely close to showing the same thing it doesn't imply to you that perhaps you shouldn't jump to conclusions quite yet?
 

DaveBaumann

Member
Mar 24, 2000
164
0
0
They compiled it for ATi to the compiler that best matched their hardware, but they failed to do it for nVidia.

Gawd, you select things as you see fit.

The update for the compiler was not ready at that point in time.

As for your comments about ATI cheating because of AF filtering, again, there is not specification for anisotropic filtering. How many times does that same point need to be said before you will take it in?

And, again, the term "adaptive" filtering does not relate to the Z angle rotation issue.

Gearbox, who wasn't paid millions in promotional money from ATi, managed to get decent PS 2.0 performance out of this 'bad hardware'. Doesn't that ring any bells for you? If one company that was paid millions by company X comes out and shows how fast company X is, while another company who wasn't paid millions of dollars shows a finished shipping product that isn't remotely close to showing the same thing it doesn't imply to you that perhaps you shouldn't jump to conclusions quite yet?

Yes, Ben, this is all a monetry driven conspiracy. If you can't see that there are issues with the interaction with DX9 and their shader model then you are being fairly delusional. Yes, you can improve performance if you work at it, but a lot of that involves doing things that you'd expect to do for DX8 shaders.
 

Originally posted by: muzz
Gin IDGAS what you think, as OBVIOUSLY you are part of the sheep herd that NV has harvested over the yrs...... I bet that 9800 was given to ya just so you could see what a POS the NV offerings were, and you just can't give it back because it WIPES THE FLOOR with anything that NV has....
Massively cheating drivers aside......., and even then the ATI offering holds it's ground pretty well.

I bet you didn't even say a damn word when the AT reviews came out with BOGUS results in favor of the POS NV offerings? 60 pages that are NOT worth the webspace use... except as a tool to hurt ATi.

Not to mention the other review they did B4 with more BOGUS results because of the filtering ( or should I say LACK of filtering) skewed benchmarks, ya know the one where EVERYONE that KNEW better TOLD them MANY times that the results were BOGUS, yet they said they would " Look into it" and never said another damn word and deleted the thread.....
Pretty damn pathetic.

Get over it ya fan, your team stinks, cheats, and is a pathetic tool of marketing.

This is hysterical.... I can almost hear him saying, "nyah, nyah, nyah nyah, nyah". How childlike and cute. Simple minds, simple pleasures...

GM

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
This is hysterical.... I can almost hear him saying, "nyah, nyah, nyah nyah, nyah". How childlike and cute. Simple minds, simple pleasures...

Agreed. It's pretty pathetic.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Gawd, you select things as you see fit.

The update for the compiler was not ready at that point in time.

But they still mananged to get it ready for the MM.

As for your comments about ATI cheating because of AF filtering, again, there is not specification for anisotropic filtering. How many times does that same point need to be said before you will take it in?

And nVidia is taking four samples each from two different mip levels with their brilinear....partly. I've been saying Dave, if it is a 'cheat' for nV to do what they are doing then thing like PVR's box filter or-

forcing Anisotropic Filtering via the driver control panel doesn't mean that Trilinear Filtering (if selected by the application) will be utilised on all texture layers. Evidently ATI believe that the control panel forced Anisotropic Filtering is for legacy applications that do not have in-game controls for enabling Anisotropic Filtering, hence they have chosen a compromise of quality and performance in that in "Quality" mode only the first texture layer will be Trilinear filtered and all subsequent layers with only be Bilinear filtered.

That would also be cheating. Either each of them are cheating in terms of filtering, or none of them are.

Yes, Ben, this is all a monetry driven conspiracy.

I don't think ATi paid Gabe to get him to do that explicitly, I think Gabe went out of his way to do it on his own and prove that their money was well spent. Take all of your comments Dave, about Valve trying to send a message, and their being no point in optimizing as much as possible for nV's architecture(sticking with MS's tools mind you, I'm suggesting nothing close to using Cg), your own statements seem to agree with what I've been saying minus the conclusion.

If you can't see that there are issues with the interaction with DX9 and their shader model then you are being fairly delusional.

Halo shows this quite nicely doesn't it. I know, one of the games that has managed to hit the bottom 100 PC titles of all times in GR's database is far more important then Halo. A game that was coded so well the developers patched the game not to use certain features by default due to their massive performance hit. I've already stated multiple times in this thread that the R3x0 boards are faster at PS 2.0 performance then the NV3x parts, but 70% difference to be expected? No.

Yes, you can improve performance if you work at it, but a lot of that involves doing things that you'd expect to do for DX8 shaders.

And we are still waiting for a game to show up that shows this big edge in visuals that shows this huge performance rift. You have been implying for quite some time that these would be commonplace. We have a small group of games out now that use DX9 level pixel shaders one of them showing a big performance hit, the rest don't. It's been almost a year now, still waiting.
 

DaveBaumann

Member
Mar 24, 2000
164
0
0
But they still mananged to get it ready for the MM.

Because that is the mode that is most important for FX.


And nVidia is taking four samples each from two different mip levels with their brilinear....partly. I've been saying Dave, if it is a 'cheat' for nV to do what they are doing then thing like PVR's box filter or-

Forget control panel controled filtering - something that is forced by the control panel is outside of the application and is something the user selects in order to gain some performance at the expense of image quality. I've said before, the "brilinear" modes are an exceptionally good method for increasing the performance whilst still minimising the most obvious pitfalls of plain Bilinear, however this should be at the behest of the user. The "Brilinear" modes are a highly questional optimisation when they are forced as the default and full trilinear filtering is not available at all in certian applications, or in all D3D titles as appears to be the case with the 52.xx series.

I don't think ATi paid Gabe to get him to do that explicitly, I think Gabe went out of his way to do it on his own and prove that their money was well spent. Take all of your comments Dave, about Valve trying to send a message, and their being no point in optimizing as much as possible for nV's architecture(sticking with MS's tools mind you, I'm suggesting nothing close to using Cg), your own statements seem to agree with what I've been saying minus the conclusion.

Could not also just be that they don't feel that they should need to do this level of development for one IHV, and that perhaps they selected ATI because it ran better? This is afterall what they had said.

I know they have not been happy with the FX series for a long time. As far as they were concerned they were coding to DX9 specifications and the FX series were billed as DX9 - they probably epxected it just to run when they ran it on the FX boards, but as Gary told me they tried it just before E3 on the NV35 and it completely failed. Thats when their work started.

I'm fairly sure that the entire tone of the presentation changed very late since NV got wind of ATI's presentation and dropped those drivers on developers and press the day before - Valve were up into the small hours testing them and thats where their list came from.


Halo shows this quite nicely doesn't it. I know, one of the games that has managed to hit the bottom 100 PC titles of all times in GR's database is far more important then Halo. A game that was coded so well the developers patched the game not to use certain features by default due to their massive performance hit. I've already stated multiple times in this thread that the R3x0 boards are faster at PS 2.0 performance then the NV3x parts, but 70% difference to be expected? No.

And we are still waiting for a game to show up that shows this big edge in visuals that shows this huge performance rift. You have been implying for quite some time that these would be commonplace. We have a small group of games out now that use DX9 level pixel shaders one of them showing a big performance hit, the rest don't. It's been almost a year now, still waiting

I would contend that Halo still shows highlights similar issues when PS2.0 is enabled.

http://www.extremetech.com/article2/0,3973,1354067,00.asp

However, its not just a pure case of "PS2.0 being used" but a case of how much it is used. Take a look at the nauture test in 3DMark03 - every pixel is touched by a PS2.0 shader at some point or another, and that initially sjowed the type of performance differences we've seen; likewise with TR:AoD the PS2.0 DoF effect is full screen, and again we get similar issues. In Aquamark PS2.0 is used for much fewer pixels and the performance difference is mimimised. I don't know the composition of the shaders in Halo, but given its an XBox port I'd wager that the majority of them are DX8 shaders. the performance difference will be very much down to the composition of the shaders and what quality they are used on the screen - so pendenant on those variables then the performance difference can be very large or very small.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
they probably epxected it just to run when they ran it on the FX boards, but as Gary told me they tried it just before E3 on the NV35 and it completely failed. Thats when their work started.


Wasnt E3 in May?!?!?!?!?!?!?!?!?
They started working on an NV path 4-5 months ago? Or about 3 months before shader day? How long has the game been in dev?

Interesting.................................I think it kind of shows the level of organization within Valve. Carmack said well before E3 he was working on a seperate path for the FX cards. Wasnt it already last Nov? Why wouldnt Valve of know about this? I guess at this point if Valve only put 3 months of work into optimizations for the FX cards I wouldnt expect them to be anything great.

I guess we will wait and see. Maybe by next E3 we will see how it pans out

Edit: On a side note how much can we believe Gabe when he said they spent 5 times as much time optimizing for the FX card compared to the 9800 Pro cards. Unless he is telling us they wrote the shader programs on the ATi card in roughly 2 weeks.
 

DaveBaumann

Member
Mar 24, 2000
164
0
0
Carmack said well before E3 he was working on a seperate path for the FX cards. Wasnt it already last Nov? Why wouldnt Valve of know about this? I guess at this point if Valve only put 3 months of work into optimizations for the FX cards I wouldnt expect them to be anything great.

Why would they have thought they would need to if they were just writing to DX9? They were using MS's API that NVIDIA were telling you, me and the developers they were compliant with.

DX is fundamentally difference from OpenGL in that DX9 only had (at that point) a single path. OpenGL didn't have a single fragement (pixel) shader path until more recently and JC had already started supporting multiple vendor paths in order to get the shader model to work.

Edit: On a side note how much can we believe Gabe when he said they spent 5 times as much time optimizing for the FX card compared to the 9800 Pro cards. Unless he is telling us they wrote the shader programs on the ATi card in roughly 2 weeks.

There is a difference between "optimising" and "writing". Optimising implies that they are already written and they are being tuned to run faster.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Why would they have thought they would need to if they were just writing to DX9? They were using MS's API that NVIDIA were telling you, me and the developers they were compliant with.

DX is fundamentally difference from OpenGL in that DX9 only had (at that point) a single path. OpenGL didn't have a single fragement (pixel) shader path until more recently and JC had already started supporting multiple vendor paths in order to get the shader model to work.


Because even carmack said in OpenGL the Nvidia cards were requiring thier own path to be faster than ATI cards. If Nvidia was going to be slower using a default path in OpenGL(Their strong suit) this should of put up all sorts of red flags over at Valve. I am sure Valve had to have some idea about how the Nvidia card was going to perform before E3. But if they really just wrote it and found out at E3 that tells me they were way behind the curve. Didnt they get FX cards before May??????

Being compliant and being fast are two different things. Nvidia cards are compliant obviously, but apparently they arent fast.

There is a difference between "optimising" and "writing". Optimising implies that they are already written and they are being tuned to run faster.

Ok let me rephrase that. Valve only spent ~ 2 weeks optimizing for ATI? That seems like an awful short time to me. I mean I understand the ATI arch is supposedly very well fit to DX 9. But I have done some coding before and 2 weeks is a short time for just about anything to be optimized, tested, and stamped with a seal of approval.

 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Um, nV only released the 5800 in Feb/Mar, and they only released DX9 drivers a few months after that. So it's not that hard to believe that Valve tried out their DX9 path on FX cards around E3. Valve had 9700P's available for six months by E3, and I believe ATi had released DX9 drivers a few months before E3, too.

But hey, maybe Valve has horrible time management. Who knows? Everything is gossip and innuendo until I'm able to test HL2 on my own PC. I still believe Valve was being honest about the state of their game and nV's drivers when they made their Shader Day presentation, but maybe MS and nV have gotten things in gear since then. The point is that nV hasn't made things as easy as ATi for devs coding to generic DX9, as 3DM03, TR:AoD, and Halo benches have shown. I still don't trust nV's current 3DM03 numbers (I'm awaiting FM's approved driver list by Oct. 31st), but their improvements in Halo and purported improvements in HL2 are promising.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
Ben:

I've already explained this. nVidia has a history of hiding what their drivers are doing since the inception of the company.
You didn't explain anything, you pulled a theory out of thin air. "nVidia's history" does not account for the fact that the driver compression just happened to arrive straight after both FutureMark and Unwinder's findings were published. The very next drivers implemented measures to stop them and you're telling me this has been nVidia's goal all along and that the events happening around them had no effect?

I'm sorry but your theory is basically unbelievable.

Why do you think they would do something that may increase the performance by maybe 5% in a bench they already dominated?
But initially the Radeon dominated it and that's even with nVidia's inferior compression! Put it into that perspective and you have something more to think about.

Not in JKII and not in JKIII, have to double check on RtCW and Alice to be sure on those.
Wonderful, so you're further illustrating my point that Quake III isn't the only title to have inflated results on nVidia cards since other Quake III based games are also using texture compression. How many reviewers are applying the DXT3 fix on pre-NV25 cards when they benchmark the likes of JK2 and RTCW?

Even today we're still seeing inflated results on nVidia cards.

Then ATi is cheating and has been for years.
No they haven't

They do not proper filtering implementations either(AF), and they haven't since the R200 debuted.
Show me the definition of proper AF. Now show me that the NV1x/NV2x is defined as the only card to achieve this proper implementation.

Also show me how ATi's AF has ever been unusably bad to the point of users disabling it completely and/or working around it like nVidia's DXT1 method was. In fact nVidia is more likely to have unusable AF since it's so damned slow it that enabling it eats your performance like Homer Simpson at an all-you-can-eat restaurant.

There AF has always been adaptive and with the R200 it also forced bilinear no matter what.
That's a hardware design decision that allowed virtually free AF at a time when memory bandwitdh was in really short supply. How is nVidia's method of changing from trilinear AF to Bilinear AF based on the whether the application name is "UT2003.exe" even remotely resemble ATi's situation?

App detection nearly never breaks the game with later revisions,
If the game changes enough then yes, it does. That is the nature of hard coding drivers around a given game based on the state it was when the drivers were made.

Can you imagine the support nightmare it would create having 'I broke my cupholder' users being allowed to assign the Zbuffering properties or caching limitations on their video cards?
That's just an excuse. How is that any different to users cranking up AA and then getting unusably slow performance? Should we now remove all method of control from the drivers based on stupid users? Hell, let's take away all administrator functions, control panel options and command line options from Windows because one stupid user might screw up their system.

Of course applying simple filtering implementations is a basic request, but its not like you are getting pure filtering from ATi now either.
If I request trilinear on my 9700 Pro then I get it. I can spend all night trying to change my application to UT2003.exe and it doesn't make a shred of difference.

For that matter, how does your R9700Pro overclock using the default BIOS and drivers?
Irrelevant.

Actually with the mixed mode and the latest drivers it appears that nV does edge out ATi in some of the benches.
Where?

You now are under the impression that ATi's boards absolutely destroy the FX boards with PS 2.0 performance.
And they do. The NV3x has very poor design decisions such as limited temporary registers and a very fiddly rendering engine that relies on things exactly the way it expects them, otherwise it suffers from abysmal performance. ATi's R3xx beats the NV3x in both brute force and IPC and it'll quite happily crunch through optimised and unoptimised code like there's no tomorrow.

They compiled it for ATi to the compiler that best matched their hardware, but they failed to do it for nVidia.
Microsoft's compiler was used for mixed mode and it was still unable to beat ATi's full precision path. Again I'll ask, why bother wasting time on an even slower path for even less diminishing returns?

Look at Halo performance with PS 2.0, real staggering difference there isn't it?
Halo? Given it runs on the X-Box's Direct 8.x hardware I can't imagine there'd be a staggering amount of PS 2.0 code in there if any at all.

We have one top notch title shipping with PS 2.0 support and it doesn't show anything remotely like what we saw with HL2, don't you find that the least bit odd?
No, not when other genuine DirectX 9 titles such as 3DMark03 and Tomb Raider show exactly the same abysmal performance.

Gearbox, who wasn't paid millions in promotional money from ATi, managed to get decent PS 2.0 performance out of this 'bad hardware'.
Again I fail to see how PS 2.0 instructions are running on an X-Box.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Dave-

Because that is the mode that is most important for FX.

But you stated they were trying to make a statement to developers. If that was the case, why would they not be honest and come out and explain that the test wasn't optimized using DX9's own compilers for the FX?

Forget control panel controled filtering - something that is forced by the control panel is outside of the application and is something the user selects in order to gain some performance at the expense of image quality. I've said before, the "brilinear" modes are an exceptionally good method for increasing the performance whilst still minimising the most obvious pitfalls of plain Bilinear, however this should be at the behest of the user. The "Brilinear" modes are a highly questional optimisation when they are forced as the default and full trilinear filtering is not available at all in certian applications, or in all D3D titles as appears to be the case with the 52.xx series.

So with nV it isn't OK because they can't do it in D3D apps, but with ATi is it OK because they can do it in an even smaller amount of apps? I Have maybe a dozen games that offer the option of AF in game. For PVR, if you enable TC then you get the box filter whenever you select trilinear, isn't that the same thing? You don't select the box filter, it is enabled by default when you use TC.

Could not also just be that they don't feel that they should need to do this level of development for one IHV

The same company that went on and on about spending five times as much time working with nV as they did with ATi couldn't compile the code they already had for the FX?

As far as they were concerned they were coding to DX9 specifications and the FX series were billed as DX9 - they probably epxected it just to run when they ran it on the FX boards, but as Gary told me they tried it just before E3 on the NV35 and it completely failed. Thats when their work started.

Funny the talk about spending five times as long getting it to work with a FX while they used the E3 build to bench the game and they first attempted the game right before that, not to mention they had spent over half a year working on getting the game running on a R300.

I'm fairly sure that the entire tone of the presentation changed very late since NV got wind of ATI's presentation and dropped those drivers on developers and press the day before - Valve were up into the small hours testing them and thats where their list came from.

A list of bugs. What if Carmack had demanded the latest official drivers be used for the Doom3 benches? Would you have thought it was fair to show the R9800Pro running neck and neck with the 5200?

I would contend that Halo still shows highlights similar issues when PS2.0 is enabled.

http://www.extremetech.com/article2/0,3973,1354067,00.asp[/q

Halo doesn't show anything remotely like HL2. The performance gap using the old drivers is comparitively mild, not to mention it reverses itself with the latest drivers(which are now available for public scrutiny). Also, the guys over at ET should really do a better job of reporting on the bench and how it compares to gameplay, particularly that the benchmark itself factors in the load times between levels in to its equation. Gameply shows considerably higher bench results. Oh yeah, they also need to fix their PS 1.1 chart, they have 16x12 and 10x7 reversed

Take a look at the nauture test in 3DMark03 - every pixel is touched by a PS2.0 shader at some point or another, and that initially sjowed the type of performance differences we've seen

I consider 3DM2K3 to be utterly irrelevant for numerous reasons, mainly it is far too slow on all the DX8 tests versus what you see on screen(particularly GT3, it looks to be sub par to D3 or even Amp2 and yet runs significantly slower). And also their bench wasn't mapped to the FX properly either, although that is most certainly not their fault as the compiler was not ready when they released it which was not the case with Valve.

likewise with TR:AoD the PS2.0 DoF effect is full screen, and again we get similar issues.

The one game out that shows a major performance rift. You look at the game, the visuals, and the performance and you can tell me with a straight face that it is close to decently done? The game runs significantly slower then either D3 or HL2 and looks like a DX7 title in comparison(ignoring the horrendous game itself completely).

I don't know the composition of the shaders in Halo, but given its an XBox port I'd wager that the majority of them are DX8 shaders.

Gearbox has talked decently about how many of the shaders they redid to improve visuals using DX9 shaders. And Halo has plenty of levels that have nigh every pixel running a shader too.

so pendenant on those variables then the performance difference can be very large or very small.

So you are going on record saying that 70% should be what people should expect? This is what Valve showed, and as you stated Valve was trying to send a message. I would really like to see you go on record stating this.

Pete-

So it's not that hard to believe that Valve tried out their DX9 path on FX cards around E3.

The bench was from the E3 build, and Valve said they spent five times longer optimizing for nVidia's parts. What does that tell you?

The point is that nV hasn't made things as easy as ATi for devs coding to generic DX9, as 3DM03, TR:AoD, and Halo benches have shown.

52.16s are public(although not official quite yet) now, the 5900U is faster then the R9800Pro in Halo and the public is allowed to scrutinize the drivers all they like. This did not require any extra effort on Gearbox's part.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
BFG-

You didn't explain anything, you pulled a theory out of thin air. "nVidia's history" does not account for the fact that the driver compression just happened to arrive straight after both FutureMark and Unwinder's findings were published.

You think they managed to get them up an running that quickly? That seems awful hard to believe to me.

Wonderful, so you're further illustrating my point that Quake III isn't the only title to have inflated results on nVidia cards since other Quake III based games are also using texture compression. How many reviewers are applying the DXT3 fix on pre-NV25 cards when they benchmark the likes of JK2 and RTCW?

JK2 is CPU limited anyway, but that would be a valid point for the older boards that would be vid limited.

If the game changes enough then yes, it does. That is the nature of hard coding drivers around a given game based on the state it was when the drivers were made.

Not neccessarily. Tell me how the filtering optimisations would ever break a game?

Also show me how ATi's AF has ever been unusably bad to the point of users disabling it completely and/or working around it like nVidia's DXT1 method was.

Look at a R2x0 board running AF, it is horrendous.

That's just an excuse. How is that any different to users cranking up AA and then getting unusably slow performance? Should we now remove all method of control from the drivers based on stupid users?

That's why we have apps like RivaTuner and RageTweaker. Also, when ATi excludes a feature that d@mn near 100% of the users that want control over their hardware use you claim it is irrelevant......?

If I request trilinear on my 9700 Pro then I get it. I can spend all night trying to change my application to UT2003.exe and it doesn't make a shred of difference.

Fire up any game that doesn't have an AF option in game and force enable Quality mode in the control panel with the 3.8s. You get trilinear on one mip level, and that's it. All the rest are straight bilinear, not even a brilinear hack. Have you not noticed this blatant 'cheating' yet?

And they do. The NV3x has very poor design decisions such as limited temporary registers and a very fiddly rendering engine that relies on things exactly the way it expects them, otherwise it suffers from abysmal performance.

Let's see how your R300 runs the FX compiled code when it is the only option. You may not quite think the way you do now.

ATi's R3xx beats the NV3x in both brute force and IPC and it'll quite happily crunch through optimised and unoptimised code like there's no tomorrow.

IPC? Please, let's keep it reasonable here, no need to go Mac user. And I'm sure you would be real happy with code that would not run at all on your board that was optimized for the FX.

Microsoft's compiler was used for mixed mode and it was still unable to beat ATi's full precision path. Again I'll ask, why bother wasting time on an even slower path for even less diminishing returns?

They were trying to convince developers and people like you that there was a much larger performance gap then there actually is. Apparently it at least partly worked.

Halo? Given it runs on the X-Box's Direct 8.x hardware I can't imagine there'd be a staggering amount of PS 2.0 code in there if any at all.

TRAoD runs on the PS2's DX5 level hardware, you sure you want to use that line? OK, we'll go with it.

No, not when other genuine DirectX 9 titles such as 3DMark03 and Tomb Raider show exactly the same abysmal performance.

But according to you TRAoD is DX5, right?

Again I fail to see how PS 2.0 instructions are running on an X-Box.

I fail to see how DX6, DX7 or DX8 are running on the PS2, let alone DX9.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Why do these discussions always end in knit-picking, bickering and flame-wars? At least I know what a jiffy is, I suppose.

**FLAME SUIT EQUIPPED**

If only the laws of capitalism could be broken...then we'd get to see all companies abide by pure ethics and such. Only then would ATi stop firing all of their driver programmers while Nvidia stopped programming their GPUs to do illicit things. After all that, we could possibly all play patsy together while involved in a shindig.

 

DaveBaumann

Member
Mar 24, 2000
164
0
0
But you stated they were trying to make a statement to developers. If that was the case, why would they not be honest and come out and explain that the test wasn't optimized using DX9's own compilers for the FX?

They have - they told us.

So with nV it isn't OK because they can't do it in D3D apps, but with ATi is it OK because they can do it in an even smaller amount of apps?

Whatever occurs in the control panel is up to the suppliers of the drivers - some of the things that are put in there may be cheesey or they may not, but this is outside of the application and there is essentially no right or wrong here. What should always be the right is to do what the app asks for.

The same company that went on and on about spending five times as much time working with nV as they did with ATi couldn't compile the code they already had for the FX?

...figuring it would be more useful in the path they actually coded for it...

Funny the talk about spending five times as long getting it to work with a FX while they used the E3 build to bench the game and they first attempted the game right before that, not to mention they had spent over half a year working on getting the game running on a R300.

Again writing != optimising. The Benchmark they used had been was complete with the MixedMode by the time it was benchmarked.

I consider 3DM2K3 to be utterly irrelevant for numerous reasons, mainly it is far too slow on all the DX8 tests versus what you see on screen(particularly GT3

I don't really care whether you think its irrelavent or not. Fact is, every pixel is touched by a PS2.0 shader and we know what happened there.

The one game out that shows a major performance rift. You look at the game, the visuals, and the performance and you can tell me with a straight face that it is close to decently done?

Ben, you appear to be building a habbit of ignoring or disparaging anything that shows something that you don't want it to. Its a little tedious. Does this also mean that the countless other PS2 benchmarks that have been generated such as SahderMark, ShaderMark2, Marko FillRate Tester, RightMark3D etc, etc, are all wrong/poorly coded/bad?

However, regardless of the rest of the game, the DoF effect in TR is quite well done IMO.

And Halo has plenty of levels that have nigh every pixel running a shader too.

But what shader?

So you are going on record saying that 70% should be what people should expect?

Nice, Ben, now you're putting word in peoples mouths.
 

spam

Member
Jul 3, 2003
141
0
0
I finished reading the whole of this thread and if I could mention one point that seems to be agreed upon it is that Nvidia has been cheating. Some have gone on to point out that ATI has done the same thing. However, ATI has a different response to it's critics than Nvidia, when caught with optimizations (shader executions order in 3dmark) it stated that it would take them out. Nvidia has had a far more murky response to the documentation of it's cheats. It has made promises not to sacrifice IQ for FPS and yet clearly it has done so.

I think that currently ATI can afford to show greater integrity, they have the better hardware and software design. Nvidia cannot afford to show integrity, honesty or forthrightness because they will lose market share. The bottom line is the ALMIGHTY $, any moral compass is pulled askew by the dollars infleunce. That is just sad.

For our part as consumers, we can defend our value for dollar by buying only the part that offers the best value for dollar. That means we try and penentrate through the IQ verses FPS issues the ALMIGHTY $ is the bottom line.

For my part, I will add honesty and integrity as part of my buying decision. ATI has shown greater amount of honesty, IQ and performance.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Um, nV only released the 5800 in Feb/Mar, and they only released DX9 drivers a few months after that. So it's not that hard to believe that Valve tried out their DX9 path on FX cards around E3. Valve had 9700P's available for six months by E3, and I believe ATi had released DX9 drivers a few months before E3, too.

Devs will get Dev cards well before launch. I believe carmack was talking about issues back last winter well before the release of the cards.

But hey, maybe Valve has horrible time management. Who knows? Everything is gossip and innuendo until I'm able to test HL2 on my own PC. I still believe Valve was being honest about the state of their game and nV's drivers when they made their Shader Day presentation, but maybe MS and nV have gotten things in gear since then. The point is that nV hasn't made things as easy as ATi for devs coding to generic DX9, as 3DM03, TR:AoD, and Halo benches have shown. I still don't trust nV's current 3DM03 numbers (I'm awaiting FM's approved driver list by Oct. 31st), but their improvements in Halo and purported improvements in HL2 are promising.

I am not terribly convinced Valve really made much of an effort to work on the NV path. And if Dave is correct that they didnt even start working on an NV path until after E3. That means they spent all of about 12-15 weeks trying to optimize all thier shaders for the FX cards. To me that doesnt seem like a lot of time. Then you tack on the marketing blitz by Gabe, and them basically telling fibs about the release date and I am not impressed.

 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: spam
I finished reading the whole of this thread and if I could mention one point that seems to be agreed upon it is that Nvidia has been cheating. Some have gone on to point out that ATI has done the same thing. However, ATI has a different response to it's critics than Nvidia, when caught with optimizations (shader executions order in 3dmark) it stated that it would take them out. Nvidia has had a far more murky response to the documentation of it's cheats. It has made promises not to sacrifice IQ for FPS and yet clearly it has done so.

I think that currently ATI can afford to show greater integrity, they have the better hardware and software design. Nvidia cannot afford to show integrity, honesty or forthrightness because they will lose market share. The bottom line is the ALMIGHTY $, any moral compass is pulled askew by the dollars infleunce. That is just sad.

For our part as consumers, we can defend our value for dollar by buying only the part that offers the best value for dollar. That means we try and penentrate through the IQ verses FPS issues the ALMIGHTY $ is the bottom line.

For my part, I will add honesty and integrity as part of my buying decision. ATI has shown greater amount of honesty, IQ and performance.



 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Dave-

They have - they told us.

How long after their PR event?

Whatever occurs in the control panel is up to the suppliers of the drivers - some of the things that are put in there may be cheesey or they may not, but this is outside of the application and there is essentially no right or wrong here. What should always be the right is to do what the app asks for.

So then the R2x0 was cheating? Whenever AF and trilinear were selected, you didn't get it.

...figuring it would be more useful in the path they actually coded for it...

You said they wanted to demonstrate the relative performance of ATi v nV in terms of shader performance. Are you now saying that you didn't mean that?

I don't really care whether you think its irrelavent or not.

Then why bother replying? Civility is certainly escaping you here.

Ben, you appear to be building a habbit of ignoring or disparaging anything that shows something that you don't want it to. Its a little tedious. Does this also mean that the countless other PS2 benchmarks that have been generated such as SahderMark, ShaderMark2, Marko FillRate Tester, RightMark3D etc, etc, are all wrong/poorly coded/bad?

Your ignoring what I have actually said became tiring some time ago. Yet again, in this thread, I will say that I expect the R3X0 to be in the area of 25%-35% faster then the comparable FX part in PS2 performance on average. HL2 showed a 70% rift which you have been staunchly supporting. I have not claimed that the FX is faster, or even comparable to the R3X0 in PS 2.0 support and now amount of your attempts to read otherwise will change that. The pure synthetic PS 2.0 tests show particular shader performance period. They sure as he!l don't render back to front to intentionally increase the load on the video card. Any developer that did that for a game with the intention of slowing it down I would consider to be extraodinarily stupid, yet it isn't supposed to be poorly coded by your standards.

But what shader?

Ask Gearbox about it. They aren't getting any TWIWMTBP or ATi PR money, so they should be able to give you a straight up answer on that one. Performance drops decently on ATi and nV parts enabling the PS 2.0 shaders, just nowhere near the catastrophic gap that you seem to want everyone to think will happen if they try and run a PS 2.0 game on nV hardware.

Nice, Ben, now you're putting word in peoples mouths.

so pendenant on those variables then the performance difference can be very large or very small.

I will ask you then, what level of performance difference do you expect between FX and R3X hardware when running DX9 games roughly? It has been a year now, we have one title that show a big rift, one that shows a slight edge going the other way, all of the other games fail to use the new uber gfx feature. I've been saying, repeatedly, that I expect the FX to be a bit slower, but nothing close to the 70% that Valve was trying to send a message with.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |