Originally posted by: BenSkywalker
Why are you mentioning PS/VS 3.0?
Because it is part of the DX 9.0 spec.
You're right, my mistake.
Carmack was asking for FP16 all along, he commented that he was happy with FP16 numerous times. As far as coding a special path, so is ARB2. The OGL 2.0 spec was years away from completion when Carmack started coding Doom3.
But Carmack has also had to resort to FX12 for the NV3x. Nvidia can't run the standard ARB2 path at anywhere close to the speed of the R3x0.
The point is that the NV3x needs special considerations that deviate from the standard APIs, whereas the R3x0 does not. That's just the way it is. The fact that Nvidia renders 8 bits more precision in full precision mode is no excuse for the extremely poor performance. ATI's FP24 is as fast or faster than Nvidia's FP16. And FP24 was deemed to be enough for full-precision in DX9.
Nvidia made some bad design decisions that sacrificed too much floating-point shader speed.
Compiler optimizations will gain SOME speed, but it's obvious that both Nvidia and game developers feel that the NV3x doesn't have enough shader power to run full DX9-level code.
nV feels that way? Why do I seriously doubt that...
Apparently they do, or they wouldn't have gotten developers to rewrite DX9 content to lower standards. The HL2 mixed-mode path reportedly doesn't even use effects like HDR.
Another reason is, of course, all the driver cheating involving shader and precision hacks.
Nvidia denounced 3dmark03 for not being representative of DX9 because it uses more PS1.4 than PS2.0, but now they're promoting the use of PS1.4 as a substitute for PS2.0!
It can't even run only floating-point!
Yes it can.
I meant satisfactorily. That's why the the first couple of DX9 games are actually defaulting to DX8 mode for the low-mid range cards, and why mixed-mode paths have had to be implemented. Either the game will reduce precision for the cards, or the drivers will. Then there's Doom3 which uses FX12/FP16 for the NV3x path.
The DX9 spec doesn't include integer precision,
Yes it does.
Okay, I should be more precise. The PS2.0 spec!
Forget about whether or not any noticeable quality is lost - the NV3x simply can't handle pure DX9 at a level of performance that is acceptable to developers and gamers.
Performance is the viable topic of discussion here. Everything else you have stated is a question of what level of performance is viable, not if it can or not.
Yes, but I'm not arguing about whether it can. My point is that if it can't
practically, then it practically can't. Nvidia is promoting DX9+ compliance in their marketing, but sub-DX9 compliance in the way their cards are measured against the competition, and used in games.
As I explained, why would developers go to all the extra trouble of writing a special path if there was any confidence that the NV3x could get up to acceptable speed by the time the games ship?
Why did UbiSoft have to add in a dumbed down version of their code for Splinter Cell for ATi when even nV's DX8 parts can run the game as it was meant to be played?
Nvidia-specific coding.
There are many games that ATI can't run as well as Nvidia (ex: NWN, Tribes 2), and it has nothing to do with ATI's inability to cope with the API standards.
It's funny how the NV3x is marketed as being "beyond DX9", but developers are being forced to make compromises for even standard DX9. Actually, it' not funny. People are getting ripped off if they want full DX9 functionality. How will the 5200/5600-class products offer that?
Why do you only take into consideration games that are PS 2.0 limited as being fully DX9 compliant?
I'm not. My statement is accurate. Half-Life 2 is an example of where Nvidia can't even handle the standard DX9 requirements - according to the developers, and even Nvidia themselves apparently, since they had to work with Valve to implement a separate path. Anyway, PS2.0 is the standout feature of DX9.
Apparently, Nvidia has GREAT DX9 speed! Or are they deceiving the public, and that's why people are so upset?
Look at Ginats and SeriousSam and compare them using the Kyro2. In SS the Kyro2 could best the GeForce2Ultra while in Giants it got whipped by a GeForce1 SDR. Was PVR fooling the public?
I don't know, was Kyro misrepresenting their performance by claiming or being expected/believed to do one thing, and then doing something else?
Pete-
Seriously, I'd appreciate their superior feature set if they had drivers to match.
Absolutely they need drivers that will bring their performance up for Pixel Shaders. I'm talking about people saying the NV3X isn't DX9 compliant, that is no more true then it is with the R3x0 parts. I don't recall you stating the NV3X wasn't DX( compliant, but if you did then I certainly disagree with that strongly. You think that the 64bit 5200NU is going to have problems running Longhorn?
I don't deny that the NV3x is fully DX9 compliant. I'm just saying that it's not being treated as such in the realworld (or won't be), by Nvidia, by developers, and naturally by consumers.