HL2 screenshot differences

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Doubt the R3.xx will be any faster using DX8 because the PS is always 24FP no matter what.

And I will have to wait for the movie or whatever they are going to release on the 30th. To be honest the pictures are nothing special and all the apologists are saying you have to see it in motion. So Ill reserve judgement before declaring if it is a big deal.

 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Originally posted by: oldfart
I think it is hard to tell the difference with screenshots. You need to see the game in motion to see the effects. I read that there will be a new version of the tech temo in DX8.0 or 8.1 mode for comparison.

I totally agree, you can't really tell differences in a simple screenshot, you need to see the game being played to tell the difference. I trust it when I read articles (like the ones here at AT) that say stuff like; "Although it looks much worse than the other competitors, the GeForce4 Ti 4600 running in its DX8 code path manages to offer fairly decent frame rates".

Sure, the game will probably still look pretty damn, good, maybe not as jaw droppingly good but playable just the same which is the imporant part. Gameplay will make the game fun if it can, not the super pretty graphics.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
This is not the case with DX9. There are several DX9 games on the way, and I cant see how someone could in good faith recommend a card that is not DX9 compliant

I've seen this stated numerous times and have been wondering what is meant by it? The R3x0 boards aren't close to having full DX9 support, most of the R3x0 boards are actually further off then the NV3X series, so what is the issue with compliance?
 

TheMouse

Senior member
Sep 11, 2002
336
0
0
i figure... if you're going to get a card now, might as well get one that works good with directx 9(i mean, unless thats outta price range)... i needed an pgrade from a tnt2, and i dont think anyone can tell me i did the wrong thin by pickin up a soft moddable 9500np in efforts to be able to play upcomin games like HL2 and Doom3
 

RogerAdam

Member
Oct 14, 2001
29
0
0
Originally posted by: BenSkywalker
This is not the case with DX9. There are several DX9 games on the way, and I cant see how someone could in good faith recommend a card that is not DX9 compliant

I've seen this stated numerous times and have been wondering what is meant by it? The R3x0 boards aren't close to having full DX9 support, most of the R3x0 boards are actually further off then the NV3X series, so what is the issue with compliance?

As per M$, the ATI DX9 parts were built to conform to the standard, ie M$ endorses that fact. Nv is closer? Then why is that the Nv3x is having so much trouble implementing DX9 features "better" than ATI, what is it driver bugs??? Everyone who runs an Nv product always makes it a point to point out the superior driver support, now how can that be?

 

IQfreak

Junior Member
Sep 15, 2003
12
0
0
Originally posted by: BenSkywalker
This is not the case with DX9. There are several DX9 games on the way, and I cant see how someone could in good faith recommend a card that is not DX9 compliant

I've seen this stated numerous times and have been wondering what is meant by it? The R3x0 boards aren't close to having full DX9 support, most of the R3x0 boards are actually further off then the NV3X series, so what is the issue with compliance?

The issue with compliance is simply that Nvidia is completely contradicting their marketing claims by doing everything they can to lower quality and features below DX9 standards, by having developers rewrite a lot of DX9 content (lowering precision all the way to FX12 in many cases, converting PS2.0 shaders to DX8-level, substituting shaders with textures), and by "optimizing" their drivers.

Nvidia touts features that are practically unusable by its gaming customers. They comply on paper, but not in realworld benchmarks and games.

A person who buys an NV3x-based card cannot use DX9 features the way they would expect, and Nvidia is doing its part to make sure the person doesn't even have a choice to use full DX9, through driver tricks that lower quality.

Any advantages in DX9 features that the NV3x has over the R3x0 are mostly only theoretical, for all intents and purposes. From a gamer's standpoint, the NV3x is essentially non-compliant. And that's all that matters. Not BS semantics.



 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: IQfreak
Originally posted by: BenSkywalker
This is not the case with DX9. There are several DX9 games on the way, and I cant see how someone could in good faith recommend a card that is not DX9 compliant

I've seen this stated numerous times and have been wondering what is meant by it? The R3x0 boards aren't close to having full DX9 support, most of the R3x0 boards are actually further off then the NV3X series, so what is the issue with compliance?

The issue with compliance is simply that Nvidia is completely contradicting their marketing claims by doing everything they can to lower quality and features below DX9 standards, by having developers rewrite a lot of DX9 content (lowering precision all the way to FX12 in many cases, converting PS2.0 shaders to DX8-level, substituting shaders with textures), and by "optimizing" their drivers.

Nvidia touts features that are practically unusable by its gaming customers. They comply on paper, but not in realworld benchmarks and games.

A person who buys an NV3x-based card cannot use DX9 features the way they would expect, and Nvidia is doing its part to make sure the person doesn't even have a choice to use full DX9, through driver tricks that lower quality.

Any advantages in DX9 features that the NV3x has over the R3x0 are mostly only theoretical, for all intents and purposes. From a gamer's standpoint, the NV3x is essentially non-compliant. And that's all that matters. Not BS semantics.

Exactly! Finally someone who gets it! Well said, IQFreak.
 

Rogodin2

Banned
Jul 2, 2003
3,219
0
0
Ben

Please enlighten us as to why ati has sub dx9 support-all the gurus at beyond3d, valve, nvnews.net, and polytechnichii are saying the exact opposite.

rogo
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Please enlighten us as to why ati has sub dx9 support

PS/VS 3 support- try reading the sites you like to talk about so much, that way I won't have to point out the basics so much.

Nv is closer?

Then most of the R3x0 line yes. Their shader count limit and functionality are closer to what is required to match PS/VS 3 standards then most of the ATi parts.

Then why is that the Nv3x is having so much trouble implementing DX9 features "better" than ATI, what is it driver bugs??? Everyone who runs an Nv product always makes it a point to point out the superior driver support, now how can that be?

This is actually the most telling aspect as to what most of the die hard loyalists to ATi think about their drivers. Anything is wrong with nV it HAS to be intentional, even on beta drivers. Creating a compiler for a new shader architecture can't possibly be anything nV would have bugs in, even in beta state their driver team is so much better then ATi's that it isn't possible? That seems to be what people are saying.

Any advantages in DX9 features that the NV3x has over the R3x0 are mostly only theoretical, for all intents and purposes. From a gamer's standpoint, the NV3x is essentially non-compliant. And that's all that matters. Not BS semantics.

Which games are out now showing problems with the last released drivers? If HL2 has shipped, I have to talk to EB as my pre order sure as hell didn't ship. Haven't seen my copy of Doom3 or Halo either. Tomb Raider is faster under ATi(we will leave out the quality of the title from the discussion since the developers sure as hell left the quality out of the game ), that's reason to say that nV doesn't support the features? Is the fact that ATi can be faster at something that much of a shock to people? If even ATi can do it faster, then nV must have something broken??
 

ZimZum

Golden Member
Aug 2, 2001
1,281
0
76
Then most of the R3x0 line yes. Their shader count limit and functionality are closer to what is required to match PS/VS 3 standards then most of the ATi parts.

Are you comparing the enitre product lines? Or are you saying that the 5900u DX9 implemintation is superior to that of the 9800pro?
 

IQfreak

Junior Member
Sep 15, 2003
12
0
0
Originally posted by: BenSkywalker

Then most of the R3x0 line yes. Their shader count limit and functionality are closer to what is required to match PS/VS 3 standards then most of the ATi parts.

Why are you mentioning PS/VS 3.0? We're talking about DX9, not DX9.1. What does it matter to a gamer if the NV3x can support something like 1024 instructions, when it's too slow to handle even a tenth of that in real games? And longer shaders require full precision, but the NV3x's FP32 speed is severely handicapped.

ATI made all the right compromises for a fully usable DX9 consumer chip. Why design a DX9+ chip when it can't even handle DX9 "the way it's meant to be played"?


This is actually the most telling aspect as to what most of the die hard loyalists to ATi think about their drivers. Anything is wrong with nV it HAS to be intentional, even on beta drivers. Creating a compiler for a new shader architecture can't possibly be anything nV would have bugs in, even in beta state their driver team is so much better then ATi's that it isn't possible? That seems to be what people are saying.


If they're just driver bugs, then why is Nvidia resorting to all these artificial optimizations to gain speed? Why did Id and Valve have to spend so much time writing a special path, if Nvidia's drivers are just "buggy" and not representative of their true shader speed? Do you expect Nvidia to fix the speed and then tell Valve that all the extra work they did for the mixed-mode path was in vain?

Compiler optimizations will gain SOME speed, but it's obvious that both Nvidia and game developers feel that the NV3x doesn't have enough shader power to run full DX9-level code. It can't even run only floating-point! The DX9 spec doesn't include integer precision, but that's exactly what the NV3x *needs*. Forget about whether or not any noticeable quality is lost - the NV3x simply can't handle pure DX9 at a level of performance that is acceptable to developers and gamers.



Any advantages in DX9 features that the NV3x has over the R3x0 are mostly only theoretical, for all intents and purposes. From a gamer's standpoint, the NV3x is essentially non-compliant. And that's all that matters. Not BS semantics.

Which games are out now showing problems with the last released drivers? If HL2 has shipped, I have to talk to EB as my pre order sure as hell didn't ship. Haven't seen my copy of Doom3 or Halo either. Tomb Raider is faster under ATi(we will leave out the quality of the title from the discussion since the developers sure as hell left the quality out of the game ), that's reason to say that nV doesn't support the features? Is the fact that ATi can be faster at something that much of a shock to people? If even ATi can do it faster, then nV must have something broken??
[/quote]


As I explained, why would developers go to all the extra trouble of writing a special path if there was any confidence that the NV3x could get up to acceptable speed by the time the games ship? Why would Nvidia push for such extreme accommodations?

It's funny how the NV3x is marketed as being "beyond DX9", but developers are being forced to make compromises for even standard DX9. Actually, it' not funny. People are getting ripped off if they want full DX9 functionality. How will the 5200/5600-class products offer that?

By the way, how does the mass market know ATI is faster? The most popular DX9 benchmark (3dmark03) shows Nvidia as the best! Aquamark with the Det50s shows Nvidia is just as fast as ATI! And we didn't get to see HL2 performance with the Det50s. Apparently, Nvidia has GREAT DX9 speed! Or are they deceiving the public, and that's why people are so upset?

 

Vonkhan

Diamond Member
Feb 27, 2003
8,198
0
71
Haven't seen my copy of Doom3 or Halo either

U been living under a rock where u never heard of kazaa?



Halo, the leaked version, is good but not impressive and runs better on my 9800pro than on my roomie's 5900u
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: BenSkywalker
Please enlighten us as to why ati has sub dx9 support

PS/VS 3 support- try reading the sites you like to talk about so much, that way I won't have to point out the basics so much.

Nv is closer?

Then most of the R3x0 line yes. Their shader count limit and functionality are closer to what is required to match PS/VS 3 standards then most of the ATi parts.

Superb. Another "almost" feature from nVidia to add to their marketing checklist. Wake me when nV gets the currently-exposed features of DX9 working at a decent speed....

Seriously, I'd appreciate their superior feature set if they had drivers to match. As it stands, ATi has had far superior DX9 support and speed for far longer for its DX9 parts than nV has. It's been six months since release and we're still seeing drivers that increase performance by huge leaps. Just because we don't have many DX9 titles available yet doesn't mean nV can get away with selling a "DX9+" video card that's aenemically slow at mere DX9 settings.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Why are you mentioning PS/VS 3.0?

Because it is part of the DX 9.0 spec.

If they're just driver bugs, then why is Nvidia resorting to all these artificial optimizations to gain speed? Why did Id and Valve have to spend so much time writing a special path

Carmack was asking for FP16 all along, he commented that he was happy with FP16 numerous times. As far as coding a special path, so is ARB2. The OGL 2.0 spec was years away from completion when Carmack started coding Doom3.

Compiler optimizations will gain SOME speed, but it's obvious that both Nvidia and game developers feel that the NV3x doesn't have enough shader power to run full DX9-level code.

nV feels that way? Why do I seriously doubt that...

It can't even run only floating-point!

Yes it can.

The DX9 spec doesn't include integer precision,

Yes it does.

Forget about whether or not any noticeable quality is lost - the NV3x simply can't handle pure DX9 at a level of performance that is acceptable to developers and gamers.

Performance is the viable topic of discussion here. Everything else you have stated is a question of what level of performance is viable, not if it can or not.

As I explained, why would developers go to all the extra trouble of writing a special path if there was any confidence that the NV3x could get up to acceptable speed by the time the games ship?

Why did UbiSoft have to add in a dumbed down version of their code for Splinter Cell for ATi when even nV's DX8 parts can run the game as it was meant to be played?

It's funny how the NV3x is marketed as being "beyond DX9", but developers are being forced to make compromises for even standard DX9. Actually, it' not funny. People are getting ripped off if they want full DX9 functionality. How will the 5200/5600-class products offer that?

Why do you only take into consideration games that are PS 2.0 limited as being fully DX9 compliant?

Apparently, Nvidia has GREAT DX9 speed! Or are they deceiving the public, and that's why people are so upset?

Look at Ginats and SeriousSam and compare them using the Kyro2. In SS the Kyro2 could best the GeForce2Ultra while in Giants it got whipped by a GeForce1 SDR. Was PVR fooling the public?

Pete-

Seriously, I'd appreciate their superior feature set if they had drivers to match.

Absolutely they need drivers that will bring their performance up for Pixel Shaders. I'm talking about people saying the NV3X isn't DX9 compliant, that is no more true then it is with the R3x0 parts. I don't recall you stating the NV3X wasn't DX( compliant, but if you did then I certainly disagree with that strongly. You think that the 64bit 5200NU is going to have problems running Longhorn?
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
I think the upper FX line is DX9 compliant, though I somewhat disagree (in principle, probably not in practice given current manufacturing and economic realities) with the 5200's only able to do FP16 and still labelled DX9. I'm not even going to get into needing a 3D card for a 2D OS....

Anyway, back to Rollo's original point that DX8 may look as good as DX9 in HL2. I found this quote in FiringSquad's second HL2 benchmark article interesting: "The shadows in Half-Life 2?s DX9 mode are soft shadows, whereas the DX8 shadows have hard edges." I think soft-edged shadows would constitute a significant IQ enhancement.

FS also had something to say about HDR:
If the performance is there for RADEON users, HDR could be a real selling point for ATI, as it?s definitely one of those features that you?re not going to want to turn off once you?ve seen it in action. Remember the first time you saw the sun in Gran Tourismo 3? To the uninitiated, HDR is similar to that effect, only it?s about two times more effective. In addition, the light reflects off of reflective or shiny surfaces. When you combine this with the Half-Life 2 water (which is the most accurate representation of water we?ve seen in a game to date), can you imagine how good HDR would look at sunset over a large body of water? HDR will also be used for effects like muzzle flashes.
Sounds good to me.
 

IQfreak

Junior Member
Sep 15, 2003
12
0
0
Originally posted by: BenSkywalker
Why are you mentioning PS/VS 3.0?

Because it is part of the DX 9.0 spec.


You're right, my mistake.


Carmack was asking for FP16 all along, he commented that he was happy with FP16 numerous times. As far as coding a special path, so is ARB2. The OGL 2.0 spec was years away from completion when Carmack started coding Doom3.


But Carmack has also had to resort to FX12 for the NV3x. Nvidia can't run the standard ARB2 path at anywhere close to the speed of the R3x0.

The point is that the NV3x needs special considerations that deviate from the standard APIs, whereas the R3x0 does not. That's just the way it is. The fact that Nvidia renders 8 bits more precision in full precision mode is no excuse for the extremely poor performance. ATI's FP24 is as fast or faster than Nvidia's FP16. And FP24 was deemed to be enough for full-precision in DX9.

Nvidia made some bad design decisions that sacrificed too much floating-point shader speed.


Compiler optimizations will gain SOME speed, but it's obvious that both Nvidia and game developers feel that the NV3x doesn't have enough shader power to run full DX9-level code.

nV feels that way? Why do I seriously doubt that...


Apparently they do, or they wouldn't have gotten developers to rewrite DX9 content to lower standards. The HL2 mixed-mode path reportedly doesn't even use effects like HDR.

Another reason is, of course, all the driver cheating involving shader and precision hacks.

Nvidia denounced 3dmark03 for not being representative of DX9 because it uses more PS1.4 than PS2.0, but now they're promoting the use of PS1.4 as a substitute for PS2.0!


It can't even run only floating-point!

Yes it can.


I meant satisfactorily. That's why the the first couple of DX9 games are actually defaulting to DX8 mode for the low-mid range cards, and why mixed-mode paths have had to be implemented. Either the game will reduce precision for the cards, or the drivers will. Then there's Doom3 which uses FX12/FP16 for the NV3x path.


The DX9 spec doesn't include integer precision,

Yes it does.


Okay, I should be more precise. The PS2.0 spec!


Forget about whether or not any noticeable quality is lost - the NV3x simply can't handle pure DX9 at a level of performance that is acceptable to developers and gamers.

Performance is the viable topic of discussion here. Everything else you have stated is a question of what level of performance is viable, not if it can or not.


Yes, but I'm not arguing about whether it can. My point is that if it can't practically, then it practically can't. Nvidia is promoting DX9+ compliance in their marketing, but sub-DX9 compliance in the way their cards are measured against the competition, and used in games.


As I explained, why would developers go to all the extra trouble of writing a special path if there was any confidence that the NV3x could get up to acceptable speed by the time the games ship?

Why did UbiSoft have to add in a dumbed down version of their code for Splinter Cell for ATi when even nV's DX8 parts can run the game as it was meant to be played?


Nvidia-specific coding.

There are many games that ATI can't run as well as Nvidia (ex: NWN, Tribes 2), and it has nothing to do with ATI's inability to cope with the API standards.


It's funny how the NV3x is marketed as being "beyond DX9", but developers are being forced to make compromises for even standard DX9. Actually, it' not funny. People are getting ripped off if they want full DX9 functionality. How will the 5200/5600-class products offer that?

Why do you only take into consideration games that are PS 2.0 limited as being fully DX9 compliant?


I'm not. My statement is accurate. Half-Life 2 is an example of where Nvidia can't even handle the standard DX9 requirements - according to the developers, and even Nvidia themselves apparently, since they had to work with Valve to implement a separate path. Anyway, PS2.0 is the standout feature of DX9.


Apparently, Nvidia has GREAT DX9 speed! Or are they deceiving the public, and that's why people are so upset?

Look at Ginats and SeriousSam and compare them using the Kyro2. In SS the Kyro2 could best the GeForce2Ultra while in Giants it got whipped by a GeForce1 SDR. Was PVR fooling the public?


I don't know, was Kyro misrepresenting their performance by claiming or being expected/believed to do one thing, and then doing something else?


Pete-

Seriously, I'd appreciate their superior feature set if they had drivers to match.

Absolutely they need drivers that will bring their performance up for Pixel Shaders. I'm talking about people saying the NV3X isn't DX9 compliant, that is no more true then it is with the R3x0 parts. I don't recall you stating the NV3X wasn't DX( compliant, but if you did then I certainly disagree with that strongly. You think that the 64bit 5200NU is going to have problems running Longhorn?


I don't deny that the NV3x is fully DX9 compliant. I'm just saying that it's not being treated as such in the realworld (or won't be), by Nvidia, by developers, and naturally by consumers.

 

nemesismk2

Diamond Member
Sep 29, 2001
4,810
5
76
www.ultimatehardware.net
Originally posted by: BenSkywalker


Look at Ginats and SeriousSam and compare them using the Kyro2. In SS the Kyro2 could best the GeForce2Ultra while in Giants it got whipped by a GeForce1 SDR. Was PVR fooling the public?

The problem with the Kyro 2 was that most games companies did very little testing with it.
The performance with Serious Sam wasn't great until a patch was released for it and that became a common thing with the Kyro 2.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
I think the upper FX line is DX9 compliant, though I somewhat disagree (in principle, probably not in practice given current manufacturing and economic realities) with the 5200's only able to do FP16 and still labelled DX9. I'm not even going to get into needing a 3D card for a 2D OS....

If my mother was upgrading her gfx card today, I'd pick her up a 5200 because of Longhorn(she insisted on purchasing a WinXP license desptie my having seven Win2K licenses to my name not currently being used just so she can run the latest version of Windows and she insists on being able to run with everything on). For what the 5200 is going to see for the majority of its life in the systems that will be running it, people will be pleased with it.

Anyway, back to Rollo's original point that DX8 may look as good as DX9 in HL2. I found this quote in FiringSquad's second HL2 benchmark article interesting: "The shadows in Half-Life 2?s DX9 mode are soft shadows, whereas the DX8 shadows have hard edges." I think soft-edged shadows would constitute a significant IQ enhancement.

I guess it would depend on how well they were done where I would rank it. It certainly has the potential to be a significant IQ enhancer.

FS also had something to say about HDR:

Not expressly about that in particular, but are you disheartened by the lack of NV30 numbers for HL2? It really would have helped get a gauge on how much potential the NV35 has with updated drivers.

IQF-

But Carmack has also had to resort to FX12 for the NV3x.

Had to or did are two different things. If a shader has no benefit running at a higher level of precission, then why do it? Using FX12 when it works helps do two things- one is to keep the NV30/31/34 running faster when the game is out, two it helps all of those with DX8 boards. What point is there running a shader in FP24 if FX12 gives the exact same results for that shader?

The point is that the NV3x needs special considerations that deviate from the standard APIs, whereas the R3x0 does not.

For Doom3, the entire ARB2 path was deviated from the API. OGL 2.0 was not finalized when the NV3x core was finished, let alone the R3x0. Carmack has explained quite nicely that the reason for making a nV path was that he could, nV had their extensions which allowed him to do what he wanted to. He also has a code path for the NV2X line, the NV1X line and the R2X0 line. You could easily make the argument that the ARB2 path is the R3x0's code path, just one that the NV3X boards can run if they chose to(though with his statements that there is no discernable quality difference, I can't see why you would want to).

Nvidia made some bad design decisions that sacrificed too much floating-point shader speed.

Should we say the same about ATi and their shadowing speed?

I'm not. My statement is accurate.

Freelancer is DX9, where is the big performance edge there? My Ti4200 ran it better then my 9500Pro. The game didn't utilise PS 2.0, and the edge wasn't there.

There are many games that ATI can't run as well as Nvidia (ex: NWN, Tribes 2), and it has nothing to do with ATI's inability to cope with the API standards.

You mean the lead development card can have an edge over the competition? Imagine if the Radeon9700Pro had been the lead dev card for HL2 and most of the other current DX9 games prior to the launch of the NV3X, they might end up showing a sizeable performance gap with the developers blaming nVidia for not running it right......
 

jjjayb

Member
Jul 4, 2001
75
0
0
Heh, Skywalker, you have the funniest posts I've seen for some time. What is the old saying? Something along the lines of "dazzle them with bulls*it"? Do you work for Nvidia? You are posting half truths and outright lies. Just throw in a couple technical terms and nobody will be the wiser huh?
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Ben: Well, the 5200 may be a success for people who upgrade to Longhorn, but I don't consider paying $80 for an otherwise pokey 3D card to run a future OS very successful. I'm not paying $80 to get 3D effects in a $150+ OS. And most people won't upgrade to Longhorn when it debuts, just like I'm sure a lot of people are still using Win98--they stick with what they got when they bought their PC. By the time Longhorn debuts, the 5200 will hopefully be a vague, discomfiting memory.

It would've been nice to see how the 5800 fares compared to the 5900, and it would've given us a clue as to how the NV36 will fare compared to the 5600. I'm kind of disappointed in the HL2 benchmarks as they were presented, as reviewers were forced to review systems that were not their own, were not able to present screenshots showng IQ differences among modes, and not that many cards were benched (and the only one close to my 9100, the 9200, had rendering issues ). But I guess we don't have long to wait before we see some serious articles--maybe a week after the HL2 benchmark is publicly available.

I'm sure Freelancer "requires" DX9 mainly to force people to update, and not for D3D9 specifically. Heck, the game has been in development since probably before DX9 was even conceived!
 

IQfreak

Junior Member
Sep 15, 2003
12
0
0
Originally posted by: BenSkywalker

But Carmack has also had to resort to FX12 for the NV3x.

Had to or did are two different things. If a shader has no benefit running at a higher level of precission, then why do it? Using FX12 when it works helps do two things- one is to keep the NV30/31/34 running faster when the game is out, two it helps all of those with DX8 boards. What point is there running a shader in FP24 if FX12 gives the exact same results for that shader?


You're being hypothetical. Carmack has said the quality is slightly lower with the NV3x path. Obviously some tradeoffs must have been made for performance, however small. The NV3x lineup *needs* FX12 to run optimally, so developers like Valve have had to make concessions in their standard DX9 path for integer precision. The NV3x won't be running full floating-point. Whether FX12 is used with no quality loss is irrelevant. The NV3x has significant resources for FX12, so they must be utilized (and ARE) to even approach the performance of the R3x0.


The point is that the NV3x needs special considerations that deviate from the standard APIs, whereas the R3x0 does not.

For Doom3, the entire ARB2 path was deviated from the API. OGL 2.0 was not finalized when the NV3x core was finished, let alone the R3x0. Carmack has explained quite nicely that the reason for making a nV path was that he could, nV had their extensions which allowed him to do what he wanted to. He also has a code path for the NV2X line, the NV1X line and the R2X0 line. You could easily make the argument that the ARB2 path is the R3x0's code path, just one that the NV3X boards can run if they chose to(though with his statements that there is no discernable quality difference, I can't see why you would want to).


So you're disagreeing with my statement that the NV3x *needs* special considerations outside of the standard APIs? I'm not just talking about Doom3. My assertion is that the NV3x can't handle standard code acceptably. Forget about taking advantage of additional features not exposed in the APIs - the NV3x suffers *too much* without architecture-specific optimizations.


Nvidia made some bad design decisions that sacrificed too much floating-point shader speed.

Should we say the same about ATi and their shadowing speed?


Why? Was the Doom3 engine a standard that ATI had input on? Besides, what's wrong with ATI's shadowing speed? Don't even bring up the Doom3 benchmark, which was just a one-time PR stunt put on by Nvidia, that ATI didn't even know about or optimize for. And ATI still won the high quality tests.


I'm not. My statement is accurate.

Freelancer is DX9, where is the big performance edge there? My Ti4200 ran it better then my 9500Pro. The game didn't utilise PS 2.0, and the edge wasn't there.


What DX9-exclusive features does Freelancer use? Would you say it's a worthy demonstration of DX9 on the basis of DX9 content, or does it just have token DX9 effects?


There are many games that ATI can't run as well as Nvidia (ex: NWN, Tribes 2), and it has nothing to do with ATI's inability to cope with the API standards.


You mean the lead development card can have an edge over the competition? Imagine if the Radeon9700Pro had been the lead dev card for HL2 and most of the other current DX9 games prior to the launch of the NV3X, they might end up showing a sizeable performance gap with the developers blaming nVidia for not running it right......


Oh definitely. Except in the case of HL2, no IHV-specific optimizations or features were used. All you need is a good DX9 card that can cope with the standards. Unlike NWN and Tribes 2.

 

SilverTrine

Senior member
May 27, 2003
312
0
0
If the Gf Fx line is superior in Dx9 to the Radeon line why is Nvidia paying Eidos presumably tens of thousands of dollars to remove any benchmarking capabilities from their Dx9 Tomb Raider game?
Dont debate Skywalker point by point its like trying to make a snake do tricks, make him answer the fact that Nvidia FX line doesnt have the 24 point precision that is necessary to be called a Direct X 9 card, you dont need VS3 to be Dx9 you do need 24 point precision Nvidia doesnt have it.

Also to even imply that the GF FX is more Dx9 complaint in the face of the Halflife 2 numbers is so absurd its laughable. What part of 'ATi's DX9 parts take Nvidia out behind the woodshed and manhandle them do you have problems understanding?' Time for some remedial reading skills Ben.

The fact of the matter is all true Dx9 games down the road will heavily rely upon pixel shaders to get good performance. The basic truth is that ATi's Dx9 shaders are far superior in every way to the Geforce FX's shaders. Its as simple as that, if you plan to play Dx9 games down the road, the Halflife 2 benchmarks dont lie.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
A summary of the DX9 spec

Anandtech obviously thought it was OK to put in their 9700 preview, showing DX9 calls for PS/VS 2.0, not 3.0.

NVIDIA has also introduced support for DirectX 9's Vertex Shader 2.0 spec.
Once again, NVIDIA went above and beyond the DX9 specification for pixel shaders and introduced what they call their Pixel Shader 2.0+ support.
From the NV30 preview.
Seems DX9 calls for PS2.0

Or am I confused by these 3 things which all suggest PS2.0 in DX9, and not PS3.0?
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
DX9 also includes PS/VS3.0, but those specs will only be revealed later. DX9 was made to last for quite a while, until Longhorn debuts with DX10.
 

jjjayb

Member
Jul 4, 2001
75
0
0
Read John Carmack's latest comments here:

http://english.bonusweb.cz/interviews/carmackgfx.html

Here is what it says for those too lazy to click the link:

Hi John,

No doubt you heard about GeForce FX fiasco in Half-Life 2. In your opinion, are these results representative for future DX9 games (including Doom III) or is it just a special case of HL2 code preferring ATI features, as NVIDIA suggests?


Unfortunately, it will probably be representative of most DX9 games. Doom has a custom back end that uses the lower precisions on the GF-FX, but when you run it with standard fragment programs just like ATI, it is a lot slower. The precision doesn't really matter to Doom, but that won't be a reasonable option in future games designed around DX9 level hardware as a minimum spec.

John Carmack

Does that help clear things up?
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |