AnandTech Half-Life 2 Info :)

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Nebor

Lifer
Jun 24, 2003
29,582
12
76
You know what really gets me... You guys take Valve's word as fact.... Are you the same people that strung out several 500+ post Apple G5 threads?

You didn't believe that gross performance exageration, because the PC was crippled for it... Yet you just accept that these ATI cards, with slower clockspeeds, thrash the FX series? Riiiiiight.

You guys should put your skepticism hats back on.
 

UlricT

Golden Member
Jul 21, 2002
1,966
0
0
Originally posted by: Nebor
You know what really gets me... You guys take Valve's word as fact.... Are you the same people that strung out several 500+ post Apple G5 threads?

You didn't believe that gross performance exageration, because the PC was crippled for it... Yet you just accept that these ATI cards, with slower clockspeeds, thrash the FX series? Riiiiiight.

You guys should put your skepticism hats back on.

I just wanna know something Nebor.... would you have said anything against thing whole thing if it was about the GeforceFX thrashing the R3xx's ass?

And what is this about clockspeed? Do you really think that is the only thing that shows how efficiently hardware can run? If I could clock my TNT2 @ 1Ghz, I could get some zippy framerates (I know, DX support isnt there... so sue me )
 
Apr 17, 2003
37,622
0
76
what u people commenting on doom dont understand is that it is far from completion as opposed to HL2 so results may change drastically by the time it is released
 

UlricT

Golden Member
Jul 21, 2002
1,966
0
0
Originally posted by: shady06
what u people commenting on doom dont understand is that it is far from completion as opposed to HL2 so results may change drastically by the time it is released

well... Doom3 is based on OpenGl 2.0, which ATi is admittedly weak at. Maybe the GeforceFX should be marketed as an OpenGL 2.0 card instead of a DirectX 9.0 Card! That would pull in the public, wouldn't it?
 
Jun 18, 2000
11,140
722
126
First and foremost, the G5 was an unproven machine. Nobody would believe that Apple would come out swinging like they did.

Secondly, VALVe's performance numbers correlate perfectly with past findings when testing PS2.0 shaders. Do you think these HL2 benches are the first time somebody's pitted a 9800 against a FX5900 in a shader test? Remember the results of the PS2.0 tests in 3DMark2003? Remember the Tomb Raider tests? Remember the AquaMark tests?

Guess who won these. I'll give you a hint. It wasn't nVidia.
 

Nebor

Lifer
Jun 24, 2003
29,582
12
76
No, I would have expected the FX to thrash the 9800.... much higher clockspeeds, 256mb of ram (that WAS an ultra they tested, RIGHT?) Plus I'm an nvidia fan....

See, all of you guys are acting like the Mac fans.... Believing these frankly unbelievable benchmarks... I'm the voice of reason, the PC fan, who said, "Why does that Dell you're using have PC2700 in it?" "Why is hyperthreading disabled?"

You guys fell for the marketing, hook line and sinker.

Besides, I'm a very proud, stubborn fanboi. I would honestly buy an awful card and suffer with it out of spite. I'm strange like that...
 
Apr 17, 2003
37,622
0
76
Originally posted by: KnightBreed
First and foremost, the G5 was an unproven machine. Nobody would believe that Apple would come out swinging like they did.

Secondly, VALVe's performance numbers correlate perfectly with past findings when testing PS2.0 shaders. Do you think these HL2 benches are the first time somebody's pitted a 9800 against a FX5900 in a shader test? Remember the results of the PS2.0 tests in 3DMark2003? Remember the Tomb Raider tests? Remember the AquaMark tests?

Guess who won these. I'll give you a hint. It wasn't nVidia.

people rather not consider this and just come to the conclusion that valve and ATi are in bed with each other
 

UlricT

Golden Member
Jul 21, 2002
1,966
0
0
Originally posted by: Nebor
No, I would have expected the FX to thrash the 9800.... much higher clockspeeds, 256mb of ram (that WAS an ultra they tested, RIGHT?) Plus I'm an nvidia fan....

See, all of you guys are acting like the Mac fans.... Believing these frankly unbelievable benchmarks... I'm the voice of reason, the PC fan, who said, "Why does that Dell you're using have PC2700 in it?" "Why is hyperthreading disabled?"

You guys fell for the marketing, hook line and sinker.

Besides, I'm a very proud, stubborn fanboi. I would honestly buy an awful card and suffer with it out of spite. I'm strange like that...

Good for you! Now thats what I call dedication! How long have you been at this? since the NV1?
 

BoberFett

Lifer
Oct 9, 1999
37,563
9
81
Originally posted by: NOXThis is getting worse then the console market... next we'll have games which will only run on ATI or Nvidia cards.

If that happens, then it's only the fault of the GPU designers. Don't you think that the software developers would love it if their software ran on every piece of hardware perfectly? Imagine the headaches saved.

If the graphic chip companies can't decide on a standard that forces game companies to either choose one platform and develop for it, or invest extra money in writing specific code paths for each platform. That directly digs into the developers profit. Either way, it's a lose/lose situation for consumers and the game companies.
 
Jun 18, 2000
11,140
722
126
Originally posted by: Nebor
No, I would have expected the FX to thrash the 9800.... much higher clockspeeds, 256mb of ram (that WAS an ultra they tested, RIGHT?) Plus I'm an nvidia fan....

See, all of you guys are acting like the Mac fans.... Believing these frankly unbelievable benchmarks... I'm the voice of reason, the PC fan, who said, "Why does that Dell you're using have PC2700 in it?" "Why is hyperthreading disabled?"

You guys fell for the marketing, hook line and sinker.

Besides, I'm a very proud, stubborn fanboi. I would honestly buy an awful card and suffer with it out of spite. I'm strange like that...
Ok, I get it - you're just yanking our chain. I admit it, you actually had me believing you were the stupidest person I've ever met.

I'm not getting strung along this flamefest. *bows out*
 

Nebor

Lifer
Jun 24, 2003
29,582
12
76
Well, when Nvidia became the maker of my voodoo 5 card, my business followed 3dfx on to it's new home....

Um... GeForce, GeForce2, Geforce2 GTS, Geforce 3 ti500, GeForce4 ti4400, soon to be a GeForce 5900.
 

Alkali

Senior member
Aug 14, 2002
483
0
0
The reason the GeForce FX cards are having trouble seems quite obvious to me. nVidia have done the following three things, which have ultimately proved 'daft?', but it was probably worth the risk. If they had got it right, they would have sewn up the market.

1. 32-bit floating point precision. The DX9 Spec states 24-bit.
2. 4x2 pixel pipelines. The DX9 spec states 8x1.
3. cg_language. They tried to introduce a new graphics language, which if it had worked might have ground ATi into the mud, but it didnt. I have seen no evidence that it has actively helped in the coding, especially seen as Valve have had to resort to DX8.1 and mixed code rather than write some nVidia cg_stuff.


ATi is now the only real player with a serious DX9 card on the market, so obviously, because ATi have stuck like glue to the spec of DX9, they win easy.

Although theoretically nVidia's cards do higher precision, its like filling a half-pint glass with a pint of lager. A lot of effort gets wasted, and the end result is exactly the same. Therefore ATi lose nothing as they use 24-bit, and nVidia waste loads because they use 32-bit.

Now of course, nVidia's card can also run in non-full-precision mode, i.e. 16-bit, but to take that analogy again, that would be a half full half-pint glass, and you can definately see the effect it has had on the 'quality' of whatever it has tried to render. The nVidia cards cannot do 24-bit precision, so they will have to make 32-bit EXTREMELY fast, or build a new chipset.

We will see. I am just glad I have ATi, feel sorry for anyone who bought the FX5900Ultra for over £500.
 

UlricT

Golden Member
Jul 21, 2002
1,966
0
0
Originally posted by: BoberFett
Originally posted by: NOXThis is getting worse then the console market... next we'll have games which will only run on ATI or Nvidia cards.

If that happens, then it's only the fault of the GPU designers. Don't you think that the software developers would love it if their software ran on every piece of hardware perfectly? Imagine the headaches saved.

If the graphic chip companies can't decide on a standard that forces game companies to either choose one platform and develop for it, or invest extra money in writing specific code paths for each platform. That directly digs into the developers profit. Either way, it's a lose/lose situation for consumers and the game companies.

This is the major mistake Nvidia has done. When there are two very good APIs out there, they go ahead and create hardware that runs good only using proprietary code? They could have been forgiven if the Proprietary code added on performance, and not allowed them to ALMOST reach the competition! I dont get it... there is something very very wrong here...!


/EDIT: reading Alkali's post gives me an idea as to what went wrong with Nvidias implementation. Thx dude!
 
Jun 18, 2000
11,140
722
126
Originally posted by: UlricT
This is the major mistake Nvidia has done. When there are two very good APIs out there, they go ahead and create hardware that runs good only using proprietary code? They could have been forgiven if the Proprietary code added on performance, and not allowed them to ALMOST reach the competition! I dont get it... there is something very very wrong here...!

/EDIT: reading Alkali's post gives me an idea as to what went wrong with Nvidias implementation. Thx dude!
Cg is meant to be an alternative to the DirectX shading language (HLSL). It is not an entire API, simply a middleware tool to aid in developing shader code. A developer would write a shader in Cg, which would compile the code into (1) generic assembly shader code which is vendor neutral (!) or (2) nVidia specific "to-the-metal" code that was optimized for their hardware. Considering shaders are but a miniscule portion of an entire game engine, Cg would have to be used in conjunction with DirectX or OpenGL.

Companies were free to write a custom compiler to plug into the backend of Cg for their own products. Nobody bothered because Microsoft's higher level shading language was, syntactically, almost identical.

Cg would have been great if Microsoft didn't create it's own language for DX9.

Well, that's how I've always understood nVidia's Cg.
 

Genx87

Lifer
Apr 8, 2002
41,095
513
126
nVidia should have put more effort into a proper DX9 feature set instead of trying to push CG. Can anyone say "Glide 2"?? HL2 will not be the only DX9 game that wont run well on nV hardware. How is that Valves fault?


CG was proposed to be included with the opengl spec but ATI and 3dlabs fearing losing their grip proposed an alternate standard for 2.0. Apparently they proposed it promising compilers were on the way(Which they werent) while Nvidia already had the compilers up and running. So right now opengl is sitting with a thumb up its ass

CG is open to anybody who wants to license it(I believe it is even free). It isnt like nobody can't use it. From the few people who have something serious to say about it. The path opengl is taking is not a good one and using CG would of been a cleaner wiser move.

Politics can be nasty hehe.

 

Czar

Lifer
Oct 9, 1999
28,510
0
0
Originally posted by: Nebor
You know what really gets me... You guys take Valve's word as fact.... Are you the same people that strung out several 500+ post Apple G5 threads?

You didn't believe that gross performance exageration, because the PC was crippled for it... Yet you just accept that these ATI cards, with slower clockspeeds, thrash the FX series? Riiiiiight.

You guys should put your skepticism hats back on.
Difference is that Valve is not making and selling Radeons

 

Nebor

Lifer
Jun 24, 2003
29,582
12
76
Originally posted by: Czar
Originally posted by: Nebor
You know what really gets me... You guys take Valve's word as fact.... Are you the same people that strung out several 500+ post Apple G5 threads?

You didn't believe that gross performance exageration, because the PC was crippled for it... Yet you just accept that these ATI cards, with slower clockspeeds, thrash the FX series? Riiiiiight.

You guys should put your skepticism hats back on.
Difference is that Valve is not making and selling Radeons

Well, the sale of Radeons now equal the sale of HL2... For every radeon sold, a copy of HL2 is sold.... Valve has a motiff to sell Radeons.
 

Regs

Lifer
Aug 9, 2002
16,665
21
81
Originally posted by: Nebor
No, I would have expected the FX to thrash the 9800.... much higher clockspeeds, 256mb of ram (that WAS an ultra they tested, RIGHT?) Plus I'm an nvidia fan....

See, all of you guys are acting like the Mac fans.... Believing these frankly unbelievable benchmarks... I'm the voice of reason, the PC fan, who said, "Why does that Dell you're using have PC2700 in it?" "Why is hyperthreading disabled?"

You guys fell for the marketing, hook line and sinker.

Besides, I'm a very proud, stubborn fanboi. I would honestly buy an awful card and suffer with it out of spite. I'm strange like that...

Just remember, there is no rule of thumb stating higher clock speeds always means better performance. But then again I've might of missed your sarcasm on that post.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Nvidia responds:

Over the last 24 hours, there has been quite a bit of controversy over comments made by Gabe Newell of Valve at ATIs Shader Day.

During the entire development of Half Life 2, NVIDIA has had close technical contact with Valve regarding the game. However, Valve has not made us aware of the issues Gabe discussed.

We're confused as to why Valve chose to use Release. 45 (Rel. 45) - because up to two weeks prior to the Shader Day we had been working closely with Valve to ensure that Release 50 (Rel. 50) provides the best experience possible on NVIDIA hardware.

Regarding the Half Life2 performance numbers that were published on the web, we believe these performance numbers are invalid because they do not use our Rel. 50 drivers. Engineering efforts on our Rel. 45 drivers stopped months ago in anticipation of Rel. 50. NVIDIA's optimizations for Half Life 2 and other new games are included in our Rel.50 drivers - which reviewers currently have a beta version of today. Rel. 50 is the best driver we've ever built - it includes significant optimizations for the highly-programmable GeForce FX architecture and includes feature and performance benefits for over 100 million NVIDIA GPU customers.

Pending detailed information from Valve, we are only aware one bug with Rel. 50 and the version of Half Life 2 that we currently have - this is the fog issue that Gabe refered to in his presentation. It is not a cheat or an over optimization. Our current drop of Half Life 2 is more than 2 weeks old. NVIDIA's Rel. 50 driver will be public before the game is available. Since we know that obtaining the best pixel shader performance from the GeForce FX GPUs currently requires some specialized work, our developer technology team works very closely with game developers. Part of this is understanding that in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no image quality benefit. Sometimes this involves converting 32-bit floating point precision shader operations into 16-bit floating point precision shaders in order to obtain the performance benefit of this mode with no image quality degradation. Our goal is to provide our consumers the best experience possible, and that means games must both look and run great.

The optimal code path for ATI and NVIDIA GPUs is different - so trying to test them with the same code path will always disadvantage one or the other. The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers.

In addition to the developer efforts, our driver team has developed a next-generation automatic shader optimizer that vastly improves GeForce FX pixel shader performance across the board. The fruits of these efforts will be seen in our Rel.50 driver release. Many other improvements have also been included in Rel.50, and these were all created either in response to, or in anticipation of the first wave of shipping DirectX 9 titles, such as Half Life 2.

We are committed to working with Gabe to fully understand his concerns and with Valve to ensure that 100+ million NVIDIA consumers get the best possible experience with Half Life 2 on NVIDIA hardware.
 

Regs

Lifer
Aug 9, 2002
16,665
21
81
Nvidia's response is basically - "They didn't let us optimize it for it to work right" . Do they really have to optimized every damn game to get it to work properly on their hardware? .
 

sash1

Diamond Member
Jul 20, 2001
8,897
1
0
Originally posted by: Regs
Nvidia's response is basically - "They didn't let us optimize it for it to work right" . Do they really have to optimized every damn game to get it to work properly on their hardware? .
This is what I was thinking too.

As soon as news hits that ATi is slaughtering nVidia, theya re quick to say they didn't have time to optimize for the game/they were using the wrong drivers.

~Aunix

 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |