Originally posted by: Nebor
You know what really gets me... You guys take Valve's word as fact.... Are you the same people that strung out several 500+ post Apple G5 threads?
You didn't believe that gross performance exageration, because the PC was crippled for it... Yet you just accept that these ATI cards, with slower clockspeeds, thrash the FX series? Riiiiiight.
You guys should put your skepticism hats back on.
Originally posted by: shady06
what u people commenting on doom dont understand is that it is far from completion as opposed to HL2 so results may change drastically by the time it is released
Originally posted by: KnightBreed
First and foremost, the G5 was an unproven machine. Nobody would believe that Apple would come out swinging like they did.
Secondly, VALVe's performance numbers correlate perfectly with past findings when testing PS2.0 shaders. Do you think these HL2 benches are the first time somebody's pitted a 9800 against a FX5900 in a shader test? Remember the results of the PS2.0 tests in 3DMark2003? Remember the Tomb Raider tests? Remember the AquaMark tests?
Guess who won these. I'll give you a hint. It wasn't nVidia.
Originally posted by: Nebor
No, I would have expected the FX to thrash the 9800.... much higher clockspeeds, 256mb of ram (that WAS an ultra they tested, RIGHT?) Plus I'm an nvidia fan....
See, all of you guys are acting like the Mac fans.... Believing these frankly unbelievable benchmarks... I'm the voice of reason, the PC fan, who said, "Why does that Dell you're using have PC2700 in it?" "Why is hyperthreading disabled?"
You guys fell for the marketing, hook line and sinker.
Besides, I'm a very proud, stubborn fanboi. I would honestly buy an awful card and suffer with it out of spite. I'm strange like that...
Originally posted by: NOXThis is getting worse then the console market... next we'll have games which will only run on ATI or Nvidia cards.
Ok, I get it - you're just yanking our chain. I admit it, you actually had me believing you were the stupidest person I've ever met.Originally posted by: Nebor
No, I would have expected the FX to thrash the 9800.... much higher clockspeeds, 256mb of ram (that WAS an ultra they tested, RIGHT?) Plus I'm an nvidia fan....
See, all of you guys are acting like the Mac fans.... Believing these frankly unbelievable benchmarks... I'm the voice of reason, the PC fan, who said, "Why does that Dell you're using have PC2700 in it?" "Why is hyperthreading disabled?"
You guys fell for the marketing, hook line and sinker.
Besides, I'm a very proud, stubborn fanboi. I would honestly buy an awful card and suffer with it out of spite. I'm strange like that...
Originally posted by: BoberFett
Originally posted by: NOXThis is getting worse then the console market... next we'll have games which will only run on ATI or Nvidia cards.
If that happens, then it's only the fault of the GPU designers. Don't you think that the software developers would love it if their software ran on every piece of hardware perfectly? Imagine the headaches saved.
If the graphic chip companies can't decide on a standard that forces game companies to either choose one platform and develop for it, or invest extra money in writing specific code paths for each platform. That directly digs into the developers profit. Either way, it's a lose/lose situation for consumers and the game companies.
Cg is meant to be an alternative to the DirectX shading language (HLSL). It is not an entire API, simply a middleware tool to aid in developing shader code. A developer would write a shader in Cg, which would compile the code into (1) generic assembly shader code which is vendor neutral (!) or (2) nVidia specific "to-the-metal" code that was optimized for their hardware. Considering shaders are but a miniscule portion of an entire game engine, Cg would have to be used in conjunction with DirectX or OpenGL.Originally posted by: UlricT
This is the major mistake Nvidia has done. When there are two very good APIs out there, they go ahead and create hardware that runs good only using proprietary code? They could have been forgiven if the Proprietary code added on performance, and not allowed them to ALMOST reach the competition! I dont get it... there is something very very wrong here...!
/EDIT: reading Alkali's post gives me an idea as to what went wrong with Nvidias implementation. Thx dude!
Difference is that Valve is not making and selling RadeonsOriginally posted by: Nebor
You know what really gets me... You guys take Valve's word as fact.... Are you the same people that strung out several 500+ post Apple G5 threads?
You didn't believe that gross performance exageration, because the PC was crippled for it... Yet you just accept that these ATI cards, with slower clockspeeds, thrash the FX series? Riiiiiight.
You guys should put your skepticism hats back on.
Originally posted by: Czar
Difference is that Valve is not making and selling RadeonsOriginally posted by: Nebor
You know what really gets me... You guys take Valve's word as fact.... Are you the same people that strung out several 500+ post Apple G5 threads?
You didn't believe that gross performance exageration, because the PC was crippled for it... Yet you just accept that these ATI cards, with slower clockspeeds, thrash the FX series? Riiiiiight.
You guys should put your skepticism hats back on.
Originally posted by: Nebor
No, I would have expected the FX to thrash the 9800.... much higher clockspeeds, 256mb of ram (that WAS an ultra they tested, RIGHT?) Plus I'm an nvidia fan....
See, all of you guys are acting like the Mac fans.... Believing these frankly unbelievable benchmarks... I'm the voice of reason, the PC fan, who said, "Why does that Dell you're using have PC2700 in it?" "Why is hyperthreading disabled?"
You guys fell for the marketing, hook line and sinker.
Besides, I'm a very proud, stubborn fanboi. I would honestly buy an awful card and suffer with it out of spite. I'm strange like that...
Over the last 24 hours, there has been quite a bit of controversy over comments made by Gabe Newell of Valve at ATIs Shader Day.
During the entire development of Half Life 2, NVIDIA has had close technical contact with Valve regarding the game. However, Valve has not made us aware of the issues Gabe discussed.
We're confused as to why Valve chose to use Release. 45 (Rel. 45) - because up to two weeks prior to the Shader Day we had been working closely with Valve to ensure that Release 50 (Rel. 50) provides the best experience possible on NVIDIA hardware.
Regarding the Half Life2 performance numbers that were published on the web, we believe these performance numbers are invalid because they do not use our Rel. 50 drivers. Engineering efforts on our Rel. 45 drivers stopped months ago in anticipation of Rel. 50. NVIDIA's optimizations for Half Life 2 and other new games are included in our Rel.50 drivers - which reviewers currently have a beta version of today. Rel. 50 is the best driver we've ever built - it includes significant optimizations for the highly-programmable GeForce FX architecture and includes feature and performance benefits for over 100 million NVIDIA GPU customers.
Pending detailed information from Valve, we are only aware one bug with Rel. 50 and the version of Half Life 2 that we currently have - this is the fog issue that Gabe refered to in his presentation. It is not a cheat or an over optimization. Our current drop of Half Life 2 is more than 2 weeks old. NVIDIA's Rel. 50 driver will be public before the game is available. Since we know that obtaining the best pixel shader performance from the GeForce FX GPUs currently requires some specialized work, our developer technology team works very closely with game developers. Part of this is understanding that in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no image quality benefit. Sometimes this involves converting 32-bit floating point precision shader operations into 16-bit floating point precision shaders in order to obtain the performance benefit of this mode with no image quality degradation. Our goal is to provide our consumers the best experience possible, and that means games must both look and run great.
The optimal code path for ATI and NVIDIA GPUs is different - so trying to test them with the same code path will always disadvantage one or the other. The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers.
In addition to the developer efforts, our driver team has developed a next-generation automatic shader optimizer that vastly improves GeForce FX pixel shader performance across the board. The fruits of these efforts will be seen in our Rel.50 driver release. Many other improvements have also been included in Rel.50, and these were all created either in response to, or in anticipation of the first wave of shipping DirectX 9 titles, such as Half Life 2.
We are committed to working with Gabe to fully understand his concerns and with Valve to ensure that 100+ million NVIDIA consumers get the best possible experience with Half Life 2 on NVIDIA hardware.
Originally posted by: Compddd
I love the BS that spews from Nvidia
This is what I was thinking too.Originally posted by: Regs
Nvidia's response is basically - "They didn't let us optimize it for it to work right" . Do they really have to optimized every damn game to get it to work properly on their hardware? .