Originally posted by: spam
Do you think Nvida would regain some customer trust if it corrected the labelling of 5200 and 5600 as DX9 compatible? It should be changed to DX8.
Originally posted by: Genx87
The only thing Gabe Newell has had enough of is the cheeseburgers that 8 million dollar check ATI wrote out to him bought.
This is kind of sad if Valve really did put their game up for auction. And if they did and ATI paid 8 million for it + something like 1000 copies do you honestly think Valve gave it the old bison try when the optimized for the NV3.x?!?!?!?!?!?!?
John Carmack appears to be doing it just fine and getting good results from te NV3.x cards. So what is wrong with Valve? I cant honestly believe after all this rucus they really tried to get Nvidia cards up and running to their potential.
John Carmack appears to be doing it just fine and getting good results from te NV3.x cards.
Originally posted by: Genx87
Like other posters have said Geforce FX is DX9 compliant so there is no need to change labeling.
Originally posted by: peter7921
Originally posted by: Genx87
The only thing Gabe Newell has had enough of is the cheeseburgers that 8 million dollar check ATI wrote out to him bought.
This is kind of sad if Valve really did put their game up for auction. And if they did and ATI paid 8 million for it + something like 1000 copies do you honestly think Valve gave it the old bison try when the optimized for the NV3.x?!?!?!?!?!?!?
John Carmack appears to be doing it just fine and getting good results from te NV3.x cards. So what is wrong with Valve? I cant honestly believe after all this rucus they really tried to get Nvidia cards up and running to their potential.
The topic clearly was about DirextX 9 not open GL, Carmack is doing just fine because he is using the OpenGL API not Direct3D. What you said is irrelevant. BTW if you look at Halo you see similar results and at an article i saw at Gamers Depot you can clearly see with screenshots, that det. 50 drivers lower quality significantly to speed up performance.
Heres the link. Make sure to look at the picture differences its quite amazing.
http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_rel50/001.htm
Like other posters have said Geforce FX is DX9 compliant so there is no need to change labeling.
Originally posted by: spam
Originally posted by: Genx87
Like other posters have said Geforce FX is DX9 compliant so there is no need to change labeling.
I think Half Life developers would disagree with you. I think people who bought 5600ultra and lower would disagree with you.
Originally posted by: RogerAdam
Originally posted by: peter7921
Originally posted by: Genx87
The only thing Gabe Newell has had enough of is the cheeseburgers that 8 million dollar check ATI wrote out to him bought.
This is kind of sad if Valve really did put their game up for auction. And if they did and ATI paid 8 million for it + something like 1000 copies do you honestly think Valve gave it the old bison try when the optimized for the NV3.x?!?!?!?!?!?!?
John Carmack appears to be doing it just fine and getting good results from te NV3.x cards. So what is wrong with Valve? I cant honestly believe after all this rucus they really tried to get Nvidia cards up and running to their potential.
The topic clearly was about DirextX 9 not open GL, Carmack is doing just fine because he is using the OpenGL API not Direct3D. What you said is irrelevant. BTW if you look at Halo you see similar results and at an article i saw at Gamers Depot you can clearly see with screenshots, that det. 50 drivers lower quality significantly to speed up performance.
Heres the link. Make sure to look at the picture differences its quite amazing.
http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_rel50/001.htm
Like other posters have said Geforce FX is DX9 compliant so there is no need to change labeling.
FUD! Carmack clearly stated the NV3x was not using the ARB2 path, and he has coded for the NV3x by lowering precision (FX12/FP16) within his "special NV3x path", ie few if any advanced shaders, lower precision etc... He did say that ATI cards run the ARB2 path slightly slower than his special Nv3x path at higher precision --->ATI FP24 Vs NV3x FX12/FP16 duh! yeah the Nvidia path is faster here, but the ATI path is of higher IQ.
Nvidia is pulling the 51.75's (obviously), it was "leaked" most likely for the Aquamark bench, they got the numbers posted with some negative scrutiny, but the negative remarks about IQ and hacks in the driver for performance is a small price to pay as they got the numbers into the fray, and with people posting around who have no clue of the Nv3x problems because of their loyalty to the NV brand, I'd bet they're counting on those to cloud perspectives of those (esp. developers) who actually DO KNOW.
Originally posted by: peter7921
Originally posted by: RogerAdam
Originally posted by: peter7921
Originally posted by: Genx87
The only thing Gabe Newell has had enough of is the cheeseburgers that 8 million dollar check ATI wrote out to him bought.
This is kind of sad if Valve really did put their game up for auction. And if they did and ATI paid 8 million for it + something like 1000 copies do you honestly think Valve gave it the old bison try when the optimized for the NV3.x?!?!?!?!?!?!?
John Carmack appears to be doing it just fine and getting good results from te NV3.x cards. So what is wrong with Valve? I cant honestly believe after all this rucus they really tried to get Nvidia cards up and running to their potential.
The topic clearly was about DirextX 9 not open GL, Carmack is doing just fine because he is using the OpenGL API not Direct3D. What you said is irrelevant. BTW if you look at Halo you see similar results and at an article i saw at Gamers Depot you can clearly see with screenshots, that det. 50 drivers lower quality significantly to speed up performance.
Heres the link. Make sure to look at the picture differences its quite amazing.
http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_rel50/001.htm
Like other posters have said Geforce FX is DX9 compliant so there is no need to change labeling.
FUD! Carmack clearly stated the NV3x was not using the ARB2 path, and he has coded for the NV3x by lowering precision (FX12/FP16) within his "special NV3x path", ie few if any advanced shaders, lower precision etc... He did say that ATI cards run the ARB2 path slightly slower than his special Nv3x path at higher precision --->ATI FP24 Vs NV3x FX12/FP16 duh! yeah the Nvidia path is faster here, but the ATI path is of higher IQ.
Nvidia is pulling the 51.75's (obviously), it was "leaked" most likely for the Aquamark bench, they got the numbers posted with some negative scrutiny, but the negative remarks about IQ and hacks in the driver for performance is a small price to pay as they got the numbers into the fray, and with people posting around who have no clue of the Nv3x problems because of their loyalty to the NV brand, I'd bet they're counting on those to cloud perspectives of those (esp. developers) who actually DO KNOW.
Was that targetted at my post or Genx87's? Cause i never mentioned anything about Doom3 other than it used a different API than HL2 does. I completely agree with you, i read the same article as you.
Originally posted by: spam
to peter7921,
Remember the point of this post is to re-establish trust of Nvidia. Do you think that labelling Fx parts as Dx9 compliant builds customer confidence and trust? I am sure you do not! Let me ask you what corrections does Nvidia have to make to restore confidence? My suggestion would be a dramatic display of a change in policy and direction. What would be your solution?
I do see your point though because people who buy a 5600 or 5200 hoping to get good DX9 performance will be extremely disappointed. But people who buy products without doing research deserve what they get in my opinion.