Take DX9 labeling off of fx chips?

spam

Member
Jul 3, 2003
141
0
0
Do you think Nvida would regain some customer trust if it corrected the labelling of 5200 and 5600 as DX9 compatible? It should be changed to DX8.
 

Curley

Senior member
Oct 30, 1999
368
3
76
Nvidia got what they gave to 3DFX. They not only bought 3DFX, they humiliated them. Who is humiliated now???? Nvidia is a good card not a great card.

The curse of 3DFX is among you.
 

spam

Member
Jul 3, 2003
141
0
0
Technically yes it is compliant, practically no it is not! Anybody who buys a 5600ultra thinking they are ready to play DX9 games with the Dx9 settings enabled are going to dissapointed and mad. Far better for Nvidia to be conservative in their claims from this time forward. I think they have suffered currently from their marketing departments exagerations on product performance.
Right now buying an Nvidia card is too much a like shopping at a used car lot! They are not to be taken without salt.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: spam
Do you think Nvida would regain some customer trust if it corrected the labelling of 5200 and 5600 as DX9 compatible? It should be changed to DX8.

As the previous post mentions, they are DX9 compatible. They have the ability to run DX9 instructions just like the FX5900 Ultra... however, they do it MUCH slower. So... they shouldn't have their labeling corrected, they should have their model name corrected... maybe the 5200 should be changed to 3200, and the 5600 changed to 5000.
 

modedepe

Diamond Member
May 11, 2003
3,474
0
0
No they should not have it removed. They ARE dx9 parts. It's up to consumers to see how their dx9 performance is.
 

spam

Member
Jul 3, 2003
141
0
0
Look at the example of AMD and their PR rating on their chips. They earned consumer trust when they kept their PR (performance rating) at a conservative level. Many have said that performance can be superior to the given PR figures. Nvidia needs to restore consumer confidence in their products performance. If they are smart they would not undermine themselves by this misleading labeling of DX 9 parts.
 

spam

Member
Jul 3, 2003
141
0
0
and another thing....
Look at the propriatory designs of Nvidia. Instead of using the industries standard A.P.I.'S for DX9 and OpenGL they have gone off on their own. In all likely hood the delays we see in the release of HalfLife 2 and Doom 3 are due inpart because of Nvidia's code customization requirements. It seems Gabe Newel has had enough and it sounds like he will not delay the release of HL2 despite the lack of Nvidia drivers able to perform decently on DX9 cards. Remember Gabe says that HL2 will select DX8.1 settings as default for the FX cards.
 

Genx87

Lifer
Apr 8, 2002
41,095
513
126
The only thing Gabe Newell has had enough of is the cheeseburgers that 8 million dollar check ATI wrote out to him bought.

This is kind of sad if Valve really did put their game up for auction. And if they did and ATI paid 8 million for it + something like 1000 copies do you honestly think Valve gave it the old bison try when the optimized for the NV3.x?!?!?!?!?!?!?

John Carmack appears to be doing it just fine and getting good results from te NV3.x cards. So what is wrong with Valve? I cant honestly believe after all this rucus they really tried to get Nvidia cards up and running to their potential.

 

1ManArmY

Golden Member
Mar 7, 2003
1,333
0
0
Valve doesn''t need ATI to sell HL2, the product speaks for itself and will sell accordingly.
 

spam

Member
Jul 3, 2003
141
0
0
Think about what you just said for a moment- why would any game developer bundle their game with a card that could not run it? It would be even worse for Nvidia if they did! What a laughing stock Nvidia would be then! "Like It's Meant to be played" Nvidia would not touch HL2 given it's current Fx problems!



The inquirer itself says it cannot establish the truth of this rumor. I think that it likely reflects Nvidia's sour grapes with HL2. It certainly fits the pattern of dis-information we have seen associated with Nvidia's recent history(deny, deflect, denigrate and dismiss your critics). If you read the Inquirer article it really sounds like Inquirer is being USED- all the while Nvidia can preserve plausible deniability.

I think you can add to Aquamark 3d, Tombraider TOD and Halo pc. as further examples of poor performance on DX9 .
 

Genx87

Lifer
Apr 8, 2002
41,095
513
126
I saw 50 FPS on the FX running its DX9 path.
Are you telling me 50 FPS is not playable?!?!?!?!?!?!?!?!?!?!?!?!?

And if you think ATI and Valve didnt exchange some money you are a little naive. What is the purpose of Valve going out of their way to smear Nvidia? Price performance ratios really sealed Valves fate on this one.

 

spam

Member
Jul 3, 2003
141
0
0
What you saw was the 5900ultra running at 50 fps in Dx9 and if I recall that is in Det. 50's which are not acceptable to Half life 2 developers because of optimizations that lower IQ for the sake of FPS. It contradicts Nvidia's delcaration that they would not sacrifice IQ for FPS! It is a desperate time for them saying the right thing while doing the opposite. I hope somebody at Nvidia will begin to think long term about what they are doing to their reputation. It is a classic case of being trapped by their own web of deceit and they can't get out. Truth and honesty are the only way to cut through this mess. Do you remember that saying, " cheaters never prosper" It applies here.
 

peter7921

Senior member
Jun 24, 2002
225
0
0
Originally posted by: Genx87
The only thing Gabe Newell has had enough of is the cheeseburgers that 8 million dollar check ATI wrote out to him bought.

This is kind of sad if Valve really did put their game up for auction. And if they did and ATI paid 8 million for it + something like 1000 copies do you honestly think Valve gave it the old bison try when the optimized for the NV3.x?!?!?!?!?!?!?

John Carmack appears to be doing it just fine and getting good results from te NV3.x cards. So what is wrong with Valve? I cant honestly believe after all this rucus they really tried to get Nvidia cards up and running to their potential.

The topic clearly was about DirextX 9 not open GL, Carmack is doing just fine because he is using the OpenGL API not Direct3D. What you said is irrelevant. BTW if you look at Halo you see similar results and at an article i saw at Gamers Depot you can clearly see with screenshots, that det. 50 drivers lower quality significantly to speed up performance.

Heres the link. Make sure to look at the picture differences its quite amazing.
http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_rel50/001.htm



Like other posters have said Geforce FX is DX9 compliant so there is no need to change labeling.
 

Kongzi

Member
Jul 6, 2003
50
0
0
John Carmack appears to be doing it just fine and getting good results from te NV3.x cards.

Umm... maybe that has to do with the fact that John writes games with a different API.
 

spam

Member
Jul 3, 2003
141
0
0
Originally posted by: Genx87

Like other posters have said Geforce FX is DX9 compliant so there is no need to change labeling.

I think Half Life developers would disagree with you. I think people who bought 5600ultra and lower would disagree with you.

 

RogerAdam

Member
Oct 14, 2001
29
0
0
Originally posted by: peter7921
Originally posted by: Genx87
The only thing Gabe Newell has had enough of is the cheeseburgers that 8 million dollar check ATI wrote out to him bought.

This is kind of sad if Valve really did put their game up for auction. And if they did and ATI paid 8 million for it + something like 1000 copies do you honestly think Valve gave it the old bison try when the optimized for the NV3.x?!?!?!?!?!?!?

John Carmack appears to be doing it just fine and getting good results from te NV3.x cards. So what is wrong with Valve? I cant honestly believe after all this rucus they really tried to get Nvidia cards up and running to their potential.

The topic clearly was about DirextX 9 not open GL, Carmack is doing just fine because he is using the OpenGL API not Direct3D. What you said is irrelevant. BTW if you look at Halo you see similar results and at an article i saw at Gamers Depot you can clearly see with screenshots, that det. 50 drivers lower quality significantly to speed up performance.

Heres the link. Make sure to look at the picture differences its quite amazing.
http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_rel50/001.htm



Like other posters have said Geforce FX is DX9 compliant so there is no need to change labeling.

FUD! Carmack clearly stated the NV3x was not using the ARB2 path, and he has coded for the NV3x by lowering precision (FX12/FP16) within his "special NV3x path", ie few if any advanced shaders, lower precision etc... He did say that ATI cards run the ARB2 path slightly slower than his special Nv3x path at higher precision --->ATI FP24 Vs NV3x FX12/FP16 duh! yeah the Nvidia path is faster here, but the ATI path is of higher IQ.

Nvidia is pulling the 51.75's (obviously), it was "leaked" most likely for the Aquamark bench, they got the numbers posted with some negative scrutiny, but the negative remarks about IQ and hacks in the driver for performance is a small price to pay as they got the numbers into the fray, and with people posting around who have no clue of the Nv3x problems because of their loyalty to the NV brand, I'd bet they're counting on those to cloud perspectives of those (esp. developers) who actually DO KNOW.

 

peter7921

Senior member
Jun 24, 2002
225
0
0
Originally posted by: spam
Originally posted by: Genx87

Like other posters have said Geforce FX is DX9 compliant so there is no need to change labeling.

I think Half Life developers would disagree with you. I think people who bought 5600ultra and lower would disagree with you.


I never said the performance was as good as the ATI counter-parts, i never said that Nvidia hasn't mislead the public. Whether you get 1 FPS or 1000 FPS is irrelevent, technically it is DX9 compliant.
 

peter7921

Senior member
Jun 24, 2002
225
0
0
Originally posted by: RogerAdam
Originally posted by: peter7921
Originally posted by: Genx87
The only thing Gabe Newell has had enough of is the cheeseburgers that 8 million dollar check ATI wrote out to him bought.

This is kind of sad if Valve really did put their game up for auction. And if they did and ATI paid 8 million for it + something like 1000 copies do you honestly think Valve gave it the old bison try when the optimized for the NV3.x?!?!?!?!?!?!?

John Carmack appears to be doing it just fine and getting good results from te NV3.x cards. So what is wrong with Valve? I cant honestly believe after all this rucus they really tried to get Nvidia cards up and running to their potential.

The topic clearly was about DirextX 9 not open GL, Carmack is doing just fine because he is using the OpenGL API not Direct3D. What you said is irrelevant. BTW if you look at Halo you see similar results and at an article i saw at Gamers Depot you can clearly see with screenshots, that det. 50 drivers lower quality significantly to speed up performance.

Heres the link. Make sure to look at the picture differences its quite amazing.
http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_rel50/001.htm



Like other posters have said Geforce FX is DX9 compliant so there is no need to change labeling.

FUD! Carmack clearly stated the NV3x was not using the ARB2 path, and he has coded for the NV3x by lowering precision (FX12/FP16) within his "special NV3x path", ie few if any advanced shaders, lower precision etc... He did say that ATI cards run the ARB2 path slightly slower than his special Nv3x path at higher precision --->ATI FP24 Vs NV3x FX12/FP16 duh! yeah the Nvidia path is faster here, but the ATI path is of higher IQ.

Nvidia is pulling the 51.75's (obviously), it was "leaked" most likely for the Aquamark bench, they got the numbers posted with some negative scrutiny, but the negative remarks about IQ and hacks in the driver for performance is a small price to pay as they got the numbers into the fray, and with people posting around who have no clue of the Nv3x problems because of their loyalty to the NV brand, I'd bet they're counting on those to cloud perspectives of those (esp. developers) who actually DO KNOW.


Was that targetted at my post or Genx87's? Cause i never mentioned anything about Doom3 other than it used a different API than HL2 does. I completely agree with you, i read the same article as you.
 

RogerAdam

Member
Oct 14, 2001
29
0
0
Originally posted by: peter7921
Originally posted by: RogerAdam
Originally posted by: peter7921
Originally posted by: Genx87
The only thing Gabe Newell has had enough of is the cheeseburgers that 8 million dollar check ATI wrote out to him bought.

This is kind of sad if Valve really did put their game up for auction. And if they did and ATI paid 8 million for it + something like 1000 copies do you honestly think Valve gave it the old bison try when the optimized for the NV3.x?!?!?!?!?!?!?

John Carmack appears to be doing it just fine and getting good results from te NV3.x cards. So what is wrong with Valve? I cant honestly believe after all this rucus they really tried to get Nvidia cards up and running to their potential.

The topic clearly was about DirextX 9 not open GL, Carmack is doing just fine because he is using the OpenGL API not Direct3D. What you said is irrelevant. BTW if you look at Halo you see similar results and at an article i saw at Gamers Depot you can clearly see with screenshots, that det. 50 drivers lower quality significantly to speed up performance.

Heres the link. Make sure to look at the picture differences its quite amazing.
http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_rel50/001.htm



Like other posters have said Geforce FX is DX9 compliant so there is no need to change labeling.

FUD! Carmack clearly stated the NV3x was not using the ARB2 path, and he has coded for the NV3x by lowering precision (FX12/FP16) within his "special NV3x path", ie few if any advanced shaders, lower precision etc... He did say that ATI cards run the ARB2 path slightly slower than his special Nv3x path at higher precision --->ATI FP24 Vs NV3x FX12/FP16 duh! yeah the Nvidia path is faster here, but the ATI path is of higher IQ.

Nvidia is pulling the 51.75's (obviously), it was "leaked" most likely for the Aquamark bench, they got the numbers posted with some negative scrutiny, but the negative remarks about IQ and hacks in the driver for performance is a small price to pay as they got the numbers into the fray, and with people posting around who have no clue of the Nv3x problems because of their loyalty to the NV brand, I'd bet they're counting on those to cloud perspectives of those (esp. developers) who actually DO KNOW.


Was that targetted at my post or Genx87's? Cause i never mentioned anything about Doom3 other than it used a different API than HL2 does. I completely agree with you, i read the same article as you.


Opps! sorry, I quoted the wrong post. I was replying to whomever said that D3 was better on the Nv3x, that is nothing but fantasy. Sorry

 

spam

Member
Jul 3, 2003
141
0
0
to peter7921,

Remember the point of this post is to re-establish trust of Nvidia. Do you think that labelling Fx parts as Dx9 compliant builds customer confidence and trust? I am sure you do not! Let me ask you what corrections does Nvidia have to make to restore confidence? My suggestion would be a dramatic display of a change in policy and direction. What would be your solution?
 

peter7921

Senior member
Jun 24, 2002
225
0
0
Originally posted by: spam
to peter7921,

Remember the point of this post is to re-establish trust of Nvidia. Do you think that labelling Fx parts as Dx9 compliant builds customer confidence and trust? I am sure you do not! Let me ask you what corrections does Nvidia have to make to restore confidence? My suggestion would be a dramatic display of a change in policy and direction. What would be your solution?

I completely agree that Nvidia has lost alot of trust for cheating. What they need to do is stop limiting IQ in favour of performance and then trying to hide the fact. People haven't lost faith in Nvidia for claiming DX9 compliance but from all the cheating.

People have lost faith in Nvidia because of the lies not the fact that it underperforms in DX9 compared to ATI.
Example: Radeon 8500 is a directX 8 card right? So is the Geforce 4600, now the fact that the 4600 greatly outperforms the 8500 doesnt mean that 8500 isn't directX 8 compliant or is in someway misleading, its just slower.

What i am trying to say is that Nvidia doest need to re-label its chip because it is DX9 compliant it just needs to stop the these so called optimizations that ruin IQ.



I do see your point though because people who buy a 5600 or 5200 hoping to get good DX9 performance will be extremely disappointed. But people who buy products without doing research deserve what they get in my opinion.
 

Alkali

Senior member
Aug 14, 2002
483
0
0
The top FX cards are still compliant with DX9, (if you bend the rules slightly on pixel pipelines), they are just very slow in certain conditions.
 

spam

Member
Jul 3, 2003
141
0
0


I do see your point though because people who buy a 5600 or 5200 hoping to get good DX9 performance will be extremely disappointed. But people who buy products without doing research deserve what they get in my opinion.

They will be especially mad if they bought their cards prior to the release of DX9 benchmarks (excepting the earlier release of 3dmark 2003)! I would feel like I had been taken advantage of.

Perhaps re-labelling back to DX 8 would be bad for another reason. Litigation, class action lawsuits, all that good stuff would be a likely outcome. In fact litigation issues may be a reason that Nvidia has not said "We are sorry" any admission of guilt could result in very expensive court cases.
 

beserkfury15

Member
Jun 25, 2003
91
0
0
yeah, i sure as hell feel cheated, i got a 5600. it's pretty good right now, and plays well, but with the dx9 benchmarks... especially seeing how bad a 5900ultra performs, how the hell am i suppose to get good frames with a 5600. anyone here knows of a midranged ati card with vivo? the main reason i chose a 5600 was because of vivo.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |