I suspect that nVidia have written a whole heap of custom DX 8/9 shaders for current games and replaced the games' shaders with their own. Yes it's cheating but if it helps the customer without degrading IQ then you really can't fault them for trying to make their product more attractive.I do not believe that nvidia could code 50% to 60% more performance in drivers alone.
That is 100% f4nb0y15m!Originally posted by: Luagsch
somehow i don't trust nvidia.... don't know why...
i read the whole thing and everything looks fine but the voice in my head says "it's nvidia, watch out"
Originally posted by: Rollo
Err, Jiffy, not to nitpick, but you're contradicting yourself here. The 9700Pro started to ship to retail stores 8/19/2002, which isn't even 14 months ago, let alone two years?This is not fanboyism, this is based on the fact that ATI was(and still is) market leader for the last two years. Just like Nvidia wore the crown from the TNT2 all the way to the GeForce 4. Bring on NV40/R400.
Beyond that, when you consider there's no real overriding advantage of the 9800 over the 5900 for the last 4-5 months, you're down to talking about a window where ATI was a clear market leader for 9-10 months, a long way from two years.
have the same regard for nvidia as I had before-I don't trust them.
I suspect that nVidia have written a whole heap of custom DX 8/9 shaders for current games and replaced the games' shaders with their own. Yes it's cheating but if it helps the customer without degrading IQ then you really can't fault them for trying to make their product more attractive.
Originally posted by: lifeguard1999
BFG10K wrote
I suspect that nVidia have written a whole heap of custom DX 8/9 shaders for current games and replaced the games' shaders with their own. Yes it's cheating but if it helps the customer without degrading IQ then you really can't fault them for trying to make their product more attractive.
I agree with your suspicion, but disagree with your conclusion that it is cheating. As long as the answer (images) come out to be same, what makes it cheating?
Originally posted by: Rogodin2
When I buy coffe from a gourmet roaster I expect it to be high grade arabica-which is what I roasted-if I found out (either by the taste or by looking at their bags) I'd not purchase their products-this analogy holds true for nvidia.
Originally posted by: BFG10K
The problem is that they're constantly going to have to do this for every new game that comes out.
Originally posted by: 1ManArmY
Good read for the most part but I thought there was to much cheer leading going on for Nvidia improvements.
Originally posted by: Jeff7181
Originally posted by: lifeguard1999
BFG10K wrote
I suspect that nVidia have written a whole heap of custom DX 8/9 shaders for current games and replaced the games' shaders with their own. Yes it's cheating but if it helps the customer without degrading IQ then you really can't fault them for trying to make their product more attractive.
I agree with your suspicion, but disagree with your conclusion that it is cheating. As long as the answer (images) come out to be same, what makes it cheating?
Exactly... it's very important to remember, as stated in this review, that nVidia and ATI GPU's are VERY different, and use different methods to accomplish the same goal. So in the end, what does it matter HOW the image is put up on the screen, as long as it happens, it looks good, and it's performance is acceptable.
BFG, I see what you're saying, that if nVidia does their own thing optimized for their hardware, and ATI does what the game developer tells them to, it's not exactly a level playing field. But... if ATI could write their drivers to get better performance without reducing image quality, would you, as a consumer, want that? You should.
AMD and Intel processors use very different means to achieve the same goals... does that mean one is cheating? Is Intel cheating by using SSE2 in the P4 while the AthlonXP didn't? By your logic, it is. Of course, I disagree. SSE2 is an optimization by Intel, so is Hyper-Threading... neither of which AMD CPU's have been able to take advantage of... if those are "legal" in the computer industry, why can't nVidia do the same things with their GPU????
I thought it was pretty clear in the article that nVidia plans on making it seamless... or at least trying to. So that the driver can turn regular DX9 instructions into whatever instructions nVidia wants to use for their GPU. In effect, they're programming the GPU to run DX9 instructions... hence the term, "highly programmable GPU."I am hoping that there comes a time when the devs can code to an api... not to a cards individual strengths... it will save them money and time during which they can implement more features perhaps... or *gasp* better storylines
Originally posted by: Jeff7181
I thought it was pretty clear in the article that nVidia plans on making it seamless... or at least trying to. So that the driver can turn regular DX9 instructions into whatever instructions nVidia wants to use for their GPU. In effect, they're programming the GPU to run DX9 instructions... hence the term, "highly programmable GPU."I am hoping that there comes a time when the devs can code to an api... not to a cards individual strengths... it will save them money and time during which they can implement more features perhaps... or *gasp* better storylines
Right now NVIDIA is at a disadvantage; ATI's hardware is much easier to code for and the performance on Microsoft's HLSL compiler clearly favors the R3x0 over the NV3x. NVIDIA has a long road ahead of them in order to improve their compilers to the point where game developers won't have to hand-code special NV3x codepaths, but for now ATI seems to have won the battle.
Originally posted by: virtualgames0
WOW!
you guys get the award for the "MOST COMPREHENSIVE 9800XT/FX5950 REVIEW"
60+20 pages... whoa
very imformative review...
your comment actualy made me laugh. thnx :beer:Originally posted by: XeoBllaze
That is 100% f4nb0y15m!Originally posted by: Luagsch
somehow i don't trust nvidia.... don't know why...
i read the whole thing and everything looks fine but the voice in my head says "it's nvidia, watch out"
When I buy coffe from a gourmet roaster I expect it to be high grade arabica-which is what I roasted-if I found out (either by the taste or by looking at their bags) I'd not purchase their products-this analogy holds true for nvidia.