They were cheating in 3DMark2K3, I've said in numerous times.
Wonderful, so you've finally admitted that all eight of FutureMark's findings (which included application detection) are cheats. Thus we can
finally move on.
Why do you make that leap?
Because they took great steps to hide it.
List some. List the exact title, exactly what the cheat is and what it causes.
We can start off with the link I gave you at Beyond3D and I will respond to your point later in this post.
Can I please see a link to a press release or otherwise coming straight from nVidia (ie not just a rumour site saying this is what we heard when we put a glass to the wall of nVidia's conference room) that nVidia admitted to cheating?
You honestly think they could come up with a scheme that the community couldn't hack through inside of a few days?
I don't have the information regarding nVidia's driver strength to make such a guess. However given that it's likely that they started implementing the cheats after the NV30's lackluster debut then it's quite possible they began planning measures to counteract any applications that could detect the cheats.
Besides, I believe the latest patch scripts from RivaTuner have broken it although they don't work on all versions of the drivers.
If they could, the entertainment industry would gladly pay nVidia's driver team millions and millions of dollars to sort out their piracy issue.
What are you talking about? Methods such as 128 bit encryption form the basis of most encryption and are pretty much unbreakable unless you apply brute force to them. nVidia obviously didn't even implement something that strong otherwise Unwinder couldn't have broken it, which provides more basis to suggests a rush job rather than a long term plan.
Not quite. nVidia's DXTC3 was superior to ATi's DXTC1 in terms of image quality.
Yes it was slightly superior and it also came with a 10% performance hit. Also for titles like UT your were simply SOL on nVidia cards and in order to enjoy the richer textures you had to enjoy rainbow coloured artifacts in coronas, skies and everything else that had a hint of transparency.
That was what S3 did, and they created the standard.
And it doesn't make any sense. The angle I'm approaching it from is that there's a performace gain from doing but I can't see why that would be the case.
How many games does it impact?
All Quake 3 engined games, UT, etc.
They knew they had an issue, they implanted a switch in the registry to force the use of S3TC3 for those that wanted to(unfortunately that would not work for UT as the textures were all precompressed which was not the case with the other titles that compressed at run time).
Yes and the switch was applied
after the bechmarks had been run and the user had installed the card and found abysmal image quality. How many websites applied the switch
before benchmarking? Again, it's just another example of nVidia artifically inflating benchmark results through methods that are essentially not available to end-users.
Same with S3's, why aren't you bashing them about it?
Because S3 were long dead when the Radeon/GTS/Voodoo5 benchmarks were in full swing.
That sure as hell wasn't the case at the Basement.
I never said or implied that it was.
Same deal but not a factor as they weren't in the game.
Using that same logic, the R9700Pro has faulty PS2.0 support.
No because running PS 2.0 on a 9700 Pro doesn't create an unusable experience on the Radeon 9700 Pro, neither does it inflate benchmark performance.
My issue with nVidia is for including an unusable feature and also enabling it by default to inflate benchmark numbers. 99% of nVidia users would have either applied the DTX3 fix or turned off texture compression and both methods have a negative impact on performance, throwing out the benchmarks they initially used to make a purchase decision.
If I applied your logic then absolutely. I don't apply your logic however.
I still fail to see how a PS 3.0 application wouldn't work on your 9500 Pro but then worked on your Ti4600. Or is this a hypothetical example?
They aren't using app detection for that, they do it for all D3D games.
So why does the unaltered Direct3D app that Dave intially tested use trilinear
until he changed the executable to "UT2003.exe"?
As far as using app detection for optimizations, PowerVR does this an incredible amount. Are they cheating in d@mn near every game you have heard of?
Yes but again it's like S3 and isn't as much of a factor since they're out of the game. However if they released the Kyro4 which showed comparable performance to today's high end cards and then further investigation showed application detection, shader subsitution and the like then hell yes, I'd be all over it like a rash.
But yes you're quite right, application detection is cheating no matter who does it. You shouldn't be able to change the behaviour of a driver based on the executable name that it's running.