Originally posted by: TheSnowman
Originally posted by: Insomniak
Originally posted by: TheSnowman
Originally posted by: edmundoab
really hope futuremark can do something to disable whatever optimization that Both Nvidia and ATi can do.
Ati doesn't do anything like that
ROFLICOPTER!!!1
Well they don't, they did it for 3dmark2001 but then they stoped doing that and they now have a policy not to do it for any synthetic benchmarks.
Saying they don't do "anything like that" connotates that ATi doesn't optimize. They most certainly do. Do they do app detection? Not that anyone has been able to discern yet, but in my mind that doesn't make them any less guilty than Nvidia. Saying "well, the optimize in different ways..." is like saying "well, he used a steaknife for the murder instead of a lead pipe...."
The point is, they both "optimize", and frankly I'm not happy about that regardless of which camp we're talking about. Let the APIs optimize for your chips, and let the developers optimize for APIs. It works better that way.