Actually, that isn't strictly true. As all of the chips in a set range (say Geforce3) have the same pin layout (no matter which manufacturer makes the card), any design/layout changes on the cards must be directly compatible with each other and the reference drivers. Remember, its nvidia who make the chips not MSI or Asus, etc.
The main design changes that may be tweaked out in the drivers is the differences in the BIOS used on cards as few people rarely update their graphics card BIOS so I assume a lot of fixes will be worked in at the driver level.
Also, as for so called cheating in drivers, ATI are not the first and I doubt will be the last:
1) Sticking with Quake3, the S3 Savage 2000 had stunning performance in Quake3 in 32-bit when compared with its technical equal (the GF1 SDR card).
Why? Mainly because by default it never used a 32-bit Z-buffer and hence economised on memory bandwidth. Clever, but definitely a 'cheat'. Although you could always tweak a higher Z-buffer...
2) Nvidia's turn I think. Their success today mainly relies on the fact that the TNT and TNT2 were such good cards for their time. Without a doubt, these are the chips that lifted them above the competition and started them on their way to market domination. However, just dont plan on getting any good
texture filtering from the card. Fair enough, they didn't declare it could do proper tri-linear filtering, but making the option appear as so is really a 'cheat'.
Also, I still think that their IQ for the TNT/2 and early Geforce cards was pretty bad compared to others (no matter what settings you chose - ie. Best quality or Best speed - basically LOD adjust) which is a bit of a 'cheat' especially as it was always noticable bad.