If you are doing graphics software development and use DirectX, I'd consider the ATI option. Shader programming is more "to spec" than Nvidia. Nvidia drivers if you accidently forget to set an alpha channel for example, it will assume its full opaque, ATi on the other hand will default to 0 (completely transparent) which the variable defaults to anyways. So, Nvidia will assume things, ATI is more "as it's supposed to do."
I used to use Nvidia cards for game programming, and then when I'd run on ATI I'd find alot of "bugs" in my code, only to find out the bugs were actually there on the Nvidia side but I programmed using their assumptions rather than following the spec. I now only program on ATI cards for that reason.
But maybe Nvidia drivers have changed since.
I used to use Nvidia cards for game programming, and then when I'd run on ATI I'd find alot of "bugs" in my code, only to find out the bugs were actually there on the Nvidia side but I programmed using their assumptions rather than following the spec. I now only program on ATI cards for that reason.
But maybe Nvidia drivers have changed since.