I would to with the Radeon.
To quote Tom from
www.tomshardware.com
"
First of all I'd like to note that Radeon's benchmark results at 16-bit color look worse than they are, simply because numbers can't give you a feel for game play. I understand if some of you might complain about Radeon's 16-bit performance, but the 16-bit scores of Radeon are only an issue if you happen to play at 1280x1024x16 or 1600x1200x16, because Radeon's scores are definitely good enough in all the lower 16-bit color resolutions. In those two resolutions GeForce 2 GTS is clearly and utterly beating the Radeon. This doesn't come as a surprise, because
the NVIDIA chip does not suffer from memory bandwidth limitation as much in 16-bit color as it suffers in 32-bit color, making it able to reach at least 70% of its claimed fill rate. Radeon's HyperZ is also not too effective in 16-bit color, which is why Radeon's scores are almost identical in 16-bit color as in 32-bit color.
Things are a lot different when you look at the results in 32-bit color. You might be missing the scores at 1152x864 and 1280x1024, but believe me, as soon as the resolution skips 1024x768 Radeon is ahead of the rest, thanks to its HyperZ feature. The same is valid for FSAA. Let's be honest, why should somebody who is so much into image quality that he is using FSAA use anything worse than 32-bit color?
I personally like the Radeon, which is mainly due to the elegant 'HyperZ'-feature with the stupid name. ATi has shown that the memory bandwidth issue can be tackled in a different way than with pure brute force. It's like a light and fast sports car with a much better fuel consumption thanks to smart technology.
Radeon is indeed up there with the top crop when it comes to 32-bit performance and the chip comes with a wealth of new 3D features. Additionally you get the best integrated video, DVD and HDTV solution that money can buy right now.
What is NVIDIA supposed to say? 16-bit color is more important than 32-bit color? I don't think so, since it was NVIDIA who told us how important 32-bit color is back in 1999 when 3dfx's Voodoo3 was unable to support that.
I am pretty sure that Radeon's performance will further improve once the drivers have matured a bit. I certainly look forward to the luxurious 'All-In-One' Radeon that's supposed to be released in early fall of this year. The SDR Radeon might mix up NVIDIA's GeForce 2 MX sales as well, because it is meant to offer performance that's close to DDR Radeon for a rather low price.
As I already said, I like the Radeon. I like it
because I prefer intelligent technology to brute force. That's why I also prefer a Porsche 911 Turbo to a Dodge Viper."
I still remember when I got my first Geforce DDR card. Drivers were good for it, but
nvIDiaIOTS made the cards power consumption out of AGP spec and it would not work with two brand new motherboards. Of course they didn't correct this until the Geforce2 cards. They left the owners of the Geforce SDR and Geforce DDR high and dry.
People who bash the drivers for the Radeon, don't own or even tried the card.
I am running a BETA version of W2K drivers right now and they are great. I am using the AIW version as well, so there is more stuff to my card than the 64MB version.