Originally posted by: Yuriman
Long ago, perfect color came to computers. 24bit. Tell me, is a video card that supports 24bit color still top-of-the-line?
I'd suggest reading up on the X-Fi before commenting on it.
That's not a valid analogy. 2^24 = 16,777,216 colors. The human eye can only distinguish about 4 or 5 million colors, so 24-bit color is, for all intents and purposes, "perfect."
With that in mind, the main reasons video cards went to 32-bit color is ease of programming and speed. 32-bit color dedicates a nice 4-byte chunk of memory to each pixel. With binary amounts of memory, addressing in 4-byte chunks divides the memory neatly. 24-bit color results in oddly-sized 3-byte chunks which take more computational power to address.
Back when having 4mb of RAM on a video card was a premium feature, you couldn't just use 1/3 more memory to make your life easier, so you sacrificed speed for space and used 24-bit color. But, once 16mb became the norm, 1600 pixels x 1200 pixels x 32bits/pixel x double buffering = 15 megabytes. 32-bit color became the norm.
A full-color digital photo displayed on an old Matrox Millenium II video card will look just as good as when it's displayed on a brand spanking new Geforce 7800 GTX. Better, maybe, because converting the digital information to analog output is never a "perfect" process, which is why M-Audio and other exotic audiophile cards sound better with plain stereo sources.
However, Doom3 will look significantly better on a 7800GTX, which can do more PROCESSING to the image. That's why the X-Fi is a better card for gaming - it can process more 3D sound effects with better quality and more features than anything else. Static vs. interactive content.