Originally posted by: Matthias99
Originally posted by: Cuular
ATI doesn't usually need it because their picture is already "vibrant" and colorful compared to the nvidia picture. If you compare a non-digital vibrance messed with nvidia display side by side with an ATI one you will see it. Nvidia added that for a reason. Their default color settings are somewhat "washed out" looking. ATI has similiar boosting capabilities in their color correction screen, it just requires you to set 2 or 3 settings, instead of a single slider that does them for you.
Also, as noted by several posters, Digital Vibrance screws badly with the accuracy of your color rendering. It's along the lines of the 'red push' that TV manufacturers use to make displays look 'better' sitting in a showroom.
If you put two monitors side-by-side, people will naturally prefer the one that is brighter or more highly saturated (and, to some extent, the one that has a 'warmer' color palette). If you ever check out the TVs on display in Best Buy / Circuit City / whatever, you'll generally find that the brightness, contrast, and color saturation are all cranked WAY up. While this gives it a nice 'wow' factor (especially if it's sitting next to another set that's not totally out of whack), it greatly reduces the monitor's ability to accurately display colors. If you get a calibration DVD like AVIA or Digital Video Essentials, you can see how bad the default settings make them look (I'm sure it would also show how much DV is messing with the color settings).
If you like seeing wrong (but more highly saturated) colors, DV is great. If you prefer actually seeing things rendered as they should be, not so much.