Originally posted by: BenSkywalker
There's no reason there should be an "easily noticeable quality dropoff", unless the VGA output of the card is substandard to begin with or the adapter cable is defective.
I have yet to see any DVI-VGA adaptor that didn't clearly degrade IQ quality at higher settings compared to using the straight VGA out. My monitor has dual inputs- I can directly compare them at the exact same settings- DVI adaptors destroy IQ particularly at higher settings. I should qualify this by saying I've only seen it on about a dozen different boards- all of them ATi and nVidia, no Matrox parts.
It's no different electrically than, say, going through a KVM switch.
Absolutely correct, and KVMs have a horrific impact on IQ. If you can't see it using one, daisy chain half a dozen KVMs together sometime and even the legally blind should be able to see the difference- although all you need to do is set your monitor to something like 1920x1440@85Hz and you should have no problem at all seeing the impact(although it is quite noticeable running 1600x1200@85Hz on every part I've seen).