Question... is HDMI supposed to offer superior image quality than DVI? I tried hooking up my monitor to the HDMI port and it looked like crap. I quickly switched back to the DVI connection.
HDMI and DVI were virtually the same, until the most recent versions of HDMI - they're just different connectors.
HDMI also has stricter specs for some features that are optional on DVI:
- DHCP anti-piracy technology is optional for DVI connections, but mandatory for HDMI (This means that not all DVI monitors/TVs support blu-ray playback, but all HDMI monitors/TVs do)
- Audio signals. HDMI connections must support digital audio signals. This is optional (and very rarely supported) by DVI.
However, over the last 3 or 4 years, a number of upgrades to the HDMI spec have been made - in preparation for higher resolutions, color depths, refresh rates, 3D, etc. As a result, these latest HDMI signals are not backward compatible with DVI. (But if you a have a DVI signal on HDMI pins, any HDMI monitor/TV should accept it).
The reason your HDMI input looked poor quality was because of processing in your monitor/TV. TV broadcasts (and therefore DVDs, etc.) are prepared so that the edges of the images are not intended to be displayed. This is called 'overscan' - this is a historical thing. The edges of a TV picture with analog broadcast were often full of artefacts and unstable - so the TVs zoomed in on the centre of the image, cutting off the edges.
This overscan isn't necessary today - but it is kept for tradition. So, a 1920x1080 TV when it gets a 1920x1080 HDMI signal will tend to treat it like a TV signal. It will cut the image down to 1720x980 and then scale up to 1920x1080 - causing loss of edges and lots of scaling artefacts (as the image is no longer native resolution). Because DVI ports tend to be used for computers, this is unacceptable, so most TVs don't overscan on DVI ports. Even so, most TVs should have an option to disable overscan and force native resolution (1:1 pixel mapping) mode.