Digital (DVI) like (BNC connectors on a CRT) in theory will provide a better picture. However in practice or in a blind test I find it very difficult to tell the difference. This also depends on the quality of the video signal provided by the video card. Some cards are notorious for poor analog video signal quality thus the analog interface takes the rap when in fact it is a poor design on specific products.
The myth that DVI (digital) does not need to be converted to (analog) is just that, a myth. On a analog system, the Ramdac (chip that generates the video signal on the VC) has been integrated into the graphics controller chip for years now. Adding DVI means adding a DVI transmitter chip to the VC and a DVI receiver chip in the monitor.
In order to transmit the digital data from the VC in true digital, the graphics chip must have DVI outputs and the video cable would need to have a single wire for each bit. If this were true the cable would need to contain more than 27 wires. You can imagine how thick this cable would be. DVI converts the parallel data to a number of digital serial channels. Depending on the interface used (DVI-I or DVI-D) the number of serial channels varies. The serial bit steam is then converted back on the monitor side, and the signal must be sampled using the pixel clock. You can argue that DVI actually increases the number of times the signal is processed but since it is a digital signal this is rather a moot point.
Even monitors with a DVI interface convert to analog at the LCD driver level. The digital signal must be converted to an analog in order to achieve the 16M colors. If LCD was pure digital only eight colors, would be achievable. In order to generate the 16M colors each red, green and blue cell must be capable of stepping through 256 shades this is an analog function. In fact, most LCDs maintain the video signal in analog form through to the pixel drivers (NEC was the most notable producer of these).
Most of today?s version of DVI is rather limited. DVI driver chips have a maximum 1600 x 1200 at 60Hz resolution and refresh rate capability. Keep this in mind if you plan on upgrading. LCD?s do not suffer from flicker so the 60Hz is not such a big deal unless you are playing games and want higher FPS. Faster DVI transmitter and receiver chips are being developed if you want to upgrade, however that means replacing both your video card and monitor. With an analog connection you can upgrade either one without the need to replace the other. This is a major reason DVI will not show up on CRT monitors.
Also if you use DVI-I or DVI-D I do not recommend hot swapping (unplugging) the monitor. Turn the computer and monitor off prior to unplugging the monitor.
The chief advantages of DVI are improved image stability. The DVI interface provides a pixel clock, where the analog interface the clock used to sample the analog video has to be derived from the horizontal sync.
Even though it is possible to have pixel jitter in the analog interface, It is extremely rare so if you're seeing noise, odds are something's is wrong.