Originally posted by: jiffylube1024
Larry, I don't know if you're being intentionally obtuse to piss off Jack, or are ignorant of the facts, or just want to be argumentative.
I was just trying to throw his argument back at him, to point out how pointless it is.
Originally posted by: jiffylube1024
Comparing running dual flat screen monitors with one DVI and one VGA port versus one or two CRT monitors using DVI-VGA adaptors is apples to oranges.
Well, the signal issues with running an LCD via VGA may be more subjectively pronounced to some. But the point was that there
is degradation - in either case.
Originally posted by: jiffylube1024
Personally on my dual LCD setup the LCD using the VGA port is not only less clear or sharp than the DVI one, but it seems to also have vaguely discernable "refresh" lines going down the screen (although this is probably due to the shielding on my (poor) VGA cable.
That's some other problem, probably something with the panel itself. Sounds like it's not syncing up exactly correctly or something. (Is it kind of like those older TV sets that had "rolling" programs, due to not properly adjusting the vertical-hold control?) Others have reported that on higher-quality panels, there is no discernable difference between the inputs.
Originally posted by: jiffylube1024
Running a DVI equipped monitor through a VGA port is completely different to running a CRT monitor through a DVI port via an adaptor (DVI-I to DVI-A). Sure there might be ever so slight signal degradation due to the fact that you're using a $2 part using most definitely not the top-of-the-line circuitry inside, but even the basic DVI-VGA adaptors that come with your average GeForce/Radeon card should produce an artifact free image.
Not at high resolutions, like 1600x1200 or 1920x1480 (?) or something.
Originally posted by: jiffylube1024
If you're having problems I'd suspect your VGA cable as much as the adaptor, or try another adaptor to see if you have a bad adaptor. Perhaps a better quality VGA cable would fix your problem, because as JackBurton said, high-end 3d rendering cards come with all DVI ports and work fine with DVI-VGA adaptors.
It's not a
bad problem, mind you, but a
discernable one, for me. For text on a highly-contrasting background, at high resolution and smaller point sizes, you can see a slight degradeing in the edges of the vertical lines on the characters. It doesn't make things unreadable, but it is noticable. It's the same sort of thing that adding a mechanical analog VGA switch box inline with the signals will cause, although that effect is generally even more pronounced than the DVI-to-VGA adaptor usage.
But by the same argument that you are using, I could quote some posts of other people that have done A/B testing with their new higher-end LCDs, between the DVI and VGA ports, and they can't see any difference at all. I could also suggest that your LCD panel is not operating properly with the VGA input, and they you should consider getting a better-quality panel that doesn't suffer from that problem. (For the sake of logical argument here - I'm not realistically suggesting that. But it could easily just as well be true.)
Originally posted by: jiffylube1024
If you're that suspect of the tiny run of copper wires in a DVI-VGA adaptor degrading your signal then why not take it a step further and suspect the copper going from the DAC to the port on the back of the video card?
Well, that has just as much to do with the signal-quality, technically, as well. In fact, that stage is just as important for DVI-D signals too, and often is lacking, as the TH and ExtremeTech DVI shootouts revealed. In those cases (and has been documented in the past here as well), you might well be
better off using the VGA outputs from your card to drive your LCD panel, if: 1) the TMDS outputs from your card had poor signal quality, and 2) the VGA input on the LCD panel was of sufficient quality to not cause perceptable visual degradation as compared to an otherwise sufficient-quality TMDS/DVD-D signal, at the resolutions that the user was to view it at.
Originally posted by: jiffylube1024
I agree completely with JackBurton's initial statement- that all video cards (at least ones priced $100 and up) should come with two single-link DVI-I ports. The debate about higher end cards coming with one or two (more expensive) dual-link ports is an entirely different debate altogether.
Well then, I'm going to have to lump you in with Jack's position in this debate as well. Please don't inconvenience me and degrade
my signal quality, or add costs to the video card that
I have purchased, just to satisfy the demands of a minority of the current total users out there.
I guess what really gets me is the blatant hypocracy of Jack position on the matter, that all cards should be DVI/DVI, and there shouldn't be any DVI/VGA cards, and he also refuses to admit that one can easily choose to drive dual LCD displays, one using the DVI and another using VGA, if one so chooses, claiming that somehow my position of not removing the VGA output, is somehow preventing him from running his displays that way.
Originally posted by: jiffylube1024
One bit of confusion that I got from your posts Larry is that you stated earlier something to the effect of "why not have a single dual-link DVI port on the back of video cards along with a VGA port and then just split up the dual-link DVI into two single-link connectors" .
From what I have read (And I've read a fair bit) dual-link DVI, although it does have two TMDS tramsmitters, can not drive two monitors. All the two TMDS transmitters on the card do is offer twice the bandwidth to a single monitor for the purpose of driving monitors at resolutions that need more than 165MHz of bandwidth (like 20xx by 15xx).
Perhaps there is something inherent in the signalling protocol that prevents it, but I don't know the details. If one single TMDS link can drive one monitor, then why can't two links drive two monitors? Whether or not they are output on one physical DVI port or two? As long as all of the signals are present, then it would seem logical that they could be split out that way, much like a dual-line phone RJ11 jack can be split out into two single-line RJ11 jacks and used that way with two single-line phones.
Now, the current implimentation of single dual-link TMDS transmitters, may slave the sync signals of both of them to the same outputs coming off of the CRTC on the GPU, in that case no, you couldn't easily use them to drive two seperate displays then, but if the cards are changed to support the config that I proposed, then this issue could be changed and fixed as well. (I assume.)