jiffylube1024
Diamond Member
- Feb 17, 2002
- 7,430
- 0
- 71
Originally posted by: VirtualLarry
It's not a bad problem, mind you, but a discernable one, for me. For text on a highly-contrasting background, at high resolution and smaller point sizes, you can see a slight degradeing in the edges of the vertical lines on the characters. It doesn't make things unreadable, but it is noticable. It's the same sort of thing that adding a mechanical analog VGA switch box inline with the signals will cause, although that effect is generally even more pronounced than the DVI-to-VGA adaptor usage.Originally posted by: jiffylube1024
If you're having problems I'd suspect your VGA cable as much as the adaptor, or try another adaptor to see if you have a bad adaptor. Perhaps a better quality VGA cable would fix your problem, because as JackBurton said, high-end 3d rendering cards come with all DVI ports and work fine with DVI-VGA adaptors.
But by the same argument that you are using, I could quote some posts of other people that have done A/B testing with their new higher-end LCDs, between the DVI and VGA ports, and they can't see any difference at all. I could also suggest that your LCD panel is not operating properly with the VGA input, and they you should consider getting a better-quality panel that doesn't suffer from that problem. (For the sake of logical argument here - I'm not realistically suggesting that. But it could easily just as well be true.)
Larry, I see where you're going with this argument, but I'm not talking about subjective tests here here - I'm talking about how the technologies work objectively.
LCD monitors require more digital to analog conversions to run on a VGA connection. Whether the user perceives a difference or not is irrelevant for my argument - when you run an LCD monitor by a VGA cable, the signal needs to be translated to an analog signal for the cable run, then gets reconverted into digital by the monitor's circuitry.
With a DVI-VGA adaptor from a DVI-I port, all you are doing is adding another 1.5" or so of cable between the video card and the monitor. This is nothing like adding a digital-analog-digital conversion. Like I said before, if the DVI-VGA adaptor is reducing quality of the image on the CRT monitor, then it's either poor shielding/wiring on the video card or in the adaptor, but it's a problem that's fixable based on the technology. Done correctly (ie not with cheap filters/cables), it should be no different from a straight VGA connector.
LCD on VGA, however, is sub optimal no matter how you look at it. It is an unnecessary conversion step to the digital image.
Well then, I'm going to have to lump you in with Jack's position in this debate as well. Please don't inconvenience me and degrade my signal quality, or add costs to the video card that I have purchased, just to satisfy the demands of a minority of the current total users out there.Originally posted by: jiffylube1024
I agree completely with JackBurton's initial statement- that all video cards (at least ones priced $100 and up) should come with two single-link DVI-I ports. The debate about higher end cards coming with one or two (more expensive) dual-link ports is an entirely different debate altogether.
Then we'll have to agree to disagree on this point then, because adding 1.5" to the run to a VGA monitor and converting a digital signal to analog then again to digital is comparing a molehill and a mountain.
I guess what really gets me is the blatant hypocracy of Jack position on the matter, that all cards should be DVI/DVI, and there shouldn't be any DVI/VGA cards, and he also refuses to admit that one can easily choose to drive dual LCD displays, one using the DVI and another using VGA, if one so chooses, claiming that somehow my position of not removing the VGA output, is somehow preventing him from running his displays that way.
Would you want to watch a DVD player through composite video if you had Svideo or component video plugs on your TV? While this isn't a directly analogous situation (for starters, all three of those interfaces are analog), it's similar. Why degrade your image quality through an inferior connection when putting the better connection type costs virtually nothing more and (contrary to your claims) doesn't affect signal quality of the old type of connection.
Perhaps there is something inherent in the signalling protocol that prevents it, but I don't know the details. If one single TMDS link can drive one monitor, then why can't two links drive two monitors? Whether or not they are output on one physical DVI port or two? As long as all of the signals are present, then it would seem logical that they could be split out that way, much like a dual-line phone RJ11 jack can be split out into two single-line RJ11 jacks and used that way with two single-line phones.Originally posted by: jiffylube1024
One bit of confusion that I got from your posts Larry is that you stated earlier something to the effect of "why not have a single dual-link DVI port on the back of video cards along with a VGA port and then just split up the dual-link DVI into two single-link connectors" .
From what I have read (And I've read a fair bit) dual-link DVI, although it does have two TMDS tramsmitters, can not drive two monitors. All the two TMDS transmitters on the card do is offer twice the bandwidth to a single monitor for the purpose of driving monitors at resolutions that need more than 165MHz of bandwidth (like 20xx by 15xx).
That's just the way dual-link works. The two TMDS transmitters are tied together to drive a single display at higher bandwidth. If I'm wrong on this please provide me with a link to some technical documentation because the last I checked, dual link DVI could drive only a single display (at any resolution, even low bandwidth ones).
Now, the current implimentation of single dual-link TMDS transmitters, may slave the sync signals of both of them to the same outputs coming off of the CRTC on the GPU, in that case no, you couldn't easily use them to drive two seperate displays then, but if the cards are changed to support the config that I proposed, then this issue could be changed and fixed as well. (I assume.)
That's changing the way the interface works - if it could be done cheaply it would make sense but I'm pretty sure they set up dual link DVI the way it is for a reason (simplicity/cost).