Seriously, why the hell are video cards even being equipped with a VGA connector?

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Capt Caveman

Lifer
Jan 30, 2005
34,543
651
126
My two cents. I'm about to build a new rig and just got the ATI X800XL for $290. However, I am seriously considering returning it and getting the Powercolor X800XL for $370 b/c it has dual dvi, vivo and possibly faster DDR3. I currently have a dual display set-up w/ a 23" LCD HP L2335 at 1920x1200 and a 19" CRT Viewsonic GS790 at 1600x1200. Even though the Viewsonic is a CRT Graphics model, I prefer my HP. After moving up to a widescreen LCD, even with a higher pixel count and dot pitch on the CRT, the larger image size at such a high resolution makes everything so nicer to view. From two and half feet away, my LCD has a more vivid picture. Eventually, I'll dork out and pick-up a 24" Dell LCD and I won't miss a CRT at all.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,544
10,171
126
Originally posted by: JackBurton
You're almost doing that. Like I said, CRT users can do dual displays using the VGA connector AND the DVI connector via an adapter. LCD users (with DVI connections) can't do dual displays AT ALL unless they have a card that supports dual DVI connections. So you're basically taking away my ability to run dual LCD monitors whereas if I had my way, all you'd have is a 2.5" adapter sticking out of your video card but you'll be able to do everything you've always done before.
What the he** are you talking about? You're the one that suggested getting rid of the VGA altogether. My suggestion was to use a splitter off of the DVI port, to power two DVI displays. Did I suggest getting rid of the DVI port altogether, in the same manner that you did the VGA port? No.
Originally posted by: JackBurton
I don't have a problem using a doggle. But LCDs are getting better and better VERY quickly. If a manufacturer uses a single dual link DVI connection, I'll still only be able to run one Apple 30" LCD (you aren't going to be able to use a doggle to push two 30" LCDs on a single DVI connection). Although I most likely won't get another 30" LCD, I'd at least like to have that option, ESPECIALLY with a $500-$600 video card.
Oh, so now you don't just want two DVIs, you want two dual-link DVI ports. Please, make up your mind. Conveniently, you change the specs mid-discussion, to thwart the possiblity of simply being happy with a splitter. So now you want four TMDS transmitters altogether. Oh, btw, guess what - none of the current GPUs support four integrated TMDS transmitters. So are you implying, that not only should all mainstream users subsidize dual DVIs, now they should also subsidize dual dual-link DVIs? IOW, requiring *two* additional discrete TMDS transmitter chips onboard, beside the two (currently) default integrated ones? Your position just gets more humourous by the post.
Originally posted by: JackBurton
Let me clue you in - the market tends to favor the majority, when it comes to cost-benefit and incremental-cost issues. If you are in the minority, you will pay more, that's just how it is.
Let me clue you in, CRTs user are going to quickly become part of the minority whether you like it or not. Like I said, Sony and Mitsubishi have pretty much dropped out of the CRTs business. And those were the TOP dogs in CRT manufacturing. So I think the only person that needs to get a clue here is you.
Hey, I can only point things out to you, if you choose not to accept them, then so be it. But at least, stop whining about your incorrect purchase, and blaming everyone else for it. CRTs and/or the analog VGA connector, aren't going away anytime soon. At least not in the next 3-5 years, anyways.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,544
10,171
126
Originally posted by: whoisrube
BNC connectors for the win!

^^^ Amen! Preach it brutha! BNC or bust!

(Actually, he does have a point, they generally do offer the best display/signal quality out of all of the current display cabling solutions. My pair of 20" Sony tubes prefer them too.)
 

ribbon13

Diamond Member
Feb 1, 2005
9,343
0
0
Holy fvcking flamers batman, wheres the nomex?

Here's your high end dual DVI right here
HD15 can't touch DVI for hi-res. "Max Resolution@32bit Color: 3840 x 2400"

But the truth is CRTs are more economical, so the majority has CRT. The major segment of gamers prefer CRT. Thus the largest target demographic still uses a native HD15 connector... so DVI+HD15 is the common setup, and it makes sense business wise.

http://www.newegg.com/app/ViewProductDesc.asp?description=14-127-156&depa=1
http://www.monarchcomputer.com/Merchant...=PROD&Store_Code=M&Product_Code=190671
 

VirtualLarry

No Lifer
Aug 25, 2001
56,544
10,171
126
Originally posted by: JackBurton
Ok, I think you are missing the point. Instead of making a DVI/VGA card AND a DVI/DVI card, wouldn't it be easier and possibly cheaper to just make one version (DVI/DVI)? The people with CRTs will have no problem with the included adapter, and the LCD crowd will be happy that they can add another LCD in the future. What's the down side here?

I just noticed this post of yours - this makes it clear to me that you aren't aguing for dual-DVI at all, but rather - you are directly arguing for the disappearance of CRTs altogether. Why didn't you just say that? After all, even if they offered two models, a DVI/VGA (for us), and a DVI/DVI model (for you), wouldn't that make you happy? But no, for some reason, those of us with CRTs, should be denied a VGA analog connector completely, according to you, right? Truely bizarre thinking at work here...

PS. Why the heck are you at all worried about "cheaper", if you can afford dual 30" Apple Cinema Display LCDs? Surely you can also afford to plunk down another $600 for a dual dual-link DVI video card to drive them, right? If you can eat 'cake', why do you insist on trying to deny "the common man" their 'bread'? Is it not good enough that you have cake for yourself?
 

dragonballgtz

Banned
Mar 9, 2001
2,334
0
0
Originally posted by: LTC8K6
Most people are still using onboard video.
Most people are still using CRTs.
CRTs are still better than LCDs.

The DVI standard is a ways off, methinks.

Originally posted by: Jeff7181
Hey shut up... don't take away my 15-pin VGA until 17 inch LCD's cost $100.

 

JackBurton

Lifer
Jul 18, 2000
15,993
14
81
Originally posted by: VirtualLarry
Originally posted by: JackBurton
Ok, I think you are missing the point. Instead of making a DVI/VGA card AND a DVI/DVI card, wouldn't it be easier and possibly cheaper to just make one version (DVI/DVI)? The people with CRTs will have no problem with the included adapter, and the LCD crowd will be happy that they can add another LCD in the future. What's the down side here?

I just noticed this post of yours - this makes it clear to me that you aren't aguing for dual-DVI at all, but rather - you are directly arguing for the disappearance of CRTs altogether. Why didn't you just say that? After all, even if they offered two models, a DVI/VGA (for us), and a DVI/DVI model (for you), wouldn't that make you happy? But no, for some reason, those of us with CRTs, should be denied a VGA analog connector completely, according to you, right? Truely bizarre thinking at work here...

PS. Why the heck are you at all worried about "cheaper", if you can afford dual 30" Apple Cinema Display LCDs? Surely you can also afford to plunk down another $600 for a dual dual-link DVI video card to drive them, right? If you can eat 'cake', why do you insist on trying to deny "the common man" their 'bread'? Is it not good enough that you have cake for yourself?
Do you not fvcking read my posts? I DO want to replace the VGA connector with a DVI connector, but I DON'T want to keep CRT users from using high end gaming cards (they can always use THE SUPPLIED DVI>VGA adapter). You have this idea that I'm trying to get rid of CRTs, which I'm not. The solution I'm providing NEVER alienates CRT users. YOU on the other hand want to keep an inevitable move from happening (CRT>LCD). I SAID I HAVE A FVCKING SONY G500! I'M PLANNING TO USE THE DVI>VGA DONGLE MYSELF UNTIL I MOVE FULLY TO DUAL LCDS. When I get the card I'll be running my G500 AND an LCD. But I would like to have the option to move to dual LCDs in the future. I can't do that with a fvcking VGA/DVI card. Ideally, I'd like to have a dual link Apple 6800U card made for PCs. That would be the perfect card. And Apple which is known for their high prices are selling the card for a very reasonable $599. So don't give me this whining about doing away with CRTs, because I've NEVER pushed for a solution in this thread to keep people from using a CRT on the type of card I'd like to become standard. The WORST thing that would happen to CRT users is that they'd have to use a dongle. Guess what, I'll be using that same adapter for awhile if I got a dual DVI card.
After all, even if they offered two models, a DVI/VGA (for us), and a DVI/DVI model (for you), wouldn't that make you happy?
That would make me extremely happy. I'd would be completely fine with that. Unfortunately, like I said WAY back in this thread, a slower dual DVI card (X800XT PE) is going for WAY more than a faster card (X850XT) that uses DVI/VGA ($600+ vs $480). That's my problem. If manufactures can put out two different card (DVI/VGA, DVI/DVI) at the same price, that would be perfect for me. HOWEVER, I don't think manufactures would do something like that because of the added cost of producing to different models. But if they can, hey, I'd be completely happy with that.
 

JackBurton

Lifer
Jul 18, 2000
15,993
14
81
Originally posted by: ribbon13
Holy fvcking flamers batman, wheres the nomex?

Here's your high end dual DVI right here
HD15 can't touch DVI for hi-res. "Max Resolution@32bit Color: 3840 x 2400"
The little problem with that card is that it sucks for games. Workstation cards have had dual link for awhile. Hopefully the next generation of high end gaming video cards will have dual link.
 

JackBurton

Lifer
Jul 18, 2000
15,993
14
81
Originally posted by: VirtualLarry
Originally posted by: JackBurton
I don't have a problem using a doggle. But LCDs are getting better and better VERY quickly. If a manufacturer uses a single dual link DVI connection, I'll still only be able to run one Apple 30" LCD (you aren't going to be able to use a doggle to push two 30" LCDs on a single DVI connection). Although I most likely won't get another 30" LCD, I'd at least like to have that option, ESPECIALLY with a $500-$600 video card.
Oh, so now you don't just want two DVIs, you want two dual-link DVI ports. Please, make up your mind. Conveniently, you change the specs mid-discussion, to thwart the possiblity of simply being happy with a splitter. So now you want four TMDS transmitters altogether. Oh, btw, guess what - none of the current GPUs support four integrated TMDS transmitters. So are you implying, that not only should all mainstream users subsidize dual DVIs, now they should also subsidize dual dual-link DVIs? IOW, requiring *two* additional discrete TMDS transmitter chips onboard, beside the two (currently) default integrated ones? Your position just gets more humourous by the post.
I needed to comment on this last statement you made. IDEALLY I'd like a card to support two 30" Apple displays. I just like to keep my options open when I pay $600 for a video card. Are you saying there are no gaming video cards out now that support dual 30" LCD displays?

The way I see the future is dual link SLI video cards that can push gaming resolution to a new level. I think it will be a few years before we get there, but I think that's the way we will be moving.
 

L00PY

Golden Member
Sep 14, 2001
1,101
0
0
Originally posted by: VirtualLarry
Originally posted by: L00PY
As for the CRTs are faster than LCDs argument, it totally depends on how you slice it. I'd even argue that it's physically impossible for a CRT to be faster than an LCD.
What are you talking about? You have that exactly backwards.
Actually, that's almost my point. This concept of "faster" is rather subjective and hard to compare across different technologies.

I'm sure you'll agree that a CRT refreshing at 60Hz looks like crap compared to a "faster" CRT capable of refreshing at 75Hz. I can see the difference between 75Hz and 85Hz so an even "faster" CRT capable of 85Hz is a must for my eyes. I need for a CRT to be fast enough so that I don't see that annoying refresh rate flicker.

Due to the difference in technologies, an LCD refreshing at 60Hz, despite the slower rate, is "faster" to me than a CRT refreshing at 75Hz. I can "see" the phosphors turning on and off for a CRT at 75Hz. In a static image, the LCD pixels are always on and I never "see" them turn off. It is physically impossible for a CRT to refresh a phosphor faster than an LCD that never turns off the pixel. It's all a matter of how you slice it (and what qualifiers you choose to use).
 

gobucks

Golden Member
Oct 22, 2004
1,166
0
0
allright, i see a lot of people flipping out about the prospect of having to buy a dual DVI card. What's the problem? Any dual DVI cards ship with DVI->D-SUB adaptors, so if you have a CRT, you can hook it up, no problem. I had my old CRT hooked up to my 6600GT via this arrangement, before I got my L90D+, and the CRT looked just fine. No image degradation, nothing. This makes sense, of course, because the GPU creates the image in digital (being a PC component) and then has to convert to analog for a CRT. LCDs can work directly with the digital signal, the only thing required is a means of conveyance, so basically DVI just sends the data straight to the LCD. In the case of the adaptor, it just takes the digital signal, converts it to analog, and sends it to the CRT. No problem. It hasn't been changed in any way along the DVI interface, it's the exact same signal you'd get with a D-SUB. As such, why would anyone be super pissed if GPUs started coming standard with dual DVI? After all, eventually LCDs will have even lower response times, and if not, then OLEDs will. Then, the single reason people are holding on to CRTs will be gone, so people will be moving to LCDs. Why not allow yourself to be prepared for this? It doesn't hurt you to use a little adaptor dongle, but it does hurt people like me with 2 LCDs if we DON'T have dual DVI.
 

Bar81

Banned
Mar 25, 2004
1,835
0
0
This is a silly argument. If you want dual DVI buy a dual DVI card and stop bitching about it, just don't try to convert the world to your point of view and don't cry about having to pay a little extra for a feature 99% of users don't need.

As an aside it probably costs more to put a DVI connector on a card versus VGA so that probably explains it. Also, I wouldn't be so sure that the D/A conversion on those small adapters are as good as those used on the card itself.
 

RelaxTheMind

Platinum Member
Oct 15, 2002
2,245
0
76
Dual DVI will be more common in the future. Dont wet your pants just yet. Give it a while. If you cant wait why dont you just manufacture your own cards.

I still have my 6 year old 32" Mitsubishi CRT along with my 19" Acer 99SL CRT. I had a cheapo newegg ACER 19" LCD but i gave it to my parents after 2 weeks of use. Yeh I hate the dva>vga adapter. My gaming years on teh mitsubisi have far prevented me to get an LCD. then again these monitors together weigh about 2 tons, take up a huge amount of deskspace and i cant use a conventional desk. There was a noticable difference in the electric bill as well when i aquired the 32".

Ghosting didnt bother me until i started playing darker FPS/RPG games.

32" CRT aquired from a school closing = Free

Dont hate.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,544
10,171
126
Originally posted by: JackBurton
Do you not fvcking read my posts? I DO want to replace the VGA connector with a DVI connector, but I DON'T want to keep CRT users from using high end gaming cards (they can always use THE SUPPLIED DVI>VGA adapter). You have this idea that I'm trying to get rid of CRTs, which I'm not. The solution I'm providing NEVER alienates CRT users. YOU on the other hand want to keep an inevitable move from happening (CRT>LCD). I SAID I HAVE A FVCKING SONY G500! I'M PLANNING TO USE THE DVI>VGA DONGLE MYSELF UNTIL I MOVE FULLY TO DUAL LCDS. When I get the card I'll be running my G500 AND an LCD. But I would like to have the option to move to dual LCDs in the future. I can't do that with a fvcking VGA/DVI card.
Sure you can. Just use the VGA input on one of the LCDs. Sure, that's sub-optimal in terms of signal quality, but then again, so is forcing an analog CRT user to use a DVI-to-VGA dongle. Touche! (And I could always bring out some comments from other LCD users here on this board, with higher-end LCDs, that apparently don't see any real difference between the DVI connection and the LCD connection. Personally, I find that hard to believe, but it proves that the "degredation", although real, is a subjective viewer thing of whether it is acceptable or not.)
Originally posted by: JackBurton
Ideally, I'd like to have a dual link Apple 6800U card made for PCs. That would be the perfect card. And Apple which is known for their high prices are selling the card for a very reasonable $599. So don't give me this whining about doing away with CRTs, because I've NEVER pushed for a solution in this thread to keep people from using a CRT on the type of card I'd like to become standard. The WORST thing that would happen to CRT users is that they'd have to use a dongle. Guess what, I'll be using that same adapter for awhile if I got a dual DVI card.
Well, guess what, I can turn around and use the same argument against your position - I'm not stopping anyone from running dual LCD displays, just use the VGA analog port to drive one of them!
Originally posted by: JackBurton
After all, even if they offered two models, a DVI/VGA (for us), and a DVI/DVI model (for you), wouldn't that make you happy?
That would make me extremely happy. I'd would be completely fine with that.
Unfortunately, like I said WAY back in this thread, a slower dual DVI card (X800XT PE) is going for WAY more than a faster card (X850XT) that uses DVI/VGA ($600+ vs $480). That's my problem. If manufactures can put out two different card (DVI/VGA, DVI/DVI) at the same price, that would be perfect for me. HOWEVER, I don't think manufactures would do something like that because of the added cost of producing to different models. But if they can, hey, I'd be completely happy with that.
Here's a clue - what you are asking for, for mfgs to add support for dual dual-link DVI ports, would add to the cost of mfg a lot more than just having DVI/VGA and DVI/DVI configurations.

Btw, for those that are wondering, those DVI-to-VGA dongles don't have RAMDACs inside of them, the video card itself generates the analog signal, which is sent to the DVI-I connector on the card, and the DVI-to-VGA adaptor just connects those wires carrying the analog RGB signals already present to the proper pins on the VGA HD15 connector side. The adaptors are nothing more than a cheap chunk of plastic with a few soldered wires wrapped in foil for shielding. Very cheap. That's also why there is a (slight) signal degredation caused by the adaptor. If you hooked the two signals up to a decent O-scope with 200Mhz/channel bandwidth, I'm sure that you could probably see the difference. I wish I had one because I would post some pics to help people understand.
 

imported_humey

Senior member
Nov 9, 2004
863
0
0
I got a Chaintech 6800U with dual dvi and use 1 dongle to connect to my crt as i hate LCD's.

I see no qualiy loss and im more than happy and complete FarCry 2nd time round and HL2, never seen any these funny artifacts like water or grass messed up in HL2.

Yes if you want dual dvi, there plenty of card with even my pals POS 5600 non ultra 256MB card has dual dvi, i think its leadtek.
 

L00PY

Golden Member
Sep 14, 2001
1,101
0
0
Originally posted by: ribbon13
Actually its cheaper to make Dual DVI cards because you don't need a DAC circuit.
Yes and no. It would be cheaper if you were to make cards with dual DVI-D connectors. Unfortunately we're talking about Dual DVI-I cards which by definition require DAC circuits.

 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: VirtualLarry
Originally posted by: JackBurton
Do you not fvcking read my posts? I DO want to replace the VGA connector with a DVI connector, but I DON'T want to keep CRT users from using high end gaming cards (they can always use THE SUPPLIED DVI>VGA adapter). You have this idea that I'm trying to get rid of CRTs, which I'm not. The solution I'm providing NEVER alienates CRT users. YOU on the other hand want to keep an inevitable move from happening (CRT>LCD). I SAID I HAVE A FVCKING SONY G500! I'M PLANNING TO USE THE DVI>VGA DONGLE MYSELF UNTIL I MOVE FULLY TO DUAL LCDS. When I get the card I'll be running my G500 AND an LCD. But I would like to have the option to move to dual LCDs in the future. I can't do that with a fvcking VGA/DVI card.
Sure you can. Just use the VGA input on one of the LCDs. Sure, that's sub-optimal in terms of signal quality, but then again, so is forcing an analog CRT user to use a DVI-to-VGA dongle. Touche! (And I could always bring out some comments from other LCD users here on this board, with higher-end LCDs, that apparently don't see any real difference between the DVI connection and the LCD connection. Personally, I find that hard to believe, but it proves that the "degredation", although real, is a subjective viewer thing of whether it is acceptable or not.)

Larry, I don't know if you're being intentionally obtuse to piss off Jack, or are ignorant of the facts, or just want to be argumentative.

Comparing running dual flat screen monitors with one DVI and one VGA port versus one or two CRT monitors using DVI-VGA adaptors is apples to oranges.

Running an LCD in a VGA port is a big step down in clarity and sharpness since you're running a digital monitor through an analog connection.

Personally on my dual LCD setup the LCD using the VGA port is not only less clear or sharp than the DVI one, but it seems to also have vaguely discernable "refresh" lines going down the screen (although this is probably due to the shielding on my (poor) VGA cable.

DVI ports on the back of video cards (or more correctly, the current single-link DVI-I port found on most Radeon/GeForce cards) has both a DVI connection and a standard VGA connection (that's what those four pins plus the straight line are for, although I'm sure you know this already).

Running a DVI equipped monitor through a VGA port is completely different to running a CRT monitor through a DVI port via an adaptor (DVI-I to DVI-A). Sure there might be ever so slight signal degradation due to the fact that you're using a $2 part using most definitely not the top-of-the-line circuitry inside, but even the basic DVI-VGA adaptors that come with your average GeForce/Radeon card should produce an artifact free image. If you're having problems I'd suspect your VGA cable as much as the adaptor, or try another adaptor to see if you have a bad adaptor. Perhaps a better quality VGA cable would fix your problem, because as JackBurton said, high-end 3d rendering cards come with all DVI ports and work fine with DVI-VGA adaptors. Otherwise you can always buy a third party DVI-VGA adaptor for $10-20, or perhaps a snake-oil powered $50 branded DVI-VGA cable to ease your perceived 'artifacts' .

If you're that suspect of the tiny run of copper wires in a DVI-VGA adaptor degrading your signal then why not take it a step further and suspect the copper going from the DAC to the port on the back of the video card as well?


I agree completely with JackBurton's initial statement- that all video cards (at least ones priced $100 and up) should come with two single-link DVI-I ports. The debate about higher end cards coming with one or two (more expensive) dual-link ports is an entirely different debate altogether.

One bit of confusion that I got from your posts Larry is that you stated earlier something to the effect of "why not have a single dual-link DVI port on the back of video cards along with a VGA port and then just split up the dual-link DVI into two single-link connectors" .

From what I have read (And I've read a fair bit) dual-link DVI, although it does have two TMDS tramsmitters, can not drive two monitors. All the two TMDS transmitters on the card do is offer twice the bandwidth to a single monitor for the purpose of driving monitors at resolutions that need more than 165MHz of bandwidth (like 20xx by 15xx).
 

Bar81

Banned
Mar 25, 2004
1,835
0
0
Originally posted by: L00PY
Originally posted by: ribbon13
Actually its cheaper to make Dual DVI cards because you don't need a DAC circuit.
Yes and no. It would be cheaper if you were to make cards with dual DVI-D connectors. Unfortunately we're talking about Dual DVI-I cards which by definition require DAC circuits.


Interesting, I hadn't realized that they were using DVI-I, which would explain why those cheasy adapters work. I'm quite surprised the port isn't DVI-D as they can cut out the expense of the DAC. I'm guessing they'd rather have universal compatibility for those looking to drive two CRT monitors but also give the LCD crowd something to like. It makes sense when you think about the number of CRT users, even gamers, out there in comparison to LCD converts.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Regarding "better" replacements to LCD, there seems to be a lot of misinformation or half-truths.

First of all, as of right now, in production, there are two kinds of computer monitors the average user can buy from the store: CRT's and LCD's. That won't change much over the next five years, so get used to picking one or the other because OLED/SED/"Thin CRT" (most probably referring to FEDs - "Field Emission Displays") are years off before commercial production.

Perhaps HDMI (which uses the same spec for display as DVI, just with a simpler pinout, thinner cables and supporting more bandwidth that can carry sound as well as video) may make it onto video cards in the future, but I bet DVI will be the standard for another generation at the very least, since the benefits of HDMI are not nearly as applicable to the desktop PC market as the TV/home theatre market. DVI monitors still work the same with an HDMI connection, they just need an adaptor, and it's a standard change that's unnecessary for computer monitors.

-----------------------------

Regarding screen technologies, here's what I've read:

OLED (organic LEDs) are still years away from production as computer monitors. OLEDs are already being mass-produced in sizes up to (2" or 4", I forget which one), and are used in some cell phones, etc already. However, a commercial display based on OLED is still years off of production, and nobody has a working OLED above ~15" or so.

SEDs are a very interesting new "thin CRT" technology from Toshiba that will launch in 2006 (supposed to be late this year but that will be in extremely limited quantities). SED is a derivative of the FED technology and will occupy a niche in the 40" to 60" + TV market. Due to the nature of SEDs (oversimplified, SEDs are analogous to having thousands of tiny thin transistor-driven CRT TV's in a single display), a screen below 40" is not commercially feasible.

FED technology, apart from SEDs, have produced nothing yet, although from what I've read this is the technology Samsung, Sony and many other companies will be using to make thin CRT's (although I've only read about thin CRT's for the TV market, and even then they aren't ready for production yet).

I don't see what the real problem is for including two DVI-I ports on video cards and supplying 1-2 DVI-VGA adaptors (don't give me the "slight video degradation" argument for DVI-VGA adaptors please).

Regarding the argument "most users have onboard video driving a single CRT" argument, it's moot for video cards - those users obviously don't have onboard video, they are spending $200-500 for a new one, and are quite often running dual display these days.


Although CRT monitors still have their followers such as certain picky gamers, be their perceptions of ghosting on <=12ms screen real or mearly imagined, LCD is a technology only getting better, bigger and cheaper. LCDs are a lot more practical for 99% of the joe-schmoe computer user, since they offer comparable performance in a much smaller footprint (with lower power consumption as another benefit).

Why exactly does an internet surfing housewife need to use a bulky power-hungry CRT when she can use a smaller (and soon to be cheaper) LCD screen that doesn't eat up her entire desk?

For office work, LCD's are far more practical, greatly reducing power consumption and requiring a smaller workspace.

Personally, as a gamer, I'm just as happy with my 12ms Samsung 710T as I ever was with a CRT, which would turn my room into an oven and didn't give me any desk space to work with.

LCD's, until something better comes around (in production, not on paper), are the logical, more practical replacement for CRT's. It's only stubborn people who've already made up their minds regarding LCD's before they've ever seen the latest/greatest that don't believe that LCD *should* replace CRT. Why use a display that takes up 5X more space and uses 2X the power (3X on smaller screens)?
 

VirtualLarry

No Lifer
Aug 25, 2001
56,544
10,171
126
Originally posted by: jiffylube1024
Larry, I don't know if you're being intentionally obtuse to piss off Jack, or are ignorant of the facts, or just want to be argumentative.
I was just trying to throw his argument back at him, to point out how pointless it is.

Originally posted by: jiffylube1024
Comparing running dual flat screen monitors with one DVI and one VGA port versus one or two CRT monitors using DVI-VGA adaptors is apples to oranges.
Well, the signal issues with running an LCD via VGA may be more subjectively pronounced to some. But the point was that there is degradation - in either case.

Originally posted by: jiffylube1024
Personally on my dual LCD setup the LCD using the VGA port is not only less clear or sharp than the DVI one, but it seems to also have vaguely discernable "refresh" lines going down the screen (although this is probably due to the shielding on my (poor) VGA cable.
That's some other problem, probably something with the panel itself. Sounds like it's not syncing up exactly correctly or something. (Is it kind of like those older TV sets that had "rolling" programs, due to not properly adjusting the vertical-hold control?) Others have reported that on higher-quality panels, there is no discernable difference between the inputs.

Originally posted by: jiffylube1024
Running a DVI equipped monitor through a VGA port is completely different to running a CRT monitor through a DVI port via an adaptor (DVI-I to DVI-A). Sure there might be ever so slight signal degradation due to the fact that you're using a $2 part using most definitely not the top-of-the-line circuitry inside, but even the basic DVI-VGA adaptors that come with your average GeForce/Radeon card should produce an artifact free image.
Not at high resolutions, like 1600x1200 or 1920x1480 (?) or something.

Originally posted by: jiffylube1024
If you're having problems I'd suspect your VGA cable as much as the adaptor, or try another adaptor to see if you have a bad adaptor. Perhaps a better quality VGA cable would fix your problem, because as JackBurton said, high-end 3d rendering cards come with all DVI ports and work fine with DVI-VGA adaptors.
It's not a bad problem, mind you, but a discernable one, for me. For text on a highly-contrasting background, at high resolution and smaller point sizes, you can see a slight degradeing in the edges of the vertical lines on the characters. It doesn't make things unreadable, but it is noticable. It's the same sort of thing that adding a mechanical analog VGA switch box inline with the signals will cause, although that effect is generally even more pronounced than the DVI-to-VGA adaptor usage.

But by the same argument that you are using, I could quote some posts of other people that have done A/B testing with their new higher-end LCDs, between the DVI and VGA ports, and they can't see any difference at all. I could also suggest that your LCD panel is not operating properly with the VGA input, and they you should consider getting a better-quality panel that doesn't suffer from that problem. (For the sake of logical argument here - I'm not realistically suggesting that. But it could easily just as well be true.)

Originally posted by: jiffylube1024
If you're that suspect of the tiny run of copper wires in a DVI-VGA adaptor degrading your signal then why not take it a step further and suspect the copper going from the DAC to the port on the back of the video card?
Well, that has just as much to do with the signal-quality, technically, as well. In fact, that stage is just as important for DVI-D signals too, and often is lacking, as the TH and ExtremeTech DVI shootouts revealed. In those cases (and has been documented in the past here as well), you might well be better off using the VGA outputs from your card to drive your LCD panel, if: 1) the TMDS outputs from your card had poor signal quality, and 2) the VGA input on the LCD panel was of sufficient quality to not cause perceptable visual degradation as compared to an otherwise sufficient-quality TMDS/DVD-D signal, at the resolutions that the user was to view it at.

Originally posted by: jiffylube1024
I agree completely with JackBurton's initial statement- that all video cards (at least ones priced $100 and up) should come with two single-link DVI-I ports. The debate about higher end cards coming with one or two (more expensive) dual-link ports is an entirely different debate altogether.
Well then, I'm going to have to lump you in with Jack's position in this debate as well. Please don't inconvenience me and degrade my signal quality, or add costs to the video card that I have purchased, just to satisfy the demands of a minority of the current total users out there.

I guess what really gets me is the blatant hypocracy of Jack position on the matter, that all cards should be DVI/DVI, and there shouldn't be any DVI/VGA cards, and he also refuses to admit that one can easily choose to drive dual LCD displays, one using the DVI and another using VGA, if one so chooses, claiming that somehow my position of not removing the VGA output, is somehow preventing him from running his displays that way.

Originally posted by: jiffylube1024
One bit of confusion that I got from your posts Larry is that you stated earlier something to the effect of "why not have a single dual-link DVI port on the back of video cards along with a VGA port and then just split up the dual-link DVI into two single-link connectors" .
From what I have read (And I've read a fair bit) dual-link DVI, although it does have two TMDS tramsmitters, can not drive two monitors. All the two TMDS transmitters on the card do is offer twice the bandwidth to a single monitor for the purpose of driving monitors at resolutions that need more than 165MHz of bandwidth (like 20xx by 15xx).
Perhaps there is something inherent in the signalling protocol that prevents it, but I don't know the details. If one single TMDS link can drive one monitor, then why can't two links drive two monitors? Whether or not they are output on one physical DVI port or two? As long as all of the signals are present, then it would seem logical that they could be split out that way, much like a dual-line phone RJ11 jack can be split out into two single-line RJ11 jacks and used that way with two single-line phones.

Now, the current implimentation of single dual-link TMDS transmitters, may slave the sync signals of both of them to the same outputs coming off of the CRTC on the GPU, in that case no, you couldn't easily use them to drive two seperate displays then, but if the cards are changed to support the config that I proposed, then this issue could be changed and fixed as well. (I assume.)
 

VirtualLarry

No Lifer
Aug 25, 2001
56,544
10,171
126
Originally posted by: jiffylube1024
LCD's, until something better comes around (in production, not on paper), are the logical, more practical replacement for CRT's. It's only stubborn people who've already made up their minds regarding LCD's before they've ever seen the latest/greatest that don't believe that LCD *should* replace CRT. Why use a display that takes up 5X more space and uses 2X the power (3X on smaller screens)?

Well, you're wrong there, the reason that I've rejected (thus far, although I do keep an open mind) all of the LCD displays that I've seen, is that I am "picky" about my displays, and very much so. Being both a "gamer" (although on a budget), and someone that needs high-resolution text capability, although I don't do graphic design and so accurate color-calibration isn't critical to me, there haven't been any LCDs that have impressed me yet. Sure, on the outside, they look "damn sexy" compared to a big, ol' "bulky" CRT, but overall, CRTs are still far, far superior.

For example, I have a friend with an (admittedly older) FP1800 (I think) LCD panel. The owner of it loves it to pieces, even though the front of the screen is covered in dirty fingerprints and tar residue from being in a smoker's house. He uses it for gaming, and swears that he can't see any ghosting on it, but when we play LAN games, I can see it ghosting all to heck playing UT. I could never use that panel. Likewise, if I am sitting next to him, and he's trying to explain something about whatever new MMORPG-of-the-week that he's playing, I notice the color-shift issue due to viewing angle in a fairly pronounced way. Not to mention, the issues of the glow from the backlight creeping around the edges of the screen, and the lack of "true black" of the panel. Slightly newer panels are better in many of those regards (although the backlight showing through seems to be a recurring issue still on many panels), but they're still not up to the standards that I've chosen, standards which most high-end CRTs meet for me. I'll admit it, I'm picky as heck about some things. All of my (current) displays are either NEC or Sony tubes, even my TV. Other people may not be as picky, and/or may find some attributes of CRTs displeasing to them (size/power/weight being one of the major ones). If they're happy with their LCDs, then fine, but that doesn't mean that CRT users should be rounded up and forced to give up their displays, either.

It's kind of like the split between a laptop and a desktop. In terms of performance, modern-day laptops are certainly competitive, in that aspect, just as LCDs are now finally competitive with CRTs for some tasks. But overall, a "power desktop" still has the overall performance edge, and as long as you aren't space/power constrained, they still have the advantage. I'm not going to suggest that everyone has to continue to use a desktop because of that - not at all. Those that have chosen to replace their desktops with laptops, are generally happy with that choice. But neither does that phenomenon mean that: 1) desktops are going away completely, or that 2) in the future, everyone will be using a laptop instead of a desktop, or that 3) current desktop users should be forced to give them up and switch to a laptop, like the rest of the crowd that has already switched. It's just not going to happen. Not while desktops (and by analog here, CRTs), still have that upper hand.

Now, obviously, from your viewpoint, you are a "switcher". But please don't try to force everyone else to jump on your bandwagon because of it.

PS. I wonder... why hasn't anyone come up with a "digital VGA" cable - one that plugs into a DVI-D signal source at the card, and then has a RAMDAC in the cable itself, and converts to analog right at the interface to the CRT? Granted, some issues like analog ground-plane reference stability, especially in the presence of analog noise coupled by the cable, might be an issue, but assuming those issues could be overcome on the CRT side... why not? That would actually likely minimize the analog signal-quality degradation as much as possible, assuming that the TMDS signalling could handle being used that way, much like a digital audio out used to drive a component stereo system/decoder setup. Hmm.

Edited to add that anecdote about my friend's LCD panel in the middle, and the idea I had at the end.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |