Seriously, why the hell are video cards even being equipped with a VGA connector?

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: VirtualLarry
Originally posted by: jiffylube1024
If you're having problems I'd suspect your VGA cable as much as the adaptor, or try another adaptor to see if you have a bad adaptor. Perhaps a better quality VGA cable would fix your problem, because as JackBurton said, high-end 3d rendering cards come with all DVI ports and work fine with DVI-VGA adaptors.
It's not a bad problem, mind you, but a discernable one, for me. For text on a highly-contrasting background, at high resolution and smaller point sizes, you can see a slight degradeing in the edges of the vertical lines on the characters. It doesn't make things unreadable, but it is noticable. It's the same sort of thing that adding a mechanical analog VGA switch box inline with the signals will cause, although that effect is generally even more pronounced than the DVI-to-VGA adaptor usage.

But by the same argument that you are using, I could quote some posts of other people that have done A/B testing with their new higher-end LCDs, between the DVI and VGA ports, and they can't see any difference at all. I could also suggest that your LCD panel is not operating properly with the VGA input, and they you should consider getting a better-quality panel that doesn't suffer from that problem. (For the sake of logical argument here - I'm not realistically suggesting that. But it could easily just as well be true.)

Larry, I see where you're going with this argument, but I'm not talking about subjective tests here here - I'm talking about how the technologies work objectively.

LCD monitors require more digital to analog conversions to run on a VGA connection. Whether the user perceives a difference or not is irrelevant for my argument - when you run an LCD monitor by a VGA cable, the signal needs to be translated to an analog signal for the cable run, then gets reconverted into digital by the monitor's circuitry.

With a DVI-VGA adaptor from a DVI-I port, all you are doing is adding another 1.5" or so of cable between the video card and the monitor. This is nothing like adding a digital-analog-digital conversion. Like I said before, if the DVI-VGA adaptor is reducing quality of the image on the CRT monitor, then it's either poor shielding/wiring on the video card or in the adaptor, but it's a problem that's fixable based on the technology. Done correctly (ie not with cheap filters/cables), it should be no different from a straight VGA connector.

LCD on VGA, however, is sub optimal no matter how you look at it. It is an unnecessary conversion step to the digital image.


Originally posted by: jiffylube1024
I agree completely with JackBurton's initial statement- that all video cards (at least ones priced $100 and up) should come with two single-link DVI-I ports. The debate about higher end cards coming with one or two (more expensive) dual-link ports is an entirely different debate altogether.
Well then, I'm going to have to lump you in with Jack's position in this debate as well. Please don't inconvenience me and degrade my signal quality, or add costs to the video card that I have purchased, just to satisfy the demands of a minority of the current total users out there.

Then we'll have to agree to disagree on this point then, because adding 1.5" to the run to a VGA monitor and converting a digital signal to analog then again to digital is comparing a molehill and a mountain.


I guess what really gets me is the blatant hypocracy of Jack position on the matter, that all cards should be DVI/DVI, and there shouldn't be any DVI/VGA cards, and he also refuses to admit that one can easily choose to drive dual LCD displays, one using the DVI and another using VGA, if one so chooses, claiming that somehow my position of not removing the VGA output, is somehow preventing him from running his displays that way.

Would you want to watch a DVD player through composite video if you had Svideo or component video plugs on your TV? While this isn't a directly analogous situation (for starters, all three of those interfaces are analog), it's similar. Why degrade your image quality through an inferior connection when putting the better connection type costs virtually nothing more and (contrary to your claims) doesn't affect signal quality of the old type of connection.

Originally posted by: jiffylube1024
One bit of confusion that I got from your posts Larry is that you stated earlier something to the effect of "why not have a single dual-link DVI port on the back of video cards along with a VGA port and then just split up the dual-link DVI into two single-link connectors" .
From what I have read (And I've read a fair bit) dual-link DVI, although it does have two TMDS tramsmitters, can not drive two monitors. All the two TMDS transmitters on the card do is offer twice the bandwidth to a single monitor for the purpose of driving monitors at resolutions that need more than 165MHz of bandwidth (like 20xx by 15xx).
Perhaps there is something inherent in the signalling protocol that prevents it, but I don't know the details. If one single TMDS link can drive one monitor, then why can't two links drive two monitors? Whether or not they are output on one physical DVI port or two? As long as all of the signals are present, then it would seem logical that they could be split out that way, much like a dual-line phone RJ11 jack can be split out into two single-line RJ11 jacks and used that way with two single-line phones.

That's just the way dual-link works. The two TMDS transmitters are tied together to drive a single display at higher bandwidth. If I'm wrong on this please provide me with a link to some technical documentation because the last I checked, dual link DVI could drive only a single display (at any resolution, even low bandwidth ones).

Now, the current implimentation of single dual-link TMDS transmitters, may slave the sync signals of both of them to the same outputs coming off of the CRTC on the GPU, in that case no, you couldn't easily use them to drive two seperate displays then, but if the cards are changed to support the config that I proposed, then this issue could be changed and fixed as well. (I assume.)

That's changing the way the interface works - if it could be done cheaply it would make sense but I'm pretty sure they set up dual link DVI the way it is for a reason (simplicity/cost).
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: VirtualLarry
Originally posted by: jiffylube1024
LCD's, until something better comes around (in production, not on paper), are the logical, more practical replacement for CRT's. It's only stubborn people who've already made up their minds regarding LCD's before they've ever seen the latest/greatest that don't believe that LCD *should* replace CRT. Why use a display that takes up 5X more space and uses 2X the power (3X on smaller screens)?

Well, you're wrong there, the reason that I've rejected (thus far, although I do keep an open mind) all of the LCD displays that I've seen, is that I am "picky" about my displays, and very much so. Being both a "gamer" (although on a budget), and someone that needs high-resolution text capability, although I don't do graphic design and so accurate color-calibration isn't critical to me, there haven't been any LCDs that have impressed me yet. Sure, on the outside, they look "damn sexy" compared to a big, ol' "bulky" CRT, but overall, CRTs are still far, far superior.

I know your intentions aren't malicious, but lumping LCD users into a group of bandwaggon jumpers is a pretty direct way to kill your credibility in an argument.

Very few of the regulars here on AT that I have spoken to bought their LCD's for looks alone. I (like I'm sure 99% of the rest of the intelligent AT populace) bought my LCD because it was clearer and sharper than my CRT ever was (and the improved desk space and reduced heat helped). I don't give a crap what my LCD looks like, just give me a bezel that's not absurdly thick and I'm happy. I didn't buy the so-called "sexy" Samsung 172X, I bought the cheaper, better performing and "uglier" 710T because when I compared the display on those two screens and my previous 19" Sony CRT, I liked the image quality on the 710T the best (text looks sharper on the LCD as well).

It's kind of like the split between a laptop and a desktop. In terms of performance, modern-day laptops are certainly competitive, in that aspect, just as LCDs are now finally competitive with CRTs for some tasks. But overall, a "power desktop" still has the overall performance edge, and as long as you aren't space/power constrained, they still have the advantage. I'm not going to suggest that everyone has to continue to use a desktop because of that - not at all. Those that have chosen to replace their desktops with laptops, are generally happy with that choice. But neither does that phenomenon mean that: 1) desktops are going away completely, or that 2) in the future, everyone will be using a laptop instead of a desktop, or that 3) current desktop users should be forced to give them up and switch to a laptop, like the rest of the crowd that has already switched. It's just not going to happen. Not while desktops (and by analog here, CRTs), still have that upper hand.

Now, obviously, from your viewpoint, you are a "switcher". But please don't try to force everyone else to jump on your bandwagon because of it.


Coming soon to theatres: 'Switchers' : Rated R. A group of computer rebels who tastelessly upgraded their monitors are hunted down and killed for being the bandwaggon hopping dogs they are. Warning, this film contains scenes of graphic gore and violence.

Honestly, the way you describe it it sounds like LCD adopters shot your dog or something . 'Please don't try to force your beliefs on me' ... Sheesh!!!
 

VirtualLarry

No Lifer
Aug 25, 2001
56,544
10,171
126
Originally posted by: jiffylube1024
Larry, I see where you're going with this argument, but I'm not talking about subjective tests here here - I'm talking about how the technologies work objectively.
I'm quite aware of how the technologies work, objectively. Both have signal-degradation issues, objectively speaking. Whether or not those are perceptable to the user, is a subjective question, and we could both argue that until all of the oxygen on earth was used up, I'm sure.

Originally posted by: jiffylube1024
LCD monitors require more digital to analog conversions to run on a VGA connection. Whether the user perceives a difference or not is irrelevant for my argument - when you run an LCD monitor by a VGA cable, the signal needs to be translated to an analog signal for the cable run, then gets reconverted into digital by the monitor's circuitry.
Yes, it results in some level of signal degradation. It is more pronounced when running at higher resolutions/refresh-rates, is it not?

I'm not entirely certain about the need for additional A-to-D and then again D-to-A stages in all cases though. The LCD panel itself, physically, is still analog, not digital. I'm actually wondering, if on lower-end panels, they haven't implemented a way to drive those panels directly using the analog signal, and thus cutting out a bit of additional cost as well as additional signal-processing steps.

Originally posted by: jiffylube1024
With a DVI-VGA adaptor from a DVI-I port, all you are doing is adding another 1.5" or so of cable between the video card and the monitor. This is nothing like adding a digital-analog-digital conversion.
No, but again, "objectively speaking", it still results in some level of signal degradation. It changes the transmission-line characteristics, reducing the overall effective analog bandwidth, which will only become noticable if one tries to use that level of bandwidth, which only happens when running at higher resolutions/refresh-rates.

Originally posted by: jiffylube1024
Like I said before, if the DVI-VGA adaptor is reducing quality of the image on the CRT monitor, then it's either poor shielding/wiring on the video card or in the adaptor, but it's a problem that's fixable based on the technology. Done correctly (ie not with cheap filters/cables), it should be no different from a straight VGA connector.
The same exact argument could be said, for driving an LCD panel based on an analog input signal. If it results in noticably increased image degradation when using a VGA analog input to the panel, then obviously your problem is that you are simply using a low-quality panel, and should use a better-quality one.

Originally posted by: jiffylube1024
LCD on VGA, however, is sub optimal no matter how you look at it. It is an unnecessary conversion step to the digital image.
Again, the LCD panel itself is still physically analog. We don't live in "The Matrix" yet.

Originally posted by: jiffylube1024
Then we'll have to agree to disagree on this point then, because adding 1.5" to the run to a VGA monitor and converting a digital signal to analog then again to digital is comparing a molehill and a mountain.
The simple point, which was apparently overlooked, is that they both degrade the image quality of the signal.

The corresponding point was that why should I make a sacrifice in terms of my display's signal/image-quality, for someone else's display needs? Therefore, the best solution would be to have two different lines of video-cards. The fact that: 1) having a dual-lineup of video cards isn't currently the norm, and 2) the current norm of one DVI-I and one VGA port being the current most cost-efficient solution in the market, and 3) the fact that dual DVI-I outputs would cost more, all seem to have been overlooked by Jack.

Any claims that DVI-to-VGA adaptors do not degrade the signal at all, whilst at the same time claiming that forcing LCD users to use the analog VGA input on their panel would degrade the signal far too much, is both blatantly objectively false, and subjectively dismissive and hypocritical.

Originally posted by: jiffylube1024
Would you want to watch a DVD player through composite video if you had Svideo or component video plugs on your TV?
While this isn't a directly analogous situation (for starters, all three of those interfaces are analog), it's similar. Why degrade your image quality through an inferior connection when putting the better connection type costs virtually nothing more and (contrary to your claims) doesn't affect signal quality of the old type of connection.
I'm not sure what you mean by "contrary to my claims", because I made no claims about composite/S-video/Component-video signal quality in this thread, that I am aware of. And adding those additional analog input types does not cost "virtually nothing" more, otherwise we would have been seeing televisions, even the cheapo Made in China ones sold at Wal-Mart, with included S-Video inputs for a long time now. But yet, even today, that seems to be a rarity, until you move up to the "home theater" quality TV sets.

I guess what I am saying is, if what you are claiming about nearly-zero incremental cost is true, then why aren't all of the lower-end LCD displays, all sporting DVI inputs? (And then, by extension, even the lower-end video cards would also have the necessary DVI outputs to drive them included by default as well.) But they aren't, are they? I wonder why... (hint - the answer is "cost").

I guess the point is, subjectively speaking, NO, I wouldn't want to watch anything via a means that would result in a degraded signal, if I had a better-quality means of obtaining the signal available. So in that case, the ideal solution would be two seperate sets of cards, one with DVI/VGA, and another with DVI/DVI.

(Or my suggested solution, to run two DVI-D TMDS signals out via one DVI physical connector, and then split them via a dongle. Since the signal is still digital at that point, it shouldn't result in the same sorts of degradation that the analog DVI-I to VGA adaptor causes, because the digital logic recievers should still be able to discern the proper thresholds for the digital signal levels, and reject minor analog noise overlaid on that signal.)

Originally posted by: jiffylube1024
That's just the way dual-link works. The two TMDS transmitters are tied together to drive a single display at higher bandwidth. If I'm wrong on this please provide me with a link to some technical documentation because the last I checked, dual link DVI could drive only a single display (at any resolution, even low bandwidth ones).

That's changing the way the interface works - if it could be done cheaply it would make sense but I'm pretty sure they set up dual link DVI the way it is for a reason (simplicity/cost).
But I'm saying that there is nothing inherent in either the signalling specs, or the physical connector, to require that, that I am aware of - only that the current implementation of it is likely done that way for cost reasons.
I will research that some more, I'm a bit curious myself. Indeed, it would be a relatively nice feature of modern video cards if this is possible - they could run dual TMDS links to one DVI connector, and then split them off, and drive one LCD/DVI display, and then use the other for driving an HDMI display - basically to get a TV-out for more advanced HDTVs, without having to add another physical DVI port on the backplate. That could be a win in the future, and of course also useful for dual DVI LCD setups too.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,544
10,171
126
Originally posted by: jiffylube1024
I know your intentions aren't malicious, but lumping LCD users into a group of bandwaggon jumpers is a pretty direct way to kill your credibility in an argument.
Well, when they (ok, Jack at least) start to say things like, "CRTs are dead! All the major CRT makers are leaving the market! You will end up using an LCD, they are the wave of the future, you'll see!"... what am I supposed to think?
Originally posted by: jiffylube1024
Very few of the regulars here on AT that I have spoken to bought their LCD's because they do what their CRT's did for them better.
Alright, that supports my position that LCDs haven't totally eclipsed CRTs yet.
Originally posted by: jiffylube1024
I (like I'm sure 99% of the rest of the intelligent AT populace) bought my LCD because it was clearer and sharper than my CRT ever was (and the improved desk space and reduced heat helped). I don't give a crap what my LCD looks like, just give me a bezel that's not absurdly thick and I'm happy. I didn't buy the so-called "sexy" Samsung 172X, I bought the cheaper, better performing and "uglier" 710T because when I compared the display on those two screens and my previous 19" Sony CRT, I liked the image quality on the 710T the best (text looks sharper on the LCD as well).
Well, that's commendable then. Some people have (apparently) replaced their CRTs with LCDs, as purely a fashion statement, as if it were the "in" thing to do. That disturbs me slightly, but then again, I've never been much of a "fashion" person.
Originally posted by: jiffylube1024
Now, obviously, from your viewpoint, you are a "switcher". But please don't try to force everyone else to jump on your bandwagon because of it.
Coming soon to theatres: 'Switchers' : Rated R. A group of computer rebels who tastelessly upgraded their monitors are hunted down and killed for being the bandwaggon hopping dogs they are. Warning, this film contains scenes of graphic gore and violence.
Honestly, the way you describe it it sounds like LCD adopters shot your dog or something . 'Please don't try to force your beliefs on me' ... Sheesh!!!
LOL. I think that you glossed over some of the statements that Jack made, that were quite clearly "anti-CRT" although a bit obliquely-stated.
PS. Apple Computer would likely attempt to sue any theatre showing that movie, for obvious reasons.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,544
10,171
126
Ok, this is for Jack, regarding driving two 30" Apple Cinema LCD displays. From here
ATI?s inability to drive two 30? Cinema Displays is what gave NVIDIA the design win with the 2.5GHz G5 system, but ATI insists that very few users will have two 30? displays and thus the flexibility of one ADC and one dual-link DVI connector is more than enough for the majority of the market. So immediately off the bat, if you want to run two 30? Cinema Displays, then your only option continues to be the GeForce 6800 Ultra DDL.
Sorry, that does look like your only option on the market right now, as far as gaming cards go. Best of luck.

For Jiffy, I found this, which talks about both dual- and single-link DVI, and HDMI, and it seems to indicate that current dual-link DVI cables, only include 7 signals, instead of eight. I'm going to double-check the dual-link DVI-D pinouts and see if the connector has provisions for 8 or only 7 as well. From that, it seems that the RGB data each gets their own "channel", and then their is a fourth control channel, for a single-link DVI-D signal. Dual-link adds another set of three RGB "channels", but no additional control channel, which I would assume is the key necessary to implementing a dual-link to dual single-link DVI splitter like I had envisioned. Edit: That may not be right - this link talks about the HDMI spec, and it appears that what the first doc referred to as a "control" channel, is actually a clock signal, with three (RGB) digital data signals strobed along with it. Still, without two seperate clock signal channels, dual single-link from one connector wouldn't be feasable.

That page also includes some very interesting info on HDMI-input HDTVs, and digital audio signals, and why there might be an issue with driving them from a PC.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: slash196
Man, this is a flamewar between people with WAAAY too much disposable income.

and time... 50 thousands quotes on this page along...
 

g8wayrebel

Senior member
Nov 15, 2004
694
0
0
Originally posted by: JackBurton
Originally posted by: LTC8K6
I found several High end cards with dual DVI at Newegg. X800XL, 6800GT, etc. Some in AGP.
I need one using PCI-E. If I pay $400+ for a video card, I think dual DVI isn't too much to ask for. If you can find a PCI-E X800XT PE with dual DVI for under $500, please let me know.

Buy a system from the Dell outlet with the card you want,displace it with the cheaper one then sell the system for what you paid less the cost of the replacement card. DONE!
 

housecat

Banned
Oct 20, 2004
1,426
0
0
All this debate and you know what the sad (for some) truth is?

Everyone is moving to dual DVI. Not DVI/VGA.
Jack is right, you guys are wrong.

Its much worse to go from analog to DVI, than it is to go from DVI to analog.

Companies are choosing not to cater to the tightwads and they are putting dual DVIs on the $400 cards.
That trend will only continue.

Too bad.
Enjoy your CRT and analog. It looks like crap.

Toodles.
 

Spike

Diamond Member
Aug 27, 2001
6,770
1
81
Originally posted by: housecat
All this debate and you know what the sad (for some) truth is?

Everyone is moving to dual DVI. Not DVI/VGA.
Jack is right, you guys are wrong.

Its much worse to go from analog to DVI, than it is to go from DVI to analog.

Companies are choosing not to cater to the tightwads and they are putting dual DVIs on the $400 cards.
That trend will only continue.

Too bad.
Enjoy your CRT and analog. It looks like crap.

Toodles.

This makes me wonder if you ever have owned a nice CRT? Here is my current monitor setup:

BFG 6800GT
Nokia 445xpro 21" (20" viewable) running at 1600x1200 86hz VGA
Dell 2005fpw 20.1" 1680x1050 60hz DVI

I have found pro's and cons for both, size and weight aside, the Dell has better text and thats about the only main category it wins on. For movies it is cool for being widescreen (my CRT is a 4:3) but the picture actually looks slighly worse than on my CRT. I am not sure if that is idea of running a smaller res signal on a fixed res LCD or what.

For gaming it is a toss-up, I love gaming in widescreen but the scaling is better on the CRT as not all games will run at 1680x1050 smoothly (doom 3 anyone?).

For picture viewing the LCD wins, brighter and crisper colors to my eyes. For IQ as a whole it is a toss-up again. If I look at the LCD from some angles (normal viewing angles, not some strange, standing-on-my-head-eating-chex angles) I see color disortions, mainly red tinting (this is my second 2005, both had this issue)

Overall for what I do I love both my monitors and would not give either one up. Because of this I LOVE having a DVI and VGA port so I can run both without an adapter. PErsonally I feel they should continue to offer cards with both as a previous experience of mine running my CRT on the DVI with the adapter resulting in some strange IQ issues.

just my $.02

Important Note: My Nokia CRT is 5 years old and is an invar shadow mask CRT. It is NOT one of the newer, ultra-sharp displays that you can get. I imgine if I had one of these the LCD would not do as well against it as it has my old baby

-spike

 

MisterChief

Banned
Dec 26, 2004
1,128
0
0
"Sit back and watch this thread turn into a BBQ"

CRT monitors are better for gaming. That's all there is to it. LCD's ghost. CRT's don't. LCD's get dead pixels. CRT's don't.
 

housecat

Banned
Oct 20, 2004
1,426
0
0
I was trying to be a arse. But I'll reply to your legitimate claim.
Yes the CRTs can be sweet. But I wouldnt say a Diamondtron can match a 30inch LCD like the Apple Cinema, or a 24" Dell... or (IMO) even the 2005/2001FP.

They've just been outclassed. But ya the Diamondtrons are awesome... just are now discontinued. With those gone.. its all over for CRT for me.

So we have:
everyone going to dual DVI
Diamondtron (best CRTs) discontinued

Summing everything up, Jack is right here. Defending DVI/VGA cards and CRTs is a battle that is already lost.

It will just take time for people to wake up and realize it.
 

Spike

Diamond Member
Aug 27, 2001
6,770
1
81
Originally posted by: housecat
I was trying to be a arse. But I'll reply to your legitimate claim.
Yes the CRTs can be sweet. But I wouldnt say a Diamondtron can match a 30inch LCD like the Apple Cinema, or a 24" Dell... or (IMO) even the 2005/2001FP.

They've just been outclassed. But ya the Diamondtrons are awesome... just are now discontinued. With those gone.. its all over for CRT for me.

So we have:
everyone going to dual DVI
Diamondtron (best CRTs) discontinued

Summing everything up, Jack is right here. Defending DVI/VGA cards and CRTs is a battle that is already lost.

It will just take time for people to wake up and realize it.


This sounds alot like the thread where you told everyone that AGP was dead and to buy one would mean you are a moron. I seem to remember you reversing your claim and admitting that you can get a better AGP system right now for less than a PCIe. Not saying it won't be the future, but AGP still has a place.

I think the LCD/CRT is a closer match than even the GPU interface question, but remember how that last thread went. Just because you are "certain" does not mean it's the truth. Things happen NONE of us nerds can predict. You like LCD's better and have no problem with dual DVI output. I like both and want both outputs, thats just our opinions.

-spike
 

housecat

Banned
Oct 20, 2004
1,426
0
0
there is no predicting needed spike.

the best crts are gone. all the high end cards are going dual dvi.
its not I that is certain, its going off what has already happened.. and what will only go further.
 

imported_humey

Senior member
Nov 9, 2004
863
0
0
I want u all to look at my monitor, i said id buy it for years and have now finally (would i really buy this this late in day if LCD was better).

http://www.necmitsubishi.com/products/P...lassificationFamily=1&Classification=1

2048 x 1536 @ 50 to 86hz

This one bellow seems sepcial and its new to site by a matter of months so CRT manu aint dead, but its to expensive.

http://www.necmitsubishi.com/products/P...lassificationFamily=1&Classification=1

Some kinda of ultra-wide color gamut CRT monitor with price of

Estimated Street Price(US) : $4999.99
Estimated Street Price(CAD): $7899.00

:Q

I know no tftlcd can compete with that for gamming and if i choose to watch movies on it, which i normaly tv out anyhow.

That will tide me over till 2006 when samsung and others said shallow tube crts will be in our stores, why do we need crt still, hmm well its cause HDTV is gonna take of well esp in UK 2006-07, and we need 1080dpi res upto 19**x1080 and LCD is POS IMO for that and plasma is worste only thing they can make bigger screens no idea what size in inches the new crts will be avail in.
 

JackBurton

Lifer
Jul 18, 2000
15,993
14
81
Originally posted by: VirtualLarry
Ok, this is for Jack, regarding driving two 30" Apple Cinema LCD displays. From here
ATI?s inability to drive two 30? Cinema Displays is what gave NVIDIA the design win with the 2.5GHz G5 system, but ATI insists that very few users will have two 30? displays and thus the flexibility of one ADC and one dual-link DVI connector is more than enough for the majority of the market. So immediately off the bat, if you want to run two 30? Cinema Displays, then your only option continues to be the GeForce 6800 Ultra DDL.
Sorry, that does look like your only option on the market right now, as far as gaming cards go. Best of luck.
I see that my last question made you go out and do some research before you answered too quickly and stuck your foot in your mouth:
Are you saying there are no gaming video cards out now that support dual 30" LCD displays?
I could have told you Apple's the only kid on the block that offers that. I find it hard to believe that someone that knows so much about LCDs :roll: would not have known about that. Weird.
 

JackBurton

Lifer
Jul 18, 2000
15,993
14
81
Ok, I'm going to finish this up because I'm not going to waste anymore time quoting Larry, and then replying. Dual DVI WILL become the standard, like it or not. Newer cards WON'T be moving backwards and offering dual VGA connections. As CRT users shrink ever so quickly due to the fact that VERY few manufactures will be producing them, it is inevitable that dual DVI will become the standard. If you like your VGA connection, I suggest you keep the video card you have now. My prediction is that the next generation high end video cards will offer DVI/DVI and fewer DVI/VGA cards. Hopefully the next gen cards will offer dual link DVI cards. That's all that holding me back from buying my beautiful 30" display.
 

JackBurton

Lifer
Jul 18, 2000
15,993
14
81
Originally posted by: humey
I want u all to look at my monitor, i said id buy it for years and have now finally (would i really buy this this late in day if LCD was better).

http://www.necmitsubishi.com/products/P...lassificationFamily=1&Classification=1

2048 x 1536 @ 50 to 86hz

This one bellow seems sepcial and its new to site by a matter of months so CRT manu aint dead, but its to expensive.

http://www.necmitsubishi.com/products/P...lassificationFamily=1&Classification=1

Some kinda of ultra-wide color gamut CRT monitor with price of

Estimated Street Price(US) : $4999.99
Estimated Street Price(CAD): $7899.00

:Q

I know no tftlcd can compete with that for gamming and if i choose to watch movies on it, which i normaly tv out anyhow.

That will tide me over till 2006 when samsung and others said shallow tube crts will be in our stores, why do we need crt still, hmm well its cause HDTV is gonna take of well esp in UK 2006-07, and we need 1080dpi res upto 19**x1080 and LCD is POS IMO for that and plasma is worste only thing they can make bigger screens no idea what size in inches the new crts will be avail in.
You buy your little 20" (viewable) low res $4999 CRT, and I'll take my high res (2560 x 1600) 30" LCD. You'll be happy, and I'll be happier.

Oh yeah man, the monitor you just bought, I bought one with specs like that....5 YRS AGO!

Sony CPD-G500
2048 x 1536 @ 75Hz
 

housecat

Banned
Oct 20, 2004
1,426
0
0
No Diamontrons + card companies who know better than you guys are going for dual DVI= not good for CRT market or analog connection fanboys


LCDs overtook not only on these merits, but size. That 30" Apple and the Dell simply dwarf those "22inch" (20inch viewable) CRTs.

*looks at 2005FPW*

Yep, no dead pixels. What a shame.
 

Bar81

Banned
Mar 25, 2004
1,835
0
0
Originally posted by: JackBurton
Ok, I'm going to finish this up because I'm not going to waste anymore time quoting Larry, and then replying. Dual DVI WILL become the standard, like it or not. Newer cards WON'T be moving backwards and offering dual VGA connections. As CRT users shrink ever so quickly due to the fact that VERY few manufactures will be producing them, it is inevitable that dual DVI will become the standard. If you like your VGA connection, I suggest you keep the video card you have now. My prediction is that the next generation high end video cards will offer DVI/DVI and fewer DVI/VGA cards. Hopefully the next gen cards will offer dual link DVI cards. That's all that holding me back from buying my beautiful 30" display.


That's great and all, but that doesn't explain why manufacturers should equip video cards with dual DVI as a standard. When LCDs start taking up a substantial share of the market your theory will make sense but until then, the DVI/VGA combination is the best solution.
 

housecat

Banned
Oct 20, 2004
1,426
0
0
What YOU think they should do, and what they really are doing...

are two completely different things.
 

Bar81

Banned
Mar 25, 2004
1,835
0
0
Except they're not doing what YOU think they are doing Try taking a look around the marketplace for the reality of the situation.
 

imported_humey

Senior member
Nov 9, 2004
863
0
0
JackBurton , dont talk out your butthole and try pick fight with measwell as others, that monitor wasnt out 5 yrs ago and crt is like 75yrs old invented here in SCOTLAND ie the TV, i never bought the bottom url one thats 1000's dollars are you mad, you obv aint reading post properly, you can stick your big lcd up ur butt sideways

Im am playing btw but less of cheek and IMO lcd is POS for gamming 8ms - 50fsp go read all over web even 3dman site and his hot chick of wife will tell u she not get lcd yet till response times r down, i really dont give a toss for this flame going on but IMO i stick to crt and later on shallowcrt.
 

alius

Member
Jan 13, 2003
82
0
0
I couldn't help but respond to this oft mentioned idea from the first page.

Since when are $300+ cards mainstream? Onboard graphics are the norm and your lower to mid end prebuilt machines generally come with budget cards.

 

LoserSlayer

Senior member
Nov 8, 2004
464
0
0
To those saying LCDs suck, my little HP 15" LCD is sweet. It's better than the 15" CRT Flat screen on the family computer. Besides the fact that the picture looks different at an angle and it is naturally a bit brighter than CRTs, they're ok. Took less than two weeks for me to get used to it. Though then there is the problem about applying pressure on the screen, but you shouldn't have your hand on the screen anyway, unless you are cleaning it.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |