Originally posted by: cbehnken
You do realize it only takes about 3 months before a CRT is hopelessly out of focus don't you?
Wow. I stopped reading after that comment, I'll let others make their own decisions about the credibility of your futher statements; however, I've made up my mind already.
Originally posted by: cbehnken
For someone who seems to witness "artifacts" with VGA/DVI adapters so easy you must have an exceptional tolerance to blurryness.
No, I've just used mostly high-quality CRTs for most of my computing experience. When you decide to pay more then $10 for a cheapo 14" CRT, then perhaps you too will realize the fallacy of your first statement.
Originally posted by: cbehnken
You are trying to say it is more likely for a LCD panel and it's hardware contollers to fail than a high voltage transformer and a CRT tube to fail?
No, I'm saying that it is
just as likely, more or less, because other than the tube (on a CRT), and the LCD panel and backlight (on an LCD), the rest of the components are similar. Did you even realise that LCD panels require a high-voltage step-up transformer too? Apparently not.
Originally posted by: cbehnken
Why don't you see how long other tubes last comparison to stolid state circuits. Take amplifiers for instance. It is not uncommon to replace tubes every 6-12 months. Rectifiers on the other hand last years.
CRT tubes, good ones, have a roughly-estimated lifespan on the order of 8-10 years, for the tube itself. Obviously, over that lifetime, the brightness will diminish somewhat. OTOH, the lifespan of the backlight used in most consumer LCDs, has an estimated lifespan of somewhere around 3-5 years, last time I checked, and it also diminishes in brightness over the overall lifetime. Btw, in order to produce a "brighter", and "higher-contrast" LCD, they also have to drive the backlight harder, which lowers the overall lifespan. (Well, those "super-bright" modes on some CRTs also do the same thing to a CRTs tubes too, which is why I would recommend using it sparingly, and in either case, turn off the backlight on an LCD, or the tube on a CRT, if you're not going to be using it for a while.
Both have finite lifespans.)
Originally posted by: cbehnken
Your arguements about the carb vs fuel injection are quite invalid also. The reliability of fuel injection is unquestionable higher than carburetors. Fuel injection has better emissions, better fuel economy, better performance, and better cold starting ability. Also, anyone who $100.00 can get a code scanner and find out almost precisely what is wrong with their engine. Also, with shops like autozone allowing consumers to use their code scanners for free I really see this as a non issue.
I'm going to drop the issue of automotives in this thread, since they really have no bearing whatsover on the current discussion.
Originally posted by: cbehnken
I'm not sure how you are trying to sale LCDs have shorten lifespans and grow darker over time.
Modern backlights have a MTBF of around 50,000 hrs. This is almost six years of continuous use. Show me a CRT with a higher MTBF.
You also try to say that they will grow darker. While I have never seen one do this myself I suppose it is possible. Although the flourescent bulbs they are based on are rarely known to do this. Even if they did, how many people have to run LCDs are 100% brightness? I have mine set at 0% and it is still brighter than most CRTs on MAX brightness. I think I have a little room to go if indeed it ever does get dimmer.
Well, once it finally starts to "go dim", I have a feeling that the falloff curve isn't as gently-sloped as a CRT, but I don't have any hard data at this point to back that up. But I am completely certain that LCDs have a limited, diminishing lifespan as well, and that lifespan is, on average, shorter than a (decent-quality) CRT tube.
Originally posted by: cbehnken
I have manually adjusted the focus on many CRTs, but usually if a screen is more than 1 year old it is impossible to get the focus perfect again.
First of all, unless you are a trained factory or service tech, you shouldn't be adjusting the focus control, unless you are talking about the OSD control and not the "inner" physical controls on the CRT's yoke PCB. Second, if you are using CRTs that are going out of focus in only one year, either you are abusing the heck out of them (temp/environmental extremes), or you are simply using crap-quality CRTs. I've got some bloody ancient DEC workstation monitors based on Sony Trintron tubes, I got them used, and they're *still* sharp, text is perfectly readable at 1920 across on them - when used with a decent-quality video card, of course. (Matrox Millenium PCI) I've used plenty of "used" CRTs, some with production dates as much as ten years old, and rarely have they been that severely out of focus, if at all. The key here, is that I'm talking about high-end CRTs, Sony/Hitachi/NEC tubes, that were properly factory-adjusted in the first place, and built with high-quality circuitry to support the tube. I have no doubt that a Wal-Mart-special CRT would probably go out of adjustment in one year and die not much longer than that. Then again, you can purchase bottom-of-the-barrel LCDs too, and end up with dozens of bad pixels, and a backlight that fails early, or the power-supply, or whatnot. In either case, I was trying to compare "quality" displays of each type, I wasn't comparing the bottom-of-the-barrel of one to the top-end of the other, that wouldn't be fair.
Originally posted by: cbehnken
Perfect is quite relative because I can look at ANY crt ever made and see the bluriness.
Perhaps the problem is not the CRT, but the quality of the analog output coming from your video card? If you are using an NV card, that is likely the problem.
Interestingly, in the case of LCDs using DVI inputs, even high-end NV cards have had problems, and you might actually get a
better display output on those cards from using the analog VGA output port, if your LCD supports both input types.
Originally posted by: cbehnken
LCDs can NOT get blurry with DVI. It simply is not possible with one pixel being represented by ONE pixel in the matrix.
I was speaking more of the analog VGA LCDs, which are quite common, and can indeed have that problem. I'll have to do some more research on whether that can affect LCDs that use DVI; there are some issues with clock signal recovery and de-skew of the data signals and the PLL and related things. The DVI interface has some issues too, it's hardly perfect. (Witness the TH and ExtremeTech DVI quality expose articles.)
Originally posted by: cbehnken
You also seem to question the reliability of PLLs. What a laugh. Every computer componet that is driven by a clock signal is run by a PLL. If there were as unrealiable as you speak then our processors would never function.
Are the PLLs driving your CPU "perfect"? Actually, no they are not. Most people simply accept that an output clock-freq error of 5-10Mhz either way is "acceptable". But were that clock signal used for clocking display data, you would see some errors. Most PLLs require external timing references, either a crystal, an R-C circuit, or both. (I think.) Those passive components can have their tolerances drift with time, and eventually the factory-adjustment fades.
Originally posted by: cbehnken
In short, if you do not value your eyes and don't mind having headaches go ahead and keep using your CRT. We don't care.
Actually, the extreme brightness of most LCD backlights give me headaches, but that's a subjective thing and besides the point. And if you really didn't care, you wouldn't have bothered to respond.
Originally posted by: cbehnken
But to say we shouldn't have dual DVI because adapters cause quality issues on your already poor quality CRTs is absurb.
You obviously know nothing about the physics of analog transmission-line characteristics and the issues of bandwidth. Ironically, it would be on the low-end CRTs that you must obviously be talking about, that the reduction in analog bandwidth that those DVI-to-VGA adaptor dongles cause, would likely not be noticable. It's only at higher resolutions/refresh-rates that such problems become immediately evident. If you had ever used a high-end CRT you would know that.
Originally posted by: cbehnken
They are so poorly focused and the geometry is so bad even on the best adjusted CRTs that there's a good chance any issues caused by an adapter make actually make your radiation boxes look better.
Now that's just FUD. LCDs give off radiation too you know. It's likely less than most CRTs (although CRTs are generally more heavily shielded to compensate), but an LCDs emissions are definately non-zero.
Originally posted by: cbehnken
Use a CRT for 16 hours a day and then use an LCD for the same amount of time and tell me which is easier on your eyes.
Honestly? A CRT is, for my eyes at least. I also adjust the brightness and contrast downward, to suit me.