View Full Version : Do Crt Monitors and Televisions loose image quality over time?


aidynphoenix
09-06-2011, 11:44 AM
I have been comparing my 21" CRT monitors to this little cheap looking flatpannel i pulled out of the trash and fixed. and the little gateway LCD just knocks my socks off at how much clearer and better images look on it..

is there anything that can be done to bring these two CRT monitors back to life? or have they seen their day and are junk now?

6GH8cowboy
09-06-2011, 12:26 PM
CRTs do get what we called 'Soft' back in the day. Cathode emission drops off and the result can be poor focus, bad greyscale tracking at various brightness levels, color smearing, and of course just plain dim. You may touch up the focus and G2 adjustments on the HVT unit but thats about it. Flat panel screens can look very good indeed but come with thier own set of aging hazards.

ppppenguin
09-08-2011, 02:00 AM
The difference between analogue RGB and DVI interconnects is significant on a big monitor. It may not matter much at 1024x768 but at 1600x1200 it can be very visible. With analogue interconnect the cable quality matters too. I've had some 15 pin sub-D cables that have made the picture looks truly horrible. I'm not suggesting using oxygen free cryogenic cables crafted by Peruvian virgins under the full moon, just ordinary decent cables with individually screened cores that are sensibly 75 ohm impedance.

aidynphoenix
09-08-2011, 09:28 AM
thx.. but there shouldent be big differences between VGA and DVI right?
my CRT monitors are VGA, and the lcd is whichever is plugged in either VGA or DVI

ppppenguin
09-09-2011, 01:40 AM
If I switch my 1600x1200 high quality LCD monitor from VGA to DVI the improvement is very obvious. The VGA image looks good but it's a whole lot sharper with DVI. At that end of the market top quality analogue CRT monitors and graphics cards often had separate BNC sockets rather than a d-sub15 connector. This must have helped minimise the quality loss in the interconnect.

At 1024x768 on a 17" display there should be no problems with analogue interconnect unless the cable is horrible quality.

aidynphoenix
09-10-2011, 09:41 AM
it doesent matter if my monitor smells like apples or oranges, i just want a pretty picture..
was thinking of getting this one, http://www.newegg.com/Product/Product.aspx?Item=N82E16824116483

but im saving up while waiting for it to go on sale.
what do you guys think? see anything wrong with that monitor?

ChrisW6ATV
09-10-2011, 07:55 PM
the little gateway LCD just knocks my socks off at how much clearer and better images look on it..
This is very normal for a computer display. The computer signal is digital data (that is, specific pixel locations) that will match up perfectly to a flat-panel display (with its discrete pixels) even if you use a VGA connection from the computer. CRT monitors are analog, so even the best one will have dot-pitch and response-time limitations (as well as possible convergence errors) that will never compare to any flat-panel display.

aidynphoenix
09-10-2011, 10:39 PM
what about a flat pannel thats also useing the VGA cord? would it still suffer from the same flaws that you mentioned, the dot pitch, responce time, and possible convergence errors?
or does the VGA cords use both analog and digital signal?

Electronic M
09-10-2011, 11:53 PM
VGA is analog video with digital control signals to control sleepmode etc.
I know because years back I tried to get VGA to display on an old color set. I could get quadrupple images in monochrome, but the fact that I did not want to try to duplicate the digital signal that the monitor sends to the cpu syopped me from trying to improve the analog side.

Tom C.

zenith2134
09-12-2011, 02:10 PM
Chris, I think you summed things up the best...with the DVI (or any modern digital video format) you're essentially enjoying the exact implementation of signal to the LCD's vector positions....even the most advanced crt will still have inherent problems which require more setup to correct over time.

I'm as much of a video luddite as anyone, but if it was the early nineties and I wanted a state of the art computing system, I'd be lusting after an LCD monitor. That being said, I used VGA and CRT until the year 2000 with an old Dell running Windows Me (that OS was a huge p.o.s. if ya ask me).

ChrisW6ATV
09-15-2011, 02:36 AM
what about a flat pannel thats also useing the VGA cord? would it still suffer from the same flaws that you mentioned, the dot pitch, responce time, and possible convergence errors?
or does the VGA cords use both analog and digital signal?
A VGA cable carries analog signals, but it is "digital" in the sense that it also ultimately has specific pixel values for each of the three colors (R,G,B). A flat-panel monitor typically has an "auto-adjust" feature, and that circuitry is adjusting the timing of the panel's display and sync circuits to exactly match each pixel of the analog input signal to their matching pixels in the LCD panel.

Dot pitch, response time, and convergence will not be problems on an LCD panel even with an analog input connection, but if the resolution of that input signal is high enough and a long video cable is used, some smearing of the video may be seen on vertical edges on the screen. That smearing is because the high-frequency response in longer cables (or thinner ones) gets weaker.

All of these "exact pixel" comments, whether using a DVI/HDMI input or a VGA/RGB input, only apply if the computer's resolution is set to the same resolution of the LCD monitor, whether that is 1920x1200, 1920x1080, 1680x1050, 1440x900, 1366x768 or another number (check the specs of the monitor). Flat-panel monitors can accept different resolutions, but any other than the "native" resolution will be scaled by the monitor and will not be anywhere near as sharp.