FSTargetDrone wrote:No, please, go on. It's all very interesting.
One of my biggest "complaints" with the CRT I have now is its footprint. Aside from some weird issues where lately it seems it can't hold the refresh rate I want, the thing is more than 16 inches deep and throws off quite a bit of heat. I would take a slight picture quality downgrade if only to free up some desk space and cut power consumption!
Very well then.
There are four main attributes when dealing with the actual viewing of the monitor. These are: Accuracy/adjustability, resolution, contrast/brightness, and viewing angle.
When referring to accuracy and adjustability, I am referring to the ability of the monitor to reproduce the colors as output by the machine. All CRTs suffer from the fact that their cathode/anode/phosphor delivery system is suceptable to enviromental effects, such as humidity, temperature, or magnetic and electrical fields. You can even dramatically introduce this effect by bringing a magnet close to your screen... just make sure you either have a degausser or a degauss function on the monitor. Humidity and temperatures create less noticable effects, (and are usually shielded against) but they remain pertinent as they can cause the color of the entire monitor to vary. These are mainly failings of the CRT system, but can also be a failing of the analog information delivery system in the computer itself.
LCD monitors, on the other hand, do not suffer from these problems, (or at least not to the same effect, because of not being composed of metal and highly sensitive electrically charged phosphors) due to the nature of the liquid crystals and the manipulation thereof. The analog delivery system for the information, though, is still a problem. Even though the monitor itself is not suceptable to humidity and temperature, the cards, memory, system, and cables can still be. New video cards and monitors are able to reproduce pure digital delivery and images in exact detail, but these are as of yet pricey and not very compatable with most systems.
So LCDs are more accurate, aye? All in favor of the mighty LCD, eh? Not quite. While LCDs are
naturally more accurate in recieving true color from the machine and displaying it, CRT monitors have a myriad of adjustments and features that (in my opinion, at least) more than make up for it. If your color is slightly off on a CRT, you can adjust the levels of flourecence and output of the rays to compensate, thereby getting much better color, defining a color, and magnifying certian colors for certian situations. So, in general, CRTs have adjusted accuracy equal to LCDs but with *far* more adaptability and usefulness.
Secondly, resolution can be a defining issue. You see, CRTs can adjust their ray output in order to resolve the picture at different sizes, (800x600, 1024x768, ect. pixels being lit up) whereas LCDs... can't. As Stark mentioned, they are very very restrictive. Why is this? To put it simply LCDs have a set number of colored lighting tints in a set area. (If you don't understand what that means, skip to contrast and brightness and come back.) When you want more pixels with CRT, you just have it fluoresce the phosphors in smaller areas. When you want more pixels with a LCD, however, you have to have specific software that "guesses" what the picture will look like when shrunk and activate the crystals accordingly. Want the screen smaller? CRT can blow up your icons by simply using larger phosphor blocks as pixels, but the LCD either (most commonly) shrinks your display window and uses the same size icons or uses software to extrapolate. The extrapolation for both shrinking and enlarging can create a massive blur effect, sometimes to the point of unreadability. Needless to say, this is not a good thing for graphics processing or gameplay.
To the third topic, contrast and brightness, I have only this to say: Backlight.
Alright, so I lied, I have a lot to say about it. Backlighting is how you see anything that is displayed on your LCD screen. It, quite literally, is a light behind the crystal matrix. This light shines through the crystal matrix, (which is composed of colored sections and crystals) and is directed by the formation of the crystals through specific colored areas on the monitor, producing images. How the crystals are formed is determined by specific electrical currents which cause them to align into precise shapes and patterns. I love modern technology. Regardless, the backlight coming through is responsible for the increased brightness of the LCD monitors in relation to the CRTs, which have to actually fork out the juice to excite the phosphors and display colors. The backlight method is much more constant, but it also creates huge contrast issues. I mean, it's basically just a big light that is always on, so it can't exactly produce shades of light and dark. This manifests in the differences between 0,0,0 and 255,255,255 (black and white) being completely controlled by the crystals, which don't -- can't -- perform the same task as a basic on/off function. This causes the differences in brightnesses to be much less recognizable, and by correlation the differences in colors to seem much less pronounced... "washed out", even. The more expensive the LCD, the more work put into ensuring that the contrast ratio is good... and to be honest, there are some exceptionally good contrast LCDs out there. They are also hideously expensive, though, (think 10k+) and still don't surpass the best CRTs.
As to viewing angles. Heh. LCDs get molested, here. You can only correctly view something on an LCD from direclty in front of it, and the more you move to the side the more it blurs and darkens. Why is this? Simply the nature of the crystals, my friend. They configure in specific patterns to shine the light through the colored areas, and those specific patterns direct it straight out. This removes the beneficial "bleed off" effect found on CRTs at the same time it sharpens the image, and if you view it from even a slight angle it can merge and distort colors. This is bad for graphics design, but untenable for gaming.
Finally, I will wrap up with two more brief items that are just as relevant as those above, if not directly related to the display. Cost and Dead Pixels.
LCDs are a relatively new technology, while cathode ray tubes have been around for decades. This drives the price of the LCD up, and creates worries about reliability in the producers, shortening your warranty in comparison to CRTs. As for dead pixels, those are when the crystals get locked or unresponsive and freeze up, displaying one color for EVER. There is no way to fix this, and if you get several of them it can really destroy your work on whatever you are doing. This applies especially to (you guessed it) graphics or games.
Now, as to your comment on the size, heat, and refresh rate. If your refresh rate is dropping it means your monitor is not functioning quite as well as it used to. (if it used to be able to get that rate, that is) Without seeing exactly how bad or good it is, knowing the brand, or yadda yadda variables ect., I can't say how long it has left. Some monitors can keep on chugging along for years after their refresh rate starts dropping, some die a week later. It's all highly variable.
Regarding the size/heat, I would say as long as you are not terribly concerned with massive action games with split-second reactions, advanced graphics work, or anything like that... go for it. LCD monitors excel in enviroments where there is no massive demand on their resources, and are found aesthetically pleasing by a large number of people. Not this person, however. I prefer the look of CRTs, as well as the superior high-end performance. Meh, to each his own.