Contents:
"Could someone tell me if there's a noticeable difference in picture quality between analog and digital monitors? Is digital worth the extra money?" There is no inherent reason for a digital monitor to have a better picture but as a practical matter, I would expect this to be the case in the vast majority of monitors - especially models from the same manufacturer. The digital monitors will be the ones that the designers concentrate on. Digital controls (both those you can access and those used only during setup at the time of manufacturing or servicing) permit more flexibility in setting parameters and automated more consistent adjustments on the assembly line (at least this is possible in principle). For the average not terribly fussy PC user, the major difference is in the convenience of not having to adjust size and position whenever the scan rate changes. In my opinion, while the price difference between monitors having analog or digital controls but with the same screen size, resolution, and scan range specifications may seem excessive, the added convenience of digital controls and scan rate parameter memory makes the added cost well worthwhile.
This question arises in a PC software development environment where the programmer needs to go back and forth between a Windows display and a DOS debugger, for example. Obviously, without knowing the precise design of your monitor, there can be no definitive answer. It is true that some older monitors blew up if you looked at them the wrong way. Newer monitors from well known manufacturers like Nokia, NEC, and many others are designed with a moderate amount of scan switching in mind. However this is stressful for the monitor's power supply and deflection circuitry. I would suggest that you use a dedicated mono monitor for debugging if you really are switching multiple times per minute. If you cannot afford the space, you can probably assume that if the first few days of this kind of treatment have not induced a failure, the monitor is robust enough to withstand it indefinitely. If you really are switching many times per minute 8 hours or more a day, then what may wear out are the internal relays (the clicks you hear are from these). You are still talking about years, however. They are rated in 100s of thousands or millions of operations when used within their ratings. Or, just go for the peace of mind of an extended warranty or service contract.
(From: Bob Myers (myers@fc.hp.com)). Video bandwidth is an indication of the frequency range over which the monitor's video amplifiers are capable of doing their job, which is to translate the video signal at the monitor inputs (about 0.7 volt, peak-to- peak) to something like 35-40V peak-to-peak at the CRT cathodes. Higher bandwidths ARE better, UP TO A POINT. The bandwidth required is NOT given by multiplying the numbers in the format (what most call the "resolution") by the refresh rate; even allowing for the required blanking time, what THAT gives you is the pixel rate or "pixel clock". As the fastest thing that happens in a video signal is one dot on followed by one dot off, the fastest FUNDAMENTAL frequency in the video signal is half the pixel clock. Normally, you might think you'd want to cover some of the harmonis to "sharpen up" the pixel edge, but that's actually less important than you might think (in part due to the fact that the CRT screen itself, being made up of discrete dots of color, already has the effect of "sharpening up" the image AND limiting how sharp it's going to get, anyway). There's also the problem of "bandwidth" not being measured or speced consistently by all manufacturers, making it difficult to compare one product to another. Some simply give a "max. video rate supported" number, which is about as useless a spec as one can imagine. (It's just telling you the pixel rate of the fastest timing supported - but says nothing about the image quality at that timing!) Still, a claimed bandwidth of about 2/3 to 3/4 of the fastest pixel rate to be used should indicate adequate performance - beyond that, you need to compare products with the good ol' Mark I eyeball. Using this rule of thumb, a monitor intended for use at 1280 x 1024, 75 Hz (a 135 MHz pixel rate) needs a speced amp bandwidth of around 100 MHz. (But just to show how far you can trust this particular number, I know of a product which does a very nice job of displaying 1600 x 1200 at 75 Hz - slightly more than a 200 MHz pixel rate - but which has a video amp bandwidth of only about 100 MHz, if measured per certain definitions!) I find the rise and fall time of a full-scale (white to black or black to white) video signal, as measured at the cathode, to be a much better spec, and here would look for something not slower than 2/3 of the pixel period for the timing of interest. But these numbers are rarely quoted in consumer-oriented spec sheets, and even these take some care in applying.
The ultimate sharpness of the picture on your monitor depends on many factors including but not limited to: 1. Focus of the electron beam spot(s) at the face of the CRT. Affected by: quality of the CRT and its supporting circuitry and adjustment of focus control(s). 2. Convergence of the RGB electron beams at each point on the face of the CRT. Affected by: quality of the CRT, deflection components, and how carefully the convergence adjustments were done during manufacture (or repair). In many cases, it is this last item that is most critical. Bad quality control during final setup can ruin a monitor manufacturer's reputation - and has. 3. Moire reduction (if any or if enabled) reduces the effective sharpness of the electron beam either through actual defocusing or a high frequency dither. IMO, the net effect is almost always bad. Affected by: enabling and magnitude of moire reduction. Items (1) through (3) are somewhat independent (though not entirely) of scan rate. The newest high-end monitors have a fairly comprehensive set of digital (on-screen) adjustments for these but may still not produce acceptable results for every monitor. 4. Bandwidth of the video amplifiers in the monitor - essentially how quickly the intensity can be altered by the video signal. Affected by: design of video amplifier circuitry and circuit board layout. This used to be much more of an art than it is today. Integrated circuits have replaced many of the discrete components used in the past resulting in simple designs with clean circuit board layouts. 5. Bandwidth of the digital to analog converter (D/A, DAC, or RAMDAC) of the video card. Affected by: DAC or RAMDAC chip used, supporting circuitry, and video card board layout. As with (3), these are largely cookbook designs these days. 6. Dispersion in the video cable - how smeared out the video signal becomes traveling through the cable. Affected by: quality and length of video cable. Since cables often come attached to the monitor nowadays, you don't have much control of this. Just don't add problems such as switchboxes. 7. Reflections from any impedance discontinuities in the cable - video card DAC, video card connector, monitor connector, monitor video amplifier input, monitor termination. All of these will introduce just a bit of mismatch - or perhaps much more - which will add up to either barely detectable fuzziness or totally unacceptable ghosting or ringing at vertical edges. Affected by: connectors and circuit board layouts of both video card and monitor input as well as any additional connectors or a switchbox. Items (4) through (7) are heavily dependent on scan rate since higher scan rates translate into higher video bandwidth. Any degradation of the edges of the video signal - transitions from black to white, for example - will be much more visible at the higher scan rates - they will be spread out resulting in pronounced blurring, ghosting, or ringing. Thus, it is critical to use the highest quality components wherever possible. While you don't have control over what is on your video card and inside your monitor, selecting a high quality video card and monitor should help. If you have the option to use a BNC cable (at least your monitor has BNC jacks on the back), try out a high quality BNC cable - you may be pleasantly surprised at the improvement in edge definition and overall sharpness.
(From: Bob Myers (myers@fc.hp.com)). This isn't as simple as it may appear. 'Ghosts' are caused by reflections of the video signal edges, caused by impedance mismatches between the driver (graphics card), the video cable, and the monitor video inputs. Add in the problems caused by the video connectors, and you wind up having to say that this is really (most often) a system problem, and all the parts get some of the blame. With that said, the practical answer is that you should avoid using anything other than a single, reasonably-good-quality video cable, with decent connectors, between your PC and monitor, this being the part that you have the most control over. The more breaks in the cable - adding extension cables, switchboxes, etc. - the more chances you have for a mismatch in the line. BNC connectors (or the new VESA EVC connector) are MUCH better in this regard than the 15-pin D "VGA" connector (although if you're getting good results with the D connector, don't worry about it). Also, do NOT make the mistake of using anything other than 75 ohm coax for your video cables. Just to mention one common mistake, LAN cable is *50* ohms, so it's NOT going to work here! If you've done all you can with the cable, the next place to go is the monitor itself; there's probably something wrong with the video input termination. By the way, a simple way to confirm that what you're seeing IS a ghosting (reflections) sort of problem is to use a DIFFERENT LENGTH of the video cable. Since the ghost is the result of a reflection going from the monitor back to the PC and then back up the line, the length of the cable affects where the ghost appears relative the edge which caused it. Inserting a longer cable moves the ghost out (to the right), while a shorter one will move it closer in (to the left). If you change cable lengths and the ghost doesn't move, you most likely have a problem within the monitor itself, past the video inputs. BTW, longer cables may also make the ghost less distinct, due to the increased attenuation of the signal by the cable. Unfortunately, the longer cable also means more attenuation of the video signals that you WANT, in addition
(From: Bob Myers (myers@fc.hp.com)). With an extension cable, there is the chance that this ghost is being caused by an impedance mismatch AT THE CONNECTOR OF THE EXTENSION; unless the cable is completely the wrong impedance, it is unlikely that the cable itself (meaning the actual "wire") is the culprit. But any break in the cable (connectors, switchboxes, etc.) is a chance for a mismatch. But before blaming the cable, there's another possibility to check out. One commone source of ghosting is a poor termination of the line at the monitor itself and at the graphics card driving it. It can look worse with an extension simply due to the extra cable length moving the "ghost" farther away from the image causing it. (The ghost is, after all, just a reflected signal that went back DOWN the cable, got reflected again at the controller, and sent back up to the monitor. Added cable length makes this round trip longer, and moves the ghost farther to the right of the original edge in the displayed image.) If this is the case, the you will also see the ghost without the extension - it'll simply be much closer to the original edge that it's "ghosting". In that case, a better extension cable can actually make the appearance of the ghost worse - a lower-loss cable means that more of the reflection will get through back to the monitor! If it is being caused by the extension cable, you may get better results by using BNC connections instead of the D-sub at the point where the cables mate. The D-sub is a pretty poor connector in terms of providing the proper impedance. Using a pair of 15D-to-5-BNCs back to back may give better results.
Where BNC monitors are involved and daisychaining is acceptable, additional circuitry is generally not required for reasonable distances. BNC cables for R, G, B, and possibly H and V sync, are run from the source to each monitor in turn with only the last one being terminated in 75 ohms (the others MUST be Hi-Z). However, it is not possible to drive multiple monitors in a star configuration without buffering the signals. In addition, some newer BNC monitors do not have a Hi-Z option for termination so daisychaining is not even an option with these. In either of these cases, what is needed is a distribution buffer amplifier. One such circuit is shown at: * http://www.anatekcorp.com/driving.htm This includes simple emitter follower circuits for each high speed signal.
Almost any PC with at least a medium performance SVGA video card can be programmed for a wide range of resolution options, dot clocks, horizontal and vertical sync timing, and sync polarity. Some can be programmed to generate composite sync and sync-on-green as well. DOS/Windows/Win95 will suffice for most PC applications using drivers supplied by the video card manufacturer but for complete flexibility, run under Linux - take a look at the Xfree86 documentation for more details. Test patterns can be created with any graphics applications and then saved for rapid recall. The following web sites also have some test pattern programs available for download: (Comments from: Byron Miller (byron13@pacwest.net)). * http://www.nokia.com/products/monitor_test.html Very good, color, thorough, professional. * http://www.zdnet.com/cshopper/shopguid/0695/subt.html#download Small and very basic program in B/W. Of course, for different output levels and impedances you will need some extra electronics. A normal SVGA card only produces R,G,B video and H and V sync signals compatible with doubly terminated 75 ohm cables. As noted, some will generate composite sync and/or sync-on-green. See the "Sync-on-Green FAQ" for more information on how to do this if your card is not capable of it. For NTSC/PAL video generation, additional hardware will be needed. See the section: "Displaying computer video on a TV".
These ISA, EISA, or PCI cards put TV programs or other NTSC/PAL source material into a window on your PC's monitor screen. The question has come up as to whether this will damage the monitor in the long term. I would not think that there should be any problems unless you tend to turn the brightness up much higher than normally used for computer activities. If anything, the constantly changing picture will be better than a stationary window. However, moving it to different locations every so often will not hurt. Similar comments apply to other types of image and video captures as well. IMHO, I still think it is silly to use an expensive PC and monitor to watch TV.
Some monitors have the capability of selecting or adjusting for the 'color temperature' of the display. NEC AcuColor on the 4/5/6FG series of monitors is one example. The terminology refers to the spectral output of an ideal black body source at that actual physical temperature. It essentially sets the appearance of a white screen. For example, a color temperature of 9300K will appear blue-white while 6300K will appear yellow-white. It only affects the relative balance of R,G,B and has nothing to do with refresh rates or anything performance related. Unless you are doing work where the exact colors matter or are using multiple monitors where the colors need to match, use whichever setting ismore pleasing
That goop is probably glue and generally harmless - it is there to hold down the components aganst vibration. I have heard of it sometimes decomposing and shorting stuff out but I doubt you have that problem. Therefore, unless you find a bad cap in the focus or related circuit, we are still looking at a flyback problem.
The typical flyback or Line OutPut Transformer (LOPT) consists of two parts: 1. A special transformer which in conjunction with the horizontal output transistor/deflection circuits boosts the B+ (120 V typical for a TV) of the low voltage power supply to the 20 to 30 KV for the CRT as well as provide various secondary lower voltages for other circuits. A HV rectifier turns the high voltage pulses into DC and the CRT capacitance smooths it. The HV may be developed from a single winding with many many turns of wire or a lower voltage winding and a diode-capacitor voltage multiplier. The various secondary voltages power the logic, tuner, video signal, vertical deflection circuits, and CRT filaments. In fact, with many TV designs, the only power not derived from the flyback is for the keep-alive circuitry needed to maintain channel memory and provide startup drive to the horizontal deflection/high voltage system. 2. A voltage divider that provides the focus and screen supplies. The pots are in this divider network - and these things fail resulting poor focus, uncontrolled brightness, or fluctuating focus and/or brightness. A total short could also result in failure of other components like the horizontal output transistor. In some monitors, the focus and screen divider and/or controls are external to the flyback and susceptible to dust and problems particularly on humid days. The resistance of these circuits is so high that dirt or other contamination can easily provide a bypass path to ground especially when slightly damp.
(From: ard12@eng.cam.ac.uk (A.R. Duell)) The older delta-gun tubes (3 guns in a triangle, not in a line) can give **excellent** pictures, with very good convergence, provided: 1. You've set those 20-or-so presets correctly - a right pain as they interact to some extent. 2. The CRT is set up in the final position - this type of tube is more sensitive to external fields than the PIL type. Both my delta-gun sets (a B&O 3200 chassis and a Barco CDCT2/51) have very clearly set out and labeled convergence panels, and you don't need a service manual to do them. The instructions in the Barco manual are something like: "Apply crosshatch, and adjust the controls on the convergence board in the numbered order to converge the picture. The diagrams by each control show the effect". Here's a very quick guide to delta gun convergence where the settings are done using various adjustments on the neck of the CRT (if you don't have a service manual but do know what each control does, and where they all are - otherwise, follow the instructions in the service manual --- sam): 1. Apply a white crosshatch or dot pattern to the set. Don't try and converge on anything else - you'll go insane. It's useful to be able to switch between those 2 patterns. 2. Before you start, set the height, width, linearity, pincushion, etc. They will interact with the convergence. Also check PSU voltages, and the EHT voltage if it's adjustable. That's where you do need a service manual, I guess. 3. Turn off the blue gun using the A1 switch, and use the red and green static radial controls to get a yellow croshatch in the middle of the screen. These controls may be electrical presets, or may be movable magnets on the radial convergence yoke (the Y-shaped think behind the deflection yoke). 4. Turn on the blue gun and use the 2 blue static controls (radial and lateral) to align the blue and yellow crosshatches at the center of the screen. Some manufacturers recommend turning off the green gun when doing this, and aligning red with blue (using *only* the blue controls, of course), but I prefer to align blue with yellow, as it gives a check on the overall convergence of the tube. 5. Turn off the blue gun again. Now the fun starts - dynamic convergence. The first adjustments align the red and green crosshatches near the edges - I normally do the top and bottom first. There will be 2 controls for this, either a top and a bottom, or a shift and a linearity. The second type is a *pain* to do, as it's not uncommon for it to affect the static convergence. 6. Getting the red and green verticals aligned near the edges is a similar process. 7. You now have (hopefully) a yellow crosshatch over the entire screen. 8. Now to align the blue. This is a lot worse, although the principle is the same. Turn on the blue gun again, and check the static (center) convergence 9. To align the blue lines with the yellow ones, you'll find not only shift controls, but also slope controls. Use the shift controls to align the centers of the lines and the slope controls to get the endpoints right. These interact to some extent. You'll need to fiddle with the controls for a bit to work out what they do, even if you have the manual. The convergence over the entire screen should now be good.... A word of warning here... The purity is set by ring magnets on almost all colour CRTs, but on PIL tubes, there are other ring magnets as well - like static convergence. Make sure you know what you are adjusting.
(From: Jerry G. (jerryg@total.net)). Convergence alignment is not something you can do yourself unless you have the proper calibration instruments and skills. It takes lots of experience and time. There are published specs for most of the good monitors. Most of the time they are as follows: There is the 'A area', 'B area', and 'C area'. On a 15 inch monitor the A area would be a diameter of about 4 inches. The B area would be about 7.5 inches. The C area would be the outside areas including the corners. These numbers are approximate. There are actually standard specs for these areas. They are expressed in percentage of screen viewing area. Therefore the inches would vary with the CRT size. The higher the price (quality) of the monitor CRT, yoke, and scanning control circuits, the tighter the convergence can be aligned by the technician. For the A area on a good monitor, the maximum error should not exceed 0.1 mm. For the B area it should not exceed more than about 0.25 mm. And for the C area, it can be allowed up to about 0.3 mm. Most of the monitors that I have repaired, seen, and used did not meet these specs unless they were rather expensive. With these specs there would not be any real visible misconvergence unless you put your nose very close to the screen... A lot of the ones in the medium price range they were about 0.15 mm error in the A area, about 0.4 in the B and greater than in the C area. This also annoys me because I am very critical. If one has the skills and test gear he or she can do a better job on most monitors. It is a question of the time involved. To see the convergence errors a grating or crosshatch pattern is used. A full raster color generator is required for the purity adjustments as well. This is necessary to align the landing points of the CRT guns. The exact center reference and purity adjustments are done with the ring magnets on the CRT neck. The yoke position angle adjustments are also done for the side and top-bottom skewing as well. Everything interacts! The corners are done with various sorts of slip or edge magnets. As for corner convergence skewing, button magnets are used. The color purity will be effected as you go, and must be also corrected. These adjustments interact on one another, and the processes continues until the convergence and purity are good at the same time...! I don't recommend the amateur or hobbiest, or even the do-it-yourselfer to attempt this alignment procedure. The test gear would exceed the cost of a really good monitor anyways...!!! And without the proper skills required, he or she would only make it worse anyways... As for purity specs, the color change from any corner to any corner must not exceed an error of more than 200 degrees Kelvin. The error in the B area should not exceed 300 degrees kelvin. This applies to a white raster. Most of the monitors I see don't get better than about 300 degrees Kelvin. And some are even 1000 out! The purity errors are best checked with a full Red raster using 100 % saturation. Then the other color vector angles are checked with cyan, and then magenta. The color temperature stability should be the same in all aspects. A color spectrometer should be used to judge this error factor. As far as the eye is concerned, it will see a purity error of more than about 500 degrees Kelvin if the person knows what to look for... When changing the CRT, this alignment must be done completely. Most shops do not even employ people who are skilled to a proper alignment, or don't even own the instruments to do it right, and the poor customer get back a monitor that is not in specs...!
Should you always use a surge suppressor outlet strip or line circuit? Sure, it shouldn't hurt. Just don't depend on these to provide protection under all circumstances. Some are better than others and the marketing blurb is at best of little help in making an informed selection. Product literature - unless it is backed up by testing from a reputable lab - is usually pretty useless and often confusing. Line filters can also be useful if power in you area is noisy or prone to spikes or dips. However, keep in mind that most well designed electronic equipment already includes both surge suppressors like MOVs as well as L-C line filters. More is not necessarily better but may move the point of failure to a readily accessible outlet strip rather than the innards of your equipment if damage occurs. Very effective protection is possible through the use of a UPS (Uninterruptible Power Supply) which always runs the equipment off its battery from the internal inverter (not all do). This provides very effective isolation power line problems as the battery acts as a huge capacitor. If something is damaged, it will likely be the UPS and not your expensive equipment. Another option is to use a constant voltage transformer (SOLA) which provides voltage regulation, line conditioning, and isolation from power spikes and surges. It is still best to unplug everything if the air raid sirens go off or you see an elephant wearing thick glasses running through the neighborhood (or an impending lightning storm).
Ground Fault Circuit Interrupters (GFCIs) are very important for minimizing shock hazards in kitchens, bathrooms, outdoors and other potentially wet areas. They are now generally required by the NEC Code in these locations. However, what the GFCI detects to protect people - an imbalance in the currents in the Hot and Neutral wires caused possibly by someone touching a live conductor - may exist safely by design in 3 wire grounded electronic equipment and result in false tripping of the GFCI. The reason is that there are usually small capacitors between all three wire - Hot, Neutral, and Ground in the RFI line filters of computer monitors, PCs, and printers. At power-on and even while operating, there may be enough leakage current through the capacitors between Hot and Ground in particular to trip the GFCI. Even for ungrounded 2 wire devices, the power-on surge into inductive or capacitive loads like switching power supplies may falsely trip the GFCI. This is more likely to happen with multiple devices plugged into the same GFCI protected outlet especially if they are controlled by a common power switch. Therefore, I do not recommend the use of a GFCI for computer equipment as long as all 3 wire devices are connected to properly grounded circuits. The safety ground provides all the protection that is needed.
Using a monitor on a different voltage or frequency is usually not a serious problem. Your PC and monitor should be fine requiring at most a transformer (not just an adapter for heating appliances, however) to convert the voltage. They both use witching power supplies which don't care about the line frequency. Some power supplies are universal - they automatically adapt to the voltage they are fed without requiring even a transformer but don't assume this - check you user manual or contact the manufacturer(s) to determine if jumpers or switches need to be changed. You could blow up the PC or monitor by attempting to run it on 220 VAC when set of 115 VAC. If you are lucky, only a fuse will blow but don't count on it. For non-switching power supply devices like printers and wall adapters that use line power transformers, in addition to matching the voltage (or setting jumpers or switches), running on a lower line frequency may be a problem. There is a slight chance that the power transformer will overheat on 50 Hz if designed for 60 Hz. (The other way around should be fine.) It is best to check the nameplate - it should tell you. If it does not, then best to contact the manufacturer.
(From: Bob Myers (myers@fc.hp.com)). Most manufacturers will quote an MTBF (Mean Time Before Failure) of somewhere in the 30,000 to 60,000 hour range, EXCLUSIVE OF the CRT. The typical CRT, without an extended-life cathode, is usually good for 10,000 to 15,000 hours before it reaches half of its initial brightness. Note that, if you leave your monitor on all the time, a year is just about 8,000 hours. The only "tuneup" that a monitor should need, exclusive of adjustments needed following replacement of a failed component, would be video amplifier and/or CRT biasing adjustments to compensate for the aging of the tube. These are usually done only if you're using the thing in an application where exact color/brightness matching is important. Regular degaussing of the unit may be needed, of course, but I'm not considering that a "tuneup" or adjustment.
(Portions from Bob Myers (myers@fc.hp.com)). If the monitor complies with the VESA DPMS (Display Power Management Signalling) standard, it will go into power saving modes when either horizontal or vertical sync is disabled. Different combinations of the sync signals indicate different levels of power management, distinguished by how much the power is reduced and the expected recovery time. The greater the power savings, the greater the recovery time is expected to be. For instance, one thing that may distinguish the greater power savings states is turning off the CRT filament, something that you don't recover from in just a second or two. You can tell which power saving mode is active by how long the monitor takes to come back to life: 1. Video blanking - image will appear instantly when any key is pressed since this is just a logic level inhibiting the video drivers. 2. Full shutdown - a warmup period of around 15 seconds will be needed for the image to reappear since the filaments of the CRT need to warmup.
A common misconception about the care and feeding of computer monitors is that they should be left on all the time. While there are some advantages to this, there are many more disadvantages: 1. CRT Life: The life of a monitor is determined by the life of the CRT. The CRT is by far the most expensive single part and it is usually not worth repairing a monitor in which the CRT requires replacement. The brightness half-life of a CRT is usually about 10-15 K hours of on time independent of what is being displayed on the screen. 10 K hours is only a little more than a year. By not turning the monitor off at night, you are reducing the life of the monitor by a factor of 2-3. Screen savers do not make any substantial difference especially with modern displays using X-Windows or MS Windows where the screen layout is not fixed. With video display terminals, the text always came up in the same position and eventually burned impressions into the screen phosphor. 2. Component life: The heat generated inside a monitor tends to dry out parts like electrolytic capacitors thus shortening their life. These effects are particularly severe at night during the summer when the air conditioning may be off but it is still a consideration year around. 3. Safety: While electronic equipment designed and manufactured in accordance with the National Electrical Codes is very safe, there is always a small risk of catastrophic failure resulting in a fire. With no one around, even with sprinklers and smoke alarms, such an failure could be much more disasterous. 4. Energy use: While modern monitors use a lot less energy than their older cousins, the aggregate energy usage is not something to be ignored. A typical monitor uses between 60 and 200 Watts. Thus at a $.10 per KWH electric rate such a monitor will cost between $48 and $160 a year for electricity. During the night, 1/2 to 2/3 of this is wasted for every monitor that is left on. If air conditioning is on during the night, then there is the additional energy usage needed to remove this heat as well - probably about half the cost of the electricity to run the monitor. The popular rationalization for what is most often just laziness is that power-on is a stressful time for any electronic device and reducing the number of power cycles will prolong the life of the monitor. With a properly designed monitor, this is rarely an issue. Can you recall the last time a monitor blew up when it was turned on? The other argument, which has more basis in reality is that the thermal cycling resulting from turning a monitor on and off will shorten its life. It is true that such thermal stress can contribute to various kinds of failures due to bad solder connections. However, these can be easily repaired and do not effect the monitor's heart - the CRT. You wouldn't leave your TV on 24 hours a day, would you? Also see the section: "Thernal cycling and component life". Some of the newest ('green') monitors have energy conserving capabilities. However, it is necessary for the software to trigger these power reduction or power down modes. Few monitors in actual use and fewer workstations or PCs are set up to support these features. If you have such a monitor and computer to support it, by all means set up the necessary power off/power down timers. However, using the power saving modes of a 'green' PC with an older monitor can potentially cause damage since some of the modes disable the sync signals. A 'green' monitor which can detect a blank screen and and use this as a trigger can easily be used with a screen saver which can be set to display a blank screen - on any PC or workstation. Even if the monitor does not support power saving modes, a blank screen or dark picture will reduce stress on the CRT and power supply. Electronic components will run cooler and last longer. Please make it a habit to turn your monitors off at night. This will extend the life of the monitor (and your investment) and is good for the environment as well. For workstations, there are good reasons to leave the system unit on all the time. However, the monitor should be turned off using its power switch. For PCs, my recommendation is that the entire unit be turned off at night since the boot process is very quick and PCs are generally not required to be accessible over a network 24 hours a day.Go to [Next] segment
Go to [Table 'O Contents]