Money’s tight right now, and my roommates and I were brainstorming ways to cut down on household spending. One thing that bothers me is that our electric bill is consistently high. We’ve vowed to stop leaving lights on, and another idea that I came up was to shut down our desktop computers when we’re not using them, instead of putting them into sleep mode. Does a computer use enough energy when it’s sleeping that shutting it down will make a difference to our monthly electric bill?
I watch my energy bill too and always turn off my computer between uses. If this is too troublesome for you, turn off the monitors. I’ve heard that the monitor uses most of the energy (but my bill is a bit lower since I started turning the whole thing off.) My perspective is, anything that’s on, uses electricity. Why pay for it when its not in use?
Not much. According to this page, a computer and monitor in sleep mode will draw between 15 and 30 watts. Over a month’s time, this works out to an energy usage of 22.3 kilowatt-hours (kWh) at 30 W. If we take a high average electrical cost of 10 cents/kWh (the antional average is around 8-9 cents, IIRC), that works out to a monthly savings of a whole $2.23 per computer system. And that’s assuming it’s ALWAYS in sleep mode. Your actual savings will be less, depending on how much you actually use the machine.
I should have added that the reason I wasn’t shutting it off was that I’d heard that frequently shutting down and booting up wasn’t good for the computer, but I don’t know if that’s still a concern. I own an iMac G5, so shutting down the (20-inch, thank-you-very-much!)monitor seperately isn’t an option.
What does “not good for the computer” mean? Hard on the power supply? Hard drives? I’d be very surprised if there was any evidence of a significant deterioration in the lifetime of any of these caused by turning it off when you finish using it, especially compared with the normal lifetime of a personal computer. Computers are made to be turned off and on.
Computers (like most electronic things) are made up of different materials that expand and contract at different rates. Turning the computer on and off therefore creates a lot of stresses where dissimilar materials come together. If you’ve got a bad solder joint somewhere, the thermal stresses will make it fail sooner than it would if it stayed at a constant temperature. Inside all of the chips are itty bitty wires that connect the silicon inside the chip to the metal pins on the outside of the chip. Over time, all of the thermal stresses tend to cause these teeny tiny wires to lift up off o their pads, causing the chip to fail. There’s also a bunch of devices inside the computer that draw more current when they start up (like disk drive motors, for example). It’s kind of like a light bulb. They get the most stress under start-up, so that’s when they are more likely to fail.
So, basically, if you turn your computer on and off a lot there’s a whole bunch of things trying to kill it. This leads people to think you are better off leaving the computer on all the time. However, if you leave your computer on all the time there’s also a bunch of different things trying to kill it.
Moving parts (like disk drive platters and fans) wear out faster under constant use. Computers that are on all the time are also at greater risk of damage due to power fluctuations and noise spikes on the power line. In the quest for faster and faster computers that still need to be as cheap as possible, many computers are inadequately cooled. Components that run hot 24/7 are more likely to fail.
CRT type monitors last longer when you shut them off. I’ve never read any long term reliability studies about flat panel monitors, but I suspect they don’t care so much either way.
Whether it’s better to leave your computer on all the time or shut it off when not in use is going to depend a lot on your specific computer and the environment it is in. I personally have had a lot more things over the years damaged by power fluctuations than anything else. YMMV.
Both. And nearly everything else in there, too. The main problem is thermal stress. Turning on a cold computer relatively quickly heats up various components, particularly things like power supplies and CPUs. Leaving it on, OTOH, allows the temperature to stabilize at some point, after which it changes very little, resulting in lower thermal stress. Some studies have shown that leaving it on full time gives your PC nearly double (yes, I said DOUBLE) the life span of one which is turned on and off several times a day, on average. See this article for more information.
Did you read your own cite? While the number of powered-on hours is doubled when you leave it on all the time, the operational lifetime is in fact half as long if you leave it on continuously, according to your cite. Not to mention that it’s an article from 1993, and all the numbers given are made-up estimates.
The question is not whether the thermal stress caused by turning your computer on and off is bad for your computer, but whether it is likely to cause a noticeable decrease in the effective lifetime of your computer before you replace it. I would argue that computers are engineered to a tolerance that takes into account the possibility that a user may turn it on and off, and thus leaving it on for fear of breaking it is as silly as entering your car via the window to avoid undue wear on the door hinges.
Say what you will, Giraffe, but my PC has been on nearly continously for nearly 4 years and counting. Yes, I do reboot periodically because Windows is just stupid that way, and I need to do periodic maintenance like cleaning out the dust. Thermal stress is a big issue in electronics, which is why in high-rel applications which are subject to large thermal swings, such as in aircraft components, extensive thermal cycle testing is done to ensure the reliability of the parts under those conditions. Many parts we made at EWC were used in aerospace applications, and were heavily tested for reliability under temerature swings from, typically, -70 to +105 C. Granted this is far more severe than your typical PC will encounter, but the point still stands that thermal stress is a killer for electronic componets and assemblies.
Looking over Q.E.D.'s article, and taking what engineer_comp_geek said into consideration, it looks like I’d be best turning it off during the week (when I don’t use it every day) and leaving it on during the weekend (when I do most of my computer work).
Giraffe, I heard the thing about powering up and down being bad for the computer about 6 or 7 years ago, when my mom was having computer problems and a friend of hers who handled all her computer problems said it was partially due to being shut down and restarted too frequently - my brother, step-sister and I were all using her computer and the MO at the time was to always shut it down when you were done.
You would probably save more energy by pulling out you refrigerator and cleaning the dust off it’s coils than anything you would do with your PC. It uses much more electricity than a PC does, and is on all the time. Refrigerators & freezers are the largest energy users in most houses.
Especially if your roomates stand in front of one with the door open trying to decide what to eat!
You should probably get this meter.
I love mine. No more debates about what different things use, or cost.
single-plug usage meter "Electricity usage monitor connects to appliances and assesses efficiency
Large LCD display counts consumption by the kilowatt-hour
Calculates electricity expenses by the day, week, month, or year
Displays volts, amps, and wattage within 0.2 percent accuracy "
I haven’t asserted that leaving a computer on is bad for it. Every work computer I’ve had for the last ten years has been left on continuously without a problem, and every home computer I’ve had for the last 15 years has been turned on and off multiple times/day without a problem. My assertion is that in the absence of any evidence to the contrary, it’s a safe assumption to assume that daily power cycling is within the engineering tolerances of personal computer components. Thus, both leaving a computer on or turning it off is fine.
Sure, and the point stands that friction can destroy mountains. Should I stop opening my car door for fear of wearing out the hinges?
What evidence is there that the amount of thermal stress due to power cycling a computer under normal conditions is sufficient to cause a noticeable decrease in the computer lifetime?
Define “normal conditions” and I’ll get back to you. If you use it a LOT, like I do, you’ll definitely gain by leaving it on. If you use it very infrequently, like the OP, you’ll benefit more by turning it off when not needed.
I would argue normal conditions is power-cycling it once/day, leaving it on for 3-4 hours on weekdays and 8-10 hours on weekends. For argument’s sake, we could also consider the more extreme case where a user power-cycles it 5-6 times/day, just to get an upper bound.
You also need to considering the life expectancy of the boardlevel components in a computer. I’ve spent the last 11 years working with capacitors, resistors, inductors etc.
Many of the parts used in computers, if the computer is running at about 55°C, have an expected life of 16k hrs. Approximately double that for every 10°C cooler that you run. At 55°, that gives you 667 days, under 2 years. At 45°C, it would be 3.7 years.
This isn’t an absolute time frame, but it’s all that the component manufacturers will guarantee at those temps.
My own recommendation in situations like this is to turn off the computer at night, but otherwise have it go into “sleep mode” during extended non-use periods during the day (assuming you have Windows XP-Sleep Mode/Hibernation is BAD for any other version of Windows).
Also make sure that the monitor is set to turn itself off after a reasonable time of not being used–an hour is good for most people, but YMMV. The monitor does use most of the electricity of the computer system, especially if you have an inexpensive CRT monitor.
There are other, bigger drains on electricity, though. TV’s suck power all the time, to keep their clocks going, and so that 21st century don’t have to wait for the “start up time” that 20th century viewers had to endure.
“Wall worts”–the transformers that allow you to plug battery-powered devices into AC outlet–are constantly tranforming power, even if the cell phone/laptop/palm computer/Game Boy isn’t actually plugged into it. Unplug them if you aren’t actually using them.
We’re talking a few to several tens of milliwatts, typically. You gain very little in the way of savings by unplugging them.
Again, we’re talking very little. The draw of a transfromer depends, of course, on the load. Ideally, a transformer would draw zero power when there is no load, but real-world transformers have losses which show up as the no-load current draw. For small transformers, the no-load draw can be very, very small: typically a few milliamps for the sorts used for small consumer electronic devices, such as you mentioned above. Again, very little gain from unplugging them when not in use–though very cheaply-made ones might benefit from an increase in lifespan.
Another way to use less electricity, which I’ve undertaken recently in response to another thread here, is to replace as many light bulbs as possible with compact fluorescent lamps. You won’t save in the short term, because the bulbs are much more expensive on average than incandescent lamps. But they last much longer, too, and use between one-fifth and one-third the amount of power to generate the same amount of light. So you save in the long run.
On the downside, some CF lamps don’t come on instantly, or start out dim and take a minute or so to come to full brightness. But some are instant on, and at almost full brightness. Either way, it’s mildly annoying at first, but you get used to it.
I’ve replaced about 30 bulbs in my 3-BR condo. Just about the only ones I haven’t done are the track lights in my kitchen. So far there don’t seem to be any CFLs for track lights. I haven’t checked my electric bill yet, but since I just made the change I epect the difference will be masked by the change of seasons. But I’m sure I’m using a lot less electricity. Doing my little part to reduce our dependence on foreign oil.
hold off on the compact flourecents. Ive had 3 die on me within 3 months of purchase. Dunno why, since I have others in close proximity still going strong. Might just be manufacturing.