I know someone out there must be knowledgeable about computer energy consumption. Help me here: I know that computers “power-down” or go into “power-saver mode” to save energy. I also know that when I turn on my computer, I hear things spinning and wheezing and chunking around in there, obviously using a bit of energy to get things started in the morning. There is an urban legend that you should just leave your computer on overnight, because the amount of energy consumed in power saver mode is less than the energy consumed to start it up. I think it’s bullshit, but at some point it has to be better to leave it on. At what point should you have just turned the damn thing off? 2 hours? 8 hours? 24 hours?
Six or eight hours is the conventional wisdom I’ve always heard, but as you suspect, I doubt there’s any science behind it. Even if there is, I suspect that someone looked at a particular computer in the 1980’s, and it’s been repeated since then.
Any real answer is going to depend on so many variables that it almost certainly won’t generalize to any system. If you want to know for sure, you can buy metered outlet doohickies – they plug into the outlet, you plug the system into them, and they give you a reading like your electric meter.
But that’s just going to tell you about electricity/energy efficiency. Granted, that’s what you asked, but it’s not all you need to be concerned about. The biggie is thermal expansion and contraction: lots of things in a computer get hot when the computer is on, then cool when it turns off. This causes expansion and contraction of components, which I’ve heard can be blamed for nearly all computer part failures that aren’t because of brownouts. Leaving the system on most of the time limits the damage here.
On the other hand: Hard drives, LCD backlights, and a few other components have useful lives that are shortened by continuous use (although I’d be interested to know which is worse for LCDS: constant power changes, or continuous use). Which is why most systems give you the option to power down the drives and monitor separately from the whole system.
Me? I put LCD monitors on everything (vastly lower energy use than CRTs), set everything to sleep after a few hours, and leave 'em on all the time. Some XP systems don’t like to wake from sleep – this seems to be an industry issue. Those get left on all the time, but I still tell 'em to sleep their drives and monitor.
My UPS unit reports my computer is currently using 180W of power. Of that, about 40W is the monitor, so the computer itself is using about 140W. It has a 500W power supply, so even during power-up, it can’t be using more than 500W. If we assume the power-up procedure consumes 500W and takes 1 minute, it uses less energy than 4 minutes of normal operation.
scr4 sounds right.
I think the worst thing about turning the power off, at least doing so for an hour or more, is the thermal cycling of all the parts.
There’s some number that I guess is a few hours, representing the least expensive cut point for leaving it on versus turning it off. I think power dictates a short time like minutes, and wear and tear propably prefers days.
I leave mine on if I expect to use them in the next few days. I leave them on over a weekend without use but turn them off for a week without use. I set the screen blanking and hard drive powerdown at around a half hour.
Thank you for your help! You brought up several good points that I hadn’t considered, especially with regard to overall computer deterioration as a result of temperature variation. Computers present their own set of environmental issues during construction and disposal.
I’m just a freeloader right now with my trial membership - I hadn’t thought it would actually be this useful…
On the other hand, most computers become obsolete before they break down, so using extra energy to prolong its life may not be a good idea.
It’s coming on winter here in the northern hemisphere.
If you heat with electricity, then a left-on computer is just a heater. If you turn it off, then your heating system will use the energy that your computer would have done.
True, but the computer is much less efficient as a heater than a modern home furnace. So in that sense, it would be better to turn the computer off, and let the furnace keep the house warm.
Personally, I leave my computer on all the time, and run some BOINC (http://boinc.berkeley.edu/) network computing tasks to cure diseases, study global warming, discover pulsars, etc. This probably makes my computer use more electricity than if it was completely asleep, but I fell that it’s a worthwhile contribution to society.
If your furnace is electric, then the computer is exactly as efficient as a furnace. All the electricity used by a computer comes out as waste heat. Better yet, you get to do something useful with it first. Heck, run Seti@home all winter if you heat with electricity!
No, even with electric heat (which is real rare around here), the furnace is more efficient at heating a house than your computer – it has fans & ductwork to distribute the heat around the house, a thermostat to turn it on and off as needed, etc. My PC is under a desk, with the fan blowing the waste heat onto an outside wall – doesn’t do nearly as much to warm up the room as the heating vent.
Yes, I agree. (I have nearly 20,000 CPU hours in Classic SETI, and over 50,000 credits in the BOINC version. Plus credits in various other BOINC projects.) Check out http://boinc.berkeley.edu/
The computers that I have measured use more power when starting up than just sitting there, but it is not a lot more, perhaps 25% more than the computer just sitting there and not running an application at the most (as long as the application is not doing anything, the computer does not use noticeably more energy than it does when only the operating system is running). Hopefully, it takes 5 minutes or less to start up your computer (my G5 takes about 30 seconds ), so it only takes a few minutes of idle time to make it worth turning the computer off from an energy point of view. However, if you are only going to be gone for a short time, it will not be worth the hassle and you should just put the computer to sleep instead. My computer is set to sleep after 10 minutes of idle time and only uses a couple of Watts in that mode. It wakes up in just a couple of seconds. After I turn it on, I leave it on until I go to sleep, unless I am going to be gone for several hours.
When running, a desktop computer uses 50 to 300 Watts depending on the make and model (faster models, especially those with two processors, use a lot more power). A CRT monitor uses about 60 to 175 Watts (14 to 22 inch), highly dependent on size and somewhat dependent on brand (Sony monitors tend to be more efficient, for example). An LCD monitor uses 20 to 90 Watts for an average size monitor (14 to 22 inch), highly dependent on size, make and model.
There is a large variation in the amount of power that computers use when they are put to sleep. It seems like the computers that are put together piece by piece instead of something like a Dell, HP or Apple use the most energy. They tend to not be able to go to sleep or hibernate properly and keep the hard disk spinning and fan(s) running. I do not think that turning your computer off is significantly harder on it than leaving it on, but if you must, you should at least put the computer to sleep. If it does not go to sleep properly, only the monitor will go off, but it uses a lot of power, especially if it is a CRT, so it is still worth it (if you have a small LCD, it might not use a ton of power, maybe 20 Watts for a very good 14 inch monitor). If the computer does go to sleep, the monitor and computer combined should use less than 20 watts and ideally should use 5 or 10. Deciding how long of an idle time it should take to cause the computer to sleep depends on how you use it and how long it takes to wake up. I think that 10 to 20 minutes is good and up to 45 minutes is reasonable, assuming that the computer wakes up at a regular speed.
I strongly recommend turning off your printer and speakers when you are not using them. They can use a lot of power even when not printing or playing music. A good printer might use 2 watts, but I have measured ones that use 30 Watts. A printer can be from a quality brand, expensive and good in printing ability, but that does not mean it is good in energy use (the same goes for other products). However, some excellent printers use very little energy. A good, averaged sized speaker system might use 10 Watts when there is no sound. Bad ones can use up to 30, more if they are big. These numbers might not sound big, but when they are on 24 hours a day it adds up.
A refrigerator built after 1992 uses less energy than a 75 Watt light bulb that is on 24 hours a day (when the compressor is running, a refrigerator uses hundreds of Watts, but it is off much of the time). A very good, full-size refrigerator built today uses about 40 Watts. An average computer might use 120 Watts just sitting there and an average LCD monitor might use 50, so that’s about 170 Watts on 24 hours a day, which would cost about $12 a month at 10 cents a kilowatt-hour. The energy you save from turning your printer and speakers off when not in use and either turning off your computer and monitor or putting them to sleep is more than enough to run a new refrigerator, assuming you sleep at some point and are not constantly listening to loud music while printing and playing a graphics-intensive game.