I heard that it takes far more energy to start a computer, than to keep it running once its started. Is this true? If so, how long would the running computer’s energy use take to equal the energy use for starting a computer?
I don’t think it is the amount of energy used, rather the stress on the components which is greater starting a machine than keeping it in a steady state.
I have heard from a few reasonably reliable sources that leaving a computer on is better for it than turning it on/off when you need it.
The same goes for light bulbs.
>> If so, how long would the running computer’s energy use take to equal the energy use for starting a computer?
Let us assume the worst of cases, that a computer uses twice as much energy while starting (which is just for the sake of argument). It takes a minute to start. So it uses the energy of two minutes of idling. Anyone who says keeping it on saves energy does not have the faintest idea of what he’s talking about.
On the other hand, much (most?) of the power in running your computer is the monitor, and of the rest, power consumption is lower when it’s not actively doing calculations. So an “idling” computer with the monitor off won’t eat much power.
Can I hijack this thread just a little? I would like some real answers to the question: “What is good operational practice?
(a) Never shutting down/restarting the computer until necessary, OR (b) Shutting/restarting regularly or may be even infrequently?”
One aspect to this is energy, but the key aspect seems to be wear and tear to the components.
If you turn the computer on and off a lot, you create a lot of thermal stresses caused by different materials expanding and contracting at different rates. The most likely failure is the little pieces of metal which connect the silicon to the metal contacts inside the chips, which tend to eventually lift up off of their pads and disconnect causing a failure of the chip.
On the other hand, if you leave the computer running all the time, all of the mechanical parts that are constantly moving (fans, hard drive platters, etc) tend to wear out much faster. You are also more vulnerable to problems caused by power (brownouts, spikes, etc).
Which is better for your particular system is definately not trivial to calculate. You definately don’t want to turn the computer on and off too frequently (like every 10 minutes). If you have cheap fans (like most computers do these days) then leaving the computer running may be worse. Many CPUs run a little hot (so the manufacturer can save a few cents by using a smaller fan/heatsink) and in a computer heat kills, so if your CPU does run above 35 deg C then it is a bit more likely to die an early death if you run it 24/7.
On the other hand, if your computer is adequately cooled, and you live in an area that has few power problems (or you’ve got a heck of a UPS and power filter setup) then maybe it would be better for your system to leave it on all the time.
Laptops definately run very hot. I wouldn’t leave a laptop on 24/7 or it will die young. It will probably die young anyway just because of the way it is made (compared to a desktop).
Monitors last longer if they are turned off.
Most people upgrade their computers after about 3 years, which kind of makes most of the discussion rather moot. Turn off monitors and laptops (but not so frequently that you are constantly turning them on and off) and don’t worry so much about the rest.
I’ve been trying to keep my roomate’s computer from frying while she’s away, and I figured good bet is turn the computer on when I get home from work, if I’m going to use it, and turn it off when I go to bed.
She, on the other hand, leaves it on all the time.
I just can’t do that in California, it costs a fortune for electricity.
This a rather old topic of sorts, so maybe I can sort it out.
In the old days of computers 1980-mid90s the power supply was crap so it made more sense to
keep the computer on all the time. But modern computers have a better power supply, so
they can handle the on/off thing fine.
I once called WD, the HD people. I asked a tech if it was better to keep the computer on all the time for the HD
or not. He said that in theory that it should be the same, but in their practice they found
it much better to turn the computer off at night. There ya are, right from the HD company itself.
Well, if your computer is setup properly, it should be shutting the harddrive off when not in use anyway. My suggestion is to leave the system running, and have a program sucking up the unused CPU power and putting it to good use. Examples of such programs are the United Devices “Cure for cancer” program, and the Distributed.net client. Power usage should be about equal to one or two light bulbs at worst, so your power bill won’t go up TOO much.
A 75 watt bulb going 24/7 for a month is 54KWh. That is a non trivial amount of electricity. Assume you use the computer 4 hours a day that is 45 KWh extra a month.
Power prices range from about 8 cents a KHW (US average) to about 13 cents a KHW (New York and CA) that is about 43 to 70 bucks extra a year for no particularly good reason.
PS a site that directly answers the OP and where I got the 75W from. This is a guy that measured the power turning on his computer and steady state.
http://w9if.net/iweb/cpupower/index.shtml
As noted above, its more healthy to run a computer 24/7 anyway. The actual additional power usage of a computer running at 100% CPU usage as opposed to simply idle will probably be only dozen watts, not a significant amount (I say probably because the idle power usage varies significantly even among computers with the same procesorr, based on the power management logic in use by the operating system and BIOS). Besides, the extra power usage is being expended for a good reason, depending on the task you set it to.
Interesting post gazpacho. Slight hijack here, I thought new computers have power factor correction in the switching power supplies. Anyone knows how good that is ? Also, is it correct to assume the power factor is a lead (capacitive) ?
“Plug “leaking energy” in electronics. Many new TVs, VCRs, chargers, computer peripherals and other electronics use electricity even when they are switched “off.” Although these “standby losses” are only a few watts each, they add up to over 50 watts in a typical home that is consumed all the time. If possible, unplug electronic devices and chargers that have a block-shaped transformer on the plug when they are not in use. For computer scanners, printers and other devices that are plugged into a power strip, simply switch off the power strip after shutting down your computer. The best way to minimize these losses of electricity is to purchase Energy Star® products.”
http://www.consumerenergycenter.org/flex/tips.html
That’s for California. They are trying to get us to conserve.
Addressing the OP:
I can think of 4 main differences between startup consumption and standard (non-idle) elec. usage.
- The power supply kicking on. A switching PS has a special current fed to it to get it “kicking” (controlled by a MOV or some such). Probably not a noticeable amount of elec.
- The hard drive spinning up. (And CD-ROM if it’s one that has to spin to determine that no disk is in.) A fairly small amount of elec. Not worth worrying about.
- The CPU, memory etc. go thru a horrendous amount of work during boot up. Far greater than during normal use. But again, not worth worrying about. (But …)
- A CRT monitor starting up will draw a noticable amount of extra power in order to kick on. Firing up the cathode, getting the flyback transformer going, getting all caps charged, the deflecting coils going, etc. My WAG is that it’s more like the equivalent of running the monitor for a couple minutes. Hardly in the hour equiv. range.
Now, as to that “But …” If you have a weak PS, it’s the startup that’s the problem. Once things can be gotten all running, PS problems are less likely to interfere. So the drain, in toto, has to be significant.
If the issue was solely “leave it on for X or shut it off to save power (=$)” I think that X at around an hour would be the max. where shutting it off would save money. And possibly as little as 5 minutes.
As others have mentioned, EnergyStar and such means that a lot of the equipment can be set up to virtually shutdown when not in use.