Office Equipment and Electrical Usage

Is it more efficient to leave machines turned on, if they go into a sort of “sleep” mode (power conservation), or turn them off at the end of the day and back on? Does this vary by machine? (Color Digital Print Engine vs Desktop computer, for example)

How about leaving lights on in a room throughout the day versus turning the lights off and on each time you enter a room?

Are you talking specifically about power consumption (at the office) or are you trying to take into consideration wheather or not cycling the equipment is going to shorten it’s life and thereby cost more in the long run?

I was talking about power consumption, but I’m fine with also considering overall cost/benefit.

Yes, it varies by machine. With a PC it depends on which mode it goes into, do you have it set to turn off the monitor and the harddrives after a certain time?
If you don’t set it to go to sleep and just have the screen saver come on overnight then you’re not saving any money. Have it all but turn itself off and you should be ok. I don’t like turning the PC’s off every night because I believe it leads to premature hardware failure.
Some printers you can turn off every night too. Unless you use the thing as a fax and you’d need it on all the time. It is not wise to cycle printers during the work day because they go through great measures to get warmed up and ready to print.

You should try to turn the lights off when the room is not occupied, however with fluorescent lighting it is not wise to over cycle the lamps and ballasts. You’ll save energy in the short term but the cost to replace ballasts and lamps is quite high.
I would go with a timer/motion sensor that ‘learns’ how the room is used. They will optimise the light useage depending on how often people come and go.
Also, if you buy a cheaper control unit, set it to go off 15 minutes after it last senses motion. Less cycling that way.

This seems to be a hot topic making the rounds via e-mail lately. People are hearing (don’t know from where) that most of the energy consumed in their house is due to standby losses in devices that are always on but rarely used. So I did a little math:

A 1200 watt microwave oven, assuming it uses about 4 watts in standby mode:

1 month = 720 hours
(assuming) electricity cost = 10¢/KWh

Typical usage = 15 minutes/day = 450 min/mo = 7.5 hours/mo
Standby = 1425 minutes/day = 42750 min/mo = 712.5 hours/mo

Therefore,
Usage costs = 1.200 kwatts × 10¢ × 7.5 hours = 90¢
Standby costs = .004 kwatts × 10¢ × 712.5 hours = 29¢

Total monthly cost = 90¢ + 29¢ = $1.19

So the standby cost of 29¢ is a little less than a third of the total monthly cost of powering the microwave. You could say that one-third is a big fraction of the cost, depending how you define “big”, maybe “significant” would be a better term. But is it worth constantly plugging and unplugging it to save less than 30¢ a month? That’s up to you. The really big, power hungry appliances like water heaters don’t have a standby cycle, they’re either on or off.

Where the savings could really start to add up is on a larger geographic scale. So you can see why the DoE and Green Advocacy groups might be vocal about these things.

Anything else that has a low % of use (compared to the total time it’s plugged in) would probably come out the same. VCRs are plugged in 24×7 but you probably only use them an 2 or 3 hours a week. Since VCRs use a lot less power than microwave ovens when they’re being used, the standby cost will be much higher as a raw percentage but he potential dollar amount savings will probably be about the same. Printers spend 99% of their life standing by idle, drawing an average of 5 watts (unless they are energystar compliant). But 99% of standby time ≠ 99% of power consumed.