So how much energy do computers burn?

I work in an office where there are 10 or 12 computers. When people go home, they generally leave their computers running, although the screens generally switch themselves off to save power.

How much energy will a computer burn if left like this for 12 hours or so? Is this significant when compared to the energy burnt in the other half or so of the day, when all the computers are in active use with screens on, plus fax machines, lights etc? Roughly how much might 10 computers idling all night cost in the United States?

Googling suggests that an “active” PC dissipates around 100 watts, which can drop to 20w (or less) in standby mode. In the US, power typically costs around $0.10/kwh, so 12 hours of 20w would cost around $0.025.

Great! I’ll forget ever turning it off, then…

Is this true? I thought this was primarily to save wear on the screens themselves and power wasn’t a major concern. Seems like the type of thing someone answering this thread would know…

Screen burn isn’t so much an issue with more modern CRT monitors, like it was with older ones. With LCDs, even less so. But, the power savings from switching them off overnight isn’t insignificant. Typical CRT monitors burn up between 70 and 90 watts or so, while LCDs consume around 20-30 watts. Assuming Xema’s $0.10/kWh, shutting them off results in a savings of as much as $43 a month for 10 CRT monitors, given 16 hours of idle time per day. Probably a drop in the4 bucket for the total electricity bill for a largish office like that, but still, it all adds up.

The type of computer can make a noticable difference - higher end Athlon 64 based machines use about 120 watts at idle, while Pentium 4 based machines draw about 160 watts at idle.