How much power does my PC draw?

Two times in the past week I’ve heard someone on national TV say that the current energy crunch is partly due to people turning on their computers and leaving them on all day.

I’m thinking that a PC doesn’t draw much power but before I decide that the speaker is an idiot I thought I’d ask,does anyone know how much a typical PC draws?

I believe a typical PC’s power supply is rated for a maximum output power of around 250 W. If we assume a PC’s power supply is actually putting out 150 W (I have no idea what a typical value is - this is a WAG), and if we assume the power supply’s efficiency is 70% (another WAG), then the input power would be 214 W. I just looked behind my monitor and the label says it draws 1.3 A, which is 156 W, bringing the grand total to around 370 W. That’s equivalent to about six 60 W light bulbs.

Look at the back of the monitor & look at the manual, they all have power requirements. It might be 250 watts but thats the top energy output when you have a ton of crap in it drawing power, not the energy it uses.

So basicly it is about the same as what happens when you leave the kids home alone.:o “Whatta you think I’m made of money,Turn off those lights”

Thats where I’ll catagorize those guys opinions.

It is not just the PC. The monitor can draw up to 2X the energy your PC tower does. Don’t leave the monitor on if your are just downloading a 80MB file.

Then there are also other components such as the printer (mine does about 45 watts when printing), and the external modem. The modem could be a wildcard. There have been some reports of defective broadband modems causing spikes in electrical wires througout house.

My manual states that in standby it uses 35 watts.

Thanks Handy

I guess I’ll have to go with my first impression. IDIOT :smiley:

Once upon a time, two or three years go, most PC power supplies had an output plug to power the monitor in addition to the PC itself. So I believe that 250w rating is four or five times greater than what it’s actually drawing for the PC alone.

Also, the power drawn by the monitor is closely tied to what’s being displayed. If the image has a lot of white it takes a lot more power than if it’s mostly black. Therefore, for example, SETI@Home doesn’t take much power even if you leave the monitor on.

Jerry Pournelle once hooked up some meters and measured a couple of typical machines and monitors. Somewhere between one and two light bulbs for PC+monitor is a good rule of thumb.

Computer and monitor 400 watts typical consumption, 4 cents cost per hour

Two times in the past week I’ve heard someone on national TV say that the current energy crunch is partly due to people turning on their computers and leaving them on all day.

Sorry to hijack, but if the TV companies were really concerned about the power crunch, They’d turn off their towers that use millions of watts instead of telling us to turn off our stuff. It’s getting rediculous. Now they’re telling us to unplug VCRs when not using them. (yeah, like the clocks in VCRs really use that much juice)

I’m pretty sure those power supplies with the output plug for the monitor don’t include the monitor in their output ratings, since the monitor takes the 110V wall power just like the power supply does, so the plug is just a convenience so you don’t have to use a power strip.

I bet if you opened up one of those power supplies, the output plug would be wired directly to the input plug.

(yeah, like the clocks in
VCRs really use that much juice)

2-4 watts.

PC’s don’t use 400 watts. They are capable of doing that though.

hmm this has me wondering. i have a lan with lots of devices, a desktop, 2 monitors, 2 routers and 2 broadband modems, switch, scanner, printer, speakers, and 1 or 2 laptops.

Through a few power strips, they all funnel into 2 outlets.

Is there any device (ie consumer product) that one could interpose between a plug and socket to measure the power being drawn? Then i could accurately measure the aggregrate and see what its costing me to leave all this stuff on all the time.

For most of those other things, you can get an upper bound by looking at the rating on the AC adapter. A lot of them are pretty low already, since the manufacturer isn’t going to spend extra money for more transformer than they need to supply the peak demand of the device - the adapter for my speakers says 18W, the one for my DSL modem says 30W.

Depending on the nature of the device we’re talking about, the “idle” power may be a good bit less. I doubt that my Palm Pilot cradle is drawing very much of the 12W implied by reading the AC adapter when nothing is in the cradle. And I doubt that my ancient printer is drawing very much of its 48W rating when not printing something.

Bottom line - shut off the monitor when you’re not using the PC, and you cut a very significant fraction of the usage.

Lessee, this machine has two power strips plugged into a UPS, one of which I turn off when not using the machine (which does a few background tasks on a scheduled basis, so I want to leave it on anyway). The strip I turn off has the monitor, the printer, the speakers and the Pilot cradle plugged into it. The computer, the DSL modem and an external Zip drive I’m leaving on. Come to think of it, I only have the Zip drive unswitched because I ran out of room on the power strip I want to switch on and off. It would probably be a bit better to swap it with the normally empty Pilot cradle as the Zip drive probably does draw a bit of power.

Well, when we do cooling loads for buildings, each PC counts as a 250 watt heat load (including monitor), so that’s probably worst-case. If not, I have to smack my boss and we have to revise out methodology :wink:

I too am interested in such a device. I imagine a small version of an electric meter such as that mounted on the house, but one I could move from outlet to outlet that I could plug various appliances into to get an actual idea of why my damn power bills are so high.

Does anyone know of such a device?

A server, perhaps…

We have 5-6 pcs downstairs and 2 running upstairs (I married a geek)… We leave the two main monitors on always, but they drop into standby mode after 10 minutes.

What seems to spike our electric bill up and down is use of the house fan… it’s a fair bet you use more wattage getting ready for work in the morning than you do running your pc.

Measuring power… is this maybe what you had in mind?

Meg

Here’s something more along the lines of what you’re looking for.

Meg

Meg, yer a wiz! I’m curious as to what you used for search queries. I looked all over. I’m gonna get one of those things coming my way.

Actually, if you get an Uninterruptible Power Supply (UPS) it’ll tell you what’s being drawn. I had forgotten that I have a 650W UPS which reports to one of my machines. The power here glitches often enough that I’m really glad I have that UPS. They’ve gotten pretty inexpensive. You might consider getting one for yourself. Most models have serial and USB versions. Well worth the money, IMHO.

Two PCs, a 19" monitor and a 21" monitor drew more than my UPS was rated for. I’m not at those machines right now, but I think it was about 110-120%. I think I ended up moving the 21" monitor off of the UPS and leaving the 19" on it. The UPS reports that the rest draw just over 50% of the rated output. There’s also an inkjet printer, speakers and subwoofers, etc on there, and these numbers are with those sitting idle.

A friend of mine had a smaller UPS, I think 150W, that handled his PC+monitor. But it would complain when his laser printer first powered up. After the initial power up, it handled the printer fine.