Folding@Home power consumption

I have a Dell Inspiron B130 laptop with a 1.73 GHz Pentium M processor. I used to hybernate the computer when I wasn’t using it, which was about 12-18 hrs/day (sometimes more). Now it runs 22-24 hrs/day at at 100% CPU utilization. How much electricity am I using by doing this?

Oops! forgot the link.
Folding@Home. It’s a distributed computing project that lets you donate unused CPU cycles to simulate protein folding and misfolding, helping scientists understand and develop cures for diseases like Alzheimer’s, mad cow disease, Parkinson’s, Huntington’s, and many cancers.

It’s a very worthy cause and it costs you absolutely nothing! (Except the cost of the electrical power, which I’m guessing is minimal. But we’ll find out.) You should do it, too!

Quick guess, without looking anything up…

Pentium M processors are pretty effecient, as modern processors go… when at 100% usage, they consume something on the order of 40 watts. To run F@H, the processor would be running as fast as it can, most of the other general circuitry would also be running and draining power, and the hard drive porbably isn’t left alone long enough to spin down.

So my ballpark guess is no more than 100 watts. Probably not even that much… maybe around 60 watts?

Either way, it’s about as much as leaving a light on all day.

If you want to reduce the power you use, you might be able to set the laptop to turn off other devices. No reason to have the screen open and sucking energy, or the wireless card, or various other drives and devices. My Sony Vaio, at least, lets me turn off power to the optical drive and assorted ports to save energy.

You could probably figure the power usage more directly, by simply measuring the current coming from the wall (with the right equipment). You could also find how long it runs on just a battery, and figure out the power drain from the time and battery capacity. That would give you a rough but good enough answer.

If you are using this 60 Watt (3.16 A @ 19 VDC) power supply then the 6-10 hour daily increase will cost the exactly same as leaving a 60 Watt light bulb on for the same period.

You electric bill almost certainly lists your monthly enerygy consumption in kiloWatt hours (KWH). 6 hours of using 60 Watts adds 0.36 KWH to your monthly bill. 10 hours of using 60 Watts adds 0.6 KWH to your monthly bill. In a billing cycle with 30 days you will be increasing your KWH figure between 10.8 KWH and 18 KWH. If each KWH costs a dime then you will be paying between $1.08 and $1.80 to help Folding@Home.

Cool! This also confirms that my mom was nuts about making me turn off every single lightbulb if I left a room. :wink:

Does this mean the power consumption is the same no matter what the processor (and the monitor and the disc drive…) are doing? What happens to the extra power when it isn’t actually powering anything? Or is 60 W the maximum energy usage?

You could get a Kill A Watt meter and actually measure it …

The laptop power consumption will vary depending on how you have the power-saving options configured, as lazybratsche mentioned. The power supply brick itself will also waste some power (see if it’s warm - the warmer it is, the more power it’s wasting).

In general, any power not used to perform work in a computer is wasted as heat. The 60W rating of the power supply brick would be the maximum power it could consume. I’d be surprised if a normal laptop

My Thinkpad actually has a battery gauge that shows the real-time power being taken from the battery. Right now I’m using 1.70A at 12.15V, which is about 21 Watts. That’s with the LCD display at full brightness. I’d be surprised if a laptop could actually consume as much as 60W - that would need some serious fans to keep from overheating! It’d be like having a 60W bulb in your lap.

Arjuna34