Electricity isn't free, dammit!

An unspecified Doper has this in his sig:

(the emphasis is mine)

This statement is a LIE. It does NOT cost nothing to run the United Devices “cancer research” screensaver, just as it does not cost nothing to run the Seti@Home screensaver or the RC5 background distributed encryption cracker. All of these devices will work your CPU at maximum when it would otherwise be idle. This means your computer is using up extra electricity doing this computation. This generates heat which causes your processor fan to turn on or turn up its speed, consuming even more electricity. On top of that, this heat gets dispersed into your environment, which (in the summer at least) means your room gets hotter so your air conditioner runs more.

This adds up. Just the added processor power load is the equivalent of leaving on as many as five 60-watt light bulbs all the time. 300 watts 24 hours a day is 7.2 kw-h. At the Commonwealth-Edison rate for residential customers, that’s almost $8 a day in power consumed.

These programs also shorten your computer’s life (again, mainly due to thermal loading), but that’s more speculative costwise.

These programs are not free and I’m tired of people claiming they are. Do you want to make an $8/day donation to cancer research, or not? Personally, I don’t consider $2500 a year “free”.

Cite please.

What you say is undoubtedly true, but perhaps a little bit of an overreaction?

I think most of us could have worked out that, yes, indeed, extra load on the CPU translates into more electricity used. I notice you’re using figures very much at the high end to make your point - I don’t have the figures handy for full-use vs. idle power consumption on my machine, but a) it isn’t switched on 24/7 anyway, and b) quite often, when it is switched on, those CPU cycles are being dedicated to boring old work-related things quite a lot of the time. So, if I were running one of these programs, it wouldn’t cost me anything like $8 per day, or whatever that is in proper money. Yes, it would be costing me something - I know that. Surely the decision to incur that cost is up to me?

If the inaccuracy really offends you, perhaps an email to the person with the sig line might be more to the point?

Electric bill a little high this month?

$240/month to run SETI@Home?

The power supply on an average desktop system is usually rated between 250 and 400 watts. Servers often have larger supplies (some of ours have dual 1200 watt supplies). I’ve slapped ammeters on computers (one of my job duties includes load-balancing the power in our computing center). We’ve found that server systems draw about 30% of their rated power at “moderate load” and about twice that at “high load”. These numbers obviously vary from system to system; desktops have a wider range of performance (standby will be much less than max, perhaps as low as 10%, and max actual is likely to be closer to the supply rated max as desktops are not built as failure-tolerant as servers).

On the processor side, your typical Pentium III-type processor runs between 50 and 70 watts DC power draw at maximum CPU load, based on some AMD documentation I have. Contrast this to 15 to 25 watts at standby. (Different processors have different performance.) Subsidiary hardware adds additional power consumption, as do cooling fans and hard drives. The power supplies used in computers are not particularly efficienct, either, so you have to factor in for that as well. Citing exact numbers is almost pointless as there is so much variety in computer devices out there and the power performance of a given computer is highly dependent on the particulars of the installation. The 300 watt figure I gave is about the most you can expect to see for a desktop system with a single processor. Many people will be lower, but few will be below 100 watts difference between standby and full power and most of those will be laptop users. Computers use a lot of power.

I assume you don’t need a cite on the math to get from 300 watts at 24 hours a day to 7.2 kw-h, or the application of the well-known 11 cent per kwh rate ComEd charges for residential power to get almost $8 day.

11 cents per kwh * 7.2 kwh = 79.2 cents.

Someone is slipping a decimal place here. It may be me, I’m an arts graduate.

Steve Wright: Of course it’s your choice to make. And yes, I cited a high-end figure. Even at 100 watts, you’re still paying quite a bit each day. (Hm, I think I may have dropped a decimal point originally, so the per diem and per annum costs are off. Sue me. It’s still not free.) I am just a bit peeved at people who wrongfully blather that running screensaver distributed computing engines “cost nothing”. And, yes, I could email the individual in question, but I felt rather more like ranting. So there.

JuanitaTech: Actually, yes, but not because of screensaver programs. I don’t run screensavers at all, just screen blanking. And my power bill was high not because of my computers but because of my crappy air conditioner, the hot weather, and the plant lights upstairs.

Well, a simple check of the power bill would seem to disprove this; I’ve run various distributed computing programs and my power bill has never gone over $60/month, even in the dead of a Canadian winter (i don’t currently see my power bill, but I’m quite certain my landlord would mention something if it suddenly jumped to $300 or more).

I forsee an Emily Litella moment coming up soon for someone.

Actually, to make a really air-tight comparison, what you’d want to do is compare the load on the CPU of one of these programs against the screen saver someone would normally run. For you, that’s a fairly simple comparison; you don’t run a screen saver, so you’re incurring a net cost. i wonder if it’s true for everyone, though - I see a lot of my colleagues absent in meetings for hours, while photo-realistic fish swim about on their desktops, making the CPU render more polygons in a second than Van Gogh did in a lifetime. Would they lose any money by running a distributed engine? (Just asking. Seriously. I don’t know the answer.)

Um. I shared an apartment with two other guys, both with computers, at least one of which was on almost 24/7 and I’m sure that there was on average 20hrs of computer usage a day.

Our total monthly electric bill was never near what you’re suggesting.

Not that I’m arguing that it’s not free, just that the actual cost to the individual cannot be nearly as high as you are figuring.

If you want to argue about the collective cost of all these computers around the world running, that’d be interesting.

I did drop a decimal point; it’s actually about 80 cents a day, or about $250 a year. Still not nothing, and enough to make a difference. My beef with the “costs nothing” remains.

I still have a problem even after you move the decimal – according to your figures, I should see an increase in my power bill by $24 a month (power’s a bit cheaper here, more like $15 for me, but it’s the same principle), which just didn’t happen when I started different programs. The $60/mo I mentioned is a high end for middle of winter – it’s more like $35/mo normally (AC being virtually nonexistant here) . There’s no way that my power bill should be that low if it cost what you think it does.

Raygun99, I did say that your cost will vary depending on a number of factors. There’s also the issue that Steve Wright pointed out: you may have been running an expensive screensaver beforehand, so you were already wasting money anyway. Also, older computers don’t have the power management features of newer ones, so you won’t lose as much money on older hardware (these systems draw at about the same load all the time).

The newer your computer, the more the difference between standby and full power, especially with Pentium III and Pentium 4 processors (the higher speed P3s can draw anywhere from 5 to 90 watts DC, depending on load). The number I quoted assumes a relatively new, relatively fast computer with advanced power management, not running a screensaver (other than “blank screen”) and hard drives spun down when the computer is idle.

Dammit, Steve, ya beat me to it. Evidently we do need a cite to do the math.

I am also a liberal arts graduate, but damn, at least I caught that howler.

Here’s specifics: Athlon 800 TBird, I’ve never run screensavers, monitor and HD powerdowns after ~30 min and ~3 hours respectively.

KellyM, I would like to respectfully point out that you are completely full of shit. Yes, the CPU cycles you are donating to distributed computing are not precisely free, but their cost is very small (close to free for many people/businesses) compared with the value donated.

I think your wattage figures are wrong. First of all, PCs with a 300W power supply don’t draw 300 watts continuously, but typically a much smaller fraction of that (I’ve heard between 25-50%). The higher power rating is to ensure continuous voltages, not average consumption. Second of all, the cost of these programs if used as they are intended (a higher CPU version of a screensaver) is a small fraction of that. Running Seti@home instead of a bouncing ball on your screen doesn’t change the power draw of the peripherals (HD, video card, etc.) or the monitor at all. Just a small increase in the CPU power consumption. Say 10 watts. (I just made that up.)

If you are in a situation where the computer is left on with a screensaver regularly (e.g. a business or computer lab), this is a pretty negligible difference. Of course you shouldn’t leave your computer on to run Seta@home instead of turning it off – that’s not what the program is for. It’s supposed to use spare cycles on a running machine that would otherwise go to waste.

But the biggest thing you’re missing is the value of the CPU cycles themselves. Say you paid $2000 for a PC. What you paid for is the capability to do calculations at a certain speed. Normal users, running standard applications a small fraction of the time, probably only use a few percent of the maximum capability.

At the same time, your taxes fund research. A lot of research is done on large computing clusters, which are expensive to run and maintain, and whose resources are constantly maxed out by an overwhelming demand for CPU power. In research, an hour of CPU time is a unit of currency no different than a dollar bill. Simultaneously, millions of PCs sit idle, running screensavers, under continuously burning flourescent lights. Voluntary distributed computing has the potential to provide a large amount of CPU power for a negligible cost. For many people, the increased power consumption is not even noticeable.

No one is advocating using power you wouldn’ t otherwise use, like you seem to think. If you were going to leave the comptuer on anyway, the cost difference is negligble – calling it “free” isn’t that crazy. And the cost per CPU hour is almost certainly cheaper than that of large supercomputers, so you’re saving everybody money in the long run.

Why are you insisting on the 300 Watts/ 24 hours worst case? Is it an attempt to validate your self-righteousness? Assuming the computer is actually being used for its intended purpose, then the amount spent on the CPU sharing will be no more than 8-16 hours. The actual cost, while not free, is less than the cost of a can of soda/day. A cheap generic can of soda.

It would be a more fruitful use of your frightful wrath to rage against screensavers and the leaving of monitors on at night. Or maybe just the design of processors that concentrate on CPU GHZ ratings rather than actual processing power, resulting in excessive power demands.

Another point is that UD, at least (the cancer cure program) doesn’t run only as a screensaver. It actually runs full-time on the computer, using the extra cycles when you’re slowly scrolling down through a huge page of Straight Dope goodness and plotting your reply.

I don’t see any energy-conscious way to NOT have the monitor on while I’m reading online, but I don’t see a reason not to use that extra processing power, and as for the cost…

My electric bill when I lived alone with no computer = $90 during the summer.
My electric bill now = $105 during the summer. I have a computer, which is on about 16 hours a day; the Raven has a computer, likewise. We have a third computer which is on for internet browsing and IMing during gaming sessions, about 4-6 hours a day, and I have a laptop which I use to read before bed, 1-2 hours a day. That’s a total of about 36 computer-hours a day, at a cost of 50 cents, which gives me 1.4 cents per computer-hour.

If I make the wild-arsed assertion that a full HALF of my computing power goes to rotating those little molecules (which is obviously way too high, c’mon, I play EverQuest here) then we’re looking at a cost of $7.50 a month tops, or $90 a year. That is for FOUR computers in a highly geeky household. Using the same numbers and 6 hours of computer usage a day, I get $15 a year.

No, $15 a year isn’t “free.” It’s the cost of a magazine subscription or a pair of sale shoes. But it’s also the amount you could save by recycling every soda can you use, picking up every penny you drop, or quitting smoking for a week.

Hey, now there’s a thought. Think we could get people to quit smoking, just for a week, for a cure for cancer?

Corr

I ran that for a month or so, but had to uninstall it because, for a distributed computing project, it was an intolerable resource hog-- I have a fairly pokey PII - 266, but when UD was running on it, my kernel processor usage shot up to 100% – even if no other applications (apart from the resource-meter) were running. .mp3’s would stutter and skip. Everything boggged down.

Obviously, Kelly’s computations about energy consumption are flaky, but, from a resource POV, I did find UD too costly to run.

Possibly this was just a bug that has since been worked out, but my experience running it was not good. (Although I did like the spiffy looking graphics it generated while it was working.)