Electricity isn't free, dammit!

Uneducated Dickhead checking in here:

Lets say you have a rooftop full of solar arrays. Granted, they are a fixed cost, so any power they generate will never be free. However, once you get to the point were they have broken even, and are now delivering power to your home at no additional cost to you (we wont get into the selling back of unused electricity, which you can do where I live), and the sun shines all day “for free”, can’t you claim that indeed, Electricity is FREE!!??

I’m only playing Devil’s advocate here because the wife and I are considering going completely solar and so far, the cost/benefit alalysis looks promising. Sure, the initial investment is steep, but over time (and I got lots of that, at least I hope) we will enjoy “free” electricity. We could generate more than we could use and even sell some back to the power company (this is the part I am still struglling with, conceptually and philosophically). If we are constantly in a surplus of power with no further outlay of funds, that seems like “free” to me.

anyway, continue with the debate. I for one, have no idea what this screensaver business is about anyway. I have noticed a decent decrease in the power bill by shutting off all the pinball machines when we aren’t playing them, but the gameroom just doesn’t look right…


I behave as if stupidity were a virtue.

OK, geek who has (some) hard facts about CPUs checking in.

The most power hungry PC CPU ever to be released that I know of is the AMD Athlon 1400Mhz, at 72 Watts (The AMD Athlon XP 2100+ also uses this amount. The 2200+ uses 67.9W). It can be argued that the original Pentium 4 used more, but Intel does not publish power usage figures for its processors. Now, without monitor or any peripherals, I’d guess a computer using one of these processors can hit a theoretical maximum of about 150W of power usage. The only other appreciable contributer to power usage is the videocard, and they average around 20-30W, maximum load. It is true that much larger power supplies are required, but this is because the load isn’t spread evenly accross the different voltages that the power supply can put out, but concentrated on the +5V and +12V rails.

At idle with a software CPU idler running (or integrated into the BIOS and/or chipset), the CPU will consume a negligable amount of power. My Thunderbird overclocked to 1Ghz runs 1-2C above ambient temperature when idle, and I have very ineffecient air cooling. I’m gonna make an educated guess and say that a completely idle computer with all drives spun down will consume under 25W. In reality, software cooling is rarely employed, so the CPU will never be running idle. A supposedly “idle” CPU draws about 75% of its full load power usage.

My estimation is that, with the CPU and the system bus being in maximum use by an application such as the Distributed.net client, the system as a whole will use around 100W. We’re talking one light bulb worth. This is NOT a significant amount.

As for shortening the system’s life, it is true that every 10C increase in CPU temperature halves the operating lifespan. However, a properly functioning CPU will never even approach its operating limits, even under maximum load. I have a Pentium 120 that ran at maximum load, overclocked to 133Mhz, for about 4 years, mostly 24/7, and never failed. My Athlon 900@1028 has worked under similar conditions for 1.5 years without issue.

In conclusion, you do pay a small amount to run a distributed computing application, but its such a small amount as to not be worth worrying about. The societal benefit is arguably worth it.

It’s not a bug, you just dont have a good enough processor.

It’s good enough to run Photoshop, 20 or 30 instances of Opera, play .mp3s, and calculate deep-zoomed fractals at 1200X1200 simultaneously- It should be plenty good enough to run a simple distributed computing app that is designed to run in the background – no probs with the SETI project.

Or were you joking?

My electricity bill is usually fifteen dollars a month (I live in a shack the size of a largeish matchbox), and I run two computers with these programs on them, twenty four hours a day. I guess that means that the electric company is actually giving me money or something. Cool.

Not for high values of 11. :wink:

Sheesh, I run two TVs virtually all day long (that I am awake,) two lights that are on 24/7, I have been running many fans in my house including my computer, monitor, speakers, modem/router including my firewall/router and my 1.4 Ghz server…oh about 96% of the time. Oh wait, I still have things like my dishwasher, washer, dryer, my landlord/friend running power tools to fix the fence, hair dryer, stereo on 24/7…not to mention the two UPS units that also draw power.

My last power bill (I have a ton of programs running on a weiney ass 600 Mhz machine along with a server that runs programs) and my electric bill was only $37.52 last month that’s a total of 539 Kilowatts for 30 days. That is approximately $1.25 a damn day.

Huh.

Sounds to me like a minimal issue to run a computer to me. But then again, I just deal with computers and a kick ass server…not really into the details but I think my power usage is minimal with all the shit that is run in this house.

Keep in mind I am self employed, I am here a good majority of my life.

PS sweetie

I have a 300 watt power supply on one machine and a 400 watt power supply on the other.

HUH, guess I should be about $16 or more a day on just the computers alone???

I would say “no,” because you still have capital costs.

Let me illustrate: Suppose you give me $100 and in return I agree to pay you $1.00 a year for 1000 years. After the first hundred years are you getting “free” money? No, you’re getting screwed.

I would imagine that in most cases, you’d be better off taking the money you would have used to buy solar panels, putting it in a CD, and using the interest to pay your electric bills.

I think the difference between this situation and donation of CPU cycles is that you already have a computer whether you want to give away the CPU cycles or not.

I’m only playing Devil’s advocate here because the wife and I are considering going completely solar and so far, the cost/benefit alalysis looks promising. Sure, the initial investment is steep, but over time (and I got lots of that, at least I hope) we will enjoy “free” electricity. We could generate more than we could use and even sell some back to the power company (this is the part I am still struglling with, conceptually and philosophically). If we are constantly in a surplus of power with no further outlay of funds, that seems like “free” to me.

anyway, continue with the debate. I for one, have no idea what this screensaver business is about anyway. I have noticed a decent decrease in the power bill by shutting off all the pinball machines when we aren’t playing them, but the gameroom just doesn’t look right…


I behave as if stupidity were a virtue. **
[/QUOTE]

Electricity is free.
.
.
.

They only bill me to push it along the wires to my house.