Computer Electricity Costs - I7 Processor

I am buying a new computer. The salesman said that if I get a I7 processor instead of a Quad Core processor that I would save $300 per year in electricity costs. That sounds questionable to me - I am surprised that a computer even uses that much electricity in a year.

So does anyone have any idea how much electricity cost savings there would be by buying the I7 instead of the other choice?

Caveats - I may not be using all the terms correctly, but hopefully you can figure out what I mean. I know there are probably a million variables - so feel free to offer a back of the envelope guestamate.


Sounds like nonsense to me. Our total electricity bill (for a 3-bedroom house) for a year will be comfortably under $1500- in Florida. Given that more than half of that is air conditioning and refrigerator, the salesman is asserting that just a Quad Core processor uses the same amount of power as every non-refrigerant appliance in my house.

I would be surprised if the difference was even $30.

300/Year at .10/KWH = 3000/8760 = 342Watts 24/7

A large PC running full-out might consume this much power, but most don’t run at full power all the time (my Massive G5 Quad idles at 150W, and consumes up to 400W when running with all 4 cores at 100%).

So, the numbers are a stretch.

There’s no way. these processors manage power consumption a lot better than the previous gen and they use less power when taking performance into account as well. But the difference is probably not going to be that large.

The current Core-i7 processors have thermal design power (TDP) of 130W, and most Core-2 Quad processors are 95W. The actual power consumption depends on usage, but TDP is how much heat it’s designed to handle, so it gives us some indication of peak power consumption.

However, I understand the i7 has a more advanced power management system. (Power management in CPU basically means it reduces power when CPU load is low.) So I suspect the difference in actual power consumption is negligible.

I’m calling BS on that one as well. I just moved my office back home. My wife and I work full time (more than full time, really) and use computers all the time. With two computers (each with two LCD monitors), a headless server (on 24/7), the lights and the extra heating, our monthly electric bill has only gone up $60. If we look at the wattages I know, we’ve got 440 W for lighting and 1500 W for heating (it’s on most of the office hours during these winter months). I doubt the computers are more than 1000 W combined.

Thus, I’d conclude 3 dual-core computers and 4 LCD monitors have got to be less than $20/month.

Thanks for all who replied. I figured that what I was told did not sound right but I could not validate for myself.

If your electric heating is resistance electric heating, you are not saving anything in winter if you get a computer that uses less electricity. You will only realize a net savings during those times when the heating is not required.

I thought the I7 was a quad core proc, isn’t it?

Nevermind, I see you are probably referring to something like a QX6700

Yes it does have 4 cores. However, Intel says it has advanced power saving features. So it will use less power than previous 4 core chips.

Here’s a test showing that the i7 uses about 20W less power than a comparable quad-core during idle, and roughly the same power during load. So assuming your computer is idle 24/7, 365 days a year (best case scenario), that’s 175KWH in power savings for the year. At $0.10/KWH, that’s $17.50.

Of course, if you really wanted power savings, you could just go with a decent efficient dual-core CPU, or buy a laptop.

True, although during summer, you’re going to pay twice for your inefficient computer if you have the A/C on. Once for the wasted CPU energy, and once again to remove the extra heat.