I like to keep a desktop PC at home running all the time, but I’ve noticed the energy consumption is non-negligible (judging from how warm that room is even in winter.) An obvious solution is to use a laptop instead, but I don’t want to give up my dual-monitor setup with two 20-inch LCD monitors. Any thoughts on what I should upgrade to, if anything?
FYI the specs of my current PC are:
[ul]
[li]ASUS A8N-SLI Premium motherboard[/li][li]AMD A64 X2 3800+[/li][li]Four 300GB HDDs (RAID 0+1)[/li][li]GeForce 8600 GT video card (Gigabyte GV-NX86T256H)[/li][li]Windows XP[/li][/ul]
I could just give up the RAID setup, but I’m not convinced removing two HDDs would make that much of a difference. I don’t need anything particularly fast - I don’t play games.
Any savings you get from a more power efficient computer is going to be destroyed by buying the new one for that reason in the first place. There is nothing wrong with power efficiency but computers aren’t that power hungry especially in some modes. How about setting it to go to sleep after some interval of non use? That is more practical and how most people deal with the issue.
Financially speaking, I’m sure you’re right. But I don’t want to feel guilty about my carbon footprint every time I walk into my room. Also I’m sure the room would be uncomfortably warm in summer, or at least present a non-negligible extra load to the air conditioner.
I will if I have to. But it would be an inconvenience. Having to close all applications every time I leave the desk (because some of them seem to keep the computer awake), having to download podcasts manually, having to go to my room and turn it on every time I want to print from my laptop, etc.
Unfortunately those monitors and video card are probably pulling more juice than anything else, even if you trimmed the raid I doubt you would see a significant difference in power drain.
If you are really that worried about it you can find one of those nifty little plug in meters and you can plug your power bar into it. Then try unplugging a monitor, etc to see what difference if any various sleep/suspend/power off functions make.
Far as I’m aware, computer hardware manufacturers don’t really concern themselves with building energy-saving parts. Sure, they’ll try to optimize the power drain, but mostly it’s about either performance or cost. The primary exception to this is monitors; LCDs take less energy (IIRC) and give off less heat than CRTs.
If you have a CRT monitor, switching to LCD is probably the biggest step you can take. Beyond that, I should think software solutions for sleep mode and the like are your best bet to control idle power consumption.
Your AMD processor should have a power saving feature designed to reduce processor voltage in accordance with the load (QuietCool 'n Quiet) which helps you. Your hard drives are going to take a lot of energy. You may also want to clean the inside of the computer. A build up of dust will insulate your components making the case fans work harder, using more energy. Are you keeping your computer on all the time just for convenience or is there a task that its performing (downloading or defragmenting, for instance)?
I would also suggest turning off your monitors and printer when not in use (not just letting them sleep, actually shutting them down). Better cooling for your PC might also allow you to throttle back on energy consumption. If you don’t already do so, clean your intakes, fans, filters and heatsinks.
I bought a new computer when Oblivion first came out, an expensive Alienware. What I didn’t expect was that the 600-watt power supply would make it unbearable to run during the summer. Worse yet, it turned out that it was on the same circuit as my refrigerator (!), the combination which fried the wiring and then fried the main breaker on my circuit panel. Total cost for electrical repair and computer $30K. Yep.
The your wiring had been installed by an incompetent boob and was a time bomb waiting to go off. A 600 W power supply will only draw 5 A at 120 V–and I can all but guarantee it wasn’t running even close to capacity. Refrigerator loads vary, but typically run around 6-9 A. With a 20 A breaker on the line, the total load is sill just below the 80% derating of 16 A. In any case, even if the load was too great for the wiring to handle, an appropriately sized circuit breaker should have tripped long before the wiring got hot enough to cause damage. And to fry the main breaker? To be charitable, I find that a little difficult to swallow.
I remembered that my UPS reports power usage so I set it up again - my system with one monitor on uses about 180 W, and 140 W with the monitor off. Which isn’t as bad as I feared. And that’s after I figured out how to turn the CPU power management on (thanks Flander). I think I’d get some HDD trays so I can keep two of the drives as offline backups instead of RAID. And I may still try enabling the sleep mode to see how much of an inconvenience it really is.