compared to say a 100 watt lightbulb, how much energy does a running Dell PC consume in say an hour (with monitor on)? an estimation is all i am looking for.
if i am downloading and playing music and watching a dvd and all kinds of other things, does it use more energy than if the PC was just sitting idle, with disks and monitor on?
I think that it would vary greatly depending on your PC’s configuration and periphrials. I found something that said between 50 and 100 watts an hour average from this site http://www.wyse.com/overview/energy/
Not enough to make a decent impact unless you are really pushing the system (using a 3d intensive game, for instance), but computers generally spin down the drives and turn off the monitor when idle.
Picky, picky. tell that to the author of the website I derived the source from, I never claimed to know the least bit about electricity. All I know is that it’s not the watts that kill you, it’s the amps .
I don’t know. I feel he has a right to be picky about the site, and I don’t think he’s being picky with you per se. It irritates people like sailor and myself when people who ought to know better (like people making energy-oriented sales pitches) don’t know what they are talking about.
My understanding (which could be wrong, but I was told this by someone who should have known) is that a 300 W power supply draws 300 watts. The usual rule of thumb is that the maximum it can supply is half of what it draws, so it can supply up to 150 W. Of course, if less is consumed, then it draws less.
If it weren’t rated by what it draws, you could not add it up with whatever else was on the circuit to see if you have overloaded it. Although in theory, a single circuit (in North America) could supply 1,650 watts, in practice it is good to limit it to 1,500.