According to the serial-number plates on my CPU and monitor, the one uses 5/3 of an amp and the other 1 2/3 amps. According to definitions in my dictionary and World Almanac, this means together they consume 380 watts per hour. Is this right? My brother, who gave the computer hardware to me, assures me that the CPU (Compaq Deskpro, “intel inside”) and the monitor (NEC) are designed for “low energy consumption.” (He rebuilt them himself.)
Watts = Volts x Amps.
Household current in the US is generally between 110 and 120 volts.
3/5 of an amp + 1.66 amps = 2.26 amps, for a grand total of approximately 250 watts.
380 Watts? Sort of, in theory… but not really, in practice.
The amperages listed on the outside of computer hardware are typically peak current consumption; typically, that’s found at startup. Power is related to current and voltage by P=VI. Therefore, my monitor, for example is rated at 2.0A, which would yield a consumption of 240W at 120V. However, the average operating power is only about 135W, closer to half of the peak value.
For the computer itself, it looks like you have a power supply capable of delivering up to 200W (1.66A * 120V). That’s just a power capacity; actual power use depends on how fast your processor is, how much RAM you have, how many drives are installed. Without knowing exactly what you have inside, I’d venture a guess (with a wide margin of error) that both the monitor and computer consume, on average, about 50% of their rated power, or about 200W.
“Low energy consumption” typically refers to a computer’s power management… being able to shut off the monitor, drives, and other internal components when nobody is using it. These settings will usually be found in the Windows control panel.
[small nitpick]
Watts are a measure of power, that is, energy per unit time. Saying “Watts per hour” is like saying “Miles per hour per hour”… unless you’re speaking in a time change in power, you just need to use Watts.
Is this a trick question?
5/3 is the same as 1 2/3.
[EE lecture] “Watts” is a rate, or the amount of energy used per unit time, which is power… “Watts per hour” isn’t a valid unit. In alternating current (ac) circuits there is an item called “power factor” that enters into the computation of the power. The actual formula is voltage X current X cosine of the phase angle between current and voltage. [/EElecture]
What you computed was volt-amperes which will (when multiplied by the time of use), in ac circuits, nearly always be more than will be resistered on the watt-hour meter in your house. It can equal the power involved but can never be less than that.
When I run your numbers for volt-amperes using 120 volts I get 400 va.
Please, no thanks are necessary for the pedantry.
You may be right…but I am positive that the CPU indicates 5/3 of an amp. I forgot the reading I got from the monitor, but I do remember that I had calculated the total–the two units added together–as 380 watts. I didn’t mean this as a trick question. I was doing this using the 120-volt current as a basis.
My power supply has a power rating on it of 250 watts. But doesn’t that only indicate what it can put out, not what it IS using (because Im not using all of the power it puts out)? If you want to unplugs around the house you can do that & keep the computer on then run out & look at your electrical meter & see what its using