Amps vs. Watts

I should know this, but I haven’t needed it in a while. The book that would tell me is in storage.

How many amps are drawn for a given wattage? For example, let’s say I wanted to put three 1,000 watt tungsten lights on a circuit. How many amps should the fuse be rated for? (My lights are in storage too, so I can’t just look at the label.) ISTR that the lights would draw 8.5 amps per 1,000 watts, so the hypothetical three lights would draw 25.5 amps and would require a 30 amp fuse. I also STR that allowing 10 amps per 1,000 watts makes it easier to figure and allows a wider margin of error.

Do I have this right?

For resistive loads like lightbulbs, W = V * A, so you have Amps = Watts / Volts.

No. Well, sort of.

The current drawn for a given wattage depends on the voltage supplied. (P=VI)

Therefore to find the given current of a 1000 watt light, you must divide by the voltage.

IIRC the voltage in the states is 115V, and so your light would indeed draw around 8.6 Amperes, however this depends on whether or not u do live in the US.

It’s easy as pie for direct current circuits such a batteries and the like. Current, amps, is just the power, watts, divided by the voltage, volts.

For AC circuits it is a little different. In an AC circuit the current and voltage can have different phases. That is, the sinusoid of current can cross zero going positive, for example, at a different time than does the voltage. This means that a direct computation of current requires that you know the phase angle between current and voltage for the particular device in question. Incandescent lamps are mostly a resistive load which means that current and voltage are nearly in phase so the computation is the same as for direct current* i.e.* watts/volts.

For other things, like appliances using motors, the best way to be sure is to read the current off the nameplate on the appliance.

Thanks! :slight_smile: