Watts, volts and amps. I still don't get it

At Home Depot today in the Mid-Atlantic, I saw a battery-powered lawn edger/trimmer rated at 40 volts. Next aisle over, I saw an electric (corded) snowthrower that operates on 13.5 amps of AC current. Nearby were LED lightbulbs that consume 13 watts of electricity per hour.

All of these devices operate off electricity, yet each is advertised by a different unit of electrical consumption.

Momma always said I wasn’t the sharpest fork in the knife block. Please explain at the sub-idiot level.

Cars all run on gasoline* but different cars are advertised using different metrics – towing capacity for trucks, mpg for economy cars, horsepower for sports cars, as an example.

Likewise here, each of these numbers are advertising some relevant information specific to the type of product. For power tools, think of volts like horsepower – 40V is a big beefy trimmer that will be able to cut through thicker weeds than, say, an 18V trimmer. A 13.5 amp snowblower can operate on normal 15 amp household circuit, so the buyer knows they don’t need a special circuit. And a 13 watt LED bulb gets good gas mileage compared to, say, a 60W incandescent.

*This analogy pre-dates hybrids and electrics, just go with it.

It helps if you think in terms of water.

You can think of current as current, i.e. how much water is flowing. You don’t need a firehose’s worth of flow to brush your teeth, so how much current you need can vary.

Voltage is like pressure. High voltage is like high water pressure. Low voltage is like low pressure. How much pressure/voltage you need will vary.

And Watts are sort of a combination of the two. High power (wattage) can mean a small current with lots of pressure or a large current but not a lot of pressure or lots of both.

What comes out of your wall will be ~120V and how much total power you can safely draw will be limited per circuit. So you can power a lot of low amp stuff or a few things at high amp. That’s why there tend to be several independent circuits in kitchens. You can’t run all that high amp/high power stuff (dishwashers, blenders, microwaves, etc) on the same circuit safely so they get split up across different circuits.

For things with independent power (motors, batteries, etc), how much current/voltage/wattage they can support will depend on the design.

The 40V trimmer is advertised as 40V because people associate high voltage with more power, although that isn’t always true.
The 13.5A snowblower is about the biggest load that you can run on a normal house outlet, without tripping the breaker. AC tools are almost always assumed to run on the same voltage (110-120V), so just listing the Amps is enough to know how powerful the device is.
The 13W (NOT 13 Watts/hour) LED is rated on how much energy is consumes. Also a marketing thing, because LED lamps have a wide range of efficiencies, so a 13W lamp may not be 2x as bright as a 6.5W one, but it at least gives you an idea.

Note that Watts are a unit of power. When you run a device for a period of time, you consume energy, which is Watts * time. They are obviously related, but they are not the same thing. You can be billed both for the power you are consuming at any given time, and always for the amount of energy you use over a month.

I think because if they called it a 40v string trimmer they’re hoping you like that big number enough that you don’t notice it’s being sold with a 2A/hr battery and is significantly less powerful than a two stroke trimmer.

Why call it a 3/4hp snowblower when you can say it’s a 13.5A snow blower. Again a big number to counter the gas powered models bigger numbers, or at least that my theory.

Those things should really be rated in HP in my opinion but I’m afraid that if they were they’d fall victims to the same bullshit math that air compressors did.

The led is a little different in that wattage was until recently used to label most lamps. I really wish they’d start putting the lumens in the big print instead of wattage. Power consumption really isn’t a good way to determine how bright something is going to be.

Watts are a measure of power, which is a rate of doing work. All of these devices consume power.

In the case of the lawn edger/trimmer, the voltage rating tells you what type of battery pack you need to put in it. That’s important to know if it comes without a battery pack, or if you need to replace the pack. Also, generally speaking, the higher the voltage rating, the more powerful the edger will be. This isn’t a law of physics - it would be possible to make a weak edger that runs on high voltage - but no company would do this. A higher voltage requires a battery pack with more cells in it, which is more expensive, so why do this if you don’t get more power out of it? Also, the number of watts a battery-powered device can draw depends on the condition of the battery (a new, freshly charged battery can produce more power than an old, partially discharged battery).

In the case of the snowthrower, it’s important to know how much current it draws (i.e. amps) if you’re going to plug it into an electrical outlet. If the outlet is on a 15 amp circuit along with, say, a microwave oven, running both at the same time will blow the circuit breaker. Also, if you’re going to use an extension cord, it’s important for the cord to be rated to carry enough current for the snowthrower. If you use a 13.5 amp snowthrower with an extension cord that’s rated for 10 amps, you could overheat the cord.

Also, for devices that run on household current, the power is approximately the current rating times 120 (which is the voltage of the outlet). So the snowthrower in question will draw around 1,620 watts. That doesn’t mean it will produce that much power at the business end, because nothing is 100% efficient (some of the power will be wasted as heat). Also, there’s something called the “power factor” that I won’t go into here.

Things you plug into household power outlets in the U.S. always run on 120 volts, which is the standard household voltage in this country. There’s no reason for a manufacturer to put this on the box.

The LED lightbulbs are rated in watts because that’s directly related to how much light they produce. It’s not really important to know how many amps it draws, because it’s so low that it should never trip a circuit breaker (for a 13 watt bulb it’s about 0.11 amps). Most LED lights are also rated in lumens (i.e. how much light they put out), and in the wattage an incandescent bulb would draw to produce the same output.

BTW, LED bulbs are rated in watts, not watts per hour. A watt is a rate of energy usage or production (it’s defined as one joule per second, where a joule is a unit of energy). A watt per hour would be one joule per second per hour, which doesn’t make sense in most situations.

An important issue with the different devices is that information is providing some critical aspects about the device you need to know when deciding on a purchase.

40 volt device. Tells you it is only compatible with that brand’s 40v rechargeable batteries, and won’t work with the 18v or 12v batteries the brand probably also sells. You would hate to go home and realise it won’t work with your already purchased battery system.

13.5 amp device. Tells you it can be run from a standard outlet. You would not want to buy a higher current draw device only to find you need a new extension cable and still don’t have an outlet that can run it. You know what voltage it is designed to run on, as it is implicit that it uses mains power.

13 watt LED. Tells you how bright it will be. At least modern lamps usually also have a lumen rating. Which is more useful. Same as incandescent lamps were sold by watt rating as a surrogate for brightness.

I don’t have a feel for how bright a lumen is, but I do have a feel for how bright a traditional 40-watt or 60-watt or 100-watt bulb is. I suspect that many of us would say the same. LED light bulbs seem to be labeled according to both how many watts they actually use, and the incandescent-bulb wattage they’re equivalent to.

You would if they started labeling them like that. If the can label in both new watts and old watts they can label in lumens and have solved the broader problem that they’re conflating light output with power input, which I’m sure will continually need adjustment.

Things that plug into the wall are rated in amps/watts to let you know if it’s safe to plug them into your outlet.

Things with batteries are rated in volts, because generally speaking, larger batteries last longer and sometimes it can mean more power from the tool.

Voltage in a given country is always constant, so in the US you can safely assume 120vac unless you’re talking about things like ACs, electric stoves, etc.

There are a ton of excellent posts in this thread, but I feel the need to call out this one as extremely well written and informative. Very illuminating.

mmm

If it’s safe to plug into your wall outlet it will come with a cord suitable to plug into your wall outlet.

Just becuase it makes a higher voltage doesn’t mean battery is “bigger”. We’re talking about a 40v string trimmer with a 2a/h capacity. That is not a big battery.

24, 120, 208, 240, 277, 480. Common AC operating voltages in the US alone.

——

The numbers they put on consumer items, in the US at least are likely designed to make consumers think the product is somehow superior, and will still work if you plug it into your 5-15 or 5-20 at home. Many times I have encountered things marketed at X number of amps or horsepower that were, plainly, lies.

I agree but believe a clarification is in order since “power” can mean different things. On the electricity side, “power” is how fast you use energy. It does not mean how fast a device can perform work. When Watts are quoted on a device, it just only the rate at which it uses electrical energy. It is not related to the output of the device (this gets a little more complicated for things like audio amplifiers but let’s keep talking about snow blowers).

[quote=“Francis_Vaughan, post:7, topic:954692”]
13 watt LED. Tells you how bright it will be. [/quote]
Yes but make sure you compare LEDs to LEDs, CFLs to CFLs, and incandescent to incandescent.

I think it’s obvious that by “bigger” I meant higher voltage.

Also, just because you can plug something into an outlet doesn’t mean that you should. If the 15a outlet circuit is already loaded to 12a, then it’s not a great idea to plug in something that will end up exceeding the circuit capacity, which is why things with cords have the ampacity on the label. Or do you have some other magical answer?

Amps is the amount of electricity used.
Volts is the energy level
Watts are the amount of power consumed.
amps times volts equals watts, or power.

the 40 volt rating is the rating of the battery. Normally the higher voltage battery equipment are stronger.
The amp rating tells you what king of outlet you can plug it into, and the type of electrial cord.
Watts will tel you how powerful the device is.

Nitpick on the above few posts.

  • Voltage (E) is the electrostatic potential difference between two nodes. It is not a measure of energy. E is not derived from the word energy. It derived from the word electrostatic or just electrical. Units are Volts.
  • Current (I) is the movement of charge per unit time. Coulombs per second or Amperes.
  • Charge is measured in Coulombs. One Coulomb = 6.25 \times 10^{18} electrons.

Power is energy per unit time. Units are Watts. Power = E \times I
Energy is measured in Joules. Energy = E\times I \times t

The energy capacity of a battery is often expressed in Watt hours, or implicitly as Ampere hours, where the voltage of the battery is known.

In practice, most electrical power supplies produce a constant voltage. For instance, in the US, a wall outlet is always either 120 or 108 volts (there are two slightly different systems used in different places, but they’re close enough for most purposes). A 9 volt battery will always produce 9 volts (at least, until it starts dying), and a 24 volt battery will always be 24 volts, and so on. One could in principle produce power supplies that worked other ways, like a constant current or a constant power, but there are practical reasons why we most often use constant voltage.

What you attach to that power supply will then determine what current that device will draw. In the simplest model, what you’re attaching is just a resistor, and the current will just be the voltage divided by the resistance. Most real devices are more complicated than resistors, but there will still be some amount of current drawn, determined by the voltage and the properties of the device. If you attach no device at all (or equivalently, a device with an infinite resistance), then you’ll draw zero current.

The voltage times the current is the power, also called “wattage” because watts are a unit of power. That tells you how much energy the device is using per time. Ideally, all of the energy that’s being used is being converted into some useful form: Energy in turning a motor, or light, or whatever. For most electric motors, the efficiency is fairly high, and so that’s a good approximation, but for (for instance) an incandescent light bulb, the efficiency is low, and so you’re making use of much less energy than you’re consuming.

What do you mean by “stronger?” Voltage is not power.

Unfortunately there appears to be a “voltage war” for tools and yard implements, and it’s really stupid. There are pros and cons with having a motor that operates at a high voltage (all else being equal), and the cons often outweigh the pros.

First the pro: all else being equal, higher voltage → less current → smaller gauge wire (less weight) or less I²R losses for the same gauge wire. Which is nice, obviously.

And now for the con: the battery pack needs to be charged. Higher voltage packs are comprised of many cells connected in series. Unless a fancy and expensive circuit is used, the cells will become unbalanced during charging; during charging, some will be overcharged and some will be undercharged. The pack won’t last long. When it comes to charging cells in series, the lower the number the better. I think Milwaukee made a smart choice to not play the voltage war game with their M18 series of battery packs. Anecdotal, but I use them in my chainsaws, and have never had a problem with them. OTOH, the 40 V battery pack for my EGO trimmer lasted one season. YMMV.

I think you are either misunderstanding or have a typo here. In North America, typical outlets are always 120V line to neutral. This voltage is derived from either a Center tapped transformer which gives 240V from phase to phase or 2 phases from a 3 phase system which is 208V phase to phase. Higher draw items (stove, water heater, dryer) are typically supplied with 240V or 208V.

240/120V systems are the standard for single family houses with 208/120V being common in high rises and some parts of older dense cities like New York.

I know of no place that has 108V as the standard.