I have read that charging an electric car with 120v is something like a third less efficient than charging it with 240v. So if some car takes, say, 3 hours to charge with 240v, it probably takes about 9 hours to charge with 120v.
What I don’t understand is why this is so. Can someone explain to a dummy why the electrons from a household plug are so 30% more shiftless than the electrons that come out of a dryer plug?
Think of it like filling the tank with fuel. The 240v hose is twice the diameter of the 120 so the tank fills faster. Actually it’s more like the same hose but more pressure, but the result is the same.
I do know that electricity doesn’t really work like liquid in a pipe, but it’s a good analogy for someone with as little understanding as the OP.
I appreciate it, but that doesn’t explain it. If the pipe were simply twice the size, I would expect that it would take twice as long to fill a battery with 120v as compared to 240v. But it seems to take three times longer.
Although note that the 240v figure for the Tesla isn’t a standard 30 amp 240v dryer plug, but from a 50A one like you might find on an RV hookup (or a lot of Tesla owners get them specially wired into their garages.) A normal dryer plug would charge at about 7.2Kw.
Thanks - that makes a lot of sense. But as always the case, your answer leads me to another question.
Looking at that web page, there’s a charging calculator at the bottom that lets you estimate charging with different variables. If you drive 40 miles a day, and use a 240v outlet, it takes you just under 90 minutes to top off your battery, and it says that 13.2 kWh are used. For a 110v outlet, it says 17.7 kWh are used.
I think this may be the root of my question: why 4.5 more kWh on a 110v outlet? Shouldn’t the total power delivered be the same if you drive 40 miles per day, regardless of what type of plug you have in your garage?
This. For a given current flow, if you use 240 instead of 120, you’ve doubled the power. If you also double the current, then you’ve doubled the power again.
Note that the typical 240V outlet is wired to supply more current than the typical 120V outlet. So if you’re plugging into 240, you generally will be able to draw more current than if you’re plugging into 120.
AIUI, the “charger” on a Model S is built into the car: you plug it into 120/240, and the charger takes that and does whatever it has to do to make the battery happy. The Tesla’s battery is pretty high voltage; it may be that the process for converting 120V power into ~400V power is less efficient than the process for converting 240V power.
I hope an electrical engineer comes and explains it better, because my understanding is far from complete… but it has to do with how electricity is delivered. You have 2 hot wires, and a neutral. 120V is between 1 hot and the neutral, and 240V is hot-hot.
For some reason pulling power from 1 hot to neutral is less efficient than pulling equally from each hot wire. I’ve heard the term “balancing the load” when doing the layout in breaker boxes, meaning you try to distribute the current draw (on the 120V side) to each hot wire.
Would love to get a better understanding of why this is.
What you’re talking about has zero to do with the OP’s issues.
There is nothing “less efficient” about pulling between hot & neutral versus two hots. At least not directly.
Here’s what’s really going on: electricity is delivered to US homes on three wires. They’re each at a different voltage. The “neutral” is in the middle between the other two. So from either other one to neutral provides 120V. Between the two non-neutrals (=“hots”) there is 240V. An analogy would be standing on one step in the middle of a staircase. From there you can put one foot up one step, or put one foot down one step, or put one foot in each direction meaning your feet end up 2 steps apart. In US home electricity, each step is 120V different from the other.
For supplying any electrical device there is a tradeoff: do we supply it with higher voltage and less amperage (=current), or lower voltage and more amperage (= current)? It turns out that motors are more mechanically efficient, and wiring requires less materials if we go the more voltage & less current route. At least for reasonable values of more and less. Back in Edison’s day we settled on 120V as “less voltage” & 240V as “more voltage” for household supply. Higher voltages are used in commercial industrial settings.
So the big high-draw items in your house, such as heaters, air conditioners, electic clothes dryers, ovens, & big power tools will be configured to draw between the two hots & will see 240V between their input wires.
Meantime everything else in your house will run on 120V pulled from one or the other hot versus the neutral. Your computers, lamps, stereo, TV, toaster, fridge, microwave, phone charger, etc., etc.
“Balancing the load” is the idea of hanging about the same amount of 120V stuff in your house off one hot as the other. So maybe the lamps and outlets in the living room are wired into hot A, whereas the lamps and outlets in two bedrooms and bathrooms are wired into hot B. Typically kitchens have several circuit breakers and will also be split between A & B. So the fridge might be on A while the microwave ends up on B.
In a typical breaker box, each circuit breaker position alternates which hot it draws from: A, B, A, B, etc. This is true going down both sides of the typical modern two-column breaker box. By choosing which loads connect to which breakers the electricians achieve a rough balance between the A & B hots.
In the typical breaker box you’ll also find some double circuit breakers that take up two adjacent positions. Those are the 240V breakers which pass both A & B down separate wires to the load. The switch handles are connected so if either side overloads, both are switched off.
If you pick 120V instead of 240V it shows that more kWh will be used (meaning your electric bill will be higher) for the same kWh going into the Tesla battery. So why is it more efficient to use 240V? I’m not talking speed here, it’s clear to me why 240V is faster.
Edit: I think I’m getting to the power factor here. I know pulling more on one phase on a 3-phase circuit is wasteful. But is the same true for a single-phase-with-neutral setup?
IANAElectrical engineer, but it seems to me that you are ignoring the efficiency of the charging circuits, which are an important part of the equation. Without specs, you cannot assume that the charger circuit for 240 is the same design as 120. For all we know, the 240 charger is more efficient than the 120 simply because it was designed to be so. To wit:
What it is saying is that your battery will get more juice with the 240V in the same time of charging. So if you use 240V - you will charge fewer times in a month for the same amount of driving . So your electric bill will be the same.
No, it doesn’t. Scroll down to the section labeled “calculators”. Change “Which type of outlet will you use” from 240V to 120V and the kWh for a 40 mile charge goes from 13.2 kWh to 17.7 kWh. and the cost (assuming $0.12 per kWh) goes from $1.58 to $2.12.