Explain electric car charging science to me

It may be the same charger, but don’t you think it senses the voltage? Without a schematic, how can we tell if the circuit is the same, end-to-end? And even if the circuit is the same, not all circuits are equally efficient at all parameters. Maybe this one was optimized for 240.

My Mitsubishi i-Miev charges approximately three times faster on a 220 Level 2 charger than it does on a 110 Level 1 charger. From 1/8 full to 7/8 full takes roughly 14 hours at 110V but only 5 hours at 220V. I don’t know what the explanation is. But I seriously doubt it could be blamed on the battery charge being less efficient. I know that lead-acid batteries tend to charge more efficiently at slower rates and I’d be surprised if the reverse was true for Lithium-Ion batteries.

There could be a few things going on. Machine Elf mentioned one: that the upconversion from 120 V to the Tesla internal voltage is less efficient than 240, since it’s a larger step.

Another is that any loads that come close to the maximum wire rating are going to be somewhat inefficient, but this becomes worse at low voltages. The reason is that if the wire is rated to dissipate X watts per unit length, then it will experience a fixed voltage drop at the maximum current regardless of the system voltage. Suppose this is is 10 volts for a given setup. 120 dropping to 110 is twice the efficiency loss as 240 dropping to 230.

Also, the Tesla has onboard systems that need to be on when the car is charging. The longer these are on, the more power is wasted. Suppose these systems use 200 watts–that’s a much higher fraction of 1.4 kW than it is of 10 kW.

Sorry I misunderstood your real point of confusion. I thought you didn’t get ordinary split-phase domestic power distribution as used in standard homes.

IMO as folks have said the bottom line is the Tesla charging system is less efficient when fed 12 or 15A @ 120V vs. when fed 40A @ 240V. Simplistic notions of Ohm’s Law or power factor or whatever are immaterial.

Plausible causes are that the native input to the charging innards is 240V and when fed 120V an up-converter has to be switched in that burns some fraction of the total.

Another thought is that the car has to be at least party awake while being charged. If a 120V charge takes an extra 3 hours there’s that much extra incremental consumption.

Another factor could be that the final electricity to chemistry conversion process isn’t 100% efficient. And the total power output of the 120V feed is less than that of the 240V feed. So they may be unable to push power into the battery at the optimal rate to create the most efficient electrochemistry. So more total watts of electricity have to go in to produce the same total watts of chemical potential.

That would also explain why the charging was slower than the simple 120/240 = double or half ratio we’d all naively expect.
For sure the specifics are down inside Tesla’s detailed technology that nobody but one of their engineers or maybe repair techs will be able to answer in any detail.

Late edit: Replace
“That would also explain why the charging was slower than the simple 120/240 = double or half ratio we’d all naively expect.”

with

“That would also explain why the charging was slower than the simple 15*120 vs 40 * 240 = 1:5.33 ratio we’d all naively expect.”

Makes sense that it comes down to the Tesla charger being less efficient when fed 120V. I see this quite a bit with computer power supplies. They are usually noticeably less efficient on 120V compared to 240V.

The difference in charging time can be explained by the higher power available from a 240V/40A line vs a 120V/20A line.

The question of efficiency is a bit more complicated. Assuming a relatively conventional circuit topology, the steps involved in converting 120V/240V AC to a DC voltage sufficient to charge a high-voltage battery bank are pretty straightforward: first rectify the AC to DC (using either passive or active rectifiers), then use what is called a boost converter to increase the resulting DC voltage. These circuits generally consist of a number of high-power switching devices and one or more inductors.

There is always some power lost in such a conversion due to a number of causes:

  1. There is always some sort of fixed overhead circuitry which is active. This circuitry will monitor the output voltage and adjust the timing on the various switching devices in order to maintain the desired output voltage regardless of the input voltage or the output current draw (within limits). If the battery charges faster at a higher input voltage, then this control circuitry is not on as long.

  2. There are resistive losses in the switches and inductors (as well as eddy current losses in the inductors), as well as power dissipated in passive rectifiers (if they are used). These losses generally (depending on the specific circuit topology) are worse (as a percentage) for lower input voltage, since the resistive voltage drop is a larger fraction of the input voltage (for the same current).

  3. At a higher input voltage the circuit may be operating at a lower duty cycle (i.e., the power switching devices are spending less time in the “on” state where they are dissipating power in there internal resistance than in the “off” state).

So it is not surprising to me that the efficiency of the charging circuitry is better at a higher input voltage. I am somewhat surprised by the magnitude of the difference, but although I am an electrical engineer power management is not my area of expertise.

Why would they convert to DC before raising the voltage? Easiest way to change AC voltage is through a transformer. Rectifying AC to DC at any voltage is a relative piece of cake. We do it at work for a MW induction furnace that runs in the 2300 volt range. May be because of weight seeing as the charger is in the vehicle? I’d think the loss of efficiency is in the changing of voltage to that which the battery requires. Possibly more heat loss with a lower supply voltage due to the amperage increase through the power supply.

*note to previous post: Some households in the US are powered from 208/120 volt systems. Rare but done. Any time 3 wires go to 3 transformers with multiple houses being fed from a single line each. Possibly larger apartment complexes also.

I am not an engineer but an industrial electrician. I make it work correctly after the engineers walk away.

Transformers to do that at 60 Hz are huge (and heavy). It’s more cost efficient to convert to DC, chop the output to multi-kHz, then run that through a much smaller transformer.

For industrial settings that run at extremely high voltages, the needed semiconductors start to get really expensive and old-school transformers may still be more cost efficient, but even this is starting to change (see high-voltage DC transmission lines).

Not 208v and 104v?

Nope. 208/120.

Instead of receiving two 120V AC sine waves 180 degrees out of phase from the power pole outside, the pole has three 120V AC sine waves 120 degrees out of phase with each other. Then any given house gets two of the three. So you get phases A & B, the next house gets B & C, and the next house gets C & A. Lather, rinse, repeat.

Which means you still have 120V between either of your hot phases and the neutral.

But the voltage between your two 120-degree offset phases is an RMS “average” of 208V. In effect your higher voltage AC is lopsided, but it still works fine.

This. See switched-mode power supplies for a detailed explanation of how they do their thing.

Most plug-in power supplies (e.g. for cell-phone chargers) these days are SMPSs. The hype about “energy vampires” isn’t as urgent these days because SMPSs consume very little stand-by power compared to a conventional transformer-based power supply.

The total energy, the kWh’s, put through will the same, its just the rate that it can be delivered at that varies.

But I guess the 110 volt system is just to plug in anywhere, and the 240volt one is for wired in use… on its own circuit and all. So its not only the voltage that is different…

Well anyway the differences occur because thats what they are providing/selling, not because 110 volts can’t do better …

How big and heavy and expensive would the transformer be?

That’s one of the “big wins” for switching regulators - small inductors.

No, not lopsided at all. The sum of two sine waves of the same frequency but different phases is still a sine wave of the same frequency (but with a different magnitude and phase).

Well, just grabbing the first data I could find: a Square-D 10kVA transformer (440v to 220v step-down, but the step-up can be achieved by reversing) weighs in at 165 lbs…

In general, ANY “flow” systm - electricity, hydraulic, or pneumatic - even mechanical torque vs. RPM - has more efficient power transfer at higher “pressure” (voltage). High “flow” (current) incurs more losses from “friction” for lack of a better term.

Upon learning that electric car chargers are switching power supplies I’m guessing that they are designed for best efficiency at the higher voltage. Ignoring efficiency, for a given battery resistance versus power transferred, 2X voltage will tranfer equivalent charging power in 1/2 the time.

But it isn’t, at all. You always need to ask for two of the related variables, can’t assume any kind of constants.