AC/DC and transformers

I recently visited Croatia, including a visit Nikola Tesla’s birthplace, which is now a museum.

As you probably know, Tesla and Edison had the AC/DC wars, which Tesla won. One of the reasons that AC is superior for large-scale use because it can be transmitted at higher voltage, therefore lower current, therefore less power loss along the way. The reason it can be transmitted at higher voltage is that it is easier to step up the voltage for transmission and step down for delivery.

But I can’t find any description that gets any more technical than this. I am wondering how AC voltage transformers work, and why this principal does not work for DC. When I search these kinds of terms I just get the general explanation I gave above.

IANAEE but I understand basic principles of electricity and I did take EE 101.

The important thing is is MOVING magnetic fields generate electricity. So hook up a transformer to DC and nothing happens once the magnetic filed stabilizes.
The constant reversing of the magnetic field is how transformers work for AC. Step up the voltage to step down the current so transmission lines don’t need to be 3 feet in diameter.
There are various ways to do DC to DC conversion, but none were practical until fairly recently. So now high voltage DC power lines do exist.

Brian

Consider a different way of changing voltage. Let’s connect an electric motor to a generator. If you have a DC motor and a DC generator you can step up or down. But it isn’t cheap and involves rotating machines. A DC machine involves commutators and stuff that wears. So not great.

The need for the commutators is key. Inside the motor magnetic fields force the armature around. But in order to do that the direction of the field needs to change. A simple motor, like common toy ones, have a pair of magnets around the armature. We run current though the coils in the armature, and things are designed so that the field forces rotation. But once the armature has rotated around, the field is no longer in the right direction. So we swap the polarity of the current, and the field continues to rotate the armature.

But we can make an AC motor that doesn’t need brushes and commutation. If it rotates as fast as the AC current changes polarity we can put the magnets on the armature and the coils fixed around it. The motor runs without pesky commutators.

Now do the same for the generator. A simple DC motor will work as a generator, but needs a commutator for the same reason. As we rotate the armature the fixed magnets induce a current into the coils - which is fed out as the desired electrical power. But once the armature has rotated part way around the field direction is wrong and the polarity thus wrong.

So we make an AC generator in the same was we made the motor. Coils on the outside, magnets rotating around. The generator rotates at the speed of the AC we want, the coils output an AC current, and we are happy.

So we have two sets of magnets - one immersed in a rotating magnetic field and being driven around, and the other being spun around, creating a rotating magnetic field and inducing current into a set of coils.

So lets cut out the middle man and just place our two coils together. The rotating magnetic field of one creates an electric current in the other. Bingo - an AC transformer. Vastly smaller than the motor generator, cheaper, and no moving parts.

A DC current generates a constant magnetic field with constant energy density. An electron coming into that field will be affected: an electron leaving that field will be affected. A secondary wire in that field will not be affected: energy change at one end where electrons leave will exactly balance energy change at the other end where electrons enter.

A moving electron is associated with a magnetic field (because relativity). Changing the speed or direction of the electron changes the energy in the magnetic field. There is no easy way to tap this magnetic field, but you can easily tap off changes in the magnetic field.

Accelerating an electron pumps more energy into the field (because relativity): changing magnetic fields accelerates electrons (because relativity) It will accelerate all the electrons in an adjacent wire: more energy in the electrons coming in, more energy in the electrons going out.

And because a changing magnetic field affects electrons, it will naturally find an energy balance where less energy is required, by moving two electrons in equal and opposite manner rather than pumping up the magnetic field (because thermodynamics: systems always find low-energy states)

You can pump energy into one side of a transformer, and energy will spill out the other side because that’s easier than energizing a magnetic field in the transformer core.

Real transformers do have to energize the transformer core at the best of times: we don’t get perfect lossless energy transfer. And if the current through the load would require more energy than pumping up the magnetic field, then the energy just goes into, and out of, the magnetic field, and the transformer is just an inductor with core and wire losses.

To be clearer about this, here are a couple of important phenomena that make AC transformers work:

  • Changing the strength of the magnetic field passing through the hole made by a loop of wire generates an electromotive force in that loop of wire. If you have an electrical device connected in series in that loop of wire, this means you can develop a voltage across that device, and you can deliver electrical power to it.

  • Pushing current through a loop of wire (by applying a voltage to it) generates a magnetic field through the hole made by that loop of wire. The magnetic field generated in this manner scales with the amount of electrical current going around the loop

So now you lay two loops of wire right on top of each other. In the first (“primary”) loop, you apply a voltage, current flows, and it generates a magnetic field. The magnetic field generated by that first loop passes through the second (“secondary”) loop, and induces a voltage and/or current in that second loop - but only while that magnetic field is actually changing. If the magnetic field strength in that secondary loop is constant over time, then no EMF is produced in that loop. The time-varying nature of that magnetic field is critically important: it’s why conventional transformers only work when supplied with AC power. If you apply a constant (DC) voltage to the primary side of a transformer, you’ll get an initial spike of EMF in the secondary while the current in the primary rises to its impedance-limited max value; once that current in the primary levels off, the EMF in the secondary will fall to zero.

As @N9IWP and @Francis_Vaughan point out, DC-to-DC conversion is possible by a variety of means. Switched mode power supplies utilize high speed switching to generate AC voltage from a DC source, then use clever arrangements of inductors and capacitors to manipulate this into higher or lower DC voltage at the output. Mechanical converters can take DC input to spin a motor, which turns an alternator to make AC power (which can then be put through a rectifier to make DC), or turn a proper commutated DC generator to directly provide DC power at a voltage other than what was provided at the input of the DC motor.

While AC has historically been more flexible for cross-country power distribution, high-voltage DC has other advantages, and technology for switching between the two is making it possible and worthwhile to implement HVDC systems.

And here I thought this was going to be a thread about a rock band vs Cybertron toys that morthed into fighting machines. So disappointed.

(Emphasis added.)

When delivering power over a considerable distance, it is indeed true that you want to do it at a high voltage. Because the higher the voltage is, the lower the current needs to be. And the lower the current, the smaller the wire needs to be.

But this is true for AC and DC.

The power plant has a generator. Regardless of whether it’s AC or DC, we need to boost the voltage after the generator, transfer the electricity via wires for many miles, and then decrease the voltage just before it gets to the load.

So why have power lines traditionally been AC? Two reasons:

  1. It is a lot easier and more efficient to boost AC voltage and then decrease it vs. DC.

  2. A lot of motors, including all three phase motors, run on AC, not DC.

But here’s an interesting tidbit of info: if we look at just the many miles of wire where the voltage is very high, DC is more efficient AC. If the wires are really really long, it is actually more advantageous to use DC vs. AC. The are some disadvantageous to doing this, not surprisingly. But advanced technology is fixing those problems.

This is the whole question I’m asking. This is what I keep reading when I google this topic but none of the sources explain why it’s easier. I haven’t read all the posts upthread yet but I’m hoping there’s an answer up there.

A very quick explanation: transformers are simple passive devices consisting of two coils of wire in proximity to each other. A varying electric current in one coil produces a varying electric current in the other coil. A differing number of turns of wire in the coils will increase or decrease the voltage in the second coil. That doesn’t work if the current is not varying. Alternating current is varying to start with. Increasing voltage from a direct current is more difficult, can’t be done practically to produce higher voltages with a passive device.

Others have answered it using physics, and they’re correct, of course. Here’s my take of more “layman’s” explanation:

The voltage coming from a generator is a sinewave; it is always changing over time, up and down, up and down, up and down, etc. Using nothing more than wires and iron, a transformer is able to magically change the amplitude of the sinewave. No electronics is needed, and there are no moving parts - just wire and cheap iron. It can make the voltage higher, lower, or keep it the same. How it does this requires some physics, which others have already gotten into.

A question might naturally arise from this: “How is it able to (passively) boost voltage? Isn’t that like creating ‘free’ energy?” No, we’re not creating free energy here. Yes, the output of the transformer has a higher voltage, but the available power at the output of the transformer will always be a little bit less vs. not using a transformer at all. In other words, there is a “price” to be paid: slightly less available power vs. a much higher voltage. But it’s worth it, since the power lines can now use wire that is much smaller.

Steady-state DC does not work with transformers; only AC.

The gist of it is, if you have a current flowing in a wire, and you wrap that wire around a piece of iron, it creates a magnet. An AC current creates a magnet with a changing magnetic field. So, your input AC power can be turned into a changing magnetic field by wrapping it around some iron.

If you have a changing magnetic field, and wrap a different wire around it, it will create a current in the new wire. The trick is… the voltage in the new wire depends on how much of that wire is in the magnetic field, more wire (more turns around the iron) = higher voltage.

When you put these two together, you have a transformer, one side, the input wire generates a magnetic field in the iron, which generates a voltage in the output wire, and the new voltage is proportional to the input voltage and ratio of input and output turns. If the output has twice as many turns as the input, you double the voltage, if it has half as many turns, you halve the voltage.

Thanks–I understand a bit about how induced current works but I did not understand that this works for AC and not DC.

There just isn’t anything simpler or cheaper than a plain old iron core transformer. You can make one at home with a nail and some lampcord. It’s literally a hunk of iron and some wire, nothing more:

DC won’t work directly with an iron core transformer, for reasons explained upthread. Since an iron core transformer is the easiest method to convert between high and low voltage, any other method will not be the easiest method, and converting between high/low-voltage DC will require one of those other not-the-easiest methods.

Hmm, lets try it this way – if you have a magnet (the kind that sticks to a fridge) and a coil of wire, no electricity is generated if the magnet just sits there.
If you hook up DC to a a different (call this the primary) coil, it acts just like that fridge magnet (ignoring the short time after initial hook-up)
AC transformers work because the magnetic field generated by the primary coil CHANGES, and that CHANGE is picked up by the secondary coil.
In math terms, the electrify generated is based on the derivative of the magnetic field. With DC, the derivative is 0 as the field is constant.

Brian

I’ll just complicate matters and point out that it is not only possible to build a DC transformer, but they have been built.

Superconducting Transformer for Superconducting Cable Testing up to 45 kA (pdf)

40 kA Superconducting DC Transformer for the FRESCA test station (pdf)

In my post above, I was careful to use the term “steady state DC” instead of just “DC.” The reason is that “DC” can have a couple different meanings.

One definition of “DC” is that the current is always in the same direction, but it is still allowed to change with time. Consider a sine wave that has a max amplitude of +100 amps and a min amplitude of 0 amps. Technically that could be considered “DC” since the current is always in the same direction. Could you use this with a regular transformer? Yes. But because there is such a large “DC component” to the sine wave (50 amps), the primary windings will get very hot due to I²R heating. But… what if the primary winding was a superconductor? Then it might work, and I think that’s sorta the idea between those two articles.

The other definition of “DC” is when the current is always in the same direction and the current does not change with time, a.k.a. “steady state DC.” In other words, just a flat, boring, current. I don’t think this current can be used with any transformer, superconducting or otherwise.

I would classify that as AC with a DC offset.
In all my posts above DC means constant voltage.

Brian

Until @Crafter_Man’s post, I would have said the same as you. Wikipedia does however acknlowedge that there are varying definitions of DC, with the common element being the current never goes in the opposite direction:

Although DC stands for “direct current”, DC often refers to “constant polarity”. Under this definition, DC voltages can vary in time, as seen in the raw output of a rectifier or the fluctuating voice signal on a telephone line.

Oh, and - almost everyone has some experience with a DC step-up transformer.
There’s one (or more) in every car - the ignition coil.
But, even though it works on DC, that current needs to be periodically interrupted to generate the very high voltage. That’s what the “points” do (in old cars) or electronic ignition controller (in new ones).

This vid nicely demonstrates how the coils can couple in a practical way. A few turns of stout cable steps it down to a few volts.

How to Build a Spot Welder out of an Old Microwave