AC versus DC Power?

I heard somewhere that we use AC to transmit power because it is more efficient than DC. Why is this so?

AC transmission won out becaue it’s easy to change voltage with a transformer. Power can be sent at high voltage/low current which is more efficient, less power loss due to resistance heating IIRC, and converted to lower voltage suitable for end users. There aren’t any cheap and simple devices for doing this with DC on a large scale.

There are now, but there certainly weren’t in the 19th century, when the decision was made.

The history of AC vs DC power is rather interesting, and basically came down to a war between Westinghouse with the AC system and Edison’s DC system. While the history books like to gloss over details, the truth is it was an ugly battle fought not only along technical lines, but also political lines. Just to give you an idea of how ugly the battle was, Edison hired groups of people to tour around the country electrocuting animals with AC, just to show how dangerous it was. For a while, in some areas there were AC systems and in other areas there were DC systems. Even the voltages and frequencies used by these systems were different, more so than even the mess we have now with parts of the world on 220 and parts on 110.

In the end, mostly because of the efficiency and simplicity of the AC to AC voltage transformer (which is just two coils of wire around an iron core), AC won out over DC. The transformer allows the voltage on the transmission lines to be greatly increased. Since the power lost in the wire depends on the current going through the wire and not the total power going through the wire (the power loss is the current squared multiplied by the resistance, if you happen to be curious), it’s much more efficient to increase the voltage. The laws of physics say that the power into your transformer has to equal the power coming out of your transformer (power is voltage times current), so if you doube the voltage you reduce the current by a half. Make the voltage ten times greater, and you make the current one tenth.

The same thing works with DC. If you increase the voltage you can decrease the current and make the whole system more efficient. The problem with DC is that, as Desmostylus said, in the 19th century and well up into the 20th century, cheap and efficient DC to DC transformers didn’t exist. The old 2 coils of wire around an iron core trick doesn’t work for DC.

Another trick that only works for AC is 3 phase power. If you look at most power lines, you’ll see 3 big wires and one small one. These are 3 seperate circuits, with the phase of each sine wave on the AC line 1/3 of a cycle apart. Why would anyone do such a silly thing? Well, you may remember from your high school math classes (or you may not) that if you add 3 sine waves together that are 1/3 of a cycle apart, you get zero. So, instead of having 6 wires for your 3 circuits (electricity flows in a circuit, or a circle, so you have to have a wire going out and a wire coming back) you can simply tie all of the return wires together, and all 3 sine waves add up to zero and you don’t have any current. If you don’t have any current, you don’t need any wire. In the real world, all of the loads aren’t going to be perfectly balanced, so you can’t get rid of the return wire completely, but you can make it a lot smaller. Instead of having 6 heavy wires, now you have 3 heavy wires and 1 light one, which is a lot cheaper.

DC power transmission is in fact used in parts of the world, especially on very high voltage long distance transmission lines. At very high voltages, DC is actually more efficient, and these efficiencies can make up for the more complicated transformers required at either end of the line. Systems that have DC transmission lines end up converting the power back to AC before it gets to the end user.

Excellent post, ecg. Just one nitpick:

The fourth wire isn’t a neutral return, it’s an “overhead earth wire” (OHEW). It’s design function is to provide lightning protection - it’s the highest conductor, and so should in theory be hit by lightning rather than the other three.

The OHEW is usually steel, rather than the aluminium or steel-cored aluminium that’s used for the three phase conductors. The OHEW is also generally not continuous along the length of a transmission line: a section of it will start at a particular transmission tower, and be connected to ground there. The OHEW will run for a few miles, and then stop. A new section will commence from there. It’s done that way to prevent currents flowing in the OHEW, except in case of lightning strike.

Well, that’s technically true, but ultimately, it’s not why DC lost out to AC.

The fundamental problem with DC is that it cannot be transmitted very far without losses. To a DC generator, a transmission line looks like a resistor… a very, very long one that you can tap into the middle of. So the further down the line you go, the lower the voltage you see.

However, to an AC generator, the same transmission line looks like a combination of resistors, capacitors, and inductors. The latter two components are also called “reactive” components, meaning that they can store energy during part of the AC cycle, and release it at other parts. Now that you know this, you can tune the capacitances and the inductances of the transmission line so that the energy is stored and released during precisely the right times in the cycle, and no voltage loss appears at the other end of even a very very long transmission line.

See, it’s the same transmission line for both the AC and DC generators. But DC doesn’t even see the capacitance and inductance of the line, except at the moment you apply or remove power. Once the generator has turned on, and the voltage settles, the reactive components have no effect on the circuit.

Sure, you could switch the DC line off and on, repeatedly, over and over, fast enough for the reactive components of the transmission line to give you a benefit. That would work. Why? Because by switching off and on repeatedly, you’re power cycling. In other words, you’re making DC into AC.

This is the real, practical reason why Edison’s DC power distrubution systems failed. You needed to build amplifier substations every few blocks in a major city. It was practically infeasible to light an entire city with DC, especially one of the size of Manhattan or Chicago. AC is much, much better suited. You can build one massive coal-fired generator far out of town, and bring the power in on major wires.

Sure, AC is more dangerous than DC. But it didn’t have to be. This is the real mystery.

Someone picked 60 cycles per second as the standard AC power frequency. This also happens to be the precise frequency that is most lethal. So why isn’t some other, much more safer frequency used??

In the earliest days of electrical innovation, the engineers liked to use 573 Hz, because it made the conversion between cycles and radians much easier. But when Westinghouse began installing DC power distribution systems, this frequency became 60 Hz. Now 60 Hz is also the frequency that is best transmitted by the nervous system, making them the path of least resistance when some poor idiot becomes part of an AC power circuit. Thus, when you stick a finger in each hole of an AC outlet, the electricity preferse to pass thru your nerves, and right thru your heart, causing it to stop working. And this is usually fatal.

Now coincidentally, Nikola Tesla was famous for passing electricity thru his body. But he was also an expert with AC. During the Edison/Westinghouse feud, George Westinghouse lured Tesla away from Thomas Edison’s company… and easily enough, when it became clear that Edison wouldn’t promote AC. As a college student, he invented the AC Induction Motor when the general expert consensus was that it couldn’t be done. He invented the Tesla Coil, which generated extremely high frequency, high voltage, low current electricity – this is what he liked to pass thru his body, and shoot across stages to demonstrate its safety. (OK, and also because it was impressive-looking.) Why would he implement the most lethal frequency of AC power?

Now, Tesla was also arguably insane. At the very least he was bitter and cynical from having to constantly fight greed and ignorance, even among his inventor and engineer peers. But he was also brilliant when it came to AC, and clearly he knew how lethal 60 Hz power is. Tesla also had grand schemes for wireless transmission of AC power through the earth, using giant transformers to pump charge in and out of the ground.

But Edison’s people had the most experience with the lethality of AC power… they were going around killing people with it. Surely they learned quickly which frequency was the most lethal. And they had a huge vested interest in giving AC a terribly bad reputation. There was a lot of money to be made in building DC generators and repeaters every few blocks throughout every major city in the US.

Did Edison plant someone at Westinghouse to try and sabotage AC power distribution? Was Tesla this person? Or was Tesla sufficiently insane to design this flaw into what became the most ubiquitous public utility since running water? Did he build in a flaw that he could later “overcome” with his wireless power transmitter?

This is one of the most intriguing and entertaining mysteries in the history of technology, and so very few people know of it. It would make an awesome “steampunk” style movie… I’m shocked that no one has produced such a story.

This is not really true. The wires look like long resistors to AC also. In order to transmit lots of power you need to have high current or high voltage. P=IV. The power lost in transmission is II*R for both AC and DC. If you can keep the I low you do not loose much power. AC was chosen because it was easier to have the long distance power lines at a high voltage.

This isn’t exactly true. Transmission lines are fairly complex little buggers. It’s true that the typical transmission line models us engineers use has reactive components, but the resistance is still series in the circuit and you’re not going to be able to tune your way around it. Capacitor banks are used in modern power systems to tune the system, but this is due more to the fact that residential loads tend to be slightly inductive (due to vacuum cleaner motors, hair dryers, refrigerator motors, etc), not inductance in the lines that they are trying to compensate for.

Here’s an example of a DC transmission line system currently in use:
http://www.hydro.mb.ca/our_facilities/ts_nelson.shtml
They have two DC transmission lines, each approx 900 km (a fair distance if you ask me, pretty much disproves any argument that you can’t transmit DC any long distance). If you poke around on their site you’ll see that they use DC to AC converter substations in their system. The entire system is not DC, only parts of it.

I have heard from a couple of different sources that 60 Hz comes from a demonstration system. It was supposed to run at 50 Hz but they couldn’t get enough power out of it, so they cranked it up to 60 Hz. I’ve never found a good cite for this though.

It wasn’t clearly known that the heart’s rythm is most sensitive to getting disrupted at right around 60 Hz until much later. Electrocutions do most of their deadly work by heat (literally cooking the body parts to death), which is not frequency dependent.

Tesla’s sanity may have been a bit questionable at times, but it’s fairly clear that he didn’t intentionally design lethality into his system.

It is absolutely true.

It’s called the distributed element transmission line model and it has been a standard part of electrical engineering education for, ohhh, about a century now.

Transmission lines are very accurately modelled as networks of R’s, C’s, and L’s. As your transmission frequency goes to DC, the C’s and L’s don’t matter, except as they effect step responses, which you hope to avoid, anyway.

While your math is correct, your conclusion is not. Ohms law is correct for both AC and DC. However, the addition of the reactivity of the transmission line in the AC domain enables AC to be transmitted without voltage loss over very very long distances. Power will still be dissipated, since some current will be diverted to the return via the capacitance of the line, but overall, the power efficiency is better with AC, since power is also proportional to V-squared over R.

This site has an interesting discussion of AC vs. DC:
http://www.cga.state.ct.us/2003/olrdata/et/rpt/2003-R-0530.htm

They have one particular application that they are evaluating, but they do give a general overview of the issues of AC vs. DC.

OK, my previous reponse was in reply to gazpacho.

In reply to eng_comp_geek: no it’s not exactly true that there are absolutely no voltage losses. But in a properly tuned and matched transmission line, the capacitive reactive load almost completely makes up for voltage drop across the series resistance of the line.

My original statement was a simplified explanation for the layman. But the distinction between voltage loss and power loss remains valid. And DC is still fundamentally unsuited for long distance power distribution as long as resistive losses exist in conductors.

The two brief pages on the Nelson facility don’t have enough information to enlighten us on how they do it or what problems they had to overcome. But obviously, if the technology didn’t exist until the sixties and they aren’t in common use, then that tells you something about the practicality of DC power transmission.

Yes, it can be done. But it’s easier to use AC. That’s the whole point.

I read (“somewhere”, yeah great cite I know) that the 60 cycle decision was made by Edison, not Westinghouse/Tesla.

As you say, Edison was pushing DC while Westinghouse/Tesla were pushing AC. At some point, Tesla goes to Europe to look at their systems there, all of which ran at 50 hz.

Edison finally comes to the realization that DC isn’t going to work and so converts all of his work to AC. Realizing that whoever gets there first makes the standards, he deliberately chose 60 hz just so that his system would be incompatible with the work Westinghouse had been doing to develop their system, forcing them to lose time and money.

Again, I can’t provide a cite for this so take it for whatever you want.

Well, as far as 50 vs. 60, it’s fairly irrelevant when it comes to the difference of conduction thru nerve or thru something else. That range of frequencies is right at the passband for myelinated nerves.

The higher frequency, 573 Hz, was common in Edison’s lab where Tesla first worked, because the engineers there did all their computations by hand and with slide rules. They frequently used fourier transforms, which use frequencies in units of radians/sec. But in practical systems, cycles/sec was much more useful, cause it’s a helluva lot easier to count cycles than radians.

Converting between the two involved the ratio of 180/pi, or about 57.3. But 57.3 Hz resulted in a lot of accidental deaths, and they found that 573 Hz was much safer, as it tended to conduct thru the skin rather than thru the heart.

The biographical accounts I read in college asserted that Tesla forced the Edison people to switch to 57.3 Hz, because he liked the sensation of current passing thru him at that frequency.

Well, you can probably prove a case that it wasn’t medically known, but the earliest experiments in galvanic response by physicists involved electrocuting small animals, both dead and alive, so it’s a safe bet that the people who worked with electricity knew which frequencies were the most deadly.

And again, Tesla himself learned a great deal, firsthand, about the transmission of electricity thru flesh and bone… namely, his!

Not when you complete the circuit by contact using both hands. Then the path of conductance goes straight thru your heart, causing arrhythmia and arrest, which is usually more immediately fatal than burns.

But even with the heat damage, DC conducts thru the skin, mostly, especially at high current, but 60 Hz AC goes thru the nerves, causing deep burns. Very high frequency AC reverts to skin effect transmission. Deep burns, whether from electricity or radiation, are especially mortal and when survived, healing is slow and problematic.

Another advantage of AC over DC is that AC current is less hard to switch off, i.e. for a given switch/circuit breaker/relay the maximum AC current that can be reliably switched off (as opposed to a permanent arc forming, destroying the contacts and possibly melting the whole switch/causing a fire) is higher than the maximum switchable DC current. That’s because with AC current the current crosses zero twice every cycle, enabling the electrical arc to cool off enough for the air between the contacts to become nonconducting.

I have read the above posts and am utterly confused.

I am not an electrical distribution engineer, but it has always been my understanding that:

  1. AC is more efficient than DC primarily because transformers can be used with AC, and high voltage/low current is much more efficient to transmit than high current/low voltage.

  2. For the transmission line itself, DC is inherently more efficient than AC. While both suffer from I[sup]2[/sup]R losses, an AC system dissipates power via hysteresis losses (capacitance dissipation factor and finite Q-value of the inductor).

I must admit, this is the first time I have heard that the distributed inductance and capacitance of a power distribution line actually helps the efficiency when AC is used. I guess if you model the system as a distributive parameter system (wherein the voltage is a function of distance), then there is a possibility this could happen.

Do you have a better link that supports this notion, bughunter? The link you supplied is nothing more than a basic primer on distributive parameter systems. I would like to see a link that says distributed inductance and capacitance of a power distribution line actually helps the efficiency when AC is used. Until then I will remain skeptical but open-minded.

bughunter are you seriously saying that AC does not suffer from resistive losses and DC does? That is what my response is about.

I also disagree with bughunter. I don’t believe that there is any difference in the inherent efficiency of DC vs AC, except for the small I2R loss, which is basically radiated power away from the line. I’m familiar with the lumped-element model of a transmission line, as well as the distributed element model.

Both AC and DC will suffer the same amount of energy lost to line resistance. And if energy is lost, the voltage is reduced at the load end.

To expand on tschild’s post:

If your load is even the least bit inductive (which is almost always the case), a spark will exist while the contacts are opening. In an AC system the spark will usually extinguish during the next zero crossing, and zero crossings occur every 8.3 ms for a 60 Hz system. After the zero crossing the spark will usually not return. A DC system does not have any zero crossings, so the spark lasts “as long as it wants.”

A couple more things to note:

  1. Sparks will also occur for a capacitive load when the switch closes.

  2. In addition to sparks being caused by a reactive component, there is another “component” to creating a spark: breakdown voltage of air. As the contacts get closer and closer together, a point will be reached where the voltage will exceed the dielectric breakdown voltage of air. This will occur even if the load is 100% resistive.

I think what he’s saying is that there’s a “voltage pump” situation going on; if the parameters are “just right”, and if the timing is “just right”, the voltage across the inductors will “add” to the voltage across the capacitors, thereby counteracting the I[sup]2[/sup] losses.

As stated in my post, I’m not ready to say this is a bunch of BS, at least yet. Let’s wait and see if he can come up with a cite.

As stated in my first post, a DC power transmission line must be more efficient than an AC power transmission line. This is because there’s no such thing as an inductor with an infinite Q factor, and there’s no such thing as a capacitor with an infinite Q factor. A capacitor will dissipate energy due to loss tangent and ESR, and an inductor will dissipate energy due to magnetic hysteresis losses. It’s probably not very significant, but real none-the-less.