Electricity question: Why countries chose voltages the way they are? And where's DC?

The nice thing about standards is that there are so many to choose from! Why did countries choose the voltages that they did for their electrical supply grids? Why did the US choose 120, while most of Europe chose 220-230, while Australia went with 240 and Japan decided on 100 (must make for some really hefty power cables). And why didn’t they agree on what line frequency to use either? Why did some go with 50 while others chose 60 Hz (and why did Japan choose to go with both, with 50 Hz in the south and 60 Hz in the North?)

And finally as a bonus round:

“Pocket Ref”, aka the little black book by Thomas J Glover, Third Ed, in its section on electricity it states that:

“Of the 204 countries listed in the following tables, virtually all use AC power. DC power sources are rare and will not operate AC devices”.

Am I… to understand… that somewhere out there are municipalities that provide DC power to their consumers? If so, where? I’m not talking about battery powered devices, but referring to supplying electricity by power tables to homes. What’s the straight dope?

Apparently, as of January 2005, Consolidated Edison still had a handful of customers on direct current service:

From http://www.5boroelectric.com/main/coned.htm

From Wiki:

I’m guessing that most buildings that got DC service to run their old elevators also got normal AC service for the rest of the building. Otherwise, they would have had to buy hefty power inverters to operate anything modern in the building.

Here’s a thread I started asking about the impractiblity of DC power:

Lots of internet sites out there, here’s one…

http://www.eei.org/industry_issues/industry_overview_and_statistics/history/

When electrical distribution systems were first developed, it was found very rapidly that the longer the cable, the greater the losses, so that you could churn out 600V dc at one end, and if you were needing a large current, you could drop well over 10% in output voltage.

This was for what would be called relatively short cable runs of a few hundred yards, and certainly not suitable for city wide systems.

The problem comes from the following, where,

V(volt drop along cable) = I[sup]2[/sup](the current going down the cable) X R(the electrical resistance of the cable per yard X the number of yards of cable)

or

V= I[sup]2[/sup]R

This loss of voltage along a cable actually meant that the cables were losing energy energy in the form of heat, and so the cable got warm, and this in turn limited the amount of power you can supply to your users.

If you can increase your supply voltage, then you can deliver the same amount power using a lower current, and the lower the current, the lower the losses in your cable, so if they early power companies could have sent say 6000Volts dc, instead of 600volts dc down the lines, they could reduce the amount of current needed by a factor of 10, and reduce power losses in the cable by a factor of 100.
Unfortunately there is a problem with this, 6000Volt generation during these early days was not realisticly possible as the insulators required in a rotating generation device were nothing like as good as they are today.

The other problem of course is that even if this were possible, 6000Volts is not a very safe way to supply customers direct, so you need to find a way to reduce this voltage to more manageable levels.

There is a way to do this, using the supply voltage to drive another generator producing a safer lower voltage, but this would have been utterly impratical and uneconomic, and back then there was no other way to do it with dc, except to use huge resistors to drop the voltage, which would have been extremely wasteful of power and uneconomic, as well as not being an intrinsicly afe way to do it, and a few other reasons I won’t go into in this post.

It so happens that the answer to th problem was to use ac, because by continuously varying the voltage, you can use transformers.

Transformers are a device able to transfer power through them, they can increase the voltage, which in turn reduces the current, hence the power remains the same, since

P = V x I

or Power = Voltage X Current.

for power to remain constant if voltage rises, then the current must fall.

This was an ideal solution at the time, you could generate power at say 600Volts AC, put it through a transformer and step it up to 6000Volts, and at the other end of the cable you could step it down to 120volts using another transformer at the other end of the cable.

At the time, and today as well actually, it was possible to make cables that could handle far higher voltages than could be generated in a large, fast rotating machine like a generator.

As a result of the use of transformers, the current transmitted along long cables was reduced, and this reduced the losses in those cables, so they could be made far thinner and cheaper.

The entire electrical distribution system of virtually every nation used AC power because of this.

There are some disadvantages to AC over DC, and nowadays we have the high power high current and high voltage electronics to distribute power using DC, however, it will not happen as it would be immensly costly to change.

AC power generation is also more easily carried out using alternators, which can churn out more power than can the DC generator - the dynamo.

I have greatly simplified this, I hope that other posters will not come in and make it too technical, I have observed that this happens often in electrical/electronics threads and it does not take long for the OP to suddenly get overwhelmed by a huge amount of information.

Heh. You think that’s a mess, you should read the history of power just in the US.

It all started with George Westinghouse and Thomas Edison battling it out for who was going to provide electric light services. This was NOT the nice, clean businessman’s fight that you read about in history books. This was a low down dirty fight that included folks going around and electrocuting animals to show how dangerous their oppononent’s system was. Edison had some hand in the creation of the electric chair, under the stipulation that it had to use Westinghouse’s “dangerous” AC system instead of his “safe” DC system. Edison even went so far as to make sure that reporters said that victims were “Westinghoused” when they were fried in the electric chair.

Still, AC won over DC, due mostly to the fact that there is no cheap and easy equivalent of an AC transformer for DC.

But standards? Hah. Early US systems ran on a variety of voltages and frequencies. Most common was 50 Hz and 60 Hz, but there were things like 25 Hz too. Standardization on 120 V and 60 Hz didn’t come until much later.

There was an interesting DC system operated somewhere in Europe (I don’t recall where), many years ago. It consisted of generators and motors that were all wired IN SERIES. The motors at each location would then be used to drive another generator which would provide the final current, which I think was AC. The obvious benefit of this is that with everything wired in series, you only had 1 wire going around in a big loop. The disadvantage of course was that since everything was in series, if one thing broke, the whole system went down on its butt. I don’t know how long this system was used, but it’s been gone for a long time now.

I’ve been told (but never able to confirm) that the way we ended up with 60 Hz system was that there was a demonstration system built that ran on 50 Hz. Unfortunately, it couldn’t quite provide the power that they needed, so they cranked the generators up to 60 Hz. The customer bought it, and of course everything was built to match the test system, so it ran at 60 Hz. A lot of the people playing around with power systems at the time preferred 50 Hz since it was a more even number. 60 Hz systems just ended up being more popular in the US.

One note about frequency. Higher frequencies are better, since they allow you to make smaller transformers, but they also require generators that spin faster or have more poles in their windings. You also want a frequency that’s low enough that the hum from electrical equipment isn’t really freakin annoying, which means you’ll probably want to keep the system below 100 Hz. 50 or 60 Hz ends up being a pretty good frequency to use in this range, though coincidentally, these are both in a fairly narrow range that the human heartbeat is fairly sensitive to. From a safety standpoint, you almost couldn’t have picked a worse frequency.

I’m not aware of anyone that uses DC distribution systems these days. The phone company uses 48 volt DC systems, but they aren’t shipping appreciable amounts of current all over the place with it. DC voltage is often used in manufacturing plants. 24, 48, and 72 volt DC distribution is fairly common in manufacturing. There are a few very high voltage DC transmission lines in use throughout the world. These have advantages over high voltage AC lines, like the fact that you can send more power over the same sized wire, but have a disadvantage in that more complicated transformer equipment is required at either end of the line to convert from AC to DC.

Very informative, thank you!

I need to clarify my question though: It’

Very informative, thank you!

I need to clarify my question though: It’s fascinating the cite about there still being customers that receive DC straight from the power company for such things as elevator gear. From that guide book, I was wondering since “virtually all”, doesn’t mean “all 100%” and since this book is a reference for travellers and consumers, are there places then where you have electric outlets in walls that provide household DC power? Yes, that would be HIGHLY unusual, that’s why I’m wondering whether is is true.

The other question was of course about the final voltage consumers tap into from their outlets and why the huge gap in differences. I know there are varied high voltage distribution systems but why couldn’t they all just get along and agree to the final household voltage? Especially a place like Australia. If it’s so tied closely together to the Commonwealth and considers GB its home, why didn’t they adopt the motherland’s plug standards and voltage?

Finally, about the higher the frequency being better, and due to generator design limitations for different frequencies, is that why passenger aircraft use 115 V AC at 400 Hz? I also understand that ground connections for aircraft use 400 Hz frequency too? If so, how does an airport produce AC at that frequency? Motor generators?

I used to work building and testing aircraft transformers. All things being equal, the higher the frequency you run at, the more power a given transformer core can handle. In an aircraft, weight is everything. Every little bit of mass you can reduce adds to the fuel economy of the aircraft. By running at 400 Hz, generators and transformers can be made smaller and lighter then would be needed for the same power handling capability at 60 Hz.

I don’t know specifically how airports’ ground facilities provide 400 Hz power, but for our purposes to test them, we used a piece of equipment called a frequency converter. It could take up to 400 W of 60 Hz power at ~115 V and output it at anywhere between 20 Hz to 1 kHz, at any voltage level between 0 and 120 V. Airports may use something similar, instead of an actual generator.

In addition the Navy uses 1000V at 400hz to distribute power for much if the radar equipment. (at least they did in the late 80’s) The power would then be stepped back down to 115v or 440v but 400hz provide less line noise from what I remember. The High Voltage reduced power loss in transmission from the Generator roooms below water lever to the Bridge areas.

Jim (ex-EM3)

Errrr, they do have the same voltage. There’s a couple of useful maps on this Wikipedia entry, which show how the voltage levels are across a geographic divide between the Americas and the rest of the world, with isolated exceptions. Plugs & sockets follow similar trends, but with less uniformity.

And there was never a good reason to enforce a worldwide standard, only local ones. Never was there a huge problem with selling goods in different places - just fit them with different transformers, equivalent to building right- and left-hand-drive cars. Nowadays, most portable equipment (laptop adaptors etc.) can cope with any mains voltage.

I meant what was the logic they chose the voltage levels that they did? I can see the advantage of 220 or 240 over 120 (smaller cables needed), so then why did Japan choose 100V, at both 50 and 60 Hz in different parts of the country?

The most public and dramatic of which were enshrined in the Electricity Building at the 1893 World’s Columbian Exposition in Chicago.

Corrected link: http://en.wikipedia.org/wiki/List_of_countries_with_mains_power_plugs%2C_voltages_and_frequencies

I’ve always assumed the situation with Japan was that two separate distribtution systems developed, independent from one another for geographic reasons (i.e. a lot of mountains in the way!) This may be way off the mark, however.

Often decisions about standardisation are affected by the situation that has developed, preferring the dominant choice at that moment rather than the ideal solution. The British standardisation of railway lines using a 4’ 8.5" gauge, rather than Brunel’s seven-foot gauge which offered huge benefits, was because far more companies and more mileage was then using the narrower one. If the most powerful generating company was using 100V, for whatever reason (probably dating back to their very first generator), then this would likely become a standard for the area.

Early on some of the government and commercial powers were acting against standardization.

It is much easier to tax a thing if it is not available in other countries. It is much easier to fragment markets if devices can be made to only operate in specific countries.

These were not huge forces, but enough so that the logic of a world standard never took hold, and now there is such a legacy, that it will never change.

If you think this is a conspiracy theorist, just consdier how upset the drug company’s are over canadian purchased phamaceuticals. If they could come up with a way to make them unusable by american’s it would happen like yesterday.

I don’t buy this as a deciding factor in the way standardisation actually happened. If it were, then we would not have the situation where most areas of standardisation do follow logical geographic lines. We would see far more fragmentation.

This isn’t to say that there wasn’t pressure to use electrical standards as a restrictive measure, only that it wasn’t a significant factor in the choice, in most cases across continents, of standard electrical supplies.

There’s plenty of precedent for this. There are many standards that were written to protect the European market from American and Japanese goods.

I thought it was because of the Babylonians!

For a recent example, look at the ‘region codes’ on DVD’s. Those are designed only to prevent a world-wide market in DVDs, so the companies can charge higher prices or restrict releases in certain markets.

It should be noted that the DC used by phone companies is generated on site and not transmitted from the electric company. The phone company’s DC systems are fed by a commercial AC line.

And depending on the office you could also find 24 VDC and/or 140 VDC distribution systems (although those are somewhat rare these days).

Precedents? That means coming from at least the first couple of decades of the 1900s, if not before. Have you got examples?