When Thomas Edison tested the first lightbulb, where did he plug it in?

>> DC transmission is being used for very high voltage transmission (1MV+) over long distances

This is correct. At such voltages and distances, AC losses become too great. i am not sure how they do the AC/DC and DC/AC conversion but it must be pretty interesting.

Keith T. & Sailor are 100% correct on the AC/DC thing. DC is used here in California only for very high voltage long range transmission, specifically to connect two systems operating at different frequencies, usually with either long underground or underwater sections. The DC lines are over 500 miles long, and while at first they were 700,000 volts, now they are all 1,000,000 volts. Under these conditions the line losses are less than AC, the towers are cheaper, and there are only two conductors, so the wire is cheaper. Drawbacks are that the converter stations (a rectifier for AC to DC, and an inverter for DC to AC) are really expensive and it isn’t cost effective to have any radial feed circuits running from the main line.

I also want to be clear that I wasn’t saying that it turned out Edison was right and Tesla was wrong, 'cause clearly that isn’t the case. AC is definitly the way to go for all but very speciallized long range transmission.

During the AC verses DC thing Edison did horrible things in a futile attempt to discredit AC power. His behavorial changes following that lost crusade show how deeply it affected him. Most biographies of Edison gloss over this period, and focus on his earlier successes. Edison said “genius is 10% inspiration and 90% perspiration.” I always thought people that sweaty shouldn’t work around eletrical equipment.
Tesla was clearly a genius. His inventions speak for themselves. Later in life, though, after Westinghouse got really rich off of his work, he was, based on what I have read, nuts. Many such geniuses (constructive/inventive) were obsessive-compulsives, unlike the traditional artistic geniuses who tend toward manic depressive.
Edison, Tesla, and Westinghouse were all such major players in the development of electrical utilities that their personalities are key to understanding some of the decisions concerning our current electrical infra-structure.

I can’t resist or…

They made a movie about power distribution staring Racquel Welch.

One Million Volts D.C.

:slight_smile:

I wasn’t aware some high voltage lines were DC. Interesting.

Most power distribution lines, of course, are AC. As many have already pointed out, the primary reason is because transformers can be used (transformers can’t be used w/ DC).

But the FUNDAMENTAL reason is because insulation is cheaper than copper. Let me explain with an overly simplistic example:

Let’s say you want to transfer 100,000 watts from point A to point B. A nifty way to do this is to use electricity. But at what current and voltage?? (Don’t worry about AC vs. DC for now.) Power = Voltage x Current. So for a given power we can use a high voltage and low current (100,000 V at 1 amp, for example), a low voltage at high current (100,000 amps at 1 volt, for example), or anything in-between.

A high current at low voltage sucks because I^2R losses in the conductors become prohibitively large unless you use extremely large gage conductors. This would be very expensive, and the cables would need supports about every 10 feet.

A high voltage at low current allows the use of much smaller conductors. The only price you pay is that you must use special techniques for holding-off the high voltage (special insulation, minimum distances, etc.) But the cost of this is much cheaper than using 0000 gage wire.

So one thing is settled: Transferring a high voltage at low current is much more efficient than transferring a high current at low voltage. But should it be AC or DC?

Let’s say a power station generates electricity and somehow boosts it up to 100,000 VDC (at 1 amp). Sure, the line losses are relatively low because of the low current, but how do you convert the 100,000 VDC to something you can use at your house or business, like 120 VDC (or 120 VAC, for that matter)? Answer: It ain’t easy. There is no simple, reliable, and efficient way to do it! Resistor dividers (and linear regulators) are horribly inefficient. And switched-mode converters are complex, unreliable, and expensive. (Note that I glossed over the part of how the power company boosted the voltage to 100,000 VDC from the generators, but the same problem would apply there also.)

But with AC it’s a different story. The power company can take whatever AC voltage & current is produced by their generators, easily & efficiently boost it to a higher voltage & lower current using a transformer, transmit the power over small conductors, and easily & efficiently reduce the voltage before it gets into your home with the use of another transformer.

Conclusion: High voltage is more efficient because insulation is cheaper than copper. But this requires boosting the voltage at the power station, and lowering the voltage at businesses and homes. AC is used because transformers can be used to do this job - there is no simple, reliable, and efficient way to boost/lower DC voltages.

Good point, Crafter Man. That clearly shows why the majority if not all of any power distribution system would be AC. Since both ends of the DC transmission lines are AC, the transformers are located before the rectifier, and after the inverter, where the power is AC. Roughly the same size transformers are needed in either AC or DC transmission lines, so it wasn’t a big factor in determining cost effectiveness. Technology related to power electronics, i.e. low loss rectifiers and inverters is what really made the DC lines possible. It wouldn’t work at all on a totally DC system, though.

HELLLLLLLOOOOOOO…!!!

When you guys are through talking about Edison vs. Tesla, and electric chairs, and power lines, SOMEBODY really should go back to the OP and try to give J-man an answer to his question. The poor guy has sat quietly by as everybody hijacked his thread without really addressing his Q.

How about it?

Well, I got so wrapped up in rambling about Edison’s estate that I forgot what I intended to say.

The lightbulbs I mentioned that were supposedly made in the lab at Fort Myers did have a metal base and they did screw into a ceramic socket.

I believe that the original, first bulbs were simply pinched off glass tubes or bottles, with a filament inside–basically the same technique used to build early prototype radio tubes. The wires required to operate them simply exited the bottom of the device and were hard wired to the power source. Of course, a vacuum was drawn on the tube before the bottom was pinched off. I am almost positive that my father recited all this from a “tour leaders guide” of some sort. I wish I could be absolutely positive.

He is now ninety years old and hasn’t done the tour thing for a lot of years. His memory comes and goes but I will try to ask. Note the “try,” my memory comes and goes, too.

Thanks Engineer Don.

I’m still unclear why a very high voltage DC transmission line would be more efficient than a pure AC system. So I’m trying to figure it out…

Correct me if I’m wrong, but I don’t believe AC has any advantage over DC when it comes to I^2R losses (heat losses). 1 amp of DC current produces the same heat loss as 1 amp RMS of AC current in a given conductor.

Now an AC system will exhibit a reactance loss which is not present on a DC system, but this will only be significant on very long runs for a 60 Hz system. I still couldn’t imagine there would be much benefit for going DC, even if the runs were long…

You also mention that it was used “to connect two systems operating at different frequencies.” Hmmm. Perhaps this is the primary reason? If power station “A” is sending electricity to power station “B”, but “B” wants it at a different frequency, then I suppose you could do one of two things:

  1. Power station “A” could send AC to “B”, but “B” would have to rectify it to DC, then reconvert it to AC at whatever frequency it wants.
  2. Power station “A” could rectify the AC to DC, send the electricity to “B” in DC, then “B” could reconvert it back to AC at whatever frequency it wants.

Two conversions take place in each of the above cases, so it’s a wash. BUT, #2 would be a little more efficient because there are no reactance losses during the transmission. So is this why they do it?

Now let’s assume station “A” is sending power to stations “B” and “C”, and both want it at a different frequency that what “A” is supplying. Two options:

  1. Power station “A” could send AC to “B” and “C”. “B” would have to rectify it to DC, then reconvert it to AC. “C” would also have to rectify it to DC, then reconvert it to AC.
  2. Power station “A” could rectify the AC to DC. “B” would have to convert it to AC. “C” would also have to convert it to AC.

So let’s tally up the score… There are 4 conversions in scenario #1, while only 3 conversions in scenario #2! Not only that, but #2 doesn’t have any reactance loss.

And the more power stations that “A” delivers to, the more efficient a DC system becomes. Assuming, of course, that each station wants it at a different frequency.

Is my reasoning correct? Or am I off-course?

If it’s correct, then I have another question: Why would another power station want it at a different frequency?? I thought all systems in the U.S. ran on 60 Hz. Or would the customer be someone other than the U.S.??

At the risk of sounding stupid, what the hell IS AC DC, in a simple, Electricity for Dummies explanation?

Also: Didn’t Tesla wipe out a whole forest in Siberia doing a test once?

My understanding is that an AC current can use transformers to step up its voltage while reducing its current, and that a DC current can not. As long as v[sub]in[/sub]i[sub]in[/sub] = v[sub]out[/sub]i[sub]out[/sub] the total power remains the same, but the power loss, i[sup]2[/sup]r falls quadratically. Then, close to the user, the voltage can be stepped down, raising the current inversely as before.

Why the same thing doesn’t work for DC, I don’t know. I can’t think one up, and I can’t find a reason in my searches of easily-accessable resources.

DC = Dircet Current, a positive and negative, like a bettery.
AC=Alternating Current, it swirches from positive to negative to positve 60 times per second in US households. Generators (or alternators) produce AC.

Originally posted by Guinastasia:
“At the risk of sounding stupid, what the hell IS AC DC, in a simple, Electricity for Dummies explanation?”
AC stands for “alternating current,” and is used to describe current that flows in both directions in a conductor (not at the same time, of course; the current alternates back-and-forth over time).

DC stands for “direct current,” and is used to describe current that flows in one direction in a conductor.

Personally, I think the term “AC” shouldn’t be used to describe what’s going on at your ordinary 115 VAC outlet. I think it should be “AV” (alternating voltage).

Why?

The voltage at your outlet is ALWAYS alternating, no matter what the load. But the current can be AC, DC zero - pretty much anything - depending on the load. For example, half-wave rectification causes a pulsating DC current to flow in your power line. And what if there is no load at all? The voltage is still present (and is alternating), but the current is zero! (Yea, I know, there’s a tiny bit of current due to line capacitance. But I’m trying to keep this simple.)

Wanna hear something else? Contrary to what I and others have said, it IS possible to use a transformer w/ DC, IF the DC is a pulsating signal. That’s how the coil in an automobile works.

The confusion, of course, is that we envision a steady, non-changing signal when we think of “DC.” But a pulsating/changing signal can also be defined as “DC” as long as the current is always traveling in the same direction.