The line voltage is the difference in electrical potential between the different conductors on the the line. The limitations are the breakdown voltage of the air for an air insulated line and the breakdown voltage of the insulators that support the line on the towers. In addition there is the breakdown voltage of the transformers or other equipment that supplies power to the line or takes power from it.
So yes. There is a limit to the voltage that can be maintained on a transmission line. There are several DC power transmission lines that operate at 1.2 to 1.5 million volts. They have that voltage between the lines and they are arranged with one line at positive ½ the line-to-line voltage and the other at -½ the line-to-line voltage. that way the benefit of the high voltage in reducing line loss is gained without the need for such high voltage insulation from the line to the tower and in the line driving and receiving equipment.
I know a lot of folks like the water analogy because it helps people understand electricity. I personally have always hated the water analogy because electricity isn’t water, and doesn’t behave much like water except in a few simple ways. This often leads to some confusion about electricity, like now.
That said, let me try and answer your questions.
Higher voltages will “arc” a farther distance. 120 volts isn’t going to jump very far, but a few thousand volts will easily jump the typical distance that wires are spaced apart on household circuits. The higher the voltage, the farther you have to keep the conductors apart so that they don’t arc together. You also have to use larger insulators so that the electricity doesn’t arc over to the towers that hold the wires up and get wasted into the ground. For something like household wire, the higher the voltage, the more insulation you need around it. If you ran a few thousand volts through your typical house wiring, it would punch through the insulation and arc from one wire to another all over the place.
The highest practical tranmission lines are up in the several hundred thousand volt range. These are dedicated transmission lines used to get a big freakin bunch of electricity from one place to another. Keep in mind that lightning is a few million volts, and it will arc all the way from the clouds down to the ground. Running transmision lines in the million volt range can be difficult and a bit dangerous. Many people are killed just by climbing high voltage transmission line towers. They think they are safe as long as they don’t touch the wires. They are wrong. All they have to do is get close enough that the electricity can arc over from the wires to them, and they get zapped. Bug zappers work on exectly the same principle.
Even though higher voltages are generally better, typical power distribution systems only run at a few thousand volts. There will be a transmission line running at several tens of thousands of volts which carries the power a long distance, and will end in a big substation somewhere. Transformers in the substation will take the voltage down to a much lower level (a few thousand volts at most) which will carry the electricity out to all of the neighborhoods. The line running in front of your house may be at something like 2400 volts. Then there will be a transformer which serves your house and maybe a few of your neighbors, which drops the voltage down to 240. It’s not really practical for the power company to run their lines at much higher voltages than this.
The current through the wire also causes problems because of the heat it generates. How much energy gets converted into heat is determined by the square of the current multiplied by the resistance of the wire, so the more current that goes through the wire, the more it heats up. Try and pump too much current through a wire and it will melt.
I could be wrong, but I thought that what the OP was asking was, is there a limit to the voltage levels that any given wire can take, assuming sufficient insulation to prevent short-circuiting via arcs, and a low enough current flow that the wire doesn’t melt? Obviously too much current will destroy a wire (and how much is “too much” depends on the resistance of the wire) but is there any comparable limit on voltage?
That’s just the way I read it. Perhaps Enola Straight could specify whether he/she means it the way I read it, or was speaking of some situation in particular.
For a given wire with a given resistance, the current is proportional to the voltage; in fact, you can write the power dissipated in the wire as P = V[sup]2[/sup]/R, without any reference to the current at all (except indirectly, via Ohm’s Law.)
However, this is only the power that is being produced, so if we manage somehow to get rid of this heat energy at a sufficient rate, then we won’t have enough energy accumulate in the wire to melt it. How to do this, exactly, is left as a exercise for the reader.
Let’s be clear about this. There is the voltage between the two conductors of a transmission line. There is also a different magnitude of voltage developed along the length of each of the conductors that arises as a result of the conductor’s resistance when it is carrying current.
The voltage referred to in this post is the second of those two voltages. As MikeS said, there is also a limit to this voltage for the reason he gave.