Wire gauge and amperage.

Is the needed wire gauge determined only by the current traveling through the wire? In other words, if a cable can carry 10 amps at 10 volts does this mean it could also carry 10 amps at 100 volts or that a 100ma fuse will blow at 100ma no matter what the voltage is?

I realise that that the resistance of the cable will be a factor in here as well but I am just wondering how it would work in a perfect world without wires having resistance.

Thanks in advance

The maximum rated current a wire can carry is determined by the maximum rated temperature of the conductor and insulation (primarily the latter) when the wire is operated at maximum ambient temperature.

To add to/illuminate this, the limiting factor is the amount of heat the wire needs to dissipate, where heat is power in watts. W=I[sup]2[/sup]R; I is current in amperes, and R is the resistance of the wire in Ohms.

Resistance of a wire is a function of the alloy used, the diameter, and the length. For commonly available types of wire, the factors of alloy type and diameter can be treated as constants when computing resistance.

The type of insulation is a secondary factor. It does not add to the resistance, but does have an implication in heat retention.

Upshot is you can run a lot of current in a fine wire for a short distance. Happens in integrated circuits all the time. When you have to send the same current a distance long enough to have to think in terms of transmission lines, the wire diameter gets bigger very fast.

So if I think a bit further, if the wire was a superconductor it would have no resistance and would therefore not heat up and a superconducting wire 1mm in diameter should be able to carry an infinite amount of current although I’m sure there is some limitation here are well.

In the real world though wire has resistance and this resistance causes the wire to heat up and cause the conductor and insulation to catch fire or melt. Also, thicker wires have less resistance per given length and so can carry higher currents but for a specific current to be carried round a wire the voltage has to be at a certain minimum otherwise the resistance of the wire will not allow the current be carried.

For example, assume a wire has a resistance of 0.1 Ohms/metre that means that 100m of wire will have a resistance of 10 Ohms. If I used this 100m of wire across a 10V source the maximum current that could flow would only 1A or 10W no matter what I connected it to. If I used a 100V source then I could carry a maximum of 10A and this wire would now transfer 1000W and may melt. So this would then indicate to me that voltage is a consideration.

All my calculations are using DC because I don’t know how to do the AC ones

Is my thinking correct?

Voltage is more or less irrelevant if you are fixing the amperage to a particular level. 10 amps is 10 amps regardless of whether it’s from a 12V battery or a 240V circuit.

The total heat generated by the wire is I x R[sup]2[/sup], and the wire’s resistance is fixed by its gauge and length. Voltage need not be considered at all, except that the voltage through wire & device has to be the right voltage to generate the fixed amperage you desire.

Residential power cables are rated only by amperage, 15A, 20A ,30A, etc. These ratings are limited by a maximum voltage, usually 600V and maximum temperature 60C or 90C. Exceed those maximums and the insulation will fail, causing a short.

No, it’s I[sup]2[/sup]R.

Damned edit timeouts.

While I’m nitpicking, I screwed up when giving the power equation by using W(atts) instead of P(ower). The correct equation is P=I[sup]2[/sup] x R.

Not quite.

As others have stated, the maximum current through the wire is determined by how much you heat up the wire, which is related to the wire material and thickness as well as the insulation (mostly the wire though).

The voltage rating of the wire depends mostly on its insulation. Wire rated for 12 volts has much less insulation on it than wire rated for 120 volts.

Let’s get back to your 100 m wire. Let’s say this particular wire can carry a maximum of 1 amp before it overheats and melts. Your 10 ohms of resistance is kinda high for wire, as you’ll discover if you try to power something from a 10 volt source. Your power supply is 10 volts and the max current of your wire is 1 amp, so you should be able to handle a load as low as 10 ohms, since 10 ohms driven by 10 volts should work out to 1 amp (V=IR, or I=V/R).

But what happens if you actually hook this up? You’ve got your 10 ohm load in series with your 10 ohm wire. Now you’ve got a total resistance of 20 ohms, so at the load you only end up with 5 volts at half an amp. Your wire is sucking up as much power as your load is. You are delivering 2.5 watts to your load instead of 10 watts, and your wire is also sucking up 2.5 watts and turning it into waste heat.

Here’s another place where you are getting confused. What happens if we try to drive 100 volts to a 100 ohm load?

Now you’ve got 100 volts going to 110 ohms (100 ohms for the load plus 10 ohms for the wire), so you end up with about 0.9 amps of current (.9090 repeating) which is a lot closer to your design goal of 1 amp. You will have a voltage drop of 9 volts (V = IR, or .9 amps x 10 ohms) across the wire, so the load sees 91 volts at .9 amps. Your load now gets about 81 watts instead of 100, and your wire is wasting about 9 watts as heat.

To actually get 1 amp at 10 volts out of this wire, you’d have to give it pretty much a dead short. This makes it useless to transfer power at that low of a voltage. To get 1 amp at 100 volts, you’d have to give it a load of 90 ohms (90 ohms plus 10 ohms of wire is a total load of 100 ohms, which at 100 volts yields a current of 1 amp). However, even though you now have 100 watts of power instead of 10 watts of power, only 10 watts of this 100 are getting dissipated by the wire as heat. So in either case (10 volts going to a dead short or 100 volts going to a 90 ohm resistor) the wire is dissipating 10 watts of heat and the wire gets equally hot in both cases.

What you’ve hit on here is the reason that power companies step up the voltage as high as is practical. The higher voltages not only reduce the losses, but they also allow more power to be transferred from one side of the wire to the other for the same size of wire. However, the higher voltages also require more insulation and a greater distance between wires (bigger power poles, which cost more money).

Just to add about fuses, fuses will always open at some time after reaching their fault current. A small overload will take longer to open the fuse then a large overload. Different fuses have different fault time-delays.

The voltage rating on a fuse indicates the maximum voltage at which it can safely contain and quench the arc generated when it opens.

True, but it’s worth noting that (for the most part) the length doesn’t matter when it comes to a wire’s maximum current capability. The temperature rise of the conductor and insulation per unit length is a function of the RMS current, and resistance per unit length. The latter is a function of the cross-sectional area of the wire and the conductor material (usually copper). The temperature rise of the wire per unit length is also a function of:

  • type of insulation
  • insulation thickness
  • thermal conductivity of the insulation
  • ambient temperature and air speed
  • proximity to other heat sources (e.g. other current carrying wires in the same conduit or cable tray)
  • orientation of the wire relative to gravity (due to convection currents)

Note also that I’m only talking about ampacity from a safety aspect (i.e. temperature rise of the wire). As stated by ECG, even if the wire is operated below its maximum rated temperature, you may still have to use a larger gauge wire if the voltage drop of the wire(s) is unacceptably large.