Electric Wire and Speaker Wire!

Hello all.
So I was working on building a lamp, and I relized (when I looked at the spool) that I was trying to wire the lamp with speaker wire. I quickly stopped and had to come on and ask…
Can you use speaker wire for electrical wiring?

 It looks exactly the same as electrical wire. Same color, texture, and the wire looks the same on the inside. But I wasn't sure if it would carry the same current, be safe to use as electrical wire, and do the same job as electrical wire.

Acrylic Vessel

“Yes, Frasier! I’m off to live in the Eco-pod.”

They look alike because most speaker wire is just repackaged lamp cord. A lamp cord would need to handle more voltage than a speaker wire, since the speaker is pretty low voltage, but 120 volts isn’t much anyway. However, they’re both in the same ballpark for how much current they need to carry. A 100 watt light bulb draws about one amp of current, which is low for a metal wire.

I’m glad to hear you didn’t get suckered by the stereo store telling you that you need expensive speaker wire.

Get the right wire for the job. Does the label on your spool of speaker wire claim that it’s good for 120-250 volts ? If not, the insulation may not stand up to long term use at that voltage. I’ve used speaker wire to carry 120 volts as a temporary expedient, and it was fine, but if it’s not rated for the voltage, it’s likely not tested for the voltage.

Actually, most mid- to large-sized stereo speakers operate in the neighborhood of 90 volts or so, it’s not that far off in that respect, either.

Lots of kinds of speaker wire are quite capable of withstanding mains voltages. Lots of other kinds aren’t.

A layman can’t look at two pieces of figure-eight wire and determine which is capable of withstanding mains voltage.

If it isn’t stamped to comply with the relevant standards, then your insurance company could deny a payout if the house burns down, regardless of whether the wire was the source of the fire or not.

The standard wire in the electrical industry is labeled; Gas and Oil resistant, THHN, 600 volts. The THHN (or THWN, MTW, etc.) is industry standard labeling for the durability and usage of the insulation. Armed with that info, I would not use speaker or light duty lamp cord for electrical wiring purposes other than what theyre listed for (lamps and speakers). However, I myself, have used lamp cord in lieu of speaker wires because it is generally cheaper. The 600 volt rating is needed to cover any 480 volt applications and must be good enough to carry any groundfaults back to the circuit interupting device e.g., breakers or fuses without breaking down the insulation. Lamp cord MAY be similar to speaker wire but there could be a difference in the actual insulation around the wire depending on what brand you have and where you got it from. I would keep the uses seperate except in the case of using lamp cord for speaker wire, that would be OK. Dont do the reverse.

I agree with everyone else who says that it would probably work fine. But.

A few feet of real, approved, tested, designed-for-the-purpose lamp cord from the hardware store might set you back all of US$0.75 or so.

I’d just use real lamp cord and put my mind at ease.

I agree with those suggesting getting rated lamp cord. In my time at Radio Shack (many years ago) I seem to recall that the spools of lamp cord had a slightly thicker insulation than the (very similar looking) spools of speaker wire. Don’t take any chances, get the right wire for the job.

Any cord used in a lamp should be UL-rated for that purpose. This means:

  1. Heat from the bulb is conducted into the copper conductors. The insulation must be able to withstand this heat. Speaker wire may or may not meet this requirement.

  2. A lot of people don’t know this, but the insulation used in UL-rated lamp cord does not sustain a flame. In other words, if you stick the insulation in a flame it will smoke & burn, but if you remove the flame the insulation will not continue to burn. Speaker wire may or may not meet this requirement.

Not to nit pick, but whenever you specify the breakdown voltage of a dielectric or insulator, you must consider the maximum instantaneous voltage. In a lamp cord it would be around 170 V (which still isn’t much).

Re: the discussion about 90v vs. 120v is irrelevant. It ain’t the volts that kill you, it’s the amps.

Secondly, the National Electric Code to my knowledge doesn’t address speaker wire. Electrical wire must be rated for its use. By using a non-rated cord for a lamp, you may be nullifying your fire insurance.

No, the voltage is relevant. You’re not the one carrying the current, so what kills you or doesn’t kill you doesn’t matter too much.

Actually, excess voltage could cause a failure in the insulation that sparks and starts a fire, which could kill you too so maybe it does.

However, the other factors mentioned (esp. those by Crafter Man) are more important in this case than the maximum voltage rating, which is likely to be acceptable.

Just a nitpick, but I’m pretty sure that fire insurance covers stupidity.

I’m not suggesting that the OP is stupid. I personally would use the wire and not worry about it. That makes me the stupid one I guess!

You’re wrong.

It seems every month or someone says, “It’s not the voltage that kills you, it’s the current.” And then we spend countless posts trying to educate otherwise.

For the record, voltage does matter when it comes to risk of electrocution. Using your logic, a car battery (which is capable of producing 500 amps) is much more lethal than your wall outlet (which is on a 15 or 20 amp breaker). But the opposite is true.

Sorry it took me so long to get back to everyone.

Thanx for all of the advice. I’m not going to use that speaker wire for anything but listening purposes. Thanx again…

You’re comparing AC and DC and might as well be comparing mountains and trees.

Crafter_Man undoubtedly chose a car battery as his example because it’s something that people are familiar with. Although DC and AC aren’t the same thing, its more like comparing red apples to green apples than mountains to trees.

He’s right to call bullshit on the classic “It’s not the voltage that kills you, it’s the current.” (INTVTKYITC) misunderstanding. Whereas it is true on a physiological level that tissue damage by electricity follows a current model rather than a voltage one, it takes a certain voltage to produce that current.

Since you didn’t like Crafter_Man’s battery example because it was DC, here’s an AC one. A 25W true-rms audio power amplifier (very low power in this day and age, but keeps the math simple) can feed a 14V rms sinewave into an 8 Ohm loudspeaker (P=VV/R = 1414/8=25W). Such an amplifier must be capable of providing at least 1.75 Amps rms (i.e. 14/8), which is about 2.5 Amps peak.

So, we unplug the speaker and connect a sinewave generator to the input. We can see a 14V rms (20V peak, 40Vpk-pk) sinewave at the output. Now, is it safe to touch the amplifier output with your two thumbs?

A true INTVTKYITC believer would say no, since this amplifier can pump 1.75 Amps rms through our poor bodies, instantly killing us.

However, as Crafter_Man and others are well aware, a 14V AC signal is perfectly safe to touch, unless you have unnaturally low skin resistance and an abnormal heart condition.

So, it may be the current that kills you, but it’s voltages that are dangerous. Without the necessary voltaqe available from the source, the required dangerous current just isn’t going to flow.

Ah, my point exactly. I can certainly agree that it’s the combination that is dangerous. A driving force of 5,000 volts behind 1 amp is certainly more likely to cause death or severe burn damage than 1 amp driven by 50 volts. However, voltage alone does not kill. I have worked live-line, bare-hand technique on primary lines. Were voltage the only factor, I would not be writing this. And yes, I know that current is flowing through the line while I’m working on it, but it isn’t flowing through me.

Hey, at least we’re not engaging in the “does electricity flow from positive to negative?” debate. :smiley:

.
I’m not sure I would agree with you there, at least in terms of speakers in regular home installations.

I assume that you mean 90V rms (since you’re comparing it to a 120V mains source, which is ~120V rms). For a nominal 8 Ohm speaker (I’m aware that speakers aren’t perfect resistive loads, I’m just running some numbers here) this would give P=VV/R=9090/8 ~= 1000 Watts.

If you’re used to dealing with home speakers that routinely handle 1000 Watts rms, then you and I move in very different circles.

Chefguy, there’s really no such thing as “5,000 volts behind 1 amp” and “1 amp driven by 50 volts”; these conditions only occur at one specific load impedance. In other words, you cannot stick an arbitrary load to a power supply and dial up “5000 volts at 1 amp” or “1 amp at 50 volts”.

This is the 16th million time I’ve posted this, but for the record: In order to get electrocuted, you need a certain amount of current to flow through you. There’s no specific threshold, but it’s generally agreed that anything over 10 mA is risky, and anything over 30 mA is getting dangerous.

So how do you get this level of current to flow through you?

Using Ohm’s law, I = V/R, where I is the current flowing through you. If we assume the power supply is modeled as a constant voltage source, then V would be the power supply’s voltage and R would be the path resistance through your body.

As you can plainly see, voltage is quite important. If assume R is constant over voltage, then the current through your body is directly proportional to the power supply’s voltage.

R is equally important. It’s also very difficult to quantify. This is because the path resistance through your body depends on many factors. It can vary anywhere from 10’s of ohms to over 100,000 ohms. Because of this, it is very difficult to come up with a “safe working voltage.” Depending on the path resistance through your body, 6 V could kill you and 60 V could be relatively safe.

But it gets even more complicated than this. In my example above, I assumed the power supply was a modeled as a constant voltage source with very low (or zero) source impedance. (In other words, it is capable of sourcing any amount of current.) But this is often not the case. For example, I have a Spellman high voltage power supply at work. The specs read “15,000 V, 1 mA, programmable voltage.” This means that I can program the voltage to any value between 0 V and 15,000 V, and that it is capable of producing up to 1 mA. (The actual current is determined by the load resistance and voltage from Ohm’s Law.) Is this a dangerous power supply to work around? Theoretically, no. This is because it is only capable of sourcing up to 1 mA of current. If the load “demands” more current (based on Ohm’s Law), the current will remain at 1 mA and the voltage will decrease. In effect it becomes a constant current source. So – theoretically – I can program the power supply to 15,000 volts, touch common with one hand, touch the power supply’s output with the other hand, and not be harmed. (But this assumes there’s not much output capacitance. If there is a fair amount output capacitance, the capacitance might be capable of sourcing a transient current > 1 mA.)

And then there’s the issue of R being non-linear (i.e. not constant w/ current), but that really complicates things and I won’t get into it.