Stupidly simple electronics question; working with a salvaged LED...

I have a salvaged LED that I wish to re-use.

I do not know the spec of the LED, except that, being in a clear package, I can identify the anode and cathode. I have in my arsenal a multimeter and a fairly diverse assortment of discrete components including potentiometers.

One further complication; the LED is an IR emitter; it does not produce visible output.

In this particular case, I want to power it with a cut-off USB cable (so ~5v, <500ma)

How to proceed? - I realise the shortest answer might be to simply go out and buy a new LED, but I’m interested in this as much as a general method as for this specific case, because my component box also contains a lot of new, but unknown LEDs.

couple of ways to do this, you need around .7V forward voltage to operate, so what you need to do is somehow lose 4.3V.

You could use a dropper resistor, the typical operating current you’re going to have to guess, but if you look in trade books you should get some realistic parameters.
It would possibly help to know what it came from and the application, if its a cheapo remote then it’ll probably be at the lower end of specifications, if its supposed to be a digital transmission link then it may well have a differant requirement, but this should give you a good idea of what to look for in a similar device in a catalogue.

The other way of dropping voltage is to use a chain of diodes, and since you know that each one will drop .7V across each forward biased junction, you don’t need to worry as much about the current.

This particular LED is salvaged from a TV remote control; I bought a job lot of electronic junk that contained several dozen of them.

I will be able to verify that the device is emitting IR, because I have a webcam that I have modified by removing the IR filter element, but I don’t think I’ll be able to reliably gauge where the brightness reaches a plateau (which I’ve seen as a suggestion for testing/identifying LEDs in the past)

The 0.7V mentioned by casdave is typical for Si diodes, but not for LEDs. The forward voltage drop for an LED depends on its emission wavelength (as well as the dissipation limit); for IR it’s usually 1.3-2.0 V. Your IR camera detector would work well for estimating this; use a 200ohm resistor R (serving both to limit the forward current to 25mA and to allow easy current measurement) and ~1kohm pot in series with the diode, and your 5V source; measure I (=V[sub]R[/sub]/R) vs. V[sub]D[/sub] as you vary the pot. If it doesn’t work, use a smaller R. (First make sure you’ve got the diode forward-biased, though.)

Allowable forward current can vary from tens of milliamps up. The 25mA limit used above is probably safe (just guessing from looking at various quoted specs). I don’t think there are any really good ways of estimating this from electrical measurements, since it depends partially on the packaging; your webcam brightness-plateau idea seems as good as any. Or, if you’ve got several to spare, you could test one to destruction and then back off by a factor of 2 or so.

If your webcam is too sensitive (if the IR LED saturates the image so that you can’t tell whether it’s getting brighter), try shining the LED against a wall or something partially reflective, to reduce its brightness.

You might also be able to examine the circuit it came out of; if the LED is driven from a transistor amplifier, for example, there may be a series resistor to limit the current (though probably assuming 3V source, if it’s from a 2AA-battery remote). But they may also have just put two IR LEDs in series, since Vf is probably close to 1.5V.

How big physically is the LED? Most of the “average” sized ones operate at around 10 to 20 mA. A “jumbo” LED might operate around 25 or 30 mA.

I have shone IR LEDs at old video cameras in the past, and even though they aren’t designed to pick up IR, they did give you a pretty reliable indication of how bright the IR LED was shining. If your webcam is similar, you shouldn’t have too much trouble figuring out where the LED output starts to saturate.

Another way I’ve measured LED output is to use a photodiode in series with a resistor, and measured the current flowing through the photodiode. Photodiodes pick up a wide range of light, all the way from visible up to IR, so you may have to be a bit careful with your rig so that you don’t end up measuring roomlight and shadows from your hand. Measuring the current directly requires a very precise multimeter (like a $500 fluke specifically selected for its ability to read low current levels). It’s much easier to measure the voltage drop across the resistor that the photodiode is in series with.

I had a pretty neat rig set up at one time that I made out of plywood. It was basically a turntable with the degrees of the circle marked on the rotating part. You could mount the emitter or detector in the center of the turntable, and the other part on the stationary part of the turntable on the outside. Then you can make very precise measurements as you rotate one or the other devices, and end up with essentially the “radiation pattern” of the LED. Using this you can figure out if the LED has a narrow or broad emission pattern. The narrow ones usually have a fresnel lens inside the LED which gives you greater range but a narrower beam.

      • Most IR LEDs I have seen were rated for 30 to 50 mA and dropped about 1.3 volts. Hook up however-many-volts-plus-load it takes to get that much juice through it (VISIBLE LED’s are usually 20-25 mA, somewhat less…).
  • Alternately, you can go search the Digi-Key catalog for similar components, and make a guess from their specs: http://www.digikey.com/
    ~

You only need a single resistor to bias your LED, and it should work OK over a range of resistance values. You need to know how much voltage the LED drops, and as you don’t know the LED type, you’re going to have to measure it. DougC’s estimate of 1.3V is a good place to start, so intitially use a 180 ohm resistor to bias it to about 20mA, and measure the actual voltage drop. Use the real voltage drop to recalculate the resistor value if it’s much different from 1.3V.

Don’t use series diodes or zeners to drop the voltage - it’s a fallacy that Si diodes always drop 0.7V - at no current, they will drop zero volts, at high currents they might drop over 1V, and thus make very poor series regulators. Diodes will only drop 0.7V at a particular current and temperature.

It’s a standard 5mm package in bluish transparent epoxy. I’ve been tinkering with them as an illumination source for the inside of a birdbox that I’m fitting with a webcam. I have two of them in series with three silicon diodes (I had already set this up before Fridgemagnet’s post), and they are adequately bright (they will go brighter, but I’ve backed them off) and they are not emitting any palpable warmth (which I have experienced in the past with LEDs that are being overloaded - tested these with my lip and they feel cold).

I’ll have to see whether this works in the long run; I’ll do an overnight soak test before I install them for real.