One would think there would be a simple answer, but apparently not.
The maximum possible efficiency (for a monochromatic light source) is 683 lumens/watt (from Luminous efficacy - Wikipedia). When one starts talking about “white” light, things get murky (no pun intended, I guess). The maximum efficiency of a truncated blackbody radiator is 251 lumens/watt (from the same link referenced above). But, LEDs are not blackbody radiators, and so they can apparently far exceed that limit. Cree just announced a white LED with an efficiency of 303 lumens/watt. How high can it go? Clearly, the upper limit is 683 l/w.
“White” light can be created by mixing enough colors that the pigments in your eyes perceive the mixed light as being white.
How many colors this takes, exactly, depends on how close you want to get, but clearly you could mix monochromatic LEDs, with that theoretical max efficiency of 683 lumens/watt that you mentioned, in the same light-bulb to “beat” that blackbody case.
With the blackbody case, you are generating the light by heating an object up and letting it radiate. The LED works by a different mechanism. Conservation of energy actually means the blackbody is going to radiate all of the energy it is given, these “lumens/watt” numbers we are talking about mean lumens of visible light. The “lost” light is in the infrared spectrum.
As for how high is the best possible : well, think about it for a moment, and you realize that the junction in an LED is between different dopings of a semiconductive material. It doesn’t work otherwise. Inconveniently, a semiconductor has significant electrical resistance. The energy lost to resistance is the heat, which is why that Cree bulb is only 300 lumens/watt.
I don’t know how to determine what the practical limit is from this information.
Other sources mention that that 683 number only applies for monochromatic green light. “lumens” is a measurement based on human sensitivity to light and is apparently not a measurement of energy. Some googling revealed that the limit for “white” light is more like 400 lumens/watt, indicating the Cree bulb is about 75% efficient.
It sounds like this is approaching the limits it will ever reach. Keep in mind, energy efficiency is not the only element to the game. If you made the P-N junctions using nanotechnology and made the lightbulb gigantic, you might be able to reach better than 90% efficiency, but this would not be an efficient use of resources (because the manufacturing energy costs would exceed any energy saved running the bulb)
Thanks for a fascinating question and a great preliminary reply This is why I love the Dope.
It depends on the quality of white light you want. If you want a blackbody spectrum (truncated to the visible range), the best you can do is around 250 lm/W. If you’re happy with a peaky spectrum that looks white under direct view, then 400 lm/W is the approximate limit (as Habeed mentioned).
It appears that Yoshi Ohno at NIST has done a bunch of this work. See this presentation for some info.
Narrowband sources tend to be the most efficient, but also have the worst color quality. However, several (3+) combined narrowband sources can look reasonable.
I was going to say some things, but it looks as if Habeed has covered it pretty well, and said everything that I would.
I’ll add that LEDs aren’t quite monochromatic (although they’re pretty narrow), and that I once built a device using banks of LEDs of three different colors along with a “mixing chamber” to even out the effects of all three. The current to each different color could be varied independently, so that I could adjust them to get as close to white as I wanted. The light I got was a lot “whiter” than you get from those “white” LEDs that use a blue LED and a phosphor. It’s clear from the Wikipedia write-up alone that the 683 figure is only true for a green monochromatic source at the peak of the eye response curve. Since the luminous efficacy is meant to be a measure of how well light is channeled to the visible portion of the spectrum, with no “waste” in the UV or IR (incandescents put a huge amount there), the best results will come from sources that put more light out in the visible. And there’s no reason to restrict yourself to only three LED colors – you can use more to give yourself extra tweaking control.
Also observe that, although this is good luminous efficacy, this is distinct from the source’s energy efficiency. LEDs may be “cold”, since all their light output goes into the visible (unless you use UV or IR LEDs), but they still produce heat. I once saw an LED produce so much heat that it melted the solder of its leads, and broke its own circuit. The most efficient light sources are still fluorescents.
Can you explain this? What’s the difference between luminous efficacy and efficiency? Wikipedia says they’re the same thing expressed in different units (% of max vs lm/W).
Isn’t an “efficient” light one that produces the most amount of light, per power, within the desired spectrum?
I was under the impression that LEDs give me a lot more light for my money than fluorescents (ignoring acquisition costs). Am I wrong?
This doesn’t answer my question.
The theoretical efficiency of 683 lumens/watt for monochromatic green light is derived from the energy in the photons. If you have a stream of photons at this frequency, and its brightness is one lumen, then the stream has an energy of 1.46mW.
So, i guess that my question could be answered by proposing a three-color “white” light source, and correcting for the differing sensitivity of the eye to various colors. Then, then energy of the photons in the Red and Blue beams could be calculated. I just wanted to know what the aggregate number was.
As an aside, my understanding is that *in theory * electricity can be converted to light with 100% efficiency. This doesn’t take into account the wall-plug conversion inefficiencies, of course.
Huh and I thought diodes had 0 resistance in forward bias. No wonder when I measured a 1 W LED, the multimeter didn’t recognize it as conductive.
LEDs have an unusual resistance curve. At much less than the forward voltage drop, the resistance is basically infinite (zero current flows). As you reach and exceed that voltage, the resistance drops to 0 with an exponential-like curve. It is very hard to drive an LED with a voltage source due to the steepness of the exponential–even minor thermal variations can cause you to exceed the power limit. This is why LED should always be driven with a current source, or at least be in series with a resistor.
Some multimeters have a diode test mode which basically acts as a small current supply. However, cheap meters sometimes don’t work on blue or white LEDs due to their relatively large voltage drop (3+ volts compared to ~2 for red).
Please see the end of my post above. The Wikipedia article you cite refers to luminous efficacy, not efficiency. It’s a measure of how much of the input energy goes in to providing visible light, and is clearly meant as a figure of merit for the amount of radiant energy going into useful visible light. It is not to be confused with the [efficiency of conversion of electrical power to photons. As has been noted above, LEDs do have resistance, do lose power to heat (although it’s not quite the same way that incandescent lights put out most of their energy in the infrared), and can produce significant amounts of heat. If you’re looking for high conversion efficiency, fluorescent lights still beat LEDs, although you’d think it would be the other way.
It says this:
That is a measure of how much of the electromagnetic radiation produced goes into visible light. NOT how much of the electricity that flows into the device goes into useful photons. It neglects resistive losses, so it’s not a measure of the overall efficiency of production from energy in to light energy out.
You could say it’s a measure of the efficiency of production of visible light relative to all photons out, but not a measure of the overall efficiency of the devices in converting electrical energy top light energy.
Wikipedia does not disagree with me.
From Wikipedia:
white LED (raw, without power supply) 4.5–150 Lumens/Watt
T5 tube 70–104.2 Lumens/Watt
I’d say 150 beats 104.2…
If you put your light bulb in a vacuum, and leave it turned on until it reaches thermal equilibrium, the two are the same. The energy that turns into waste heat has to go somewhere, and that somewhere will be incandescence (mostly not in the visible range).
What is the difference?
I agree (as a lighting engineer). Efficacy (empirical) will necessarily include secondary losses of the particular technology, be it filament resistance, circuit impedance, etc. Determining a theoretical maximum efficacy gets trickier, however. How efficient is your ballast? What is the ambient environment and how does that effect the component? What is the size and shape of the LED die and lens? What undiscovered “miracle” phosphor could be created? Is it “cheating” to count ballast efficiency in your spec?
Industry LED makers “bias” their efficacy numbers in their spec sheets to compete in the market by idealizing how their particular part was measured. “Doesn’t perform per spec sheet” is a common complaint. E.g. “I used an array of spec 60 lm/W LEDs in my flashlight, but our lab is only measuring 50 lm/W from the flashlight!”
I can’t contribute to what a theoretical maximum LED efficacy would be though or how close to the ideal 683 lm/W an LED could achieve, that’s for solid-state physicists and semi-conductor manufacturers to figure out.