What are the energy efficiency factors for various types of light bulbs?

On the solar collection side, there are ranking for the amount of light energy that is converted into electrical energy, stated as a percentage. This solar cell will convert 20% of the light energy it receives, etc.

Do they do percent numbers for the reverse case of getting light from electricity?

I know fluorescent converts a higher percentage of electicity to visible light than incandescents do. What would be those numbers as stated above?

And what of halogen, mercury vapor, sulfur, LED . . .?

According to various sources compact flourescents yeld about 60 lumens/watt. White LED’s about 50 luments/watt (as of 2000). Incandescents about 15 lumens/watt.

The LED makers are shooting for 100 lumens/watt by 2010.

According to various sources compact flourescents yeld about 60 lumens/watt. White LED’s about 50 luments/watt (as of 2000). Incandescents about 15 lumens/watt.

The LED makers are shooting for 100 lumens/watt by 2010.

And, while writing this I found this site.

It’s not very common to use percent numbers. It’s not very useful because the human eye has different sensitivity to different wavelengths (colors). If lamp A converts 10% of energy into red light (and the rest is wasted), while lamp B converts 5% of the energy into green light, lamp B would appear brighter.

The most common unit for measuring the brightness of a light source is the Lumen. It applies a weighting function (Luminosity function) to the spectrum, so the resulting number represents how bright the light appears to the human eye.

Sure it’s not the common thing to measure, that’s why I had to ask.
Seems a fair question though, since it’s the measure for solar power.

I guess what you’d really want to know would be the combination.
For example, there are outdoor Malibu lamps I’ve bought for the garden that collect energy from the sun and light your path at night. Sounds great, but they are too dim to actually light the walkway, just look nice. Then there’s the more expensive improved model with different bulbs, also too dim but not so bad. Here I have a solar application and I have no way to measure or compare efficiency.

As a combination, it’s not apples and oranges, it’s light in and light out. So you have to wonder, how much light that went in finally came out. What’s the efficiency of the setup. That’s the idea.

A more concrete example is suppose you have a standard roof, and you think I will convert that to a solar roof, and it will light part of the basement just as though it had a skylight. What you want to do is figure area of roof needed to light a given area of living space to the same level. If the collector efficiency is 20% and the bulbs 20% you need 25 times the collectors as the size of your “skylight lamp”

What I’m trying to say is, you are trying to compare light sources with different spectra. It is apples and oranges.

For a solar panel, you can measure or calculate the energy output when the panel is exposed to standard sunlight. But for a light bulb, you can’t say “if this bulb could emit the same spectrum as sunlight, how much would it emit?” Because it can’t.

If you still insist on some type of conversion factor, a 100% efficient lamp that emits 555nm monochromatic light (pure green light) would be 683 lumen / watt. (But please someone correct me if I’m wrong - I always get confused with lumens and candelas.)

Perhaps a more useful number would be the maximum efficiency of a lamp which produced a spectrum which was visibly white? People don’t usually want a pure green light.

The problem with comparison to solar panels is that the power output of a solar panel is compared to the bolometric luminosity of the Sun, that is, the total power in all wavelengths. This will include wavelengths outside of the range of vision, and I suspect that solar panels are actually most efficient somewhere in the ultraviolet. But if we apply the same bolometric standard to incandescant light bulbs, they’re nearly 100% efficient: Essentially all of the energy consumed by a light bulb gets turned into light. The problem is that, for a standard incandescant, most of the light output is in the infrared, and is thus unusable.

No, infrared is heat. I use that in my basement. Having a light on in the basement is an energy savings.

I suppose that nearly 100% is possible (some would be lost in chemical conversion of the filament, and other miniscule losses). I guess I never considered it that way.

But you are right, in general what counts is the visible spectrum.

So - since you don’t like the question as stated, add “in the visible spectrum”.

Oh, wait. It does say that.

So the question stands:
"I know fluorescent converts a higher percentage of electicity to visible light than incandescents do. What would be those numbers as stated above?

And what of halogen, mercury vapor, sulfur, LED . . .?"

Only in the same sense that every wavelength of light is heat. The process by which an incandescant light bulb produces visible light is exactly the same as the process by which it (or a human body, or anything else warm enough) produces infrared, and any 100 watt lamp will heat a room by the same amount, regardless of its spectrum.

The question is fine, and in fact a very reasonable question to ask. But solar panels aren’t restricted to visible light, so the comparison to them isn’t as apt as you may have thought.

You know,

[QUOTE=Chronos]
, one of the most frustrating things about this forum is that people want to argue about the wording of the question. David Simmons seems to have got it. The site he sent me to had comparisons of some different bulb types expressed in lumens.

And you were getting close until you started to off onto non-visible light.
And solar panels, for that matter. If trying to compare solar panels makes no sense to you just forget I mentioned them. They are just an example of expressing energy in terms of percent instead of watts or lumens. That’s all. I don’t really care if solar panels were never invented. The question is about bulbs.

But let’s take your statement that 100% of the wattage put into a bulb comes out as light. The question then is simple to state: What percent of that light is visible, for various bulb types? For a infrared bulb, very little. For a regular bulb the value is n% where n is between 0 and 100.

Your question seemed straightforward to me.

According to a quick Wiki search, it says that only 5% of the electricity into an incandescent light bulb comes out as visible light. Halogen lights are about 9% efficient, and a fluorescent light bulb about 20%. And a blue LED is about 35%.

Unless you happen to be a plant.
There’s a nice chart the sensitivity of the human eye vs wavelength, and the photosynthetic response of plants vs wavelength here: Watts, Lumens, Photons and Lux.
Lumens per watt values are a convenient way to measure lamp efficiency for some applications; Lux, PAR, or microEinsteins work better for others.

And that’s probably the best way to compare the efficiency of various light sources.

Una’s Wikipedia link has some numbers. But the problem with this method is, it considers all colors to be equal. 1W/nm at 680nm is considered equal to 1W/nm at 555nm, even though the latter appears 20 times brighter to the human eye.

Chronos was attempting to correct a misapprehension, to whit, that radiation emitted in the infrared is not light. It’s true that it isn’t visible light, but then radiation in the strongly blue or red spectra, while technically visible, is hardly illumination in the casual sense of the word. You can’t compare light output at different spectra as equal, even if their radiant output is equivilent. White LEDs, for instance, give out a very pure, well distributed light which gives a very crisp image, while a sodium arc lamp of much higher wattage will give off a dirty orange light that is inadequate for resolving fine detail.

Furthermore, the polarization and aspect flux or focus of the light also a key factor in how much usable light you get; light generated by fluorescent bulbs, for instance, tends to be highly depolarized and unfocused, while light coming from an LED or arc lamp will tend to have a higher degree of polarization. It is, as scr4 notes, really apples and oranges to compare light of different spectra in terms of the usability of the light. So, while it’s relatively easy to measure the efficiency of a solar cell (by measuring the potential developed versus total light flux over the cell), there’s no simple answer to your question. Using lumens as a measure gets you into the ballpark, the the effectiveness of a light source is a complicated subject that can’t be quantitatively reduced to a single figure.

Stranger

Pliny, the spirit of this board is very much to provide full and highly accurate responses.

Appreciate the deeper level of understanding being offered by some of the posters. They’re practically applicable to your sidewalk lighting project.

threemae - I understand that you didn’t come for the question but just to scold me, but this question is NOT about a specific lighting project. The examples given are just examples. To know what The Question is, look for the question marks. As a relative newbie :wink: you may be excused for not noticing that I’ve been around this board since the beginning. I know what to expect. Most people have a lot of different knowledge, want to answer, and will tell you what they know. The frustration is when they have to change your question first to work it into the conversation.

[QUOTE=Una Persson]
Your question seemed straightforward to me.

According to a quick Wiki search, it says that only 5% of the electricity into an incandescent light bulb comes out as visible light. Halogen lights are about 9% efficient, and a fluorescent light bulb about 20%. And a blue LED is about 35%.

http://en.wikipedia.org/wiki/Light_BulbThank you. You have been a great help.

Thank you. You have been a great help.

I didn’t come to scold you, I thought you asked a very interesting question for which there were a number of very interesting replies with varying levels of complexity and connected explanations. Classic Dope.

If you aren’t asking the question about a specific lighting project, why not value the more in-depth but slightly harder to apply responses?

Because most tried tried to change the question, or raise some point on a different topic. Some of the responses were quite dismissive of the whole idea. And some apparently just came to argue without offering anything sceintific.
But it turns out that Una’s link led to another Wikipedia article that said not only was my question valid, but there is a name for the percentage requested and a table of calculations computed by those who make 'em.

Wikipedia - Luminous efficacy
"The following table lists overall luminous efficacy and efficiency for various light sources:

Category
Type
Overall
luminous efficacy (lm/W) Overall
luminous efficiency[2]
Combustion candle 0.3 [6] 0.04%
Incandescent 5 W tungsten incandescent 5 0.7%
40 W tungsten incandescent 12.6 [7] 1.9%
100 W tungsten incandescent 17.5 [7] 2.6%
glass halogen 16 2.3%
quartz halogen 24 3.5%
high-temperature incandescent 35 [4] 5.1%
Fluorescent 5-24 W compact fluorescent 45-60 [8] 6.6%-8.8%
34 W fluorescent tube (T12) 50 7%
32 W fluorescent tube (T8) 60 9%
36 W fluoresecnt tube (T8) up to 93 up to 14%
28 W fluorescent tube (T5) 104 15.2%
Light-emitting diode white LED 26-70 [9][10] 3.8%-10.2%
white LED (prototypes) up to 131 [11][12][13] up to 19%
Arc lamp xenon arc lamp 30-150 [14] 4.4%-22%
mercury-xenon arc lamp 50-55 [14] 7.3%-8.0%
Gas discharge high pressure sodium lamp 150 [15] 22%
low pressure sodium lamp 183 [15] up to 200 [16] 27%
1400 W sulfur lamp 100 15% "

No. You asked what percentage of input energy is radiated in the visible spectrum. That is not the definition of “overall luminous efficiency”, for the reasons I already tried to explain. Read the Wikipedia article more carefully.