Lamps always have max wattage. It’s always lower than I would like. With today’s modern “equivalent” energy saver bulbs that are, for instance, 25 watt but the equivalent light output to 75 watt, is it safe to put in a bulb that has the same or lower wattage that it is rated for but which gives off an equivalent illumination that is higher? (Assuming I can find it… my kitchen floods are supposed to be 40 or lower, I want brighter so I’m hoping to find a 40 - or - lower wattage with an illumination equivalent of 60 or 70 or whatever, I just want them to be brighter while still being safe).
the maximum wattage allowed for a fixture is because of the heat the lamp produces. so it is the amount that bulb itself uses.
The actual power consumption of the bulb (not the “equivalent”) is what matters. The two areas of concern are the current capacity of the wiring, and the heat tolerance of anything that gets warmed by the bulb. So if you want a brighter bulb, look on the packaging to find out the actual power consumption, and make sure that number is below the rating of the fixture in which you’ll install it.
what they said. the limits on bulb fixtures generally assume an incandescent.
so the only thing that really matters is the actual wattage… cool.
I wouldn’t put a 40-watt (actual power) LED or CFL lamp in a fixture designed for a 40-watt incandescent bulb. It won’t damage the fixture, but the heat may damage the lamp. Incandescent bulbs tolerate much higher temperatures than CFL or LED.
Absolutely right!
Most lamps are wired internally with #18 or #20 wire – that is 16 amps (1,920 watts) or 11 amps (1,320 watts). Both of those are far higher than any incandescent (or even CFL) bulb that you could fit in them. (The current capacity of the lamp switch is important too, but they are always sized to match the wire limit.)
More critical is the heat limitation. Lamps often have plastic parts or fabric shades near to the bulb, and most of the wattage of an incandescent bulb is dissipated as heat. Nearby parts will absorb that heat, and can get hot enough to be discolored or damaged – or even to burst into flame, in the worst case. That’s the main reason you can’t put a light bulb of a larger wattage in a lamp.
But that applies to regular incandescent bulbs. CFLs run much cooler than incandescents, and about 4 times brighter. So while it’s not safe to put a 75 watt or 100 watt bulb in a lamp designed for a max of 60 watts, you can safely use a 19 watt CFL (equivalent light to a 75 watt incandescent), a 23 watt CFL (= 100 watt incandescent), or even a 42 watt CFL (= 150 watt incandescent). That should certainly get you as much light as you want! (I wouldn’t go much higher in CFLs, because 1) you probably couldn’t squeeze the CFL into the lamp, and 2) a tight lamp might get the CFL too warm, thus burning it out prematurely.)
LEDs run even brighter and cooler than CFLs, so substituting a higher wattage one is even safer (but their high cost to buy might make this unsafe to your wallet).
What? Why would a 40-watt LED or CFL produce more heat than a 40-watt incandescent?
And why wouldn’t a 40-watt LED/CFL be designed to withstand its own heat output?
I find your statement confusing.
No, I said an incandescent bulb can tolerate higher temperature than LED or CFL.
The manufacturer of the lamp cannot predict the operating temperature because it is affected by the fixture - some fixtures have better ventilation than others. And someone who designs a “40W (200W equivalent)” CFL lamp may not assume it will ever be used in a 40-watt fixture, and the bulb may overheat if you do.
They don’t produce more heat but they are more heat sensitive. Keeping them in an enclosed fixture can cause them to overheat unless they’re specifically designed to be used in encloses fixtures.
But does that account for efficiency? In theory LEDs and CFLs should generate much less waste heat to begin with. If this Wikipedia chart of luminous efficiency is accurate:
A 40W, ~500 Lumen incandescent is about 2% efficient
A ~500 lumen CFL, about 12%
A ~500 lumen LED, maybe around 10%
So, for a given brightness level, a CFL or LED lamp would have to use 5x-6x more wattage before it produces the same amount of waste heat as an incandescent.
Is heat failure of a CFL or LED something you’ve experienced personally? Is it a bigger deal in real life than I assume?
ETA: Just saw your responses. Do these enclosed fixtures often cause a problem? Maybe I’ve just never noticed them myself.
I think part of the confusion may be that he used the words “lamp” and “bulb” interchangeably in his statement.
To an electrician, a bulb and a lamp are the same thing (a bulb is actually a type of lamp). The thing you screw the lamp into is called a “luminaire.” To a non-electrician, a lamp is that thing sitting on your table that you turn on and off. A 40-watt CFL or LED would give off the light equivalent of a 150-watt incandescent.
The big problem with CFL and LED bulbs is that most of them are not designed to be used in fully-enclosed fixtures. Even those that are safety-rated for use in fully-enclosed fixtures suffer reduced lifespans if so used. They need air-circulation around them in order to dissipate heat.
Incandescents are far more tolerant of the heat they produce. A fixture designed for a a relatively-small 40 watt incandescent may not have enough air circulation to allow the heat from a relatively large LED or CFL to properly dissipate.
You may enjoy this video: CFL Light Burned My Home Plus Fire Destruction Photos.
Unless your CFL bulb says on it “Suitable for use in enclosed fixtures” – don’t do it.
It’s the biggest deal.
The electronics crammed into the base of CFL and LED lamps are, in general, working at the very ragged edge of their temperature range. In order to make the lamps as cheap as possible, the manufacturers often use lower-quality capacitors, and these will have a substantially shorter lifespan at higher temperatures. In my experience, the electronic ballasts almost always fail before the tubes (for CFLs) do. I’m sure LEDs have the same issue, but I don’t have enough data to comment on them (yet).
There are to reasons why LEDs in particular are more heat sensitive:
- They emit very little heat as radiation, so the vast majority of their heat needs to be dissipated via convection or conduction. Incandescent lamps emit much of their heat as radiation, so it doesn’t accumulate near the fixture.
- The electronics in LED lamps won’t operate above 150C or so, and would really prefer to stay below 100C. Incandescent lamps, on the other hand, can easily tolerate temperatures into the many hundreds of degrees.
I see. Thanks for the info.
So, to clarify:
-
In certain enclosures, CFLs will be subjected to more heat than they’re designed to withstand, either shortening their lifespan or perhaps starting a fire.
-
But in general, CFLs produce less waste heat than incandescents
So my question is, how OFTEN is this a problem? I’m curious if this is a legitimate cause for concern or if it’s anti-CFL propaganda by those who hate the legislated shift away from home incandescents.
That video linked to a news story talking about CFL dangers. The story said there were 34 reports filed to Consumer Product Safety Commission of CFLs smoking or producing a burning odor, and 4 that caught fire. Out of 272 million sold. And out of those reported failures, how many distinguished between actual heat-related failure due to insufficient ventilation and some other sort of generic manufacturing defect?
ETA: To phrase the question better…
CFLs produce less heat to begin with, but in certain enclosures, much of that heat will be trapped and may break the CFL or cause a fire. In the real world, does this happen enough that it should be a source of concern for the average consumer?
Light sources are inherently dangerous.
There are been thousands of house fires started by Halogen fixtures (notably torchieres).
Examples are all over the web: Halogen Hazards
In general, CFLs and LEDs are probably safer, but they have their own unique failure modes that may concern people (smoking ballasts).
The LED itself is only rated up to 150C or so (example), some are rated lower (120C for the Luxeon).
This is why high-power LEDs are always connected to large heat sinks. Also if you see a cluster of high-power LEDs, like traffic lights, they often have a few broken LEDs - and they’re almost always in the middle, because those get hotter than ones on the edge.
No, that’s not what we’re saying.
If an enclosure is designed for a 40-watt incandescent bulb, it (the enclosure) will handle the heat from a 40-watt CFL or LED. There is no risk of fire*. The danger is to the bulb - a CFL or LED may die prematurely because of overheating, while an incandescent will survive the same heat just fine.
(*In theory it’s possible for failing electronics to catch fire, but I suspect that failure mode is so rare as to be negligible.)
I understand that.
But given that a 40W CFL/LED usually produces less waste heat than 40W incandescent, is it likely, in the real world, that they’ll overheat in the same enclosure? I understand that they’re if trapped in something like an integrating sphere with no heat sinks or airflow, they’ll overheat and break sooner than an incandescent. But in the real world, does this happen much?
Take a look at some typical LED replacements for the normal household lamps. Notice how they’ve all got fins or grooves or ceramics. These aren’t aesthetic choices. It shows how much trouble the manufacturers have to go to deal with the heat-dissipation issues. In effect, they build radiators into the lamps to try to get rid of the heat. They wouldn’t be going to this trouble and expense if it weren’t a real issue.
The Switch Lighting Company has concocted a new LED bulb that tries to address these issues by using some fins at the base and filling the bulb with a liquid that conducts the heat away from the LEDs. Here is a video with the president of the company having a chat with some geeks at a trade show about heat issues.
By the way, their lamps are the only consumer-grade LEDs approved for use in fully-enclosed fixtures.
Maybe I’m reading this wrong, but I read this as a CFL/LED lamp needs to use 5-6x more power than an incandescent before it dissipates the same amount of waste heat. In reality, you can’t use a CFL that draws more than 44.5 watts for the same heat dissipation as a 40 watt incandescent (39.2 watts). Maybe you meant that the CFL/LED can be 5-6x brighter for the same wattage (so not for a given brightness level).
Note also that a CFL drawing 40 watts will still dissipate nearly 90% of the power of a 40 watt incandescent - in other words, lighting technology still has a long way to go in terms of efficiency improvements! I’d suggest not going above 20 watts in this case to avoid premature failure from overheating, and operating temperatures definitely should be below 100C, much less 150C; for example, electrolytic capacitors might have a rated lifetime of (at least, since they must still meet specs after this, but no guarantee afterwards) 5,000 hours at 105C; a 10,000 hour CFL must be kept below 95C and a 50,000 hour LED, below 70C; this also gives credence to the argument that special fixtures be used for LED bulbs (as with tubular fluorescents), then the ballast can be separated from the heat-generating LEDs, which dissipate perhaps 90% of the total power lost, and run at a much cooler temperature; a ballast temperature of just 35C would lead to a lifetime of at least 640,000 hours - 73 years, although other failure modes would probably come into play before then.