Why did they just now invent the LED bulb?

So, we’ve been using LEDs for a very, very long time. Did it just not occur to anyone before to make an LED light bulb? Or do these new bulbs contain some kind of special technology that wasn’t possible before?

Making LEDs that produce anything close to natural light has been a rather large challenge. I don’t think most people would want their rooms bathed in a harsh green, like living inside a gigantic “on” button.

Also, super-brightness (like 1,000 lumen) LEDs are a relatively new development. A lot of research into thermal management, phosphors, and topcoats had to be done before they were practical. Before that, it would take an array of many hundreds of LEDs to equal the light of a 60W incandescent lamp. Now there are single LEDs (rather, modules) that are brighter than a 250W lamp.

  1. Blue and by extension white (which are blue LEDs with a yellow phosphor) are fairly recent. I still remember how odd it was to see a blue LED at an electronics show, and paying $10 for a bag of three for such a novelty. Also, it’s only been recently that Cree and others have been making LEDs that are efficient and powerful enough for general lighting. IRC the first white LEDs were less efficient than incandescents.

  2. Most consumers were and still are happy with general purpose incandescent-bulbs, it’s taking a government ban to force them to change.

Actually, bright blue LEDs are almost a decade old now. It was probably a need to develop ways to produce and market them in bulk that kept them off the market until now.

It’s been very hard to figure out how to dissipate the heat for bulbs powerful enough to replace the “wattage” we’re used to, too.

Here’s an article from Wired about the race to get the bulbs perfected and on the market.

Is the heat from the LEDs themselves, or from the transformer necessary to run them? If the latter, wouldn’t it make sense to have a central transformer, and run 5v DC (or whatever) to wherever the lights need to be?

Watts are Watts.
If an LED “bulb” is rated at 17W, that means that the LEDs themselves are dissipating 17W. Since the LEDs themselves are pretty small, it takes clever engineering to get rid of the heat. Unlike incandescent lamps, LEDs want to operate at temperatures below 150C (and even better, below 100C). The cooler the LEDs run, the longer their lifespan.

Here’s a good quote from the article:

It looks like the consumers have no interest in anything except a replacement bulb that fits into standard light sockets and emits a glow similar to what an incandescent bulb emits. The answer to LED can’t involve re-wiring or new appliances.

An interesting note: Cree has achieved 231 lm/Watt in the lab. This is well over half-way to the maximum theoretical efficiency for a white light source. It’s pretty interesting - in general transducers (devices which change one form of energy into another) have very poor efficiency. LEDs are getting close to perfection.

No such government ban exists in the United States, no matter how many times people repeat this.

New York Times

Only a slight hijack, I hope…in the 1960’s, we thought that electroluminescence was the lighting wave of the future. We envisioned entire walls and/or ceilings glowing with an even, soft light. Although the phenomenon was discovered in 1907, it hasn’t become commercially feasible yet. (Or perhaps no one wants glowing walls or ceilings.)

Minor nitpick, but the efficiency is high enough so that if total input power is 17 W, heat dissipation will be less; this LED you posted about is over 50% efficient, thus power loss is less than 8.5 W.

Anyway, most of the power loss is in the LEDs themselves; the transformer (actually an electronic switchmode power supply) can be in excess of 90% efficient for modern designs (incidentally, the solution used in cheap LED bulbs, using a capacitor for its capacitive reactance, plus a few other parts, can be even more efficient, albeit not the ideal way to drive LEDs and prone to flicker).

Also, for kicks: Ultra-efficient LED puts out more power than is pumped in (its only 69 picowatts though, but who knows what this means for practical LEDs; the more than 100% efficiency is because it converts heat to light, along with the actual electricity consumed)

The OP’s question is like asking why we didn’t have the iPhone a few years after the invention of the transistor or why we paid several billion dollars to sequence the first human genome when we can now do it for less then ten thousand dollars.

Technology is hard and unpredictable. When it involves fundamental advances in materials and solid-state devices, it takes many thousands of people working for many years to make significant progress. The dream of solid-state illumination has been alive for decades, but the red LEDs of 1990 were just becoming bright and efficient enough to use for tail lights in cars. LEDs are a natural for colored light applications. If you want red light with an incandescent source, you make white light and filter out everything except the red, an inefficient process.

I remember going to a conference in the late 1980’s in which someone proudly announced the development of a blue LED made with Silicon Carbide. After great applause, they turned out all the lights in the room and passed around a sample that you could just barely see when you were holding it in your hand in the dark. A breakthrough occurred in 1993 when Shuji Nakamura at Nichia made the first high luminousity blue LED using Gallium Nitride. Suddenly the race was on at HP (then the world’s largest manufacturer of LEDs), Nichia, Cree, and other companies to develop white LEDs and to increase their efficiency and lower their cost to the point where they could compete with white light applications.

Things have proceeded about as quickly as the optimists like Roland Haitz at HP predicted.

Sorry, but that’s not the case.
The efficiency of the LED might be 30% - for turning electricity into light, but it’s still dissipating nearly 100% of the power of the lamp itself. The converter/drivers for the LEDs are somewhere in the neighborhood of 95% efficiency. So for the 17W lamp, the LEDs might actually be dissipating 16W, and the drivers 1W.

You are saying the wattage listed on the packaging is the heat dissipated, not the total electrical power consumption? So a “17W” LED bulb that is 30% efficient actually emits about 6W of light and consumes 22W total?

I did not know that.

Every light source that I have seen is specified by the input power - a 60 W light bulb draws 60 W, producing several watts of light and the rest heat (power in = light out + heat loss). It even says that here:

One would expect that if their LED bulb draws 17 W (or just the LED itself), that is the total power consumed, not how much heat is dissipated; this is always the case when a power consumption rating is used, whether it is a light bulb, TV or motor (the last may also be specified by output power, but that is like saying how much light a light bulb produces).

No, it consumes 17W, of which 5W is emitted as light and 12W as heat (assuming the LEDs are 30% efficient at turning electricity into light). The 12W is divided between the LEDs and the power supply, such that the LEDs dissipate 11W or so, and the power supply 1W.

There’s something seriously wrong there. That article is quick to reassure that the bulb doesn’t violate the First Law of Thermodynamics, but neglects to mention that it does violate the Second.

Yeah, apparently it uses ambient thermal energy to increase the emission of photos that would otherwise be trapped. So, it’s kind of cheating…