Perhaps he was thinking of fluorescent lights, which use a lot more current to start than to run. They use about twenty minutes cost of electricity to start.
Do you have any evidence for that?
Would you believe seven minutes? ![]()
:dubious: A common claim, and I’m calling bullshit on it. If you can point to a cite, I’ll have trouble trusting that source in the future.
Consider:
twenty minutes is 1200 seconds. A slow-starting fluorescent lamp might take 3 seconds to start up. An 80-watt fluorescent lamp troffer draws about 0.66 amps during normal operation. Your claim means that during that three-second startup, an 80-watt fluorescent lamp troffer draws .66*1200/3 = 264 amps. I’m pretty sure that would blow a typical 15-amp breaker. Heck, you could replace the 15-amp breaker with a block of solid copper, and then you would blow the main breaker on a typical house (usually rated for about 150 amps). If it didn’t, it’s unclear whether the 14-gauge wire feeding the lamp would survive three seconds of delivering 264 amps. According to the rated resistance of 14-gauge wire, you’d be dissipating 176 watts of heat into each foot of the line feeding the troffer (and another 176 watts per foot into the neutral). Things would get pretty hot.
Rapid-start fluorescent systems have an even shorter startup period, implying higher currents. Cut the start period down to 1 second, and this implies 3X the current, and 9X the ohmic heating in the house wiring. Is it getting warm in here? Do you smell smoke?
I’m pretty sure this is confabulation.
I would suspect that the original assertion was probably 20 minutes of lamp life.
Perhaps so.
Your cite actually says seven minutes of power in the first five minutes - which implies about two minutes (120 seconds) of power in the first few seconds. We’re still blowing breakers with this claim.
And what about high pressure sodium / HID / parking lot lights?
It’s darker if you turn them off. And you have to wait before you turn them back on again. And then they have to warm up before they start lighting up. But they don’t use much more power when turned off and on again – they just give less light.
I design lighting control systems, primarily for parks and recreation. Most of the lights we control are HID. We have a feature built into our software that allows “stagger-on” starting, because some parks are billed based on maximum hourly demand, so they want to keep the instantaneous current low. The thinking is that there is a HUGE surge when HID fixtures start, so we offset the time when each pole starts to minimize this. I was always skeptical of this, so I measured a typical fixture, and found that there is an extremely short current spike when power is first applied, but within a few milliseconds the power falls to the lowest level, and then slowly climbs as the lamp warms up.
Not exactly what it says. It says it uses 7 minutes of power in the first 5 minutes. However, lamp life is shortened by 20 minutes every power on, so that is the reason for the 20 minute recommendation.
I am unable to find anything else to support the claim about 7 mins power in 5 mins.
That’s only if the warm up cycle of a fluorescent bulb is efficient. The claim could be spreading that energy over the whole 5 minutes.
I plugged a fluorescent reading lamp into a kill-a-watt. It pulled 4 Amperes when it came on, but quickly went to 2.9A. That is an insignificant amount of time, but I can see why the urban legend began.
sort of a topic derailment but ive heard the same thing about pcs where its safer for and cheaper is that load of bunk also?
The Mythbusters test found the tube fluorescent used about 23 seconds of its baseline running power to start, and that was the longest of all. The rest (incandescent, halogen, compact fluorescent, metal halide, and LED) all used less than two seconds worth of power to start. The tube fluorescent fixture looked like an older style T-12 with a magnetic ballast, so I’d bet more modern T-8 or T-5 fixtures with electronic ballasts are much more efficient at starting like the CFL.
Their analysis of effects on longevity was totally useless though, cycling all the lights on and off every two minutes for a month. All this revealed was that the LED was the only one still functioning, but there was no data presented on when all the others failed, and thus how it would compare to leaving them on. Re-lamping is certainly a factor in commercial or industrial cost calculations, especially in harder to reach areas, but I don’t see it ever being worth leaving lights on all night when nobody’s around at the very least.
Also as far as using lights for supplementary heat, I honestly can’t come up with a situation where that’s ever the most economical solution unless your only other heat source is electric radiant, which is basically the same thing. Any other heat source is going to be more efficient and less expensive, and then there’s summer air conditioning. In many commercial buildings they’re already what’s called internally dominated, in that the people, equipment, and lighting is the primary load on the HVAC system, not the exterior envelope of walls, windows, and roof. Thus many of these buildings are air conditioning all winter, so adding more heat isn’t what you want. On the other hand, if the building is so poorly insulated that you do need the extra heat in winter, then you’re also stressing the air conditioning all summer unless you’re in a very cold climate, at which point what are you doing heating with electricity anyway?
During extreme cold spells, it can be best to leave the lights on for extra heat. The regular heating system is inadequate but the events are so rare that it’s not economical to put in a heating system to handle them.
Don’t know if its true, but I heard a few years ago that they switched traffic lights to LED’s and away from incandescent. And cities up north during the winter had to keep send trucks out to scrape the snow and ice away. Incandescent lights had been hot enough to keep the cold out.
Blown snow on traffic signals is an occasional problem, that’s because while LEDs do produce heat, that heat isn’t in the beam of light itself, but dissipated via heat sink off of the electronics. I would think there’s some way to let that heat get around to the front to help melt any snow, but I guess it’s not a big enough factor to really matter. The point though is that sending out a few guys in trucks with air hoses to blow out the signals is still a lot cheaper than the cost of running older style incandescent fixtures 24/7/365. A 12" LED signal head uses only 10 watts compared to a 120 watt incandescent. Since they last so much longer not as many crews are needed for re-lamping, which is another cost saving. They’re also quite a bit brighter (notably the reds, allowing red arrows to meet minimum lumen requirements that weren’t achievable before in many places, for instance) and they don’t burn up the plastic lenses that some jurisdictions used, especially problematic at actuated intersections where signals wouldn’t change often.
Notwithstanding all this, which I was aware of, I think the theory is that the lights will be turned on anyway during occupancy, and the waste heat needs to be factored in (and they do get hot - they are mostly HID or something like). Since it’s factored into the heat balance for day, it follows to use the heat at night (seasonal? I don’t know) rather than increase the size of the building heaters for peak load. Again, I don’t know why it was done that way, but it is what it is, and an engineer who was supposed to know what he or she was doing designed it that way. :rolleyes: