I just read on a web site (silly me) that suggests that turing lights off at night does little to conserve energy. Is this the straight dope? :dubious:
"Dark-sky Would Adversely Affect the Economy. Proponents of Dark-sky legislation claim that flipping the switch to “off” during the nighttime hours would conserve energy.
Power plants, however, generate power at a consistent level. There is no quantity control, or volume knob, if you will, on the major power producing facilities in our country. These power plants do not operate solely on demand. Rather, they produce a consistent amount of power that is supplemented during daytime “peak demands” by quick-start standby generating plants.
In other words, **turning down the lights during nighttime hours ** when communities are routinely far below peak consumption levels does not conserve energy, it only decreases consumption. "
This is complete bullshit. While it’s true that the generators turn at a constant rate (usually 3600 RPM for a 60 Hz system), the power production varies with load demand. If you’ve ever done the experiment of powering a standard incadescent light bulb using a hand-cranked generator, you know that when you unscrew the blulb, it’s every easy to turn the crank, as there is practically no load. However, when you screw the bulb back in, the handle becomes significantly harder to turn. The extra enegry you’re putting into turning the handle goes into lighting the bulb. The higher the wattage of the bulb, the more energy is required to turn the handle. Some stationary bicycles use this principle to vary your workout level, by placing a variable load across a small generator, which you turn by pumping the pedals.
There are ways of storing excess power at night for use during the day. For example, the Northfield Mountain pumped storage hydroelectric facility uses surplus production during offpeak hours to pump water to the top of a local mountain. Then during the day, it is released to drive hydroelectric generators. So even if the contention in the article were true, it doesn’t mean that the energy is just being tossed away.
Incidentally, nuclear, coal and other fuel-burning plants do have a “volume knob” of sorts. For nuclear plants, the control rods are used to regulate the amount of reaction. At times of low demand, the rods are pushed in farther, to slow down the reaction rate. For coal and other fuel-burning plants, the amount of fuel fed into the burners controls the amount of energy production. If this sort of control were not implemented, then yes, energy would be wasted in the form of excess heat, which would be carried off by venting steam pressure which didn’t need to be used to run the turbines. Yes, there is a minimum amount of fuel the generating system must consume in order to overcome losses do to friction, mechanical inefficiencies, and no-load electrical losses in both the generator and transformer iron cores and the copper losses, primarily due to stray capacitance and leakage inductance in distribution transformers.
Another thing not taken into consideration is that reduced demand by current consumers allows additional consumers to come on line without the need to increase net capacity (although this point is not so much tied to night vs day but goes to overall reduced net consumption).
I’m confused here. I thought that as the load increased it would tend to cause the generator’s rotation to slow down, dropping below 60 Hz. In order to maintain the 60 Hz frequency (…and the same power output) the generator would have to be sped up; Ie, more water from the damn or increasing the reaction rate in a nuke. Similarly, if the demand drops, the generator will tend to speed up and so to counteract this, the reaction rate can be reduced/damn closed/whatever to keep the 60 Hz.
It doesn’t affect the point of your post, Q.E.D., but actually for the pressurized water reactors with so-called “negative temperature coefficients of reactivity” common in the U.S., the reaction rate in the power range (after startup) is not controlled by the control rods. The reactor power is instead controlled by the steam demand of the turbines.
Electrical consumption drops, less power is required to drive the turbines, automatic valves cut steam input to the turbines to maintain a constant speed, resulting in less steam demand from the steam generators, so less heat is removed from the reactor core, which causes the core temperature to rise. That “negative temperature coefficient of reactivity” means that as the core temp rises, it tends to retard the nuclear reaction and slow the rate of reaction. By this feedback loop, reactor power drops until it matches the new lower steam demand.
Neat, eh? No operator action necessary.
Everything works in reverse when steam demand increases.
So what are the control rods used for? They effectively control the average temperature of the core. Also, when dropped, they shut down the reaction. However, they do not generally have to be constantly moved as power demand changes.
BTW, the “negative temperature coefficients of reactivity” comes from the fact that as the liquid water moderator increases in temperature, it becomes a less effective moderator at producing the thermal neutrons necessary to maintain criticality.
Wow—all off of the top of my head. Some things you just can’t forget, no matter how hard you try.
Going back to the OP, it is true that for many households, the consumption of electricity for heating, cooling, water heater, refrigerator, washer, dryer, TV, stereo, computer(s) etc., makes up maybe 95% of the total consumption. Lights are only the last 5% or so.
So whether you turn them off religiously, or leave them on recklessly, the impact on your electric bill is only marginal. I must say I’ve gotten a lot more inclined to leave them on as I’ve realized I’ve been bumping around in the dark to save a nickel a day.
Bolding mine. I thought the lighting figure sounded low so I checked this site. We are talking about different societies here, but my view is that lighting use is increasing in modern houses. Where there once might have been a single central light pendant or ceiling fixture, there are now a multitude of downlights, uplights, wall-washers and so on. There may be switches for all individual lights, but unless you are already energy wise, do you bother finding the lighting combination that suits or just leave them all on?
An example is my mother’s house. Her living area has 4 downlights, but all controlled by one switch. No dimmer. Each globe was 100 W, and she had another light over the dining table which was normally left on too, for another 100 W. 500 Watts of lighting in a room that was maybe 30 sq.m. She was concerned at her power bills, so the first thing is to do all the no cost actions: turn down the water temperature, check for hot water leaks, turn off microwave oven at wall when not being used, turn off the heated towel rail in the bathroom. **Turn off lights ** when not in room. The next step was to replace the 100 W incandescents with “energy efficient” lights in the downlights. 4 x 13 W instead of 4 x 100 W. All these actions together gave a noticeable reduction in the power bill, but I can’t itemise what saving each action achieved on the bill. I think it could be “nickels per day” on the lights, but what’s wrong with that?
There have been some mis-statements in this thread.
Most combustion power plants vary their load fairly considerably according to demand, and many of them do drop their load at night, or even shut off entirely.
Coal power plants tend to run above 75% load for much of the time for two reasons - efficiency, and to avoid heat cycling stresses on the boiler tubes. What keeps coal power plants from from operating well at low loads (below 40-50%) is not a function of frictions and efficiencies and other minor impacts but a function of having stable, non-destructive, and economic operations. First off, the boiler box is sized for a certain range of heat input, and you can’t really just shrink a boiler on-demand to balance heat transfer to the tubes at 20% load. You have to keep the steam temperature up to where you need it, so you don’t end up with a saturated turbine and possibly having water droplets hit the blades (unless that’s how you designed it, like some nuclear turbines are I’m told).
Load load operations can sometimes do unexpected things to slag buildup in the lower furnace - even though the temperatures are much reduced, you can get a lot of coal dropping out into the bottom ash and shooting the LOI up, which can really screw up operations, causing secondary fires and plugging of the bottom ash system. High levels of unburned carbon can head off into the backpass and burn, or even go all the way into the electrostatic precipitator and burn - and ESP fires really don’t help anyone. Cyclone power plants can be even more twitchy and unstable, as balancing that cyclone “swirl” can be very difficult with changes in airflow and primary air/fuel ratio. If you go too far one way, the slag is too hot and too runny; too far the other way and the slag solidifies, and then you break out the dynamite…
Mills, fans, and so forth often do not like to operate at low speeds either, and lose a lot of efficiency. Milling operations have difficulty at low air velocities, and milling control is difficult and can become unstable. Burner flow and burner swirl and flame pattern suffers as well, which can lead to high CO/UBC/NOx production.
And along that vein, in addition, flue gas cleaning equipment (such as SCR systems) are typically sized and designed to operate best at certain temperatures. If the temperatures are too low, then the SCR can’t operate well, or at all. Sometimes you even have to use auxiliary burners just to raise the temperature to the point where the SCR can run - essentially, wasting fuel to make the emissions better. Also, feedwater heater systems and the rest of the turbine cycle are sized to be most efficient at loads above 50% - most turbine heat rate kits show the lowest (best) net turbine heat rate is at or near the maximum output of the turbine.
So the options are typically to 1) run the plant at above about 50% load, or 2) to shut the plant down completely. And operators are loath to shut off and turn on coal plants due to thermal stresses and the lengthy shutdown/startup cycle times. Typically, there are several large baseload units, with several smaller units that are on “economic furlough”, and are only brought online if the system looks like it’s going to take a serious hit, and power prices start to break 50 mills/kWh.
Oil and gas boilers are in the same boat, overall, except that they have a little more flexiblity sometimes.
Gas turbines have great flexibility for low-load operation and startup/shutdown, although they do have some issues with thermal stresses too from cycling. And if they are combined cycle, then there is that whole steam section to deal with as well…
So really, the answer to the quote in the OP which says:
…is that the author has no real knowledge of how power plants actually operate, and is incorrect.
It’s also impressive, in a sad way, how often you get reasonable-sounding statements of “fact” up near the top of a thread that are conclusively debunked when the real pros check in a few hours later. Oh well …
Of course the author is correct. In fact, it’s vital that we don’t turn off lights at night. The generator turns at a constant speed, so the power production is constant. If we use it, then the energy ends up as waste heat in our homes. If we don’t that energy is LOST!! And matter is just energy. Turning lights off mean the universe is decreasing constantly, as this energy is just deleted. In fact, it’s EVEN MORE SERIOUS! Think: you drive more during the day. Your computer is on during the day. You use SO much more power. And you’ve probably turned your car ignition off at night. SAVWE OUR PLANETT RUN LEAVE CAR IDLINGING in UR GARIDDGE OKOKOKOK??? ViTal!! now.