I thought of this from a reply. Imagine there’s an industrial process that requires a bath of molten metal, at say 1000K and to maintain it takes 1000W. You don’t need this bath for a day. Would it use more energy to keep heating it, or let it cool and heat it again when you need it?
I think it actually takes more energy to keep heating it, since the heat loss is greater when it’s hotter. Similarly, if your home heater is thermostatically controlled, and you’re going out for dinner (2 hours), should you turn it off, or leave it on?
In general you are correct. More energy is required to maintain a temperature because the heat loss is greater at higher temperature. Even though the physics involved is relatively simple, I found this very surprising when I first learned it.
maintaining a temperature can take much less energy than bringing it to temperature.
it all depends on the amount of stuff you’re talking about, how much heat it can absorb. how long and how much insulation.
you could see this with wood heating of a cold cabin. it might take a stove full of wood (or more) and an hour (or more) before you feel any heat. a few pieces of wood might then maintain the heat for an over night time period.
solar domestic hot water collectors are used to preheat a tank of water that is used as the input to a water heater (heated by gas or electric). this system will have a payback of maybe 4 to 6 years because of the large energy savings of bring water up to closer to usable temperature (the conventional heating can maintain the heat at lesser cost).
You have a bucket with a small hole in the bottom. When the bucket is full the pressure forces water out of the hole faster than when the bucket is less full. You don’t need the bucket of water for a while: does it use less water to let it drain and then fill it up again, or to keep a slow fill going balancing the water loss of a full bucket?
The answer is that you can never beat simply stopping adding any water and then refilling.
The absolute best case is when the rate of loss in independent of the amount of water in the bucket, then so long as the bucket never fully empties you use the same amount of water with either tactic. But if the bucket stands idle long enough to empty you only need exactly a bucket-full of water to fully refill it - but can use an arbitrary amount of water over time keeping the bucket full. Once the rate of loss increases with how full the bucket is you can’t even break even in the best case and will always lose. Heat loss increases with temperature difference. A well insulated bath of molten metal is simply the same as having a smaller hole in the bucket.
The answer depends. One factor to consider is the latent heat of fusion - this is the energy required (or released) during a phase change from liquid to solid. This can be considerable - for water, the energy required to melt water from ice to water at 0[sup]o[/sup] could heat liquid water from 0[sup]o[/sup] to 33[sup]o[/sup]. It is a lot of energy.
The rest of the calculation involves thermal loss rates and specific heat capacities, and there will be no general answer.
This was one of my first thoughts, but after thinking about it some more I don’t see how it’s relevant, the heat lost to the surrounding environment is still dependent on the temperature. There’s nothing that will change the rate of heat loss just because the material is at the freezing point, is there?
(It is a concern for the time required to get things up and running again of course.)
The basic question regarding thermostats has been answered by me probably 20 times on this message board over 12 years, including one time where I did a computer simulation, and by Cecil as well. Yes, the OP’s suspicions are correct, it does take more energy to keep the house at a constant temperature rather than use a setback.
Interesting point, a factor home thermostat settings don’t have. With a high heat of fusion and good insulation, it might be possible it would use less energy to keep it hot.
It has little to do with the energy, but some things you can’t flip the switch on and off, a glass furnace for instance. Even as large of a company as Ball Hoover brings in an outside crew to shut down and start up a glass furnace. As it cools, you have guys with big wrenches tightening up the bands that hold it together. As it heats back up, they need to be carefully loosened. Even with great care, the bottom could fall out on start up.
Yes, that’s what I was thinking, it’s probably more a matter of difficulty shutting down and starting up than heat wasted. Things like liquids solidifying in pipes.
I suppose with air conditioning you should switch it off, but switching the compressor on and off too quickly is bad for it?
Not really. If you have two identical systems, the hotter one always loses energy faster than the cooler one. For systems that are hotter than ambient temperature, of course.
That doesn’t make lowering the thermostat the “right” choice, just the one that uses marginally less heating energy.
Refrigeration compressors don’t care if you shut them off ten seconds after you start them - but they don’t like being started ten seconds after you’ve shut them off, as the residual pressure in the lines (from having recently run) means the electric motor that drives the compressor requires a lot of current to get the compressor spinning again. Many (most?) household refrigerators have a timer built into them so that they won’t begin running again for a considerable length of time after being unplugged (or experiencing a power outage). Air conditioners, I’m not sure about, but I would suspect they have a similar timer.