I’m not about to whip out the calculator to address this issue, since it’s more about common sense (IMO) than anything else.
Can we all agree that, if you leave the house for a few hours, it makes sense to allow the set-point temperature to drop? And that by doing so you’re saving money?
While I’m at it, I’d like to throw in a few 2nd and 3rd order variables that might come into play:
The longer a furnace runs, the hotter its combustion chamber gets. This increases its efficiency. Therefore, the larger the controller’s hysteresis (i.e. bandwidth) is at steady-state, the more efficient it becomes.
The R-value of insulation is often a function of temperature & RH, thus the heat loss may be a function of not only the temperature differential, but the absolute temperature.
Sailor, this site provides at least some corroboration:
The fact that they point out two thermostat models that prevent the problem adds a little believability.
I bought one of those from Honeywell about four years ago. Mine lies to us. If it’s within a degree of the set temperature, it changes its readout to claim that’s what the temperature actually is. You can see this (if yours does the same thing) by changing the set temperature up or down a degree, and see if the room temperature changes right away. On top of this, it also seems to read a degree or two warmer than other thermometers in the same room.
ZenBeam Thanks for the info. There may be idiosyncrasies with certain energy sources (e.g. heat pumps) which may require the homeowner to keep the temperature above a threshold in order to obtain high efficiency, but I believe Sailor’s comments are still valid in a general sense.
ZenBeam, yes, I understand the point but I believe such a system should only be correctly installed with the right type of thermostat and so, the answer is still that you save energy by setting back the temperature setting.
Crafter_Man, your two points are irrelevant for all practical purposes.