We bought a new home last year. Before we moved in, I also bought and installed an electronic thermostat. The kind that you can set to change to a certain setting at a certain time of day.
The way I have it set now is that at 6:15AM I have it warm the house up to 20C.
7:30AM, shortly before the Mrs and I leave, it goes down to 16C(this is probably the one I’ll be questioning later on).
5:15PM, roughly when we get home, it goes up to 21C.
Lastly, 10PM, it sets to 18C for sleeping at night.
The question I have is this. When we are at work for the day, is 16C too “cold” to be economical? For the most part, the weather this winter hasn’t been too cold. So, during the daytime when we were at work the house wouldn’t even hit 16 (or at least, I don’t think it would). When I would get home it would usually show 17 as the current house temp. However, now that the days are cold again (-15 to -25) the house will indeed cool to that temperature after some hours. Then, when I get home the furnace is on for an hour straight (give or take) to get the temperature back up to 21.
Being a new house, the furnace is quite effiecient (a new Trane) and the windows and insulation are all very good. We also have thick, thermal lined curtains or roman blinds on all the windows.
Is there a way to figure out the most “economical” setting?
I don’t mean to sound facetious, but the most economical setting is “OFF.”
It’s pretty simple… regardless of anything else, when your furnace is “on” you’re using energy, and when your furnace is “off” you’re not using energy. The more it is off, the more money you’ll save. But this must (obviously) be weighed against comfort.
Fair enough, but since most homes have tropical plants and pets that not a very viable option when it is -30 outside.
So, would it be fair to say “let it get as cold as you are comfortable letting it get when you are out?”
If that is 13C, so be it? Obviously, when the furnace is off, its not burning gas. However, if the furnace is going to have to be on for 2 hours straight to warm up the house, does that outweigh the benefit or is it always better to keep it as low as you can?
>> I don’t mean to sound facetious, but the most economical setting is “OFF.”
Pretty much on target. Other considerations: You probably want to start the heating between 1/2 and one hour before you get home and turn it off the same amount of time before you leave. It depends on the thermal inertia of the system. Other than that, the lower the setting, the more you save.
IANAHISH (I am not a home improvement show host) but…
Furnace/boilers usually take a little while to get up to their most efficient operating temperature. A general rule of thumb is that a slighty underpowered furnace that has to run longer is ultimately more efficient than an overpowered furnace that only runs for a short period (and never reaches its peak operational efficiency). So I would think that your strategy of letting the house cool down and then having the furnace be on for a long time is more efficient than having the the furnace cycle on and off to ‘hold’ a more moderate temperature.
>> Fair enough, but since most homes have tropical plants and pets that not a very viable option when it is -30 outside.
>> So, would it be fair to say “let it get as cold as you are comfortable letting it get when you are out?”
I am not sure I understand. This is a question about what plants and pets need? I can’t answer that. But, at any time the lower the setting the more you save. How low you want to set it is up to you.
>> Obviously, when the furnace is off, its not burning gas. However, if the furnace is going to have to be on for 2 hours straight to warm up the house, does that outweigh the benefit or is it always better to keep it as low as you can?
The answer is YES, popular misconceptions not withstanding. The furnace may be ON for two hours but it is heat you are using. If it had been on while you were away, the heat would have been lost. You are always better off turning it down or off while you are away.
What I meant is this. If it is -35 outside, since I have plants and animals inside, I’m not comfortable just setting the furnace to “off” all day. While my house is pretty effecient, I’m not entirely comfortable letting it settle to what it would with no furnace heat during a frigid winter day. However, I may be comfortable with say, 15C as I am sure that cat and plants aren’t going to feeze at that temp.
Thanks for the nice summation, Sailor. Make sense to me. Spud the cat may have to get used to having a “chilly” home during the daytime.
Do the calculation and you will see I am right as has been established in many prior threads. To me it is quite obvious but I have an engineering background. Just do the calculation and, if you come up with anything else, I’ll show you where the mistake is.
The mathematical model is similar to the following experiment which you can do if you dislike math: take a platic cup and make a hole in the bottom so water can drip out. Now pour water in to keep a certain level. water dripping out is heat lost. Water poured in is heat added. Now tell me if by keeping the level higher for a while you save anythi water overall. The answer is NO. But you can try it. I prefer doing the math _
Try an anology. Imagine that you have a bath tub with a small hole in the bottom. If the tub is full, it’s leaking, and the more water you have in the bath, the faster it leaks.
Imagine further that you want the bath to be full at a certain time. It should be obvious that the best is to let the tub drain, and then to fill it up when you need it, instead of constantly topping it up.
On preview I see that sailor beat me with a similar explanation, but I’ll post anyway…
I understand quite well sailor’s and Popup’s explanations but I think they may simplify the situation and I would be most interested to any empirical data on the subject.
Let me rephrase your explanations back in terms of an idealized house thought experiment. A house where you turn the heat off is going to have a temperature drop due to heat loss to its environment. (Let’s assume that eventually the temp inside matches the temp outside and we have steady state with no further heat loss.) Then when we reheat, running the furnace to restore the temperature adds an amount of heat equal to that which was lost during the drop. Alternatively, if we heat the house during the day, the rate of heat loss will be greater because the temp difference between house and its environment is greater. So you would conclude that you’re better off turning the heat off entirely during the day.
But all this assumes a closed system.
Here’s another point that may be splitting hairs but it makes for interesting discussion. A house is not a closed system. As the air heats, it expands, and houses are not airtight. The increasing pressure pushes the warmed air out of cracks and crevices. As the air cools the opposite happens, and cold air is drawn into the house from the outside. So we cannot assume that heat loss is confined to convection and radiation on the outside of the house.
Therefore I would suggest that a house at a steady state temperature is losing less heat to pressure increase than a house that starts out being cool then is heated up.
How does the expansion and contraction of the air in the house affect this thought experiment? Doesn’t the loss of heat during a reheating offset some of the advantage of the reduced heat loss to convection and radiation during the day?
Not a mathematical formula, but from the a document produced by the DOE. link
It basically says that the amount of energy required to reheat a home is equal to the amount of energy saved by not running the furnace while the house cools off. The amount of energy ‘saved’ only comes from the period of time when the temperature has leveled off at the low point.
So what this seems to imply is that you wouldn’t save any energy if the house never gets a chance to settle at it’s lowest temperature.
As an example, the house starts off at 70 degrees to wake up. Daytime setback on the thermostat is for 62 degrees, so the house gradually cools but only reaches a temp of 64 degrees at 5 pm before the thermostat kicks it back up to 70 for the evening. In this scenario, the article would imply there is no fuel savings because the temperature was always rising or falling, and never leveled off at a low temp.
So this raises the question, is it better to set the thermostat back only slightly, so the period of time where the temperature is held low (and where the saving supposedly occurs) is lengthened?
This doesn’t seem to make sense. Is the article wrong?
Look, this has been discussed several times in the past. We recently had a thread about leaving an electric water heater on.
I have a background in engineering and I have calculated plenty of similar things (heat sinks, heat exchangers etc). The math involved is fairly simple. Can you do it? Then do it and you’ll find out. If you can’t do it, then just take the word of those who can. The math model is similar to charging a capacitor across a resistor and many other similar things. If you look in the book is it is lesson 1 of differential calculus.
It’s over-simplified. Even if the temperature never reaches outside temperature, the lowered temperature will lead to reduced heat loss. Lower temperature is always more efficient. CookingWithGas’s theory about expanding/contracting air is true, but a temperature drop of 10 degrees C will only cause a 3% drop in volume. I’m willing to bet that the heat loss through this is negligible.
(Central heating is one thing I really miss about living in the US. Even if my Tokyo apartement had it, I’d never be able to afford the fuel here. I use a kerosene space heater for a couple of hours a day.)
That’s a little awkward to do with central heating, Handy. Generally speaking, you have 1 furnace and ductwork connecting it to every room of the house. When the furnace is on, hot air travels in the ducting and hence heats every room. That is more or less Central Heating by definition.