Where should I set the day thermostat temp for max efficiency?

I’m trying to figure out how to find the most efficient method of heating our house and I’m stuck. On the one hand, I can set the thermostat at around 64 while we’re out most of the day and have it turn on to, say 72 degrees at around 4:00, so it will be comfortable when we come home. But when it’s very cold out, the heater runs for a long time in order to get our temperature back up. (We’re very poorly insulated, so the apartment gets very cold, including walls, furniture, etc., and no, we can’t insulate the place any better). On the other hand, I could set the temp at about 70 for the day, let the heat kick in periodically, and then when we come home, it wouldn’t be so hard to get the temp up to a comfortable level. How do I find the right balance? What information do I need to gather? Any help? Thanks, dopers.

We have to start with the basics. For the most part a house loses heat to the outdoors at a rate proportional to the difference in temperatures. This means that, in general, if it is say 40 deg F outside and you heat to 60 deg F inside it will take 50% more energy to heat to 70 deg F. That is just in general, and wind, building design, solar gains, etc. can make a big difference, but it is a good starting point and does explain why you would want to back off on the heat while you aren’t there.

Basically, any decrease you make will save heating fuel because you will decrease the rate of heat loss for a given time period. Comfort, wear-and-tear on furnishings, fixtures, and heating equipment usually determine how much of a setback to be used.
I have seen different rules of thumb for this, and I wouldn’t drop the temp lower than my furnace could recover (on a cold day) in 45 minutes to an hour, personally, and less than that most of the time.

no dragons - this is a good start. Where, exactly, WOULD one find one of those rules of thumb you’re referring to? Because I can see a reason in favor of heating during the day. Say my furnace goes on 15 times during the day for 2 or 3 minutes in order to keep the temp at 72. When I get home, it stays at the same rate. But if I let the temp go down to 64 during the day, it might take 40-45 minutes to get up to 72, and since everything is so cold, it might cycle frequently several times again, just to retain that temp. So maybe I need to monitor the on and off cycles during a cold day and make the call from that info?

This seems to be your misunderstanding. Many people think that it will take more energy to bring the temperature back up to some point than it takes to keep the temperature at that point. This is simply not true.

If you lower the temp to 45 degree during the day, and then set it back up to 70 after a few hours, it may seem like the furnace will run more to heat things back up again, but overall it doesn’t. If you were to chart the on and off times while A) holding the temperature steady at 70 degrees, and B) letting the temp drop to 40 for several hours and then heating everything back up to 70, the furnace would run more overall in situation A (just keeping the temp more or less steady). I know it doesn’t seem like it, but it’s true.

That is what No Dragons Here was telling you. From a strictly fuel savings standpoint, any decrease for any length of time will help. The more you lower the temp, or the longer time you lower it for, the more fuel you save. The only “efficiency” to consider is your comfort, the effects of heat cycling on furniture, etc., just as he says.

Thank you, RJK - that’s exactly what I wanted to know. Just to make sure I understand - as you explain it, my hypothetical example would not occur, right? Is this also a way to say it?: The number of heating minutes (time the furnace is running) required to keeping the temperature steady is always greater than it is for bringing something back up to that temperature. This is sounding suspiciously like thermodynamics. hmmm

Yes, that should be correct.

I say “should” because I guess it’s possible that your furnace might have a variable output of some kind, and for some reason be significantly more efficient running “throttled down” than when it’s on full blast. In that case, it might be possible for the increased efficiency of fuel use in the furnace to overcome the higher rate of heat loss when the home is at the higher temperature. Mind you, I’ve never seen a residential heating unit that was like that, but I did get out of the business 15 years ago. I still think your statement is correct for virtually every normal case.

And you’ve got that right as well, I think. :slight_smile: