# turning down the air conditioner

When I leave for work in the morning, I turn down the air conditioner by 5-10 degress. When I get home at night I turn it back up to a comfortable temperature. I do this because I have always been told that it saves electricity.

But, I’m an engineer and now that I am thinking about it, I would really like to see some proof. The AC will obviously be using less power during the day when it is turned off, but then it will have to work harder at cooling down the room when it is turned back on. It seems like there is a tradeoff here in how warm the room is allowed to get and how long you let it stay at that temperature. For instance, what if I turn off the AC and let the room go up to 100, and then the minute it gets there I turn the AC back on again and get it back down to 75? It hardly seems efficient to let the temperature oscillate so quickly like that.

I know many people have opinions about this (many have been expressed in previous GQ threads), but I am looking for a way to really compare how much power is used in various situations. Does anyone have an idea how to do this? Thank you!

Heat flow is generally directly proportional to the temperature differential. As far as I know, there are no big hysteretic effects with normal materials.

Well, A/C mfrgrs say whole house = leave it alone

Small unit, room, window unit = turn it off

And when you say “turn down”, you mean “turn up”, right? (i.e. increase the temperature?)

It would save a lot more energy if you turn the A/C to a higher temperature during the day, so it runs less. The difference in cooling it back down to a comfortable temperature when you come home is neglegible. Here’s a quote from my power company:

Shoeless : Yes. When I said “turn it down”, I am referring to the activity of the unit, which corresponds to moving the thermostat up. Thank you for the clarification.

Squink : I’m sorry, but I don’t have any idea what your post means. Can you translate for someone who does not have background in heat flow, etc.?

I think what Squink’s saying is that, regardless of how warm they seem on the surface, objects in your house don’t actually heat up that much as the air in the room heats up. Most of the things in your house are wood, plastic, or cloth, which are all insulators…

You can treat the heat flow as a steady-state problem. You know, heat flow in equals heat flow out. When you toss an air conditioner into the mix, you’re paying to to increase the rate at which heat flows out of your house. Since a steady state is reached no matter where you set your thermostat, the rate at which heat flows in to your house must increase as the internal temperature gets lower. A larger temperature differential implies a larger heat flow.

The system would be hysteretic if cooling an object from one temperature to another released more energy than would be needed to heat the object from the lower to the higher temperature. That doesn’t happen with reasonable objects like bricks and mortar. If it takes 20 kcal to heat an object up 10°, you have to abstract 20 kcal to cool the object by 10°. There’s no energy penalty for heating vs cooling an object.

So, what do we have?
Heat in equals heat out.
The greater the temperature differential, the greater the heat flow.
You PAY for one of the heat flows.
There’s no energy penalty for cooling things down or heating things up.

Does that make more sense?
You can treat the heat flow as a steady-state problem. You know, heat flow in equals heat flow out. When you toss an air conditioner into the mix, you’re paying to to increase the rate at which heat flows out of your house. Since a steady state is reached no matter where you set your thermostat, the rate at which heat flows in to your house must increase as the internal temperature gets lower. A larger temperature differential implies a larger heat flow.

The system would be hysteretic if cooling an object from one temperature to another released more energy than would be needed to heat the object from the lower to the higher temperature. That doesn’t happen with reasonable objects like bricks and mortar. If it takes 20 kcal to heat an object up 10°, you have to abstract 20 kcal to cool the object by 10°. There’s no energy penalty for heating vs cooling an object.

So, what do we have?
Heat in equals heat out.
The greater the temperature differential, the greater the heat flow.
You PAY for one of the heat flows.
There’s no energy penalty for cooling things down vs heating things up.

Does that make more sense?

Philster, with regard to the manufacturers recommendation to leave full house units turned on, I’ll bet that has to do with the kinetics of cooling rather than energy efficiency. If you let your house and its contents warm up to 95°F in the daytime, it can be a long, uncomfortable wait for the air conditioner to remove that heat when you get home.

:smack:
Sorry for the semi-double post there. I didn’t hit select all before pasting in the spellchecked post.

I think you are indeed saving electricity here.

The AC is turned up, so it runs less during the afternoon, the hottest part of the day. Then you turn the AC cooling up when you come home, which is probably late afternoon or evening, when the day is already starting to cool down.

The AC would use a lot more electricity to keep the house cool during the hot part of the day than it will use to cool it down when you come home later in the day.

We leave it at 80 degrees for the cats, and then lower it to around 75-ish when we get home. If we’re lucky, it’s cool enough outside to open the windows.

Our cats have repeatedly asked that we turn it down further, but since they’re not paying rent, they have to learn to deal with it.

I have been told that a good estimate of the savings is the time it runs at the steady state temp.

If you have it at 75F at night, and set it at 85F when you go to work at 8am, you savings really don’t begin till the temp of the house gets to 85+ and the compressor cuts in. Depending on you house, and outsdie temp this might not happen till 11:30am.

Now you are saving some money by running your A/C at 85 as opposed to 75.

When you get home at 6pm you put the a/c back to 75, the extra work by the a/c is about equal to the resting it did from 8 to 11:30.

So you will save about what it cost to run the a/c from 11:30 till 6 at 85 as opposed to 75 in this example.

I must be the only one on the boards who keeps the house cold. I’ve gotten complaints for years about it. I’m not comfortable unless the house is between 65 and 67 degrees. I’m in Arkansas so this costs me a bunch during the summer…I seem to save a lot in the wintertime though.

-K

Of course, the equation is far more complex that originally posed. Air conditioning confers another key benefit that the OP misses: dehumidification.

During summer months in most locales (where A/C is needed), setting back your A/C to, say, 85 degrees allows humidity levels to soar inside one’s house. A combination of high heat and high humidity inside a “closed box” provides a fertile breeding ground for all sorts of nasties that are counterindicated for humans, especially for people with allergies or sinus problems.

A complete answer must address issues outside of physics/thermodynamics. Rarely is there a free lunch.

If the AC unit is powered off a standard 120v wall outlet you could use something like this meter to measure the electricity used.

Given two days of like weather, you could set up the AC one day as you normally do, and the other day leave the AC on. Then compare the numbers.

Like most others here, I suspect that your current practice uses less electricity.