# Can a Room Reach Candle Temp?

This may sound like a stupid question, but I would like to hear the input of the science-oriented dopers: (Maybe I’m missing something?)

If a candle burns in a closed room assuming minimal leakage to keep enough of an O2 supply to keep burning, will the room reach thermal equilibrium with the flame temp? (Does that much heat leak through the walls to prevent this?)

Or, is it that the room is like a heat sink for the candle flame’s heat? A tiny point at a high temp will never cause a great rise in a large volume of air at 68-72 F. (This would mean the surface area of the walls, being a rather large value, does indeed permit enough leakage to allow the room to act as a heat sink for the candle’s high temp.)

And, why does a flame itself have a temperature gradient? Shouldn’t it really be that each section of the flame (which we observe as the different colors within the flame) all be at some average temp?

It’s late…maybe the answer will be obvious in the morning.
For now, I WAG the candle flame must lose a significant amount of heat in that short amount of space where the flame is still blue…

I’m blowing out the candle now to say goodnight!

• Jinx

The final room temperature can be computed as follows.

T = T[sub]0[/sub] + E/(S*m). T[sub]0[/sub] is the original room temperature. E is the total heat energy contained in the candle. S is the specific heat of the material composing the room. m is the mass of everything that is heated.

I think D Simmons equation is for an instantaneous addition of heat energy to the room. The situation with a burning candle is more complicated because heat is flowing out of the room while the candle is burning. Equilibrium calculations will get you nowhere here, you need to go with a steady state approach.
In a room at equilibrium with its surroundings the flow of heat out of the room is exactly equal to the flow of heat into the room. As soon as you light a candle, you set up a situation where there is a net flow of heat out of the room. This is a non-equilibrium system. Until the candle burns out, the system will move towards a steady state in which there is a net flux of heat out of the room (BTU/hour) which is exactly equal to the amount of heat the candle is putting into the room. The amount the temperature of the room increases to achieve this steady state depends on how big the room is, and how well it is insulated.
Heat doesn’t ever just get sucked up and dissipated, it always goes somewhere.

What Squink writes is correct. I took the question to mean an insulated room where there was negligible heat flow out of the room because of the statement that there was only “leakage to provide O[sub]2[/sub] for burning.”

If heat flows out of the room then the rate of heat flow must be computed. That is a difficult task but it need not even be done. That question has been experimentally answered many times in history by candles burning in actual rooms with hardly any noticeable rise in temperature of the room.

By the way, my answer assumed that none of the material in the room changed state. I.e. went from solid to liquid or liquid to gas during the experiment.

I don’t know about candles, but if you have about 20 bunsen burners going in a cold Micro lab, the room will become noticeably warmer in about half an hour.

If I read the OP correctly and these are considered true for this thought experiment, I will put forward an answer.

1. The room leaks no heat at all (is perfectly insulated).
2. You have a device that releases heat into the room at the same rate as a candle and can do so indefinitely.

In this case, the temperature will slowly increase forever. There would be no limit. In a perfectly insulated room, even as large a sports arena, the heat a single candle would eventually raise the temp inside to 1,000,000 degrees and beyond.

This of course assumes that the materials of the room and the heating device are not damaged by the temperature.

The heat released by a candle is not a fixed temperature. When you see a quote indicating the temp of a candle flame is 1500 degrees F or some such number, it starts with the unstated assumption that the fuel and air started at roughly room temperature. This could lead to the conclusion that the heat of a candle could at most raise the surrounding temp to at most 1500 degrees. This is not the case. What the candle is really doing is raising the temp of the burnt wax and air in the flame 1400 or so degrees above the starting point. If the start point was 1000 degrees, the temp of the flame would be 2400 ro 2500 degrees. The simple and correct way to see the problem is that in any space that is perfectly insulated, if any energy (or heat) is added to that space, the temperature will rise. Over time it will rise without limit if none of the energy can escape the system.