My physics classes are at least 25 years back and I slept through them. However, my hazy understanding of thermodynamics say that cold is the absense of heat, i.e. movement of atoms. The less they move, the colder something is.
So, I guess, heat radiates because the atoms that move interact with surrounding atoms. But since cold is things not moving, there can’t be interaction. Is this correct?

Sure, cold things radiate energy, as long as it’s above absolute zero.

And energy will be transferred from something cold to something even colder.

Having lived in Rochester, N.Y., I can swear that it feels as if cold radiates, but I’ve always understood this to be heat radiating in the other direction. Sorta like the darkon theory of light.

Okay, first off, ‘atoms that move interact with surrounding atoms’ does not really describe the radiation of heat… that is conduction. If you put a pan of cold water on an hot element of an electric stove, for example, heat is efficiently conducted from the element to the pan, from the pan to the water, from the pan to the surrounding air, and so on. Air does conduct heat, but not terribly well.

Cold does not ‘conduct’ in the same way that heat does, but it can seem to because a cold object serves as such a good heat sink. Heat is conducted from everything nearby into the cold object, making it less cold and everything else cooler.

Radiation works differently, not atoms directly interacting with each other, but atoms sending off tiny little energy particles that make whatever they hit warmer. (And whatever has radiated will be cooler because it has lost that energy.)

Cool does not precisely ‘radiate’ either, but again there is a psychological perception that it does, because it is not radiating heat as intensely as nearby warmer objects, but is absorbing radiation from them. Everything at room temperature, for instance, is constantly radiating infrared light, bouncing back and forth, and the net effect of this low-level radiation generally cancels out. If you put a big block of cold metal into a room were everything else is normal temperature though, that block will be radiating much less than everything else, and this, (along with the fact that air is conducting heat into it) will make you feel cooler when you stand a few yards away from it.

Hope that this helps.

And the surroundings are at a LOWER temperature.

The amount being proportional to the ratio of the 4th power of the absolute temperatures.

Do you have a cite for that?? I thought that even an object surrounded by higher temperatures would radiate, just that it would, on average, absorb more radiation than it emitted.

That’s right. What spingears meant, I believe, is that there’s a net loss of heat by radiation if the surrounding objects are colder.

Isn’t it true that you can think of the “cold” radiating, even though it’s really just the absence of heat radiating? If you were placed in a cold-walled room (let’s say with no air so that you can ignore conduction and convection), it would suck the heat out of you just like if you were in a warm room it would add heat to you. That’s because you radiate heat to the walls, and they radiate heat back to you, but since the cold walls radiate less heat, there is a net heat transfer from you to the cold walls, exactly opposite to how warm walls would have a net radiation of heat to you.

I didn’t mean radiation, as in anything having to do with nuclear physics.

It’s just that if I take a popsicle out of the freezer, it’s 20C below freezing, and about 40C below room temperature. I have to almost put my hand on it to feel that it’s cold. Moving to something that’s 40C above room temperature, I’d feel that from further away.

And that room with no air - vacuum is used as insulation in thermos bottles. Why doesn’t that function as a heat sink (which vacuum does in space, no?).

Roughly speaking, there are three ways of heat transfer: conduction, convection and radiation.

Conduction means if two objects are in contact, there’s a net flow of heat from the hotter object to the colder one. As a result, the hotter object gets cooled down and the colder object is warmed up. So in a sense, “cold” can be transferred by conduction.

Convection means hot objects transfer heat to the surrounding air, and the air moves around and carries that heat to other objects. Again, this results in a hotter object getting colder, and vice versa.

Radiation is as described above. All objects radiate heat (usually in the form of infrared radiation). If a hot object is surrounded by colder surfaces, the hot object loses more heat by radiation than it gains, so there’s a net loss of heat.

Thermos bottles are designed to minimize all three methods of heat transfer. It’s a double layer glass bottle, which means the inner bottle is only supported by the mouth of the bottle. This minimizes the heat path for conduction. The space between the two layers is evacuated (made into vacuum), which eliminates convection. And the glass layers are coated with metal, which emits very little infrared radiation and therefore minimizes radiative heating/cooling.

As for why you don’t feel “cold” when you put your hand near a cold object, part of the reason is that radiation goes as 4th power of temperature, as already mentioned. The net heat transfer is proportional to the difference between the 4th power of temperature of the objects. So radiative heat transfer between a 340K object and 300K object is 50% stronger than between a 300K object and 260K object
((340[sup]4[/sup]-300[sup]4[/sup]) / (300[sup]4[/sup]-260[sup]4[/sup]) = 1.5).

Also, hot air rises, so if you put your hand above a hot object you get a direct blast of hot air. Cold air sort of dissipates towards the ground. And many “hot” objects we encounter have lots of water (e.g. hot coffee, freshly baked pizza), and steam transfers heat extremely well - it releases heat as it condenses on your hand. And many other “hot” objects around us are significantly hotter than body temperature (toaster oven, frying pan, etc).

Nothing to do with nuclear physics… but more conventional physics, yes.

Any object will give off ‘thermal radiation’… the heat of the object determines how much it radiates and at what part of the spectrum. Think of a piece of metal glowing red-hot or a light bulb filament when the light bulb is turned on. Our furniture (and ourselves) don’t give off radiation in the visible part of the spectrum, but they still radiate. (Apparently it has to do with the fact that all matter is made up of charged particles moving towards and away from each other… the motion of electric charge through electrically charged fields stimulates electromagnetic radiation.)

Hope this helps clarify what I meant by radiation.

Not really. There is no “absense of heat radiating” at a temperature above the minimum of 0 K. Everything above that temperature radiates to some extent. If the thing is a perfect radiator/absorber (black body) it radiates according to the Stefan-Boltzman equation, I (radiation intensity) equals a constant multiplied by T[sup]4[/sup] (temperature). The value of the constant depends upon the units used for I and T.

for “absence of heat radiating”, try substituting “relative scarcity of radiated heat”?

>Isn’t it true that you can think of the “cold” radiating, even though it’s really just the absence of heat radiating?

Yes, this is true. I have worked this from differerent starting points and am sure of the conclusion. In fact it is fun to try to think of an experiment that would disprove the hypothesis that cold radiates.

There are limits as to how hot and how cold things can get, but our practical experiences are much closer to the cold limit. This is the closest thing I can find to a difference in the hot/cold symmetry. You can also do quantum experiments with thermal radiation photons, and the corresponding ‘cold photon’ things don’t work. So for instance putting a ‘cold radiator’ near a metal target wouldn’t create a photoelectric effect on it. I think these are pretty obscure details that might not be obvious to someone developing a reasonable theoretical system of radiant cold.

Light is a good metaphor for this (as well it should be, since the two are EMR with wavelengths about an order of magnitude different). I have a thermographic camera that sees thermal radiation, and have played with it in a variety of settings, and have the same difficulty you have with light proving that darkness doesn’t radiate.

Electrons and the plus and minus of electricity are another good metaphor.

Can anybody here describe an experiment that would make you say cold doesn’t radiate? Theoretical explanations that start with the assumption that the thing that radiates is heat aren’t really fair, are they?

Point a thermographic camera at a cold object, and then start accelerating towards the object. If the object were radiating “cold,” you should get more “cold” flux as you move faster towards the object, and therefore the temperature reading should drop. In reality the temperature reading would increase, because the flux of “heat” from the object will increase (i.e. blueshift of infrared light).

What do you mean by cite?
It’s fundamental physics/engineering.
There is no such thing as “COLD.” i.e. an object may be colder that another, it is a comparative term.
You and the OP need to review the fundamentals.
Heat always flows from one region to one of LOWER temperature absent a “heat pump” type device.

If you were in an Ice Cave you might be misled to believe the cold feeling was being radiated by the surroundings. In reality you feel colder becase you are radiating heat to the surrounding and sense the drop in body temperature.

Okay… wtf?? Where in my request for a cite, or anywhere else, did you see me arguing for a literal ‘radiation of coldness’?? That wasn’t even something you were denying directly before. The only line there that even comes close to my point is the line about heat always flowing to a region of lower temperature.

I was asking for a cite on the implied statement that ‘a [cold] object will not radiate heat energy unless it is surrounded by colder things.’ I was taking that literally, and it seemed inconsistent with what I knew of thermal radiation. scr4 suggested that you were speaking generically, saying that there’s no net loss of energy due to radiation if the surrounding objects are hotter.
Would you care to clarify your point without building straw men that have nothing to do with the statements I have made in this discussion??

My position is that literally anything above absolute zero will radiate thermal energy. Just about anything made of matter will also absorb radiant thermal energy, as long as there is something else in its universe. Whether there is a net gain or loss of energy through these two effects does not change the truth of the statements.

chrisk - don’t bother with spingears’ sense of superiority. It often shows up in his posts in GQ, and IIRC has resulted in at least one pitting. It’s just not worth the trouble though

Cecil’s classic column on heat transfer.

Ahh, okay. I see you have personal experience with one of those pittings.

As far as being worth it… well, I think it was worth it to defend my own position, especially since I didn’t realize he was considered to be somethink of a ____. (fill in appropriate term.)

and now…

(deliberately puts on pseudoscientist hat)

See, what you don’t realize is, cold already radiates so fast that you don’t get more of it hitting you as you travel towards the cold thing… in fact, then it’s coming at you so fast that most of it goes right through your camera… and it knocks some of the cold that was already there away as well. Yeah, that’s the ticket.

(/pseudoscientist)