Infrared, visible light, and heat

The conventional wisdom is that we sense infrared radiation as heat. (The radiation is not heat itself but causes our tissue to heat, creating the sensation of heat.) But why is infrared so often thought of here? Microwaves obviously cause heat as well. It seems to me that any wavelength could cause atoms and molecules to radiate infrared.

What about visible light? If you could filter sunlight so only the visible wavelengths got through, would it still feel warm? Which leads me to my final question–is it possible to create a window filter so you would still see all the visible light but it wouldn’t warm the room?

Visible light can definitely heat things; the lasers used in things like professional light shows can easily set things on fire. Infrared is so strongly associated with heat because most materials strongly absorb longwave infrared, and radiation absorption is what results in heating. Also, ordinary hot objects radiate most strongly in the infrared; it’s not until things get up to around 900-1000 F that visible light begins to radiate.

If you focus sunshine through a magnifying glass onto your hand, you’ll have your answer right quick.

All light, when absorbed, produces heat, and all objects above absolute zero produce light of some form or another. The mechanism by which the surface of the Sun or the filament of a light bulb produces visible light is basically the same as that by which an animal or the walls of your house produce infrared light.

Ditto any electromagnetic radiation turns to heat energy as it is absorbed.

I think infrared may be thought of so quickly in this context is that it is a common heat transfer mechanism between objects, and the EMR associated with a hot object is typically infrared. So, it’s infrared’s availability to carry heat away from one body, as well as to another, and our familiarity with calculating both of these, that gives IR so much ink in this discussion.

Sunlight has plenty of energy in the visible portion. The density of energy per delta log wavelength is highest in sunlight around visible yellow light. There is actually more power here to heat your skin than there is in the infrared.

Sure, you can make windows that block all but the visible wavelengths, and it will lead to less heating indoors, but not much less. You can also knock down the visible some and get a bigger reduction.

I’ve always been annoyed at those who popularize things by saying that Infrared = heat. As noted above, any EM radiation, if absorbed, will lead to heating. I think it’s just a shorthand way of giving some idea that infrared light is not visible, but still carries energy and has physical effects. But it hopelessly confuses people.

Another – but normally unspoken – feature of most blackbody sources used for lighting (like incandescent bulbs and the sun*) is that , although they put out a lot of visible light, the bulk of their energy is in the infrared. This might lie behind that claim of IR = heat to some degree, since if you remove the visible light, you still have IR causing heat. But it’s still misleading. Bsides the above, it leads to the corollary that a visible source without IR might not have heat – like your visible-transmitting window. Of course, the visible light does cause heating. This conception also caused me problems, because it made me think of LEDs (which can be made to emit only visible light, with NO IR, unlike an incandescent bulb)as “cool” light sources. They’re not – LEDs can get pretty darned hot (One I hooked up last year got so hot it melted the solder I used on it), because they’re not perfecly efficient. And you can certainly use one to heat items, if you have enough of them and drive them hard enough.

*Yes, I know neither is truly a blackbody. But they’re close enough for experimental purposes.

>blackbody sources used for lighting (like … the sun) … although they put out a lot of visible light, the bulk of their energy is in the infrared.

Is that really right? The sun’s emission peaks well into the visible. The amount of blackbody radiation integrating over all wavelengths longer than some given value longer than the peak value is only proportional to the absolute temperature to the first power. (If you integrate over all wavelengths, period, then it’s the fourth power, but if you are only integrating over the long wavelength regime, it’s the first power.) So, the sun has a surface brightness in the longer wavelengths that is only ten times higher than that of a blackbody at just 230 Celsius (if I did that right). It doesn’t seem like much. I didn’t try calculating it, though.

They might not be completely cool, but they are still considerably cooler than an incandescent bulb of equivalent visible brightness.

You’re quite right – my memory played me false. Here’s a spectrum: