(I have a long-standing bet with a friend on this subject, and I’m new to this board, so I’m going to test your patience . . . .)
My understanding is that the earth constantly radiates infrared heat energy into outer space. (If it didn’t, we would all cook from retained geothermal and solar heat.)
One artifact of radiational cooling is that on clear nights, dark surfaces (which radiate heat more efficiently), such as a car’s roof or a nice grass lawn, can reach temperatures below 32F and form frost, even when the ambient air temperature is well above freezing. This is a common occurrence in the southwest.
First, is my understanding of this effect correct?
Second, does anyone know of a “backyard” experiment to prove it? I think there was a description of such an experiment published in “The Amateur Scientist” column in Scientific American magazine, perhaps in the 1970s. But I have been unable to find it.
Your understanding is perfectly correct. The earth receives heat primarily from sunlight; this is balanced by the infrared emitted from the surface. The amount of heat input is constant, while rate of radiative cooling depends on temperature - so at some temperature they balance.
As for experiment, what exactly are you trying to show? Showing frost on a car hood is a perfectly good experiment. Do you need to measure the actual rate of heat loss? Or are you trying to be more rigorous in showing that there is no other possible heat loss mechanism besides radiation?
I think he is trying to prove that the air temp can be above 32F (lets say 38F) but on a clear night frost can form - and this is due to heat loss through radiation.
but this example has a problem - you also have heat loss through evaporation and sublimation. Due forms on surface as the night cools and the air can’t hold that much water - this warms the surface slightly. When the air temp stablizes the amount of due condensing = rate of evaporation and the surface reached a steady temp. When it starts warming up the water evaporates and cools the surface and could be enough to start freezing some of the water.
If there are at least 3 different factors contributing to frost formation, there isn’t a simple back yard experiment that can isolate ONLY ONE of them. You have to either,
add hardware to control the experimental environment, or 2. make MANY observations under MANY condition, organize the data into categories for analysis of different factors, and dicover the mathematical relationships within the categories.
You experiment to prove that frost can form while air temperature is above freezing? Sit in your lawn chair overnight next to your car with a thermometer, make periodic temperature notes, and note when you first see frost. Of course, I think a thermometer accurate to less than 1 degree might be pretty pricey, not to mention one that reacts quickly enough. If it takes a minute or more to respond to a small temperature change, it’s easy to conceive that there could be a brief dip in air temperature, from slightly above to below and back, in less time than it would be registered. And during that time, perhaps frost forms.
I’m not suggesting this needs to happen for frost to form, but to point out the complexities you need to consider in order to “prove” something.
**This is incorrect **
Radiative cooling can not cause objects to to cool below ambient temperature. Heat will flow from the ground, or the air to maintain the temperature of any radiating object. To cool an object below ambient you need to do work.
Grass, or the roof of a car will cool to air temperature very quickly compared to, say, a sun-warmed rock. It is this rapid cooling that speeds frost formation on such surfaces. You may see frost on a car when the air temperature is above freezing, but, unless the car is hooked up to a refrigerator, that frost formed when the air tempearture was below freezing.
In reply to “One artifact of radiational cooling is that… dark surfaces… can reach temperatures below 32F… when the ambient air temperature is well above freezing.”,
yojimboguy said “This is incorrect. Radiative cooling can not cause objects to to cool below ambient temperature. Heat will flow from the ground, or the air to maintain the temperature of any radiating object. To cool an object below ambient you need to do work.”
I think yojimboguy is the one who is incorrect in that “below ambient” is a more complicated qualifier than it first appears, and is misused in this context. An object whose exposure is mostly upward can see lots of sky and little ground, and the ambient temperature of the sky for radiative purposes is very very low. For a flat object exposed to the sky and well insulated from below, and specifically in the context of radiative transfer, “ambient” is some weighted combination of sky and ground that might be dominated by cold sky, and the object could easily get much colder than the ground and the air. Note also that still air is a pretty good insulator and does not convect naturally when transporting heat downward; moreover air is almost perfectly transparent through the infrared (depending on its water and CO2 content), so it does not participate in radiative transfer.
BTW when ImInvisible said “One artifact of radiational cooling is that on clear nights, dark surfaces (which radiate heat more efficiently)…” - this isn’t quite right either. When we call a surface “dark” we generally mean that it absorbs visible light, of wavelength around a third to two thirds of a micrometer. But what matters more when an object of ambient temperature is radiating is, how well does it absorb “light” or electromagnetic radiation of around 10 micrometers. Snow, for example, is very absorptive of 10 micrometer radiation but very light as regards visible light; ditto for Caucasian skin. Being visibly dark, however, does make a surface absorb sunlight well, and heat by that radiative transfer - it’s just that in that case the radiator is several thousands of degrees hotter and radiates short visible wavelengths.
Get a couple identical metal plates of some kind, two identical* thermometers, and a can of flat black spray paint. Spray paint one side of each metal sheet black, and let dry. Tape the thermometers to the metal side. Aluminum tape would be best, but any tape should work. On a clear night, put the two plates outside, horizontally a few feet above the ground, with open sky above them. One should have the black side up, and the other with the metal side up. Read the temperature on each periodicaly over night. Repeat the next night, switching which plate has the metal side up. Try this on an overcast night also.
*To guard against the thermometers reading differently, tape the two thermometers together, and hold them in water at different temperatures, and read both. If necessary, mark one thermometer, and use it as the standard, and figure out how much you have to add to or subtract from the other ones reading to get them to match. That amount may vary with temperature, but probably not much.
The plate with the black side up should measure a cooler temperature than the one with the metal side up, especially on clear nights.
Be sure to report back here with whatever results you get (and especially if you need help interpreting the results )
oops, i was gonna reply, and then decided i’d demonstrated my ignorance too many times already. But, apparently I needed at least one more. say la vee…
Napier, I did not make such a claim. I was quoting an earlier statement by Squink, hit a wrong button, and reposted the quote without my own response.
ZenBeam’s description is what I would have suggested as well. One thing I’d change is to put the plates on an insulator instead of in the air - a thick piece of styrofoam should do it, preferably larger than the metal plate. Also, real-world metal surfaces have a fairly high emissivity (how efficiently a surface can radiate infrared) because of oxidation. So I’d either polish the metal side of each plate, or cover it with aluminum foil.
Alternatively, have both plates black side up, but put one under some sort of covering so it can’t see the sky. I don’t mean put a blanket on it - a tall canopy that blocks much of the sky is enough.
We seem to be getting into a waffle over how well things are insulated from the ground here. There’s no question that radiative cooling occurs; but the OP’s claim of “frost, even when the ambient air temperature is well above freezing” would require that the grass, or car roof be very well isolated from conductive heating. Last I checked, there was not 6 inches of styrofoam under my lawn This site describes a mechanism whereby the ground radiates to the sky and the lowest few meters of air is cooled by the ground to the point where dew, fog or frost can form.
The problem is the measurement of ambient temperature. Weather readings are taken at a height of between 1.25 and 2 metres above the ground in shelters, called Stevenson screens. This temperature is called the “surface temperature”. At lower elevations the temperature can be less than the surface temperature.
This is empirically wrong. Many have been the days when frost formed on my car, both the metal and glass, when the ambient temp was well above freezing in the morning, and the ambient temp never dropped to 32 F. I believe those surfaces lose heat readily (good conductors) and do get cooler than the surrounding air. The molecules in the warmer air move more than rapidly than the cooler metal or glass, but metal or glass is not air; hence, the heat is not evenly distributed.
That is just a WAG, but I know that frost will form and does form when the air temp is above freezing. It even forms on roadways when the air temp is above freezing.
This sounds logical, but… If (for example) the ambient temperature is 35 degrees, something must cool below the current ambient temperature first. Otherwise, the temperature would never cool below 35 degrees. Or does everything lose heat to radiation simulaneously?
I thank you all for your interesting and informative responses. As you can see, this subject (simple as it is) can get quite controversial.
In my original post, I mentioned something about a “backyard experiment” that may have been published many years ago in Scientific American.
As I recall, the jist of it was: If you took a perfect parabolic mirror and pointed it at the sky on a very clear, dark night, and put a thermometer’s bulb or sensor at the mirror’s focal point, you could get astonishingly low readings. The article might have suggested insulating the back of the mirror to minimize heat radiated up from the ground.
Essentially, the article suggested that you would be measuring the “temperature” of outer space, which is essentially absolute zero (or “nothing”). Absent bothersome sources of radiation such as stars, galaxies, and the moon, the parabolic apparatus would radiate heat energy into space endlessly, while recieving very little back. Thus the thermometer’s reading would drop and drop and drop.
I think the “backyard” experiment involved two calibrated thermometers. One would be “hung in the air” and sheltered from direct exposure to the sky (I once placed one under a wooden picnic table to approximate this) to measure the ambient air temperature. The other would be placed at or near the “focal point” of a stainless steel kitchen bowl which sat on top of a box filled with blankets (to insulate from ground heat) and was stretched across with saran wrap (to keep warm air from circulating into it). As I recall, I was able to get a differential of a few degrees farenheit trying this, but nothing too impressive. But finding the “focal point” of salad bowls is an inexact science, to put it mildly.
Anyway, thanks again for your input. Any more would be most welcome.
Just one snit: In an otherwise informative response, Napier wrote: “Note also that still air is a pretty good insulator and does not convect naturally when transporting heat downward; moreover air is almost perfectly transparent through the infrared . . . .” Aren’t “pretty good insulator” and “almost perfectly transparent through the infrared” opposites??? Or am I missing something?
It looks like you remember enough details to do the experiment. The mirror doesn’t have to be parabolic though, it just has to be big and concave. Ideally you want polished or at least clean metal, like fresh aluminum foil. Don’t use a mirror covered by a transparent layer or coating - the stuff may be transparent in visible light but will be “black” in the infrared. And of course, you should hold the thermometer with an insulating material - stick it in a styforoam or something.
Even if you do thing sperfectly the thermometer will not reach the temperature of deep space, which is minus 270 C. (3 degrees Kelvin). The atmosphere emits some infrared, so the effective temperature of the night sky ends up being about a hundred degrees higher - still pretty cold. I once designed an instrument that was cooled by radiation alone, and managed to get it about 20 degrees colder than the surrounding air. Admittedly that was on a high-altitude balloon where air is a lot thinner, so there is less convection. What I did was similar to the experiment you describe - put clean metal shields below and around the instrument, so the object cannot see the ground, only the sky. And insulate the supports of the instruments to minimize conduction. I also used a high-emissivity tape on the instrument.
As for air being an insulator, “insulator” refers to material that has low heat conductivity. It only describes conductive transfer - radiation is treated separately.
Sorry, I didn’t read your message carefully - I missed the part where you described the experiment you already did.
I’d say get rid of the saran wrap. Most plastics have high emissivity so they radiate in the infrared. Also, stainless steel is hard to clean. Aluminum foil is more consistent.
Finding the focus isn’t important at all. Your goal is to make sure that the thermometer cannot see any “warm” objects like the ground, trees, your body, or your house. It should only see the sky or the reflection of the sky. Think in terms of shielding the thermometer from “warm” objects, not focusing.