Radiative cooler vs. blackbody

Apparently it’s possible to cool a surface to below ambient temperature with materials engineered to radiate in IR frequencies that the atmosphere is transparent to:

https://advances.sciencemag.org/content/5/10/eaat9480

https://www.nature.com/articles/s41467-018-07293-9

It was my understanding (probably wrong) that one of the characteristics of a blackbody was that it radiated more energy at any given frequency than any material at the same temperature. If that’s true, it seems like the blackbody would radiate more in those critical IR wavelengths than any metamaterial, yes? Plus more at other wavelengths too?

So is it the mechanism of these radiators that while they don’t emit as effectively as a blackbody in the desired frequency range, they don’t absorb as well in all other ranges? It seems like the same would be true for a reflective surface, or a white one. So what am I missing about how these radiators work, or how a blackbody works, such that these things can get to below ambient temperature but a blackbody can’t?

In the IR range, those materials are probably very close to black. So the metamaterial and the blackbody will both radiate about the same amount. The difference is that they’re both also absorbing at the same time, and the blackbody is absorbing more.

One of your references is a proof-of-concept demonstration. It shows that if you very carefully keep an object out of the sunlight, and away from hot things, it can actually get colder. As they describe, we already know that the earth gets colder at night. The proof-of-concept provides a context and comparative numbers for some other experiments, including the one described in the other reference.

In the first reference, an insulator/reflector, a poor radiator/absorber, is exposed to sunlight and does not get hot. This insulator/reflector has the special property that at one narrow range of frequencies it does radiate/absorb. And that range of frequencies is one where there is no sunlight. So it radiates, but there is nothing to absorb – and it gets colder. This material doesn’t have to be as good as a black body at anything. It just has to be much better in-the-slot than it is out-of-the-slot. Black bodies don’t have that property – they absorb and radiate equally across all frequencies.

A blackbody is an ideal emitter/absorber. So it would absorb all sunlight, and get fairly warm.

A material that is reflective (white or silver) in visible light will reflect much of the energy in sunlight. But if you choose a material that is white/silver in visible light AND highly absorbant/emissive in infrared, such a surface will radiate to the sky and can cool below ambient temperature.

This is well known in aerospace engineering. Silver coated Teflon is the most commonly used material for spacecraft surfaces that are exposed to sunlight but you want to keep cool.

Of course you can cool an object to below ambient temperatures, if it is outside at night with a good view of a clear night sky. That’s how dew and frost work. An object with reasonably high emissivity in these circumstances can get cooler than the immediately ambient air temperature, which is what people usually mean by “ambient temperature” in heat transfer. It will still not get as cool as the distant ambient temperature of the universe, which you could argue is also a relevant ambient temperature in the context of radiative energy transfer, and there are no thermodynamic surprises here.

A blackbody is an ideal radiator and receiver, but there are plenty of real practical things that are only a couple percent less good at radiating. Us, for example – human skin is only about 2% less of a blackbody than, well, a true blackbody.

The OP is talking about cooling below ambient in daylight. Which is a bit more tricky, but still possible, either by physically blocking sunlight, or choosing a material that absorbs very little visible light (low-absorptivity, high-emissivity).

This is part of what the OP was asking about. There is no such thing as a low-absorptivity, high-emissivity surface.

The OP references experiments that use the special characteristics of solar irradiation to make physically realizable radiative coolers that work even in the sunlight. The radiative coolers described are technically innovative, but they don’t depend on a surface that defies physical reality.

(And, actually, the referenced experiments don’t deal with visible light).

From a physics perspective, you are right. When you are talking about the same wavelength, the absorptivity and emissivity are the same.

The key here is the “same wavelength” part. For an object exposed to sunlight, the wavelength it emits is not the same as the wavelength that it receives. Sunlight is mostly in the visible light, while the object mostly emits in the infrared. So it is possible to have a low absorption and high emission, because those are at different wavelengths. It’s basically a reverse greenhouse effect - sunlight is blocked, infrared emission is not.

p.s. In engineering, “absorptivity” is usually shorthand for “visible light absorptivity”, and “emissivity” is shorthand for “infrared emissivity.” Because 3 most significant thermal transfers are: absorption of sunlight shining on the surface, absorption of infrared shining on the surface, and infrared emitted by the surface. Absorptivity is the parameter that describes absorption of sunlight. Emissivity is the paramter that describes both the emission and absorption of infrared.

So this is why people talk about low-absorptivity / high-emissivity surfaces. In this context, such surfaces do exist. If you don’t want to spend the money for silver-coated Teflon (it’s thousands of dollars per roll), the mylar side of aluminized mylar film works pretty well. White paint is not too bad.