Using a lens to get hotter than the source's temperature - true for the Moon?

This week’s xkcd What If asked if a magnifying glass can be used to start a fire with only moonlight. Randall’s answer is no, the primary reason being that lenses and mirrors can’t make something hotter than the surface of the light source itself. It’s worth reading the explanation of why that’s the case; I had mistakenly thought it was possible to “concentrate” the heat into a smaller area, so ignorance fought there.

But I’m not following why that pertains to the Moon, which is reflecting light from a different source. Randall even calls this out:

But he doesn’t “get to it later” to my satisfaction. I’m still not seeing why the temperature of the Moon’s surface is a boundary on the possible temperature achievable with a lens. If the Moon were a perfect mirror, I’d expect no heating of its surface from the sun, and I’d be able to capture the perfectly reflected rays with a lens and heat something hotter than the Moon’s surface. Even as an imperfect reflector, the Moon is acting as part of the optical system, not a light source.

I’m not disagreeing with the final answer that moonlight can’t be used to start a fire, but is his explanation relating it to the surface temperature of the Moon correct? I think the reason is that too much light is lost due to absorption and diffuse reflection.

That is a good point. I think that, for any nonzero lunar albedo, it would in fact be possible (in principle, at least) to heat up something to some amount greater than the temperature of the Moon (provided that your target had a lower albedo than the Moon).

That said, though, the Moon’s albedo is pretty low (there is no dark side of the Moon: As a matter of fact it’s all dark), so even under ideal circumstances, you wouldn’t be able to get much hotter than the lunar surface.

He also says:

There’s gotta be something that has a flash point lower then 100C, right?

I agree; I think the albedo and diffuse reflection kills you.

As mentioned in our thread about the sun, no lens system can increase the surface brightness of an object. The best case lens is one that makes it look like the moon fills the entire sky.

The moon and sun have about the same angular size, but differ by about 14 in apparent magnitude, which means they differ ~400,000x in brightness. The energy emitted by a black body varies by the 4th power of temperature, so there should be a 400000[sup]1/4[/sup] =~ 25x difference in temperature. The sun’s surface is 5,778 K, so I’d expect the moon to warm something up to ~230 K with a perfect lens.

That’s quite a bit colder than the surface of the moon in full daylight, but that’s not unexpected, since the moon has a low albedo. We can do a crude estimate of the temperature by taking the fraction of the sky occupied by the sun (0.2 square degrees / 40,000 sq-deg in a hemisphere), and again taking the 4th root. At the equilibrium temperature, energy absorbed should equal energy emitted, and since the moon emits to (roughly) the entire sky and only absorbs from the small part occupied by the sun, this should be an ok estimate.

At any rate, I get 273 K from that, which is higher than the 230 K number, but still less than the peak numbers I’ve seen. I suspect the difference is that the bottom of a sun-lit crater will have a smaller portion of exposed sky to radiate to. The temperature at the bottom of a deep crater with the sun directly overhead will be higher than on flat ground.

I believe his “get to it later” was

Even though the Moon is reflecting light and not black-body radiating, being surrounded by the surface is a lot like being on the surface, and you’ll get to the temperature of the surface.

Flash point doesn’t equal ignition temperature. Here’s a few candidates. And note Randall did say ‘most’ things.

In a vacuum, being surrounded by a surface is very different than being on a surface. Surrounded by a surface, you’ll get only radiative heating. On the surface, you’ll also have conductive heating.

Good points. According to your links, it seems you could light a match made from white phosphorous. That seems like a common enough item to have been considered in his column. I could see ignoring some exotic rocket fuel though.

That only matters for how quickly the two bodies equalize in temperature. Regardless of whether the heat transfer mechanism is conduction or radiation, once you’re completely surrounded by an object, you’re going to equalize at the same temperature (eventually), barring any sort of external energy input. It doesn’t matter how much insulation or vacuum you put between the two objects.

That’s true. I was going off the picture in the column showing stickman partially surrounded, and the text is unclear - it says “surrounded”, but then talks about rocks that are “nearly surrounded” by the rest of the surface.

But I don’t think it’s an important point in the first place. Moonlight is not black body radiation from the Moon; it is reflected sunlight. I still assert that the theoretical temperature you could achieve with a lens focusing the moonlight has nothing to do with the temperature of the Moon’s surface. It is dependent on the efficiency of reflection (or lack thereof) due to albedo and diffuse reflection.

I agree it’s a simplification, but it’s still a pretty good estimate.

What matters is the effective temperature of moonlight, i.e. the temperature of a blackbody that would be just as bright. But what Randall is saying is, if you put a rock on the surface of the moon, it would reach that temperature. If you took that same rock, brought it to earth and put it behind the ideal focusing mirror/lens, it would be in exactly the same radiation environment, and would reach that same temperature.

I disagree with the bolded point. The rock will heat up only to the extent that it absorbs the radiation. If the radiation is reflected, the rock doesn’t heat up.

By “rock,” I was assuming an object whose emissivity is comparable to its absorptivity. With this assumption, the rock will warm up until it reaches the thermal temperature of the environment, where it emits as much thermal radiation as it emits. (Emitted flux is emissivity * sigma * T^4, absorbed flux is absorptivity * sigma * Te^4, where sigma is the stefann-boltzmann constant, and Te is the effective temperature of the incident radiation.)

It’s true that there are surfaces whose absorptivity is very different from emissivity - polished metal, silver-coated Teflon, etc. Those are not rocks.

I’m not trying to argue that the rocks don’t heat up. And at equilibrium, I agree that absorption will equal emission of thermal radiation. But there is also reflected light from the Moon, and reflected light is able to heat something at the focal point of the lens without heating the reflector (the Moon’s surface, in this case).