[QUOTE=scr4]
[li]Infrared is difficult because any object near body temperature emits infrared. You can’t see anything if your eyeball is glowing.[/li][/QUOTE]
I’ve often reflected on how useful infrared vision would be – e.g., being able to gauge something’s temperature by looking at it (no more burning your fingers on a hot surface you didn’t know was hot). I guess that’s why it wouldn’t work.
Depends on the wavelengths/frequencies/colours of infrared emitted, doesn’t it? What is the wavelength of the peak IR emitted by human skin? Is it anywhere near these near-IR wavelengths (880, 900, 950 nanometres) that we are talking about?
When I was in architecture school back in the stone age, we had a demonstration of an infrared video camera (very expensive at the time, IIRC). It was used for viewing heat leakage from buildings (easiest on cold days).
We aimed it at people and put the result on a monitor. It was very odd: people would have different heat patterns on their skin depending on what they were doing; if they were embarassed, they just lit up under the arms, on the face, on the breasts… very embarassing for one poor girl.
Unfortunately, I don’t recall what wavelengths the camera was sensitive to. This infrared camera used for checking buildings is sensitive to IR of wavelength 8 µm to 14 µm (8000 nm to 14000 nm), far longer than the ones we are looking at.
I started to really wonder about that. I can see it working for monochrome IR photographs at whatever wavelength, but it ocurred to me that those pictures as displayed on the monitor cannot look like the infrared reality, because my monitor does not have infrared pixels! On the other hand, if the IR picture is using the tail end of my red sensitivity, maybe it would look the same.
I vaguely-remember seeing those as well.
So, focus for visible, tweak it back a bit for IR, slap the filter on, then shoot?
I was wondering what would happen if, instead of blocking all the light above the infrared, we let through just a trickle, so that the eye was evenly sensitive across the whole spectrum from infrared to UV.
This would require filters whose blocking curve with respect to frequency was the inverse of the eyes’ sensitivity curve: the more easily the eye picked up the frequency, the more of it would be blocked. That way, the regular light wouldn’t swamp the infrared at the lower end of the sensitivity curve. And maybe we could see some new colour mixing…
And I’d like to set up some filters that would let us use some of that residual sensitivity in the UV near 300 nm as well. But that would require… precautions.
By the by, there’s been some suggestion that some of Monet’s later paintings featured images from the UV spectrum, which he may have been able to see after cataract surgery.
In a previous job, I had access to a 10um video camera as well and used to observe the same phenomenon. One thing I found particularly interesting was when the camera was pointed at an old sheet of galvanized aluminum, it appeared as a perfect mirror. Mirror properties are a function of the smoothness of the surface as compared to the wavelength of the light. At 10um wavelength, that surface can be pretty rough and still provide a perfect reflection.
Cool. That’s something that never would have occurred to me for infrared, even though I know about the same phenomenon with radio (satellite dishes have to be mirrors at radio wavelengths, but this means that the large old ones can be made of mesh).
Especially if she was wearing the wrong outfit. Some synthetic fibers (I think that rayon is one of them, for instance) are completely transparent in the near IR.