Thermal imaging

Is there any way to modify a regular camera (say, a common, CCD-based digital) to yield thermal images?

This morning, what with winter coming on, I was thinking about how to identify the poorly insulated places in the house and thought of thermal imaging. Then I realized that, although I don’t think it’s possible (at least, not easily or cheaply), I couldn’t actually explain (to my own satisfaction) why it wouldn’t be.

To actually use a regular camera would require shifting the IR wavelengths to the visible spectrum, right? Which would entail adding energy to a specific range of wavelengths, right? What are the practical problems with this?

Your local utility may have an IR camera & offer this service.

No it is not practical to make such a conversion. CCD and CMOS image sensors are sensitive to NEAR infrared, but for thermal imaging you need FAR infrared. The glass lenses on the camera don’t even pass this very well (greenhouse effect!) so different materials are needed for those wavelengths. Back in the 80’s when I looked into this Germanium was preferred for thermal lenses, not sure what modern practice is.

Bill Beaty picked up a FLIR camera on ebay and has some cool videos he made with it on youtube…search for that name and FLIR, and you’ll find them.

Too bad you don’t live closer. I have access to a Flexcam thermal imaging camera here at work. It’s $9,000 camera, so I might have to sign over rights to my first born, plus a pair of limbs of their choice.

Which is kinda what I was thinking (that the physical properties of the camera are the limitations)…what would it take to design a device that shifted the IR spectrum into the visible light range? That is, the camera itself would not be modified; in actuality, it would be the spectrum being modified prior to entering the camera.

Again, if at all possible, I’d like to reach a technical understanding of the issues involved. The glass/germanium lens helps some, but I still don’t fully get it. (If an explanation is too in-depth or complicated for a post, I’d be quite happy with a good reference of some sort to read up on it on my own.)

There are actually some non-linear optical crystals that can double the frequency of light. Naturally there has to be some energy loss or we would be generating energy from nothing. At this point these crystals are very inefficient, and it would probably be more practical to design a camera for the infrared rather than use these crystals with a visible spectrum camera.

Ultimately, this phenomenon is a result of a non-centrosymmetric structure. From the top of my head an equation regardiong the polarization response of a material to an incoming electromagnetic wave looks something like this:

P = E[sup]0[/sup] + E[sup]1[/sup]alpha + E[sup]2[/sup]beta + E[sup]3[/sup]gama … etc.

But since (+A)[sup]2[/sup] = (-A)[sup]2[/sup], the E[sup]2[/sup] term out unless the crystal is non-centrosymmetric. I can’t remember the specifics of the derivation, but ultimately frequency doubling results from the fact that sin[sup]2[/sup]theta = 1 - cos2theta. (or something like that). Of course frequency tripling does not require this symmetry, but as these terms go to higher powers they become less and less significant.

I should add the caveat that all of this is from memory that is very old, so some of the specifics may be off.

Actually that should probably be E[sub]0[/sub] representing the static field rather than E[sup]0[/sup] which obviously equals 1. Like a said, trust the concept, not the specifics.

An easier way to do this would be to focus a thermal infrared image onto a sheet of liquid crystal temperature sensitive sheeting, which turns different dull rainbow colors depending on its temperature. Then you’d take an ordinary visible light photograph of the sheet, or for that matter inspect it by eye if that’s good enough.

To make this work, you’d have to get a sheet that was somewhere low in its sensing range in your ambient conditions before you focus the thermal radiation onto it. Then, you’d use a lens that can focus thermal radiation. Germanium lenses, as well as barium fluoride, or zinc selenide, or thallium bromoiodide, or many others, would work. But far the cheapest is to use polyethylene Fresnel lenses, which are flat plates that are stamped out of the cheapest plastic made (which happens to be pretty crystal clear in the thermal infrared).

An easier way to do it would be to accelerate the camera toward the subject at a significant fraction of the speed of light - the infrared wavelengths will be Doppler-shifted up into the visible band.

Glad I could help. No need to thank me.

It’s clear you’re not an experimentalist, Mangetout. The truly easy way would be to just place a black hole near your camera, to gravitationally blueshift the light.

We have a FLIR at work. It is an entirely different setup to a conventional camera. For one, part of it is cooled to around 75 Kelvin to give an effective “blank sheet” to read temperature against.

Q.E.D., does your thermal camera have something like that?

No, it’s not quite that that sophisticated. It does have some sort of thermal stabilization, but I’m not entirely sure about the details. I can’t find the exact model online (because I’m not exactly sure what it is), but this is the closest I can find.

Ours is one of these, or something very similar, I’m not sure of the exact model number either.

I didn’t even know you could get something handheld. It would be a neat toy to have.

You’re right - that’s far more practical than my solution.

Thanks all. I realize now that I did a very poor job of phrasing my OP, putting too much emphasis on the application (i.e., how to) and not enough on the issues involved (i.e., why not). Nonetheless, it’s been instructive and I appreciate it.

That’s very interesting. I had no idea. I’m going to spend some time reading Wikipedia and a page from CASIX, Inc. to learn more. Thank you.

Just to elaborate on a few points:

As already mentioned, conventional CCD and CMOS sensors are not sensitive to far-infrared because they are silicon semiconductor detectors. When a photon hits the detector, the energy of the photon is absorbed by an electron, allowing to become a free electron which is then detected as an electrical charge. The energy needed to free an electron is called the band gap. A far-infrared photon has less energy than the band gap of silicon, so it cannot be detected by silicon-based detectors. You have to make the detector out of a semiconductor which has smaller band gap, like HgCdTe.

But electrons have some energy to begin with, unless it’s at absolute zero temperature. In a narrow bandgap semiconductor, electrons jump the bandgap even at room temperature. All that spontaneous charge is unwanted signal, i.e. noise. So you have to cool the detector. I think thermoelectric coolers are enough, but it depends on how far into the IR you want to see, how much noise you can tolerate, etc. Some astronomical infrared detectors are cooled to a few degrees above absolute zero.

Also, you have to be careful that your camera housing and lens don’t emit much thermal radiation. I think this can be accomplished by choosing the materials carefully. (Shiny metal objects emit less IR than black objects at the same temperature, for example.)

A couple quasi-IR digital camera mods here and here.

As has been mentioned before, these mods are for capturing near infrared. Thermography refers to far infrared. As a reference, visible light has wavelengths of about 380 to 780 nm. Standard camera sensors are only sensitive to about 1000 nm, and even then, the sensitivity drops significantly the further away you get from the visible spectrum. The far infrared spectrum is situated between 8,000 to 15,000 nm. That’s a very long way from visible light. Don’t let the use of the word “infrared” in both cases fool you.

We have a fairly good thermal camera at work also. It’s rather bulky and noisy, owing to the cooling equipment. I heard that the newer models and smaller and quieter. However, even the cheapest thermography cameras out there cost at least a few thousand dollars.

*Our camera is so precise that you can “draw” on objects just by dragging your fingers on them. The friction and probably small amount of sweat/oil you leave is enough to appear clearly on the thermal image.

Back in the 90’s, I was sure that micro-bolometers would be the Kodaks of the Far-IR region. They needed no cooling, depending on the relative difference of temperature as opposed to the more absolute schemings of LN2 or thermoelectric schemes. In fact, I recalled General Motors (?) was going to offer one as an option. Sad to say, the cheapest I’ve seen has been in $10,000 range.