Broadband light sources are often blackbodies running around 3000 K, or have a similar spectrum plus some emission lines. Typical solid state imagers have a very roughly flat quantum efficiency and then suddenly drop off for longer wavelengths than, I dunno, 800 or 1000 nm or so. The signal you get by looking at the broadband has a sensitivity peak and falls off strongly as wavelengths get shorter or longer.
This presents a problem in the design of instruments like transmission spectrometers. Scaling things so that the peak is within range makes it hard to do the measurements at the far ends of the spectrum.
So, the right filter could help this. Is there a purplish filter that preferentially removes the wavelengths that are strongest and most efficiently detected, so that the filtered spectrum would be closer to having constant energy distribution throughout all wavelengths?
If you’re talking about a grating spectrometer, it may be easier to put a spatial gradient filter on the detector. You just need a filter that’s transparent at the edge and darker in the middle.
If you really need a filter that suppresses the middle of the visible range, I guess it would be a magenta filter, but I don’t know of one that’s tailored for the purpose you describe. Here’s a magenta dichroic filter, but it suppresses green too well for your purpose.
I don’t see why you couldn’t come up with a filter (or series of filters) to “flatten out” the spectrum of a particular light source, but that probably isn’t the simplest solution.
The people building the spectrophotometer would know exactly what the spectrum of their light source looked like. They’d also know how the efficiency of their detector varied over that spectrum. This would make it relatively simple for them to come up with some sort of correction factor for each wavelength (frequency). Or, they could just integrate several sensors into one instrument, each one with a different optimal range of detection.
If I understood Napier’s post, the problem is that the dynamic range of the spectrum is too large for the detector to handle. If the exposure is long enough to capture the red and blue ends of the spectrum, the central portion (peak of the blackbody) would saturate.
As I said above, I would handle it with a gradient filter on the detector. Or it may be enough to just take data at different exposure times and combine them in software.
You don’t say what the wavelength range covered by your system is, or how you plan on using it. This makes all the difference in the world.
Mired filters are specifically made to change the apparent color temperature of a source, by flattening the longer wavelengths (in many cases). But the ones I’ve seen only work over a very limited range.
Here’s a page on them:
Scr4, yes, you said what I meant much more neatly than I. The problem is that the center region saturates before the tails, especially the long wavelength tail, gets enough exposure.
I hadn’t thought of a gradient filter over the image plane. That would certainly work but I don’t know how easily I could find a satisfactory gradient. I will look into it.
Wheresmymind, I am the people building the spectrometer. Yes, I know both the spectral curves you reference. But I also think the problem I describe is at least fairly general. Broadband sources so often have 3 kK black body spectra because it is easy to make a radiator get that hot and hard to make one get much hotter. Detector curves vary somewhat more than that, but still, if I could sketch the curve I want, it would extend the range in a great many other systems too. For that reason I hoped somebody else had already created a filter with a useful curve like this.
Seems to me you just need to reduce the exposure times so that the high intensity areas don’t saturate. Do multiple exposures and add/whatever to get decent data for the low intensity areas.
Now if the readout noise is comparable to the other sources of noise then yeah, lots of exposures are a bad idea for the “faint” areas…
I thought the QE of silicon-based detectors dropped to zero around 1.2 um? As I recall, a photon must have more energy than the bandgap of the semiconductor to create a free electron.
This ise right. In fact, some silicon detectors aren’t even reliable that far.
Even if you have a germanium detector, they give out at about 1.8 microns.
In any event, the mired filters I mention above don’t work even on the range of a silicon detector – they only work over part of the visible range. I doubt if you’;ll find a shaped filter that will work across as broad a range as you want.
Yes. You caught me oversimplifying. I’m using Si out to 1.05 and InGaAs the rest of the way. And it gets messy. The InGaAs subsystem has much wider pixels which has the effect of raising its effective curve relative to the Si curve. Also I can choose different integration times in the two subranges, and maybe different slit widths too. So maybe their curves add together like flat lines whose heights adjust.
I think the overarching theme of flattening a 3000 K blackbody may dominate anyway, and I can imagine somebody trying to make this sort of filter.
I also speculated maybe the narrower Si-only range might have had a dedicated filter created, and if it didn’t knock down the longer wavelengths it would have helped.
Finally I figured a partial win would still help, and am just picking my way through it all. So, I hope forgive me this oversimplification and don’t put skeptical smilies all over me.
Wouldn’t something that turned a Tungsten or Xe arc source into a spectrally flat source find utility in the lab?
I really don’t think anybody makes such a filter. I’ve never seen one. And engineering a filter that covers a really broad range (so that you have some wavelengths that are 2X other wavelengths in the range) isn’t going to work.
I just don’t think there’s a market for the priduct you want. You could get filters that wil cover smal portions of the range, I suspect, but nothing for the entire broad extent.