Distinguishing radioactive source decay from contamination buildup on the source

This is a bit of a brain teaser about radioactive sources that decay, and also are getting contaminated at a constant rate. Suppose the source is a neat little box with a window, and the radiation escapes only through the window. Suppose further that the window is getting contaminated with some substance that accumulates at a constant rate, which absorbs the radiation. Both the decay and the contamination cause the radiation leaving the box to decrease in an exponential fashion. From outside the box and without changing the contamination rate, is there an experimental way to distinguish the two effects of decay due to half life, and gradual reduction of the radiation because the contamination layer is getting thicker? I can’t think of one.

Are there multiple forms of radiation being emitted? Does the contamination affect them all equally?

Can you measure the temperature of the box very accurately? Absorbed radiation will increase the temperature, while emitted radiation won’t. If the box is kept in a well-mixed environment of constant temperature, it will reach equilibrium at a higher temperature the more contamination there is.

Assuming a single source, not by just measuring the radiation. The baseline radiation will vary like R = exp(-alphat). The effect of contamination will vary like T = exp(-betat). What you measure will vary like R * T = exp((-alpha - beta) * t), so you can only measure the sum of alpha and beta, not either separately.

You can tell the difference between sources based on geometry. If the source in the box is physically small relative to the window its radiation will fall off in accordance with the inverse-square law, where radiation emitted from the plane source (window) will not. http://web.mit.edu/nrl/Training/Point-Line-Plane/Point-Line-Plane.html

Chronos, nice catch! I think you have something. All radioactive sources I know of have some distribution of particle or photon energies, and they have different mass absorption coefficients, so if you do spectroscopy on the emitted energies you’d see the distribution shift and get smaller due to the contamination, but only get smaller due to the decay. I think that ought to work, at least in principle.

Leahcim, that sounds intriguing, but it’s not clear to me how to use it. Assuming we can measure its temperature accurately and keep the heat transfer coefficient constant, I think any box will equilibrate at some temperature in the short term and this temperature will rise in the long term. If we’re not comparing between multiple boxes, how do we exploit that? Develop semiphysical numerical models?

ZenBeam, that was my original predicament, yes.

Yoyodyne, I think that doesn’t work because the window isn’t a source and doesn’t emit its own radiation. What comes through it still traces back to the source inside. The falloff should be 1/r^2 for the distance r to the source inside.

Ah, ok. I thought you meant the window was being contaminated.

I don’t think the modelling is very hard, but suppose we keep the box in a vacuum, and then cover the window with an absorbing plate in thermal contact with the box so that the box/plate system is absorbing 100% of the power output of the source. The temperature of the system will rise until the thermal black-body radiation emitted by the system is equal to the energy output of the source. By measuring that (which is fairly easy to do), you can measure the total output of the source.

Then if you remove the absorbing plate, you’d expect that the power output from the window to be that same number, adjusted for the solid angle subtended by the window. If the measured output from the window is smaller than that, the difference is the amount absorbed by the contamination buildup on the window.

This assumes that the time it takes for the system to come to thermal equilibrium is short compared to the half lifes involved, so you can treat the power output as constant for the duration of your experiments.

yoyodyne’s does suggest, though, that if you could collect the angles of the measurements of decay, with enough data you could back out the angular variation. The contamination absorption should vary like 1/cos(theta), so in principle you could separate the two factors.

Is this a thought experiment only, or is it based on a real-life system (which might lead to “lateral” solutions)? Do we know the isotope inside?

The temperature method should work. If we approximate the half-life as very long (i.e., much longer than the timescale for thermal equilibriation), then the temperature of the box should fall off with the radioactive decay, but should actually increase as the window gets more contaminated.