Are short-lived radioactive materials more dangerous than longlived ones?

I read an interesting book about physics a while back (“physics for future presidents”) that stated radioactive materials with very long half lives emit less radiation (or less-dangerous radiation) that materials with shorter half lives.

Is this actually true? Or are there some key details being omitted? (the entire book had a definite anti-conservative, pro-industrial bias)

My understanding of it is that materials basically have a limited supply of radioactivity which can either be released slowly over a long time or quickly (i.e. in large dangerous doses) over a short time.

There was an interesting explanation of why even low levels are dangerous (create a nonzero risk of cancer or other illness) and that danger was determined not by the level of damage, but the likelihood of illness as a result of exposure.

Radiation is like rolling down a hill: If you roll quickly, you’re going to get to the bottom sooner.

Similarly with isotopes. The least stable ones give off energy the fastest, and so are faster to achieve a stable state. The isotopes that have half-lives measured in millions of years are much cooler precisely because it takes millions of years for approximately half of any given sample of that isotope to decay into something else.

If you could sit next to a chunk of isotope A or a chunk of isotope B, and they gave off the same kind of radiation, then the more dangerous one would be the one with the shorter lifetime if the two chunks had the same number of nuclei present to start.

In practice, though, this isn’t how things go, and lifetime doesn’t tell you as much as you’d think. The potential mechanism of exposure, the biological behavior of the element, the radiation type (alpha, beta, gamma), the behavior of the decay daughters, the typical enrichment levels, and (finally) the lifetime all have say in the danger. If you pick two isotopes at random and compare how dangerous they are, one of the non-lifetime factors will likely dominate the story.

“More dangerous” has to take into account many more factors than just the life of the isotopes - things like the type of radiation (alpha, beta, neutron) and the exposed areas of the body (dust on your skin is less dangerous than dust in your lungs, for example). And many radioactive elements have decay products that are themselves radioactive, so you may have a chain of radioactive events with their own signatures and half-lives.

However, the statement made by the book is basically true. The half-life represents how long it takes half the sample to decay, and it’s the process of decay that releases radiation. If fewer atoms are decaying in a given period of time, then there’s less radiation and (all other factors being equal) that material is less dangerous.

To a certain extent, this is true. However, at some point, as the half-life of a given radionuclide gets shorter, it decays so fast that it does not persist in the environment, and so becomes a non-issue. Half-lives of such isotopes are measured in minutes, seconds, or even shorter. While the isotope may emit quite a lot of radioactivity as it decays, total exposure is limited if the isotope completely decays in a short period of time.

On the other hand, a radionuclide with a very long half-life, like U-235 (half-life of 704 million years), and U-238 (half-life of 4.47 billion years) are actually fairly stable, relatively speaking, and do not actually emit much radioactivity.

The most dangerous radionuclides are those with half-lives that are long enough to persist in the environment, but still relatively short (especially when compared to long-lived isotopes such as U-235 or U-238). One example is Co-60 (half-life of 5.27 years), an isotope of cobalt.