We’ve something of a major family dispute about this and I’ve been chosen to decide the matter. Unfortunately neither side is willing to pay the bribe I’m asking for, so that just leaves the StraightDope to answer.
Is the rate of nuclear decay effected by temperature?
One side insists that the decay will occur at exactly the same rate whether at 3 K or at 10,000 K … the other side says the 10,000 K sample will decay faster … this has already come close to a fist-fight … so please …
In general, no, nuclear decay is not affected by temperature.
However, there is some evidence that some decay modes have a slight temperature dependence:
For most elements, no at normal temperatures. If you get really hot, up to around 10^7 K IIRC, then things are energetic enough that nuclei are forced to interact and you start having fusion events, but that’s not really the same thing as affecting decay rate. Also, if you get elements that decay by electron capture hot enough that they’re ionized then you’ll inhibit or block electron capture because there will be fewer or no electrons in the atom for them to capture, though that’s the opposite of what your people are betting, and isn’t directly a temperature effect.
Yes, it is affected. But only ever so slightly, unless you’re at stupidly high temperatures; 10,000 K doesn’t qualify; and it would actually decay slower at higher temperatures. So everyone’s wrong!
Suppose you have a sample of uranium. Each uranium atom has a particular chance of decaying per second; it acts like it has an internal clock attached, and for every millisecond that goes by on that clock, it has a particular chance of decaying. If you then heat the sample up, the uranium atoms start moving; and we know from relativity that a clock that’s in motion will run more slowly than one that’s at rest (more or less.) This means that for every second that ticks by on the clocks in your lab, a little less time will tick by on the “internal clocks” of the uranium atoms. Thus, the decay rate that you observe in the lab will be slightly less than the decay rate you would observe if all the uranium atoms were at rest. The higher the temperature of the sample, the greater this effect would be, and the further the decay rate would be reduced.
Now, this is a tiny, tiny effect. For a sample of uranium atoms at 10,000 K, the average speed of an atom is about 500 m/s. The factor by which the “internal clocks” of the uranium atoms will run slow, if they’re moving at this speed, is then about 1 - 10[sup]-12[/sup]; in other words, the decay rate would be reduced by about one part in 10[sup]-12[/sup]. The correction is roughly proportional to the temperature, so if you get up to about 10[sup]14[/sup] Kelvin you might start to notice the effect. But by that temperature your uranium atoms are basically going to be a plasma anyway.
This effect, however, is observed in particle accelerators routinely. Unstable particles such as muons can be “stored” for relatively long periods of time by accelerating them up to near the speed of light. They decay with a much longer half-life when they’re moving around at relativistic speeds than when they’re at rest.
My background is in Biology, but I have worked with radioactive pharmaceuticals for 20 years. I always understood that half-life was constant, and indeed when we calculate these things we do not correct for temperature. So my answer would be that no, temperature does not affect half life.
However, there is some evidence that there could be a small impact of temperature on decay rates:
So I would argue that conventionally, the NOs are correct… But if I were looking to weasel I might claim unsure…
Does the theory that radioactive decay is temperature sensitive come from creationists (to explain away or introduce uncertainty in radio-isotope dating)?
No, chemists, cats and allied riff-raft … the creationists in the family rely on carbon-dating dinosaur fossils to make their point … which is a another good one but these folks won’t be picking the Rest Home I’ll live my final days …
Is there any written document from 8,000 years ago clearly stating that rocks existed … didn’t think so … [giggle]
Here is an article discussing several experiments that did purport to show temperature sensitivity (both speeding and slowing) for polonium-210 and gold-198. Another groups showed small differences in decay rate for beryllium-7, which depended upon which metal it was in.
It concludes with an article undertaking very high precision measurements of decay of ruthenium-97 by electron capture and ruthenium-103 and rhodium-105, both by beta(-) emission,. This failed to show any temperature-related difference at all.
They raise a question about the possibility of publishing bias against null results.
BTW- my quick perusal of Google did not seem to imply a creationism connection to me.
However, we are talking about stripping all the electrons off a noble gas, so effectively a very, very high temperature. This is a result that some creationist literature uses to attempt a challenge to the nuclear decay timeline.
I’d say you’re well outside of ‘plasma’ range, the neutron star core after a supernova ‘only’ has a temperature of about 10^11 K, and you’re talking about a thousand times larger than that. That’s not a plasma that you can contain in any kind of apparatus we could build today, that’s a super-high energy chunk of matter that you’d need a compact solar mass of gravity to hold in one place without it scattering all around the neighborhood, and it’s energetic enough that nuclear collisions would be happening regularly. It’s definitely not a temperature that you could meaningfully get a lump of matter to and observe (particle accelerators move single particles, not a a quantity that you could discern on a human-noticeable scale).
No because bosons are basically the ones where there are an even number of protons and an even neutrons, and therefore are the stable isotopes (the primordial ones…) of the elements, or remarkable odd odd nuclei , remarkable in that they are stable at all… and then more so because they are very very stable where other odd nuclei are very unstable.
Basically the bosons are either very very stable or very very unstable.
Your useful half-life radioactive isotopes or those artificial elements (americurium and the like) are not bosons to be in a bose einstein condensate.
You importantly forgot electrons. Isotopes that lend themselves to BECs have an even number for the sum of all three particles, which for neutral atoms means an even number of neutrons but any number of protons (odd or even).
And even neutral atoms which are fermions can still sometimes form Bose-Einstein condensates, by pairing up the atoms.
Back to the OP, another way you could change decay rates via bulk properties like temperature would be if it’s a decay mode which can be induced, such as fission. The products (especially neutrons) of one fission event can trigger other nuclei to fiss, and so how closely the nuclei are packed and how they’re arranged could change the total event rate (quite notably so, in the case of a supercritical arrangement).
Pronounced with an “ess” or “esh?” The latter is logical but sounds silly, to my ears, but who knows how physicists talk when they let their hair down…
Is it a thing, or does no-one except you being lazy in this one post use/say “fiss” as a verb?