Hi,
I was wondering if it is possible for the rates for radioactive decay to vary… e.g. if they speed of light changed or something (like some creationists think, though I’m not a creationist)
Also, can radioactive decay be linear for some isotopes (rather than always exponential)?
I think radioactive decay is weird… that either things decay or don’t decay, but overall, half decay during the half-life. Is it related to quantum physics? e.g. In the Many-Worlds Interpretation, would the alternate histories involve every possible combination of things decaying? Do the isotopes have a “counter” that you can measure and see when it is about to decay?
Thanks.
I’ll just try to answer these. A radioactive atom doesn’t have a “counter” or “age” which indicates whether it’s likely to decay or not. It’s like a coin toss - if you get 10 heads in a row, it doesn’t mean the next one is likely to be tails. The probability of the next flip being heads is 1/2, just like the other coin which has just produced heads 4 times in a row. If you have two Am-241 atoms, one created in a supernova 10 billion years ago and another created in a nuclear reactor 10 minutes ago, they’re both equally likely to decay within the next minute.
This lack of memory will necessarily lead to exponential decay. If a certain atom has a half-life of 1 year, that means it has a 50% chance of decaying in the next year. So if you have a large number of such atoms, half of them would decay in the first year. At the end of the year, the surviving atoms have 50% chance of decaying in the following year, so half of those would decay. And so on. Linear decay would imply that atoms have “memory” or “lifespan,” that older atoms are more likely to decay than newer atoms. But there is no evidence to suggest that atoms have such counters or clocks to record its age.
Exponential behavior is extremely common in nature because the world is full of systems without memory. For example, consider light passing through a smoke-filled room. Say half of it will get absorbed within the first foot. Of the light that make it 1 ft into the room, half of that will be absorbed in the second ft, and half of the remaining light will be absorbed in the 3rd ft, and so on. So the brightness of the light beam decays exponentially with distance.
I don’t know about the first question but I’ll try to explain the rest. Atoms decay completely randomly, it is impossible to predict when it will happen; so an atom that is one second from decay will look the same as the same kind of atom a million years from decay.
It is this randomness that means that, with a large enough number of atoms, the substance will always decay exponentially. The reason is this: let us say that a particular kind of atom has a 50% chance of decaying within a year. If a 1000000 of these atoms were observed over time you would expect that in the first year, about 500000 would have decayed. Then you have 500000 atoms of the original isotope remaining, and since radioactive decay is completely random, about half of these would decay over the next year, then half of the remaining atoms the next year, until eventually by chance the last atom has decayed.
Hope this helps.
Relativistic speeds will slow down decay just like it slows down everything else. Whether variations in the speed of light would is probably unknown and unknowable since such a question can be answered only in terms of (current) theory and a change in the speed of light is inconsistent with that theory. And it is definitely a quantum phenomenon. There are “proofs”, not entirely convincing, that there can be no “hidden variables”, unmeasurable quantitities that, even if you knew them, would tell you which atom of uranium would be the next to decay and when. This is all part of a search for some deeper level of reality at which the universe becomes causal. Weird, isn’t it?
Thanks for your replies everyone.
Exponential decay is easy to show for yourself. Take a box and put, say, 100 pennies in it. Shake it, then pull out all the pennies with heads. Count them, write the number down. Shake the box again, and again pull out the heads. Count them, write it down. Repeat until you get the last penny out (remember, if no pennies are heads, then that counts too!).
If you plot the results (number of heads versus box shake) you get an exponential decay, and in fact this is just like half-lives. The more pennies you start with, the better a curve you get (and the longer it takes to decay them all!).
Wow. I love that. It’s always so hard to get people to really grok that much of chemistry/physics is just probability. I’m definitely going to use this example the next time I’m trying to explain nuclear decay
Here’s a page that has an applet that shows atoms (represented by little yellow balls) of isotopes decay and charts it.
http://www.colorado.edu/physics/2000/isotopes/radioactive_decay3.html
To answer the first question in the OP: Not too long ago a group of astrophysicists found evidence that the fine structure constant (one of the fundamental constants of the universe) had changed by one part in 100 000 over the last ten billion years. Here’s a link that explains more about it:
http://www.astro.psu.edu/users/cwc/fsc.html
There are also a few links further down the page to information on what the fine structure constant actually is & why it matters.
There have been several experiments recently that appear to show that various constants - including the speed of light - may have been different in the very early universe. All of these are still controversial and not accepted by all scientists.
What this means, why this is important, and how to put it all together is explained very nicely in The Constants of Nature: From Alpha to Omega–The Numbers That Encode the Deepest Secrets of the Universe, by John D. Barrow, must reading if you are not a scientist and want to understand this.
Obligitory scientific vagueification: Nobody found that the Fine Structure Constant has changed. What that group found is that it maybe changed. It’s not a definite result, by a long shot, and the observations are still consistent, within experimental error, with the possibility that it has not changed.