Well, let me explain. First, I found this article about how scientists at Purdue and Stanford used decay data from the Brookhaven lab to make their conclusion. I also found a second article from Stanford U. (not a paper, mind you), featuring Professor Sturrock’s ideas.
They (Fischbach from Purdue and Sturrock from Stanford) think the Sun is emitting particles (maybe neutrinos, or maybe a new particle) that may be changing the half-life rate of some radioactive isotopes. They say it occurs on a regular basis which coincides with the Sun’s inner core rotation. The implications could change a lot of things! (Think carbon dating artifacts.) This sounds so against what we were taught in basic physics that I wanted to get a scientist’s opinion on this subject. As hard as it is to believe, I am not a scientist!
(I see this as being part of a sci-fi type set up in a world where traditional physics no longer works.) I am a writer. :o My primary genre is sci-fi.
Would any of you know about this new particle from the Sun affecting radioactive isotope half-lives?
Here are the links, followed by excerpts for each:
Here’s the paper on arXiv. You can see the correlation in period between the decay and Earth-Sun distance for yourself in the plots. One problem is that the decay rate variations lag the distance plot by a couple months. Not sure how to explain that if it’s truly the Earth-Sun distance that matters.
ETA: I actually already had that paper open on my laptop. There’s a completely different experiment looking to detect darks matter that shows a seasonal variation with the Earth’s speed around the galaxy*. I was looking to see if the above variations correlated with the Earth’s speed as in the dark matter experiment (they didn’t).
*Actually, there are a couple that show variation, and a couple others that don’t. So it’s kind of up in the air right now.
Photons take millennia to get from the centre of the sun to the surface, and then 8 minutes to get to the earth. Neutrinos pay almost no attention to matter and so only take the 8 minutes.
Accounting for a delay of a couple of months (whether citing a particular region of emittance or particle properties) takes some very special pleading, which I would distrust implicitly. Something else is going on here, and experimental error would be my #1 suspect. They’re not accounting for something in their environment.
If I remember correctly, Jenkins et al. only looked at other people’s data to come to their conclusion – and if you’re not directly involved with an experiment, it’s hard to quantify its systematic errors. It’s entirely possible that some environmental change affected measurement quality – could be something as simple as the AC running during summer introducing a source of noise – which accounts for the apparent effect. Another investigation (PDF) into long-term decay rates apparently hasn’t found any effect.
I’m going from third-hand discussion and (probably bad) memory here, but one possibility that was suggested was that it’s perhaps only beta emitters that are affected, and that the lag is due to beta emitter buildup as decay products from a primarily alpha emitting isotope – so that you only begin to see a variation once significantly many beta emitters are present. That would also account for the Cassini probe, which uses plutonium-238, an alpha emitter, as a fuel source, not seeing any change in decay rate depending on its distance from the sun.
Another advantage to this would be that we already know that beta decay rates sometimes vary depending on the environment (quick googling brought up this, but I’m not sure it’s what I was thinking about).
Nice, I was thinking along similar lines, thanks for checking.
Talk of this first started up at least a few years ago, and the conclusion then was also that there weren’t enough data, or of high enough quality, to really justify any conclusion. Personally, my guess (though I haven’t really looked into it in any detail) is that something in the detectors is responding to environmental temperature.
According to the following paper, a re-analysis of the radioactive decay data shows a correlation not to the Earth-Sun distance, but the Sun’s rotation period:
As an aside, scientists doing research on gravitational waves just entirely throw out all data at certain frequencies, including 1/year, 1/day, and 60 Hz (or whatever AC frequency is used in their locale). There are just so many potential sources of noise at those frequencies that it’s not worth it to try to track them all down and account for them. And this is coming from scientists who literally do track experimental error from the gravitational fields of tumbleweeds blowing past.
I think they might also even throw out 1/week, because so much human activity has that frequency.