Yes, electromagnetism would indeed have to be revised in the case of a massive photon, but the change would have to be very small in order not to have been noticed up to now (and experimental bounds on photon mass are too tight to allow for a massive photon as the explanation of OPERA’s results).
No love for the “fuzzy light cone” hypothesis, in which there’s a finite chance of measuring a speed slightly greater than C?
I don’t know, you’d have to explain what it is. From what little I can gather it seems to me that ‘fuzzy lightcones’ are just a way of explaining quantum phenomena which you would’ve expected to be accounted for anyway.
This article is one of the best I’ve seen on the most likely explanation:
A lot of people probably don’t like it and “experimental/systematic error” as an explanantion seems to many like a total cop out. However if you read the arXiv paper, you cannot help but be struck by the sheer number of measurements that need to be very accurate, before you even start running the experiment, for this experiment to have enough validity.
One of the commenters on the Guardian article makes a very good point that the neutrinos speed is only a very small fraction above the speed of light. If what we’re witnessing is a wholesale violation of relativity as opposed to an error, why is the speed difference so small? Why aren’t we measuing neutrinos travelling at say 2 or 3 times the speed of light? The difference in speed is big enough to be a deal breaker for a lot of established physics, but on the other hand it is suspiciously small.
IANA physicist, but I think you’ll find that physicists are quite open-minded when it comes to changes to well etsablished paradigms in the subject. However I think most sensible physicists are strongly suspecting this is the result of experimental error.
The critical words here are “up to now”.
As posted above in post #121, by DSeid:
[QUOTE=F.K. Richtmeyer]
The whole history of physics proves that a new discovery is quite likely lurking at the next decimal place.
[/QUOTE]
Well, I think the critical words are rather ‘experimental bounds on photon mass are too tight to allow for a massive photon as the explanation of OPERA’s results’, at least wrt this particular issue.
The quote is a true and good one, but one has to be careful: there can’t be just anything lurking at that next decimal place; instead, it has to be consistent with the whole intricate network of mutual relations and dependencies that comprises modern science – tweaking things here by just a tiny little ‘innocent’ amount may well lead to unacceptable (as in, excluded by experiment, not morally abhorrent or philosophically displeasing) consequences way over there.
This – not dogmatism or stuffy unwillingness to accept new ideas – is the main reason why most scientists regard the OPERA results with some scepticism: there are very few ways one can include them into a framework consistent with other, very well established results.
Of course, there’s always the possibility that somebody may come up with something nobody’s thought of before, and people are already hot on the trails of chasing such ideas (I think there’s already something like a dozen papers out discussing OPERA’s results from various theoretical angles), but all of these ideas must fit within the tight constraints of established knowledge.
The critical issue is that a new theory of the nature of the photon must not only explain this current observation, but all previous observations as well. A new theory, such as significant photon mass, that explains why this neutrino was observed to go faster than light, but also predicts that electromagnetism should have a finite range, contradicting a large number of previous observations, is not more parsimonious than a theory of “the photon is massless, and the CERN people borked something up”.
I understand your point – one of the hallmarks of pseudoscience is that results are barely detectable – but I don’t see why you would expect a 2 or 3 times c speed. Is there some reason why that is a logical expectation?
The point being is that the violation is big enough that from the current theorectical framework, it might as well be the same as measuring 2 or 3c. Which begs the question why is the violation only 1.00002c? Why is the violation big in some ways, but small in others? A big violation would be far easier to determine not to be the result of an error in the experiment. One very good explanantion is simply that it’s the result of error.
What’s a point being? ;)
A point with no mass.
I completely agree – and that’s the major problem here – but if you accept the possibility that c can be exceeded without any other reservations, 1.00002c is as logical as 2.0000c.
Yep. And this situation is different from, say, how magic free energy devices always produce barely detectable amounts of energy rather than enough energy to vaporize the planet.
Firstly because a violation of c was not the desired or predicted result.
And secondly because IIRC there is some symmetry in the equations between tardyons and tachyons. I think by some models tachyons would experience a mirror image of relativistic effects such that they would require infinite energy to be slowed to c. Perhaps the velocity with which particles are accelerated at CERN provides sufficient energy to decelerate the resulting tachyons to near c.
IANAPhysicist, but I am anticipating a nobel prize for the preceding paragraph
But would a measurement in the range 1c < v < 1.00002c be as likely as a measurement in the range of say 1.5c < v <10c or v > 1.5 c, if we accept that c can be exceeded without any reservations
Now there might be a theoretical framework which explains why despite undermining in a major way our current framework neutrinos will only travel at a small fraction above what we formerly believed to be the speed limit or that perhaps they are more difficult to observe travelling at larger mutliples of c. On the other hand if this was the result of small systematic error would expect to see a small, but possibly significant violation just such as the one we observe.
Perhaps superluminal neutrinos with velocities very close to c are much more likey to interact with tardyons, and so were detected, but those with very high velocities were never detected because their characterists make them much less likely to interact. (They might be afraid that Chronos is watching.)
Tris
To be clear, whatever this is, this is not pseudoscience. This is a large group of real, respected scientists, doing the best science they can, using all scientific tools available. Now, it may well be that the best they can do isn’t good enough, but they’re honestly applying the correct methods.
Agreed.
Possibly, but then again despite the fact that the basic framework of relatvity doesn’t have a problem with tachyons, but an explantion of tachyonic neutrinos causes so many problems that it may that the whole relativsitic frasmework (and thus the concept of taachyons and tardyons) might have to be thrown out. But in that case how can we explain all the other experiments which confirm SR to such a high degree?
I’m not rejecting the result because I don’t like it, but I just think that an error of some sort in the measurement is still by far the prime candidate in terms of explanations.
[prepare-to-abort hijack]
I have trouble with this conclusion, based on synapse conduction time in optic nerve to visual cortex. I will take this off-line, with Chonos’s permission, or start a new thread if stuff gets interesting.
[abort hijack]
If photons have mass, does that mean that the speed of light is not always constant? Isn’t that one of the rules of relativity?
Can we slowdown light?
Yes. But ask someone else to explain it.