In the field of the reals, this is trivially true.
Anyway, c decay, if it were real, would be a fundamental change in the geometry of space-time. I say ‘space-time’ because c is, ultimately, a conversion factor: You can use it to convert between lengths of space and lengths of time, which are unified in the Minkowski space which gives shape to the physical Universe at every scale our theories currently contemplate.
This goes to the heart of the theory, which is extremely well-verified by experiment*: All acceleration involves trading off moving through space and moving through time; the faster you do one, the slower you do the other, relative to a non-accelerating observer. This is true for galaxies and it is true for subatomic particles.
*(Our modern GPS designs wouldn’t work well at all if SR didn’t work extremely well, for example.)
So c isn’t just a speed. It’s part of the fabric of reality itself, which explains why proposals that it is changing over time so hard to swallow without lots and lots of evidence: c is part of how we define distances of time, in addition to every other kind of distance. The only reason Young-Earth Creationists can so casually toss around the creationist c-decay theories is because they don’t understand the implications of what they’re saying. Of course, when your whole worldview can be refuted by counting tree rings, you’re a nutball anyway and people should ignore you.
My understanding of this is that space is posited to be acting in exactly the same way as any matter with a refractive index greater than one. That is the photons interact according to the laws of QED with the virtual particles. There are no new forces (photons are a force carrier of themselves) and no new physics. Half Man Half Wit’s point about light propagation inside the Casmir effect is cute. It seems to be saying that we already accept the very slight difference between c and light even in a vacuum, and using the Casmir effect we can get light to travel a bit closer to c by eliminating at least some of the virtual particles. Given refractive index is defined relative to the speed of light in a vacuum, we end up with a tiny bit of room for a refractive index every so slightly less than one, which is c.
Light travels at c between interactions, but the interactions slow down the overall speed.
This is what I’d expect would be true. Does anyone have any estimates of how different the measured and “ideal” speed of light would be? Parts per billion? More? Less?
It is meaningless to refer to c changing. c is exactly equal to 1. One is one. It’s just as absurd to say that the value of c is changing as it is to say that the value of 1 is changing.
Now, one can refer to the ratio of c to some other speed, and one could then say that that ratio is changing. But in this case, one ought to conclude that the observed effect is due to that other speed changing.
All of the actual scientific work on this topic hasn’t actually directly involved c at all. What scientists have found is some evidence that the value of alpha, the fine structure constant, is changing (though how this gets interpreted as c changing, I have no clue). It is at least meaningful to speak of a change in alpha, though nobody’s really sure what could cause it, or what it would mean for our knowledge of physics. But it should be noted that, while there is some evidence for this, it’s extremely weak, and even the physicists who proposed the changing-alpha hypothesis don’t themselves actually believe it’s correct.
Just to throw another wrench into the works: I run across reports of serious experiments from time to time that try to prove that photons actually have mass, although vanishingly small.
Are there any serious physicists who believe that photons, under normal circumstances, have mass?
I’m not sure how much sense it makes to talk about the ‘ideal’ speed of light (which would presumably be the speed light would have without any intermediate quantum fluctuations). After all, in QED, the propagator for light takes into account all the higher-order corrections, originating, for example, from the photon spontaneously separating into a virtual electron/positron pair, etc. The case of the photon simply travelling uninterrupted from A to B is just the 0th approximation, but the full propagation of the photon is obtained by summing over all higher-order contributions, and it doesn’t really make sense to consider the photon travelling from A to B as being anything else than that sum, so the velocity obtained that way is what I would consider the ‘true’ speed of light to be. (Otherwise, I believe that Scharnhorst’s formula would yield an infinite speed in the limit of no quantum fluctuations, i.e. perfectly smooth planes infinitesimally close together; but that’s likely a result from an approximation breaking down somewhere.)
This would mean that if the light between the Casimir plates traveled faster than c, it would be genuinely travelling ‘faster than light’, which is a somewhat strange notion. It’s not trivially in conflict with special relativity, though, since the Casimir plates single out a preferred frame of reference.
Apart from all that, I don’t believe it’s been conclusively settled that the front velocity of light will actually exceed c in this situation, however.
Think about it this way. We do billions of experiments concerning the speed of light every day. They’re called communications. Every time a signal is bounced off a satellite it has to sync up with all the other measures and standards built into the system. If the speed of light were slowing in any noticeable way, every GPS location would be slightly off in a consistent way. You might not notice in your car, since the resolution is a few meters, but the military and scientists would be bouncing out of their bunkers. They aren’t. It’s a non-issue.
Bremidon, what we might call “C”, Einstein’s Constant, has always been a different thing than “c”, the speed light moves in a medium. Scientists understand this completely and always make the distinction. The problem is only when stuff gets simplified for the popular press. I’ve always advocated for the separate notations, but somehow nobody listens.
That’s a more subtle question than you might realize. It’s difficult to say something like “photons have no mass”, scientifically speaking, because we can never prove it observationally or experimentally. All anyone can ever do is find an upper bound for the photon’s mass, and then push that upper bound down further and further. So while most physicists believe that the photon is actually massless, it’s not really what you can consider a scientific truth.
And regardless of their beliefs on the matter, I think you would find that the vast majority of physicists consider experiments of that nature to be a worthwhile endeavor, so long as one doesn’t devote one’s entire career to them. Even when we think we know something, we must always test it.
In technical contexts, c is just c, and is not referred to as “the speed of light”. Even if light really does travel at c (as seems quite likely), it’s not the only thing that does, and the significance of c is in most contexts unrelated to the fact that light, or anything in particular, happens to travel at that speed.
In lay contexts, there’s really no hope of a change, until and unless it’s actually proven that light doesn’t travel at c (and probably not even then). “The speed of light” has just gotten too firmly embedded into the popular lexicon. I’ve seen attempts to relabel it as “Einstein’s constant”, but I fear such attempts are doomed to failure.
EDIT: And Exapno goes and sneaks in such an attempt while I’m posting. I really do applaud the effort…
Would there be any reason to expect the measured speed of light in a vacuum, the measured speed of gravity in a vacuum (if/when such measurements could be made sufficiently accurately), and the value of c in special relativity to all be precisely equal? (For example, is there some kind of cancellation occurring that drops the perturbation to zero?) I guess I would think of the SR value to be the fundamental one, with the other two possibly slightly different, due to higher order contributions, entirely analogous to the speed of light being slightly smaller in air. But I wouldn’t take the measured value of the speed of light to be fundamental, over the constant of special relativity.
The value that shows up in SR is fundamental. One of the consequences of SR is that massless particles inherently must travel at that speed. The simplest descriptions of gravity and of electromagnetism which are consistent with observation call for their carrier particles to be massless, and so they travel at c.
pesticide - noun. a chemical preparation for destroying plant, fungal, or animal pests.
I’m not stopping you from using your own definitions for yourself. But it’s hardly sporting to get annoyed at people for using dictionary definitions or not being mind-readers.
Assuming that light (photons) can interact with all charged particles, and assuming that the composition of the “quantum vacuum” is everywhere changing, dynamic, and random with virtual (charged) particles “popping up and disappearing” everywhere and everywhen, would we not, therefore, expect the speed of light to be inconstant and essentially random (albeit with most values clustered tightly around ‘c’)?