The coordinate speed of light can vary, for example in Schwarzschild coordinates the coordinate speed of light travelling radially outwards goes to zero at the event horizon, however the local speed does not.
Now we could ask it possible to interpret the local speed of light (i.e. c) as varying in GR? Well the big objection would be that there isn’t a particularly natural way to do this, but we could describe the variation as a scalar field and we could even link it to an invariant scalar field derived from the curvature like the Ricci scalar (though for example the Ricci scalar in Schwarzschild spacetime is everywhere zero, except for at the singularity where it becomes undefined just like any quantity derived from the curvature). The problem with this idea is that it doesn’t alter the physics one jot as to still be GR we must be able to make our scalar field must disappear in the Einstein field equations and really what our scalar field would describe is the local variation in the units we used to describe c.
We could go outside GR and couple the scalar field to the EFEs, so that it does become physically meaningful, but even then the interpretation of the scalar field as describing the variation of c is arguably not the most natural interpretation.
Newtonian physics describes celestial trajectories in our solar system with very very high precision (e.g. spacecraft orbiting a comet or Pluto), but seems to suddenly diverge at relativistic speeds. Science does this sometimes.
But that’s not enough reason to posit that it happens. There are all sorts of departures from GR we can imagine, and many more that we can’t. Why favor this one particular departure (which you still haven’t actually described, incidentally) without any evidence?
I know that in some creationist circles, theories of c-decay are bandied about to try to solve the “if the universe is only 6000 years old, how come I can see stars 100000+ light-years away” problem, which is otherwise more easily solved with “the universe is more than 6000 years old”.
Hypothesizing that it only happens in really curved space-time is designed to make sure it only happens near the big bang, and not now.
Scientists always push the established theories. If it can’t be proven wrong, then it’s not science. So lets hear your departures from GR. It seems to me you can’t get past 1 = 1.
Here’s a whole bunch of them. Ten different parameters, any or all of which can be varied. All of them have boring values (either 0 or 1) in GR, such that the corresponding bits never show up in the equations, and all of them have experimental bounds that put them at least fairly close to the GR values, but in any situation that’s more extreme than any of the experiments we’ve performed, they just might show up.
And I’m really curious how one could ever get past 1 = 1.
You misunderstand. A theory isn’t a scientific theory if it isn’t falsifiable. A theory that cannot be tested is non-science. So working back, a scientific theory must make predictions, and predictions that should they not turn out will invalidate the theory.
GR predicts: gravitational lensing, frame dragging, gravity waves. All three have been found. In particular frame dragging has been explicitly measured and one - found to be real, and two - meet the predicted value. GR also predicts lost of time effects, and for instance the GPS satellites need to use GR to get the right locations. And again, the locations turn out to be correct when you use the corrections predicted by GR.
These are falsifiable predictions made by GR. GR has withstood every test made against falsifiable predictions thus far.
If you make a suggestion about a change to GR, you need to one - ensure that it meets the so far predicted and measured success of GR, and secondly, make at least one prediction that is both meaningful, measurable, and thus falsifiable. If you go out and do the measurement, and the experiment doesn’t falsify the hypothesis, the hypothesis will garner some credibility. Indeed any such experiment that modifies GR would get you on the short list for a free trip to Stockholm. But just navel gazing and coming up with one of an infinitude of possible falsifiable ideas with dubious actual meaning, let alone predictive capability isn’t science.
To use an analogy: we could argue about whether it is possible to install an en suite bathroom in my car, but that’s skipping the most important question, i.e. why would I want to install an en suite bathroom in my car? Now there may well be good reasons I haven’t contemplated for having such a convenience in my car, but until we have a good reason there’s no point starting construction.
Now you’ve specifically talked about c varying in curved spacetime and spacetime whether curved or not, has tangent space at each point equipped with a Minkwoski metric. The Minkowski metric in the tangent space defines what is travelling locally at c at that point and axiomatically we assign the same value of c to each point. As I said above we could get rid of this axiom and instead the value of c at each point using a scalar field. If we stick within GR, this turns out to be equivalent to just using different units to measure c at each point so is thoroughly pointless exercise. If we keep the same mathematical structure (i.e. spacetime), but go outside GR we find that we really can’t tell if c is varying, G is varying or if we’ve introduced a new physical field. In fact usual way to interpret this would be the introduction of a new physical field and there are extensions of GR which do this.
We could construct an entirely new mathematical structure so that the idea of a varying c arises naturally, but this new spacetime must be something radically different from the spacetime we use at the moment and there’s no guarantee there’s even a remotely sensible way we could do it. But this brings me back to my first point, what is our motivation for doing this?
The Wiki-page on Parameterized post-Newtonian formalism describes methods to minimalize computational errors in for spacecraft trajectories without having to use GR field equations. But letting c increase in extremely-bent spacetime allows for the possibility of massive particles exceeding locally-measured c, without science fiction. The behavior of these c+ particles would be interesting… And yep I agree there’s no getting past 1 = 1.
Doesn’t “If it can’t be proven wrong, then it’s not science” mean the same thing as “A theory isn’t scientific theory if it isn’t falsifiable”? Not sure what you mean by “You misunderstand” … at l least in this context.
Doesn’t “If it can’t be proven wrong, then it’s not science” mean the same thing as “A theory isn’t scientific theory if it isn’t falsifiable”? Not sure what you mean by “You misunderstand” … at l least in this context.
Their motivation was to try to understand the world. What aspects of the world are you attempting to understand? What do you think will make more sense in the context of assuming that the value of 1 can change?
But that’s the problem: The math is easy. If the math were hard, then sure, there’s room in for things that we don’t understand. But it’s not. It’s simple enough that we do understand it, and so we know that what you’re saying doesn’t make sense.
Not so much aspects of the world, but aspects of the Universe. Some examples from wikipedia:
Why does the zero-point energy of the vacuum not cause a large cosmological constant? What cancels it out?
What is the identity of dark matter? Is it a particle? Is it the lightest superpartner (LSP)? Do the phenomena attributed to dark matter point not to some form of matter but actually to an extension of gravity?
What is the cause of the observed accelerated expansion (de Sitter phase) of the Universe? Why is the energy density of the dark energy component of the same magnitude as the density of matter at present when the two evolve quite differently over time; could it be simply that we are observing at exactly the right time? Is dark energy a pure cosmological constant or are models of quintessence such as phantom energy applicable?
It seems obvious that we don’t know everything yet, and our telescopes (now we can detect gravitational waves!) are showing us some very interesting and strange things. What do you think would happen if a massive particle particle found itself traveling at faster than local c?
Jump out of the 1 = 1 box for a second. If c increases in extremely-bent spacetime, then a particle with a real non-zero rest mass in orbit around a black hole could be ejected into less-bent spacetime with a local speed that exceeds the speed of a photon. No causality issues, so what would happen to this particle? Infinite mass for an infinitely small amount of time? What do you think would happen?