Is there convincing evidence for the speed lf light not varying with time?
Here is why I raise the question. If you read Steven Weinberg’s book on the first three minutes, it almost seems like time is logarithmic. As much happens in the interval between 10^{-33} sec and 10^{-32} seconds as in the interval between … 1 and 10 seconds. The book is written as though time were logarithmic, in which case there is no point in asking what preceded the big bang because it actually goes back to -infinity. But if the light traveled much faster in those days, there would be no need for Guthian expansion which was drug in the explain how the big bang was (nearly) homogeneous.
At the other end of time, we speak of the heat death of the universe. But entropy will ever tend to but not reach a maximum, so there will always be some usable energy around. Thermodynamicists have claimed that while some energy is always needed for computing, there is no minimum level if the computation is slow enough. This suggests that computers (not to mention organisms) can go on forever; they just have to slow down (something like a sloth–it hardly ever moves and needs very little energy). It is not clear that any such organism would ever notice that, from our vantage, they move at glacial speed. Again, it looks like logarithmic time. The only exception would be light, assuming the speed of light is constant.
I am not a physicist and this is just idle–and probably wrong–speculation, but I am curious how a real physicist responds.
Interesting, but the fine structure constant depends on three other measured quantitites besides the speed of light any or all which could also vary with time. Admittedly, this would leave the fine structure constant more fundamental than the others, but so be it.
Not many scientists appear to buy into this, though.
And it’s just not needed to explain your observation, which is no more than an artifact of the way Weinberg tells the story as well as our system of base ten units. I can’t imagine that Weinberg would approve of that interpretation of his writing.
Inflation solves problems other than homogeneity: for example, spatial (near-)flatness and the nonobservation of magnetic monopoles (in theories which predict their existence). Varying the speed of light may help you with flatness, depending on how you do it; I don’t think it will help with the monopoles (if those are a problem).
Most of the time when someone asks a physicist about varying c, the reply (as with Squink’s) talks about the fine structure constant alpha (or some other dimensionless constant). The problem with varying a dimensioned constant is that it’s not clear what it means, if anything: is it a change in nature, or just in our definitions? Eventually, any physical measurement comes down to a dimensionless numerical value, so it’s not a physical limitation to talk only about varying dimensionless values. (There’s a nice discussion at the Wiki page.)
Of course, if there are other unknown physical mechanisms involved (such as the inflationary mechanism or whatever you propose to replace it with) then there are probably other unknown constants; if the inflation mechanism defines a second fundamental speed, then the ratio of this speed to c may change meaningfully.
Dyson has talked about an “eternal intelligence” something like this, just slowing down its computation as the available free energy decreased. Such an intelligence presumably wouldn’t necessarily “notice” itself slowing down, even if it eventually came to a stop after a finite number of computational steps. (Even if there is always some free energy available asymptotically, the integrated number of computational cycles may not be infinite.)
I agree with some of this; there is certainly a lot of editor’s choice involved in what events to mention, probably leading to less-important events being mentioned in what would otherwise have been (e.g.) a rather uneventful microsecond. But the choice of base is not really relevant; if something looks logarithmic in base 10, it will look logarithmic in any base.
More relevant is a separation in energy scales over a rather surprising number of orders of magnitude: down from the GUT and Planck energies ~10[sup]19[/sup]GeV, through various energy ranges where some symmetries get spontaneously broken (electroweak, strong, etc…; the quark/lepton generations). The differences in strengths of the fundamental forces, and in the masses of the fundamental particles, at lower energies then mean that different effects get important at different times; for example, the neutrinoes decouple before the photons and the top quarks disappear before the charmed ones. That is, the very broken symmetry of the Standard Model keeps everything from happening at the same energy scale and gives Weinberg a more interesting chart.
Interesting. But the last is wrong. There is no minimum energy required to do a computation. Since there is always some free energy around, there is always enough to do one more, albeit slow. If you can always do one more, you can always do infinitely more. However, if the speed of everything but light slows down, then the speed of light will appear to tend to infinity (from the point of view of the ultra-slow-living creatures of the far future). This does no kill the idea, but it made me wonder if the speed of light is really constant.