I was wondering if the speed of light is an intrinsic part of the structure of the universe? Could the universe still exist as it is if the speed of light were a different value?
Is the speed of light a measured value only or has it been calculated theoretically?
Thanks
I don’t think we know enough about how to build a universe to answer your questions. As far as we can see, c is a built-in property of our universe, but we don’t know what makes it take the value it has. I don’t think it can be calculated from basic principles, at least not at our current level of understanding.
Science fiction sometimes has some interesting ideas to think about. The final chapters of Asimov’s Fantastic Voyage II: Destination Brain has an interesting speculation about universal constants. The idea is somewhat tangential to the story, but it is sort of the MacGuffin of the book, so I’ll hide it for anyone who might decide to read it. I really wish he’d written a sequel to expand on it.
The book is about a technology to shrink a submarine full of people so they can travel through the body of a comatose patient. The gadget operates by reducing the fine-structure constant in a localized area, but this requires a lot of power and is unstable. At the end, they learn that the constant can be linked to the speed of light, and easy and stable miniaturization can lower the fine structure while at the same time increasing the speed of light. So a spaceship could travel very fast if you made it the size of a bacterium first.
It seems a lot more complicated than it really is, because humans have chosen to use different units to measure space and time. By way of analogy, imagine a world where every vertical distance were measured in centimeters, and every horizontal distance was measured in inches. But everything about this world is still otherwise like ours: You can still take a stick that was lying on the floor and turn it so it’s standing up straight, for instance. As you’re doing so, you’ll find that the inch-part of its length is decreasing, while the centimeter-part of its length increases. You could develop formulas for exactly how these two vary, and for many other things in this world, but if you did so, you’d see that there’s this constant that shows up all over the place in your formulas, equal to 2.54 cm per inch. It’s got units of a slope, so you might call it the Slope of Something-or-Other, but that masks what that constant is really all about: Really, distances are the same sort of thing whether they’re vertical or horizontal, and that special slope is really just a fancy way of writing “1”, because 2.54 cm is an inch.
The Speed of Light is like that. We measure space and time using different units, but we don’t have to, because fundamentally, they’re the same sort of thing. We could measure both in meters, or both in seconds, and fundamentally, 299,792,458 meters is one second. This “special speed” is built into the fabric of the Universe, because it’s just 1. Light, so far as we can tell, happens to travel at the special-speed-that-is-1, but even if it didn’t, that speed would still be special.
That depends on what you mean by “measured” and “calculated”. If you interpret it as 1, well, there are plenty of calculations which come up with 1. If you mean that you have some measuring-stick and some clock and want to know how many stick-lengths light travels for every time the clock ticks, you have to do some sort of measurement(s), but those measurements don’t necessarily have to be of a speed per se: You can instead measure certain physical constants that relate to how strong the electric and magnetic forces are, and calculate it from those. If you mean specifically how many meters light will travel in a second, that’s actually how the meter is defined nowadays: A bunch of scientists got together and agreed that what they would call a “meter” is the distance that light travels in a certain fraction of a second, and so that numerical value exists because it’s what people agreed on.
From Maxwell’s equations of electromagnetism, the speed of light in a vacuum is a universal constant dependent only on the electrical permittivity (ε[SUB]0[/SUB]) and magnetic permeability (μ[SUB]0[/SUB]) of free space, being the reciprical of the square root of the product of both constants. (Sorry for the awkward prhasing but the lack of a mathtext protocol makes roots and other complex notation difficult.) So it can be calculated from basic principles of electrodynamics, but of course ε[SUB]0[/SUB] and μ[SUB]0[/SUB] are fundamental constants and we don’t have any particular reason that they are thr values they are, so c is really just a composite constant that is seemingly arbitrary. Note that c isn’t just “the speed of light” but is, as far as we can tell, the fastest speed that any information or massless gauge boson (the photon, gluon, and hypothetically, the graviton) can travel though the plenum of spacetime. (Spacetime itself can expand faster than c because it isn’t made of mass or gauge bosons, hence why the far reaches of the universe are expanding so fast they are redshifted and eventually disappear.)
If ε[SUB]0[/SUB] and μ[SUB]0[/SUB], and thus c, had different values it would change all of electrodynamics, and therefore all of chemistry, potentially making it impossible for atoms to form and matter as we know it to exist. There is a rather fine balance of fundamental constants that allows for the laws of physics and chemisty tomoperate as we know them, and this is sometimes taken as evidence of some kind of deliberate fine tuning (the so-called anthropic principle) but it may as well be that we are just one of an infinitude of universes that came into existence, and ours just happened to have the properties for the physics and self-organizing systems of life that we know, while others are barren of structure, or perhaps allowing with some analogue of chemistry with different scales and effects than we experience or can even conceive of.
Stranger
c!
Very interesting. Thank you.
I have heard that the speed of light may have been different in the past. Is this possible? Is there any evidence for this?
This was most likely a creationist argument to explain how the universe appears to be older than what is claimed by creationists. If light used to travel much faster, then that stuff wouldn’t actually be as old. It sorta uses some values of light speeds calculated over the time span we’ve been capable of doing it and shows that there is a slight trend over time to a lower value until the point at which we started using the speed of light as a definition and thus any time we measured it we’d always get what we looked for. The more likely explanation is that it was really hard to measure the speed for a long time, but people were most likely to accept their results if they were close to the previous measurements and more likely to try again if they weren’t, as was the case with the mass of the electron, I think. An old measured speed that was higher than reality slowly came down on the average as more accurate experiments were made.
What’s even more likely is that they cherry-picked the data which supported their conclusion and ignored the data that didn’t. Pretty much a standard tactic for those supporting faith-based arguments.
It’s only really meaningful to ask “is this physical constant changing?” if you’re asking about a constant without units. For instance, if you combine the speed of light, the electrostatic constant, the charge of the electron, and Planck’s constant in the right way, you can get something called the fine structure constant, approximately equal to 1/137 (no units). Now, we don’t actually know whether the fine structure constant has changed since the early days of the Universe, but it’s at least meaningful to say that it could. And people have actually done observations to test that, and some of those observations suggest that, maybe, it in fact has changed. A lot of the popular press latched onto those results and interpreted them as “The Speed of Light has changed”, because c is one of the pieces that go into the Fine Structure Constant. But it would be just as valid, and arguably make more sense, to interpret it instead as the charge of the electron changing, or the electrostatic constant. And in any event, such results have always been quite weak, weak enough that even the authors themselves admit that it’s probably just random error.
Can you say briefly (heh) what obtained–as I believe is a correct verb here–when the has-got-be-speed of light was a different got-to-be speed of light?