Bye Bye, Speed of Light

We know that the speed of light does vary in different materials. The vacuum is not, of course, actually a vacuum. It’s a quantum foam. This is, in a effect, a type of material made up of virtual particles. It would make sense, therefore, for the speed of light to be related to the density of virtual particles in the vacuum.

“Ah,” you say, “but the density of virtual particles is driven by Planck’s constant, so even if you’re correct, the speed of light shouldn’t change.” Au contraire! The early universe was smaller. Therefore, it couldn’t hold some lower-frequency photon modes so there was actually a lower virtual particle density than there is today. The math is left as an exercise to the reader. :smiley:

Fascinating. I must say that, though my comprehension at this point is meager, this is one of the most fascinating topics I’ve seen. Would anyone be willing to express in broader philosophical terms this elasticity of constants to help me understand? Does it mean, for example, that the universe is an analytic architectonic form?

Truth Seeker, I believe that generally the particle-in-a-box type solutions for virtual quantum particles depend only upon the energy density of the vacuum and not the “size” of the container. To wit, there’s no reason that you have to have a node at the edge of your observable universe for your wavefunction.

Yes, but the energy density of the vacuum depends on how many virtual photons there are. Virtual photons can’t exist in fractional wavelenths – that’s what causes the Casimir effect. 13 billion or so years ago, the universe couldn’t hold virtual photons with wavelengths greater than 1 billion light years.

Is this all utter poppycock? Almost certainly. However, as theories go, I don’t believe it falls into the tin-foil hat category.

We are taught that the speed of light is a constant and assuming it to be constant certainly makes the math easier, just as ignoring the subtleties of quantum mechanics in favor of newtonian mechanics makes the math easier. However, there is no reason that I know of that the speed of light couldn’t be replaced by a function, especially one that depends on something like the energy density of the universe.

Perhaps the best way to put this is that, so far as we observe (putting aside the recent observations) the speed of light is a constant. That does not, however, mean that the speed of light is not actually a function of something. It means merely that the function has a constant value. Nor does it mean that the speed of light must be a constant in all possible universes. Maybe it is and maybe it isn’t, but you can’t get conclusive results by generalizing from one data point.

Darn, the bit gremlins ate my last attempt at this.

Lib

There is a thing called the Standard Model. That is a set of characteristics of the universe, including particles, forces, space, time, and a very complex set of interactions among those things. To describe that model, we use mathematics, and define certain things to have certain values with respect to other things. Some things have intrinsic mathematical value themselves. They are pure numbers. Like Pi.

We choose values for some things. Distance exists as an aspect of space. We quantify that aspect in meters, which we define as a certain number of wavelengths of a specific frequency of light. Time is another, and we currently express that as the duration of a specific number of times a specific element undergoes changes in an excitation state. These are chosen so that someone can determine their precise value without referring to a single object. (We still use a lump of metal for the kilogram, because we can’t find a substitute that satisfies everyone.)

So, we come to constants. Light travels at the same speed, every place we have measured it. It is constant. Moving toward something or away from it, it remains constant, no matter how fast you, or the light source is moving, or in what direction. The charge of an electron is the same, no matter where it came from. All of the electrons have the same charge. It is a constant. There have been a lot of experimental results to back up that claim, and (not considering the one mentioned here for a moment) exactly zero experimental results showing anything else. From these values, other things can be calculated which are constant as well. The fine structure constant is one of them.

Now, the Standard Model is an intellectual construct. It seeks to map the real universe with an accurate and complete mathematical description that accounts for every single reliable experimental result and that includes results predicted, but not yet tested. So far that model has been extremely robust. But, there are some tiny little errors, in some very specific phenomena, where the numbers don’t come out right. One of these is what is reported in that article. Light that struck a cloud of gas a very long time ago and very far away was absorbed, and re-emitted by the atoms it encountered. The light frequency emitted by such an event is predicted by the fine structure constant. It must be just so much for each type of matter. The nature of both the electron, and the energy of the photon involved are quite specific. But in this case, it seems to be wrong.

That means the values for that electron photon interaction are different. That implies that the fine structure constant was different, when it happened. If the fine structure constant was different, then either the electron charge, or the speed of light, or the way space transmits the forces of particles was different. It is hard to imagine how that can happen, under the Standard Model. I am sure a lot of people will try, though.

But, the Standard Model still predicts all the things it used to predict, with all the same accuracy. And the nearby volume of space, which we can examine more or less directly, does not seem to have any such characteristic. So, we may have an experimental error, or we might have a characteristic of the early universe that differs from the current characteristic of the universe. But it does not mean that the universe itself has some flaw, or change. It means our model may not accurately describe the universe. We already knew it didn’t, by the way, we just didn’t know exactly how it didn’t. Building and tearing down cosmological models is what you do. Each one becomes more finely and more extensively accurate than the last. This is no different.

The press always reports this sort of stuff as “the end of Science as we know it.” After they move on to the next Brittney Spears album, though, science as very few know of it moves on, and begins finding answers to questions about the new data. Perhaps it means a new Extended Standard Model. Perhaps it will be an entire new way of looking at phenomena we now call particles, waves, space, and time. Or perhaps not.

The butcher shop won’t need new weights, nor will the global positioning system need an upgrade. A pint will still weigh a pound, or thereabouts.

Tris

“If we are going to stick to this damned quantum-jumping, then I regret that I ever had anything to do with quantum theory.” ~ Erwin Schrodinger ~

Truth this theory you have depends quite heavily on basic assumptions of topology and time-coherence of the universe.

If photons are really limited, Casimir-effect like, to having nodes at the observational limit (defined by how long the universe has been around) then there has to be some way of understanding this relativistically. Granted that we don’t have a quantum theory of general relativity quite yet, but if you will let us think for a moment that there is some limit on the wavefunction on the virtual photons that is defined by the limits of the universe then you have to explain how such information is available. Basically, the size of the universe for any given photon is whatever the size of the universe is at its lifetime (which is now and never then or in the future). Therefore I cannot accept that there is a fundamental difference in the unconfined vacuum of the universe today as opposed to yesterday for how does the photon know what epoch it was created in if it’s a photon? It will never reach the edge, after all! The informational “size” of the universe is simply not available for these photons wavefunctions. Basically, there is nothing to say that the edge of the universe has to act like the side of a box.

Tris

Thanks for that insightful and prodigous reply. Is the philosophical description of the universe as “an electromagnetic field suspended in gravity” scientifically accurate?

Isn’t it possible (and probable) that their conclusion is wrong? Possibly based on a mis-observation, or mis-calculation?

I showed this article to a science conscious friend, who told me that a few months ago scientists claimed to have discovered the “color of the universe” based on the Doppler shift, until someone found an error in their program.

If relativity is wrong, then how has it worked out so well for the last 70 years or so?

And below that “much more profound level” there will be an even more profound level. You know, that old lady could have been right. Maybe it is “turtles all the way down.”

[hijack] Isn’t one dm^3 of water at 4 degrees C one kg? [/hijack]

According to my physics book Fundamentals of Physics (4th Ed.), “The SI unit of mass is a platinum-iridium cylinder kept at the International Bureau of Weights and Measures near Paris and assigned, by international agreement, a mass of 1 kilogram.”

Click here for an thread relevant musical statement: Albert the Genius, by Couty Farm.

http://artists2.iuma.com/IUMA/Bands/County_Farm/

or here, for a more direct link to the mp3 stream.

http://artists2.iuma.com/site-bin/streammp3.m3u?189897

No. It is imprecise, incomplete, and not specifically accurate. It does have a pleasing euphony, though. :wink:

If you want accurate, you have to say the universe is the union of all sets. The universe must include imaginary things, if the beings that imagine them are a part of the universe.

Now, if you want to limit the universe to non-imaginary things, it gets a bit more complicated. You have to include forces, and those entities upon which those forces act, the space in which the forces and entities exist, and the time during which the forces act upon the entities. I am not sure you are going to get a concise declarative definition of universe that is going to be scientifically accurate, in under a page or so. But even then, you may or may not have to include super strings, branes, and lots of other possibly real things presently being imagined by some of the more esoteric thinkers.

It seems to me, with our current forces acting on particles which act like waves, in a space which gets distorted by their mass, and expands over time we are approaching the inherent complexity that good sense says calls for a different description entirely. If you add virtual particles that transit the barrier of existence in both directions, I have to admit you are loosing me. The dreaded new paradigm looms before us. Now, eleven dimension “strings” vibrating through “unexpanded” dimensions doesn’t sound all that much better, but I guess it doesn’t matter what it feels like to me.

The coming century will have at least as many great breakthroughs as the last, I think. A new model will develop, even if it is just a band-aid patched version of the Standard Model. I suspect we will have an entirely new model, though, when someone finally has the sudden insight that makes quantum mechanics entirely consistent with relativity.

Tris

“Here Kitty, Kitty, Kitty.” ~ Erwin Schrodinger ~

Just a few bullet-points.

  1. Clearly, what is at issue is not whether “the speed of light is changing,” but whether some formulae involving the constant “C” are yeilding results that depart from what is predicted. It’s the ratios that (may) be changing; that fact that C remains C by definition is irrelevant–unless every constant that there is is fully defined in terms of all the rest.

  2. Relativity has always included certain assumptions that are not known to be true by logical necessity, but because they (seem to us to) make more sense than the alternative. Lawrence Sklar has written extensively on the role of “conventionalism” in the development of the modern view.

  3. What does “absolute constancy” mean in this context? Let us suppose that a certain phenomenon has been measured to an accuracy of X digits, where X is some suitably impressive number. There is nothing to be found at the Xth digit that announces, “end of the road.” No matter how many zeroes ensue, it is NEVER the case that one can conclude that the precise value has been hit upon. Indeed, when one considers that there are infinitely many more nonterminating numbers than terminating ones, it “feels” unlikely that any measure even HAS a perfectly precise value. (Unless all “precise” means is “it is what it is.”) What does it mean to say that something is measured by a value unknowable even in principle (because its completed expression is never realizable)? And if all such values are thus only approximations in the first place–did we ever have a constant at all?

  4. A little artsy, but…Doesn’t it seem that a degree of vagueness and permissiveness is built into nature in general? Which makes it not absurd that every so-called constant is really a range swinging around a nonpunctile central tendency (think: a curve with a flattened top). Perhaps C varies in both the longest of long runs AND the shortest of short runs; and also with respect to spatial lengths.

Maybe all constants are ratios.

Lib, to some extent you’re right. All constants are ratios. More exactly, they allow for convenient conversions between one set of measurements to another set. The relationships between all constants has not yet been realized as we still have fundamentally two separate formulations of physics… one within quantum mechanics and one within general relativity. Both have to be right, but it’s hard how they can precisely be seen in context of each other.

By the by, with regards to the colors of the universe nonsense… to measure a color is something that is not well defined in general. Colors you see in pictures of galaxies and nebulae are always doctored up to some extent. You eyes looking at that area of the sky would never be able to see such structure as the objects are so faint. Basically it’s a cones/rods problem as well as a threshhold problem. FOr instance, the color “white” isn’t a color at all, but merely a RELATIVE oversaturation of photons as black is a RELATIVE undersaturation of photons. (I yell, “RELATIVE” because it depends on how bright the environment is, i.e. night or day, to determine what we, as humans, observe colors to be.) Basically, the two guys got the right NUMBERS, but used the wrong color table. It’s all a matter of aesthetics anyway. There’s really not that much useful about being able to average all the visible light in the sky. The “science” was more of a publicity stunt than actually something that had some substance.

These measurements of the differences in spectroscopic lines in quasars have scientific substance (they are not, theoretically at least, arbitrarly subjective measurements like color is).

Sorry for returning to this thread so late. However, while looking for something else, I ran across something I hadn’t heard of before, the “Scharnhorst effect.” In sum, QED predicts that when the vacuum energy goes down, the speed of light goes up!

K. Scharnhorst, Physics Letters B236, 354 (1990)

Poor Dr. Scharnhorst had to do reems and reems of complicated mathematics to reach this conclusion whereas all I had to do is reach back and pull it out of my . . . Well, anyway, maybe it is better to be lucky than smart!

I stumbled across an interesting paper on the Scharnhorst effect last year and commented on it here, but the post seems to be lost to the great purge.

Faster-than-c signals, special relativity, and causality

Here’s a good non technical writeup: Exceeding light speed needs mirrors, no smoke
Note that the increase in speed doesn’t amount to much.

**
Maybe not necessarily. The effect is unmeasurably small at the distances that we can experimentally measure the Casimir effect. However, as I understand it, the speed would continue to increase if you could increasingly dampen quantum fluctuations. In other words, if your reflectors were smooth enough and close enough together, you could increase c to an arbitrarily high number.

This isn’t (currently) experimentally possible. However, the fact that it may be theoretically possible is astounding enough.

Back to the subject of the OP, the Scharnhorst effect plus an observation that light was going faster in the early universe equals at least a WAG that there were fewer quantum fluctuations in the early universe. I smell a (remotely) possible huge triumph for string theory in this and my own sentimental favorite, the brane hypothesis.

By the way, if all this should actually pan out, it would probably make inflation unnecessary, wouldn’t it?

Well, we’re getting rather speculative now, but it could very well be that this IS inflation, or more likely, quintessence. We may simply be looking at two different sides of the same coin.

Inflation is such a weird concept (changing the vacuum energy density, excuse me?) me may very well be able to look at it through the lens that SPACE didn’t change at all. Perhaps it could be looked at as though the fundamental constants were changing… i.e. speed of light. I haven’t sat down and done the modelling, but it could be that there is direct symmetry between the two interpretations.

However, as far as I’m concerned, this is all mumbo-jumbo until we really are sure we’re talking about something

Also, Truth, inflation explains the stability of the universe (omega close to equalling one) and the characteristic size of large-scale structure (mass to mass correlation functions and power spectra). For these reasons, it’s probably best not to dismiss it entirely.