# Effect of Motion on Speed of Light

So, how is travel of light (or radio waves) affected by relative motion?

Right or wrong, I’m assuming a Doppler effect relative to speed of light… What happens when two spaceships move in opposite directions as they near ½ the speed of light each? Do radio signals become delayed/slower, ceasing completely as objects relative rate of separation approaches speed of light? Would transmissions be slowed relative to the speed at which they move apart? Would this affect the speed of movement within the broadcast received…ie slower motion when moving apart at higher speeds?
What about light and the speed of light? How does motion affect the appearance of a shining object? If bright enough to be visible light years away, would its light diminish as it moved further away, then disappear completely as it reached light speed (regardless of its relative direction of movement)? Would its light change colors (effect upon color spectrum)?

There are two different issues. One is that light travels at the same speed no matter how you move relative to it; this surprising result is accomodated by the nature of the space and time it moves through. The other is the straightforward issue that the time difference between successive waves does what you’d expect due to relative motion, which causes the Doppler effect.

Radio or light signals passed between rocket ships as they change their relative velocity will always hit the receiving ship at the same velocity relative to the ship (imagine a photon speedometer on the ship - it will always catch the signal and register the same speed, never faster or slower). It will also be at the same velocity relative to anything else (which is very much the point of special relativity).

The time frequency or wavelength of the received signal will depend on the different speeds involved. Intriguingly it is not necessarily what you’d calculate based on the present speed of the two rocket ships, nor what you’d calculate based on the speed of the emitting ship at the time of emission and the speed of the receiving ship at the time of reception. It is complicated.

Viewing an object which is fleeing you, you would see its appearance modified by redshifting. That is, the wavelengths are all increased, relative to what you’d see if your relative velocity were zero.

It is somewhat harder to picture because we have a mental concept of “now” that extends everywhere. We might think, if someone here on earth and someone else on the moon set off a camera flash at the same instant, etc etc. But things that happen at different locations don’t have just one version of relative timing. After you account for how long it takes to get a signal from here to there, you will still have a difference between the two different “nows” that depends on how you are moving. It isn’t an artifact of how you are doing the observing - the time and space components are intermingled in a way that depends on velocity.

The ships could not travel at the speed of light relative to each other. If they both start off at the same speed and accelerate to 1/2 c in opposite directions, you would think they would end up travelling at c relative to each other, but according to the theory of relativity, they would end up travelling around 3/4 c relative to each other (I don’t remember the exact formula). So yes, the signals would be redshifted like light from other galaxies is, but the signal would always be there.

Spaceship ET travels away at .5c for two years, transmitting a (very long) song, rhythm of 100 beats/sec. A new broadcast of the same song (still playing) takes an entire year to reach me (right?); the orignal broadcast, though, continues at 100 bps? Why weren’t we losing time with our original song?

Loss of time (& rhythm) proportionate to rate & time of travel, that is.

I’m not sure I understand what you’re getting at, but the rhythm is only 100 bps in ET (and anything travelling with it). As soon as it is moving away from earth, the beat will be slower as received on earth; you don’t need to wait the two years.

Thanks, that’s exactly the confirmation I was looking for: a Doppler effect exists for light travel. Radio tranmission, for instance, should be logically calculated using rate of departure. I have heard otherwise from so many sources, most of them applying Einstein’s Special Theory of Relativity. It doesn’t make sense, though, to think there is no delay of signals received from an output moving away at a continuous high speed. Logically, departure at precisely 1/2 speed of light should cut in half the pace of any signal at light speed from that source.

No, the signal’s speed would not be reduced by any amount. It’s still traveling at the speed of light. Instead, you would see the frequency of the signal shifted. You would have to calculate the new frequency using the Lorentz equations. Calling the old frequency F, the new frequency N, velocity v and speed of light c, the transformation would be:

N = F * (1 - v/c)/sqrt(1 - v^2/c^2)

So, for v = 0.5c, you’d get the equation:

N = F * (1 - 0.5)/sqrt(1 - 0.25) = 0.5 * F * sqrt(4/3) = F * sqrt(1/3)

The square root of 1/3 is about 0.58, so the frequency would be reduced to 58% of the original frequency.