Dopler Shift in Moving FM recievers

While in the car, and trying not to think about school, and physics class, something occured to me. Because the car was moving, either towards or away from the radio station, there should be a dopler shift, an apparent shift in frequency of waves when an observer moves towards or away from the source. Because FM transmiters/recievers rely on tiny changes in frequency to carry a signal, it seemed to me that this should make an FM signal difficult to impossible to understand if the reciever were moving. Clearly this is not the case. Can anyone explain this to me?

The FM broacast band is at about 100 megahertz. That is a wavelength of 3 meters. At 65 mph the velocity is about 30 meters per second. The frequency shift is the velocity divided by the wavelength. This equals a shift of 10 hz which the automatic frequency control will easity handle. Even without AFC that much detuning is insignificant.

And besides, rarely are you going directly toward or away from the antenna so you Doppler frequency shift would be less than that.

Didn’t some clown once try to argue (against a summons for running a red light) that doppler shift had made it look green?

Hehe, even if he argued it (blue, green, being colorblind, whats the diff.), he would have been going way over the speed limit!! :slight_smile:

To add to what David Simmons said, yes, there will be a Doppler shift of about 10 Hz if you’re going straight toward or away from the tower at highway speed. 10 Hz is way less than the tuning error of your receiver anyway.

But even if you were going much faster, FM works by moving the frequency back and forth very fast (to match the amplitude of the sound at that time), and your Doppler shift is going to be at a constant offset, so the FM will work anyway.

I’ve noticed that the radius of a radio signal seems larger when driving away from the signal than it does driving into the signal.

For example, a radio signal may fade 60 miles from downtown when driving away from dowtown.

However, when driving back into town, toward downtown, I don’t pick up the same radio signal until I’m about 45 miles from downtown.

What’s up?

[apologies if the question is a hijack]

Once your radio is tuned to a particular frequency, it will amplify whatever signal it picks up, even if it’s only white noise. However, the “scan” function on many car radios will ignore any signal below a certain strength. This threshold can seem pretty arbitrary at times, skipping past stations that sound fine when I tune to them manually, or locking in on frequencies with no intelligible signal at all.

If you set your car radio to scan the FM band, you will almost certainly miss faraway stations. But if you tune manually one step at a time, you’ll probably get some intelligible (though staticky) signal that Detroit thought you could live without, probably because they might sound a little rough and they didn’t want you blaming them.

Bearflag70, the effect you’re observing is presumably due to the shape of your car/antenna, rather than your motion. It may be that your antenna picks up signals from the rear more efficiently than those from the front. If you ever have the time and inclination, try parking somewhere in the “in-between zone”, facing different directions, and see whether that makes a difference.

Another possibility that occurs to me: You’re mostly driving downtown in the morning, and back home in the evening, right? The radio station’s license migt allow it to broadcast at different powers at different times of day.

Coincidentally, I was reading a BBC research paper on Friday assessing the current state of commercial digital radio receivers. That many of them fail to account for Doppler effects was one of their complaints.

ITU and 3GPP reference models for mobile phone air channels actually include the Doppler effect even for pedestrians. Whether this does make a significant difference in practice, I don’t know offhand.