Not sure what the problem is that they’re talking about here. Doppler effects on RF transmissions for Earth based vehicles don’t shift frequencies enough to be problematic as the emitting source falls through the atmosphere, why would this be a concern on Titan?
It depends on the frequency of the telemetry transmission. And I guess they had to work that one out to provide the best performance for the probe’s size, available space for antenna and propagation loss in the atmosphere.
As an example, an ‘X’ band microwave speed detector which was the old speed trap radar in NZ uses 10525 MHz. At that frequency, a speed of 50 km/h gives about a 1 kHz frequency shift. If I still had any of my training notes I’d look up the formula for frequency shift.
The article doesn’t say what speed the Huygens will get to, but add that to the orbital speed of Cassini and guess how fast it will be.
The shuttle in low earth orbit does about 30 000 km/h. I have no idea what the orbit around Saturn is, but I imagine the speed is much higher to account for Saturn’s much more massive gravitational attraction.
So, high speed difference between Huygens and Cassini, and even a moderate telemetry frequency of a few hundred MHz, and you get a reasonable doppler shift.
The doppler shift is going to depend on the relative motion of the spacecraft, which may be higher than you think.
Many telemetry receivers use phase-lock-loop designs where a tracking loop is phase locked to the carrier of the incoming signal. The doppler can be measured by measuring the frequency of the tracking loop. This is important information for determining the current position/velocity of the spacecraft and generating predictions of its future position/velocity. The loop bandwidth of the tracking loop is often very small. This can cause problems during signal acquisition or interruptions in the signal. To reacquire the signal, the frequency of the tracking loop can be reset to the predicted frequency of the signal, or the tracking loop can be swept across a frequency band that is guaranteed to include the signal. To predict the frequency of the signal, you need to know the actual (measured) frequency of the transmitter and the current doppler shift.
Measured doppler shift can tell you useful things such as atmospheric density (drag) when a probe is descending through the atmosphere, or that a parachute has deployed, slowing the speed of the probe.
OK I understand now (I think), it’s not the RF integrity of the signal that was of concern, but rather that the predicted doppler shift, which would be used to measure speed, atmospheric density, etc, was correctly calibrated for the atmosphere.