Are GPS signals scrambled for military bases?

Oh blimey, are you sure? :smiley:

Well… as it’s a huge topic, I shall have to choose an area to concentrate this post on, and you asked about ionospheric distortion, so… (any GPS professionals reading this, please just point out any mistakes I make, not the fact that I am simplifying a bit :slight_smile: )

GPS position fixing relies upon receiving signals from 3 or more satellites. These signals contain two primary pieces of information, 1) the position of the satellite (the ephemeris data) and 2) the time at which the signal was sent. The very clever bit is how the receiver on the ground uses the timing signals to determine what the precise time is on the ground. Once it has done this, it can determine how long it took the signal to get to the receiver, and therefore (using trilateration in conjunction with the ephemeris data), where the receiver is.

High precision GPS work is always carried out by relating one receiver on the ground to another at a known point. This is known as differential position fixing. By this means, errors in the calculations are minimised and a ‘known’ position of the ‘unknown’ receiver is calculated. It is not possible to achieve really high position fixing precision without differential techniques, and this is where ionospheric distortion and the L2 frequency come into play.

Now… the distance from the satellite to the receiver(s) is/are calculated using the time it took the signal to reach the ground… but the signals don’t always travel at a constant speed. Free electrons in the ionosphere (the band approx 100-1000km from the surface of the earth) slow down the GPS signals. Due to variations in the ionosphere locally, the speed of the signals may be different in the different locations, and indeed, over time. Using both the L1 and L2 frequencies, the modelling software can approximate the effects of the ionospheric distortion. The slowing effect on the signals is frequency dependent, so comparison of the simultaneously transmitted L1 and L2 signals gives a good indication of the ionospheric delay.

In the absence of an L2-capable receiver, the software has to rely on the broadcast ionospheric model. This limits the position-fixing accuracy to the few metres you typically see in single frequency receivers using autonomous (non-differential) techniques.

Hmm… as you can see, I don’t teach this stuff, and I am finding it hard to make much sense … is any of this actually helpful? :rolleyes:

Oh, and I really should have addressed the OP … I have conducted numerous GPS surveys and exercises in and around military facilities, and the only trouble I have had is radio inteference on voice and radio-modem data channels… never had any trouble with the GPS signals.

Doing a great job, very interesting stuff.

Glad it’s of interest to someone :slight_smile:

Just as an interesting (?) fact… a clock error in the receiver of one millisecond would result in a distance-from-satellite error of around 300,000 metres… (radio signals and speed of light and all that). So you can see why timing of the signals is so critical!

Anyway, I’ll shut up on the subject now… unless someone wants to ask any questions… a new thread perhaps.

Regarding the lack of cell phone service on a military base, one factor is the lack of an avvailable tower. The other would be signal blocking (NOT JAMMMING!) by the buildings. Especially on a base with old, permanent structures, the buildings would have lots of concrete and steel to survive bombardment; the signal attenuation is present in newer buildings, too, but to a smaller extent because the buildings aren’t build as ruggedly.

There’s been enough discussion of GPS for me to say that the explanations are basically right.