I bought a cheaper mapping GPS receiver the other day and am amazed by it’s abilities. It has already showed me some shortcuts I had not considered in the city, and I have used it on a hike in the Rocky mountains.
I have a “dummies” understanding of how the gps system works, but was looking for an article or explanation of how the gps receiver is able to detect the extremely weak signals from the satelites. I know it has something to do with the fact that the receiver knows the sequence of signals it is supposed to receive, and is able to fill in a lot of holes.
Still consider that most satellite dishes have to be very carefully aligned with one satellite and have a parabolic dish to concentrate the signal compared to this unit bouncing on my passenger seat which is simultaneously tracking 10 satellites. How does it do it?
It’s a more or less omnidirectional antenna on the reciever, and the GPS satellites themselves orbit at a much lower altitude than fixed geostationary satellites used for TV. This means they can deliver a stronger signal at ground level using the same amount of power as a GS satellite. The stronger signal is the major reason an omni antenna can work for GPS where it cannot (currently) with satellite TV.
It is a question of how much data needs to be received. There is a basic theorem in communication theory called Shannon’s limit. That describes how many bits per second can be sent over a communications channel. It has to do with signal power, noise and bandwidth. GPS does not send very much data compared to satellite TV so they can get by with a weaker signal.
GPS signals are transmitted with a technique called direct sequence spread spectrum (DSSS). A low-rate data stream is mixed with a high-rate spreading code and then used to modulate a carrier. The key to recovering the low-rate data stream at the receiver is to despread the received signal by mixing it with the high-rate spreading code. The hard part is synchronizing the spreading code generator in the receiver with the spreading code generator in the transmitter. This is done with digital correlators. By using large numbers of digital correlators operating in parallel, sensitivity is improved and acquisition time is reduced. The antenna in the receiver is omnidirectional, unlike a parabolic dish antenna, which is very directional.
The GPS signals are also channelized (can’t recall if freq. or time domain) so that a highly directional antenna is not needed to avoid interferance from other satillites…unlike TV signals, which need a lot of bandwidth, so the same “channel” may be used by several birds.
My first response was going to be “not very well”, as a GPSr will lose satellite contact under heavy tree cover. Most geocachers are aware of this (painfully), and the truly evil ones will hide their caches almost counting on it.
However, upon re-reading I see that he’s really asking about sorting out the signals. I’ll toss in the fact that the better GPSrs will include the WAAS (Wide Area Augmentation System) feature, which corrects for things like orbit error, drift, etc.
My point was that a fourfold signal improvement is alone not enough to explain the difference between a largish directional dish TV antenna and the typical small patch antenna used by a GPS receiver.
That’s not ture if the satellite has a high-gain antenna optimized for its orbit, which I believe most communications satellites do. If the antenna’s beam width is matched to the apparent size of the earth, then almost all the transmitter’s power is spread over the earth (well, half of it) with very little spillover. Power density is simply the transmitter’s power divided by coverage area (half the planet), and independent of distance.
In fact some satellites have even higher gain. IIRC, the Japanese TV broadcast satellites have asymmetric antennas with jellybean-shaped beam patterns, matched to the shape and size of their country.
And by the way, XM satellite radio is broadcast from geosynchronous orbit, and the receivers have omnidirectional antennas.
Very brief high-level - the GPS receives a “timing signal” from each satellite, and uses the delay to calculate distance, and then uses that info for triangulation… Part of the GPS’s computations are taking in as much info as possible and averaging what is the most likely position, accounting for weak/bounced/inaccurate signals from some satellites. This is very much a “line of sight” transmission, and certain things reflect the signal more than others (walk close to a metal bridge or a sheer rock wall, and watch your position “jump” 30 feet away). In general, in pretty ideal conditions, the GPS will filter out the noise and provide good tracking with good accuracy.
On another note, the more signals “bounce” and reflect, the more strength they lose. The newer SIRF chipsets can read signals down to something like 4db, where the older receiver chipsets weren’t sensitive below 11db (I probably have the db symbol wrong - i don’t remember what decibel scale we’re measuring here). This allows those chipsets to gather in much, much more information for use in averaging. Depending on your usage and your location, this could be very good for you… although there’s some question still about how usefull it is to pull in less-than-accurate signals for averaging…
It may be a nitpick, but there’s no triangulation involved. Measuring the difference in signal transit time (and thus distance) places the receiver on a hyperbola (or in 3 dimensions a ‘hyperbloid of revolution’) defined by a pair of satellites. The receiver then calculates the way hyperbolae from various satellite pairs intersect, and thus its position.