A few months ago the US military announced it would stop its GPS satellites from reducing the accuracy of its positioning system for everybody but the US military to within 10 metres or so. However some time before this my lecturer at Uni’ assured us that GPS systems were the most accurate way of surveying ground, accurate to within millimetres. How could they be if the signals were only accurate to within 10 metres on the ground?
The raw GPS signals from orbit only allow the limited accuracy you mention. By reading several different channels at once, this accuracy can be increased tremendously - my Garmin GPS 3+ will now sometimes give a Position Error of “6 feet”, where before it was never better than 48 feet.
I think there are other systems that use a ground-based signal correction to the GPS signal, which allows for far greater accuracy than I mentioned. But to millimeters? I have serious doubts, but admit that I do not know.
I’ve seen ads for 10-centimeter accuracy. These were backpack models, and required significant computer power to think about the answer (which might be done later, after the raw data is stored).
I always found it interesting that the US military (Army?) intentionally scrambled the signal for reduced accuracy, while the US Coast Guard transmitted differential GPS signals to undo the scrambling (the ground-based correction Anthracite mentioned). The Coast Guard signals were only available near the coastline; the rationale was that the enemy wouldn’t be helped much by it (presumably they would stop sending it in case of attack anyway :)).
- I ran across a site that said before the signals were unscrambled the best accuracy you could get was for a single sampling with a consumer-grade GPS unit, about 95% of the GPS error fell within a 48.5 yard radius. With the scrambling turned off, the accuracy of any single sampling is now 95% within about 5 yards. By staying in the same place and taking repeated samples (5-10 samples, once per second), the accuracy of most units can be narrowed to less than one yard. - MC
As I understand it, even with the scrambling, you can get theoretically arbitrary precision by collecting data over a sufficiently long time. This is fine for geologists and cartographers, but an invading enemy probably can’t afford to wait for a few days to get their position.
Ironically, the only time that the GPS system has been used in warfare (Desert Storm), the signals were descrambled, as they didn’t have enough military-grade receivers for all the U. S. troops, and had to issue them civilian units.
Even my little hand-held Magellan 315 has an averaging mode. Before the scrambling was turned off, if you stood in one place and allowed it to start averaging, it quit oscillating its last significant digit pretty quickly. The display mode I generally use shows hundredths of minutes. That’s about 60 feet of latitude, significantly less longitude unless you’re at the equator. Now, the averaging doesn’t do anything, as far as the display tells you.
Using differential GPS you can get very accurate masurements.
Differential GPS is where you have a gps receiver at a known place. This GPS receiver broadcasts the error that it is receiving from the satilites and your differential gps receiver uses this information to reduce the error that it has. You can get verry accurate results this way.
What gazpacho said. And even then, the best you can say about your resolution is “sub-meter”. My Magellan 4000XL will give me an error estimate of ~1 m, but then while I’m staying in the same spot for half an hour swing ~5 m. (Using, of course Universal Transverse Mercator [UTM] coordinates.) From experience using a non-differential GPS with the “new” satelite situation, I’ve found that +/- 5 m is about as good as it gets.
Of course, if you have it set to Deg/Min/Sec, one Sec of latitude or longitude (except close to the poles) is a lot more than 5 m, so you shouldn’t see any swing at all, resulting in a false sense of precision!
This Trimble product claims 1 centimeter +5ppm accuracy with 45 minutes of averaging: http://www.trimble.com/products/catalog/gis/pro_xrs.htm
Here are the specs: http://www.trimble.com/products/specs/gi20.htm
This is a backpack/handheld combo unit, with differential beacon GPS (the method others have mentioned, using fixed land-based beacons to improve accuracy) along with satellite differential processing. At this level of accuracy, ionospheric variations can affect the signals enough to degrade performance!
I think accuracies of 10 centimeters is fairly common with professional GIS devices. A handheld unit like the Magellan 4000XL is probably no better than 1 meter, at best, as Pantellerite said.
Geophysicists have been regularly using GPS for over ten years to measure the movement of the plates. They’d set up monuments–in concrete–around the world, and monitor them for weeks. My office mate, a graduate student, volunteered for a study in Europe and the dog drew a weeks assignment on Thera, a volcanic island remnant that may have been the inspiration for the sinking of Atlantis. They set up his GPS unit and he watched it for a week–from a small cafe, a couple hundred feet away.
They measured a 1000km baseline in the Pacific nine years ago, and did it again a year later, and reported an 18cm increase, plus or minus a cm I believe.
Most consumer grade GPS’s measure the time difference between signals from different satellites. Survey grade GPS’s must sit on-station for several minutes and measure the phase difference between the signals. With a wavelength of about one foot, one-centemeter accuraccy is available through post-processing the data with a computer. All of this was available before 01-May-00 when Selective Availability was dropped.
This is how it’s done, one GPS is set up on the known point and you go out and take the measurements. Any error is then taken out after all the work is done as you can use the known point as your datum. The only people that really need anything below a few feet are surveyors anyway. even the large ships or your car doesn’t matter for those few feet or even meters. besides in order to even see the difference of a few feet you’d need a really large scale map. When I took Computer Mapping a few years ago we were told that they resurveyed the Mason-Dixon line and found it to only be off by something like 4mm over the whole line! :eek: