Microseconds by which December’s Indian Ocean earthquake permanently shortened the length of a day: 3
So that got me thinking: considering the engineers who built the GPS satellites needed to deal with relativistic time-dilation effects because of the atomic clocks onboard, did the now-shortened day screw up the GPS system? A few microseconds is a long time for an atomic clock.
If I did my calculations correctly, 3 microseconds would be 0.000003 seconds. This would mean loss of a second roughly every 913 years and 3 months.
They already drop leap seconds periodically from the atomic clocks periodically…always in 1 second increments. The first leap second was in 1972 and there were 22 more between then and 1999. All were positive.
The 3 microseconds lost would mean that every 913 year, they’d have to do a negative leap second to keep the time right. I don’t think it’s going to make that much of a difference.
Ah, but you need a tremendous precision when you’re trying to triangulate something to within a few feet on a place the size of the Earth. A second every 913 years might not seem like much, but when high-precision time is used in a formula to locate a point in space by doppler shifts, you need it to be extra precise.
p.s. I didn’t know they updated the time regularly. I wonder if they will (or did) update the firmware to take into account the new day length?
The rotational velocity of the earth at the equator is roughly 464 micrometers per microsecond.
An error of 3 microseconds per day thus corresponds to a placement error of (464 X 3) 1.4 mm. The error being additive with each day, it’d take 714 days for the satellites to be off by 1 meter.
Unless I did that wrong, it seems GPS will have to take the quake’s effect into account.
The Earth’s rotation is ordinarily a bit fickle. Variability of tides is the main reason why. Thus the effect of the tsunami has added to the fickleness; however likely other factors will be more an issue in the next GPS correction.
The tsunami effect is more of a new constant than an additional fickleness, but yeah, they’ll toss it in with the rest and adjust the timing of the leap seconds to keep everything fixed within a few meters.
I would think the problems would be in the GPS receivers. Don’t the satellites just essentially send out a constant stream of code that says how much time has passed since the agreed to start time (so that the receiver can measure the lag in various signals and convert this into a distance, allowing triangulation)? That wouldn’t be affected by changes in the speed of rotation in my thinking.
All the receiver does is say when I’m X miles from Satellite 1, Y miles from Satellite 2, and Z miles from Satellite 3 then I must be in this spot on Planet Earth. How fast the earth is spinning wouldn’t affect that would it?
What would be affected is how the GPS receiver then translates that into a specific real location on its maps and I wouldn’t think this would be a cumulative effect over time. After the earthquake (after all earthquakes, and other large events) the earth essentially shifted underneath the the satellites. Point A is now 1.4mm to the left, but it isn’t going to keep shifting.
That’s how I think it through. Am I misunderstanding something (probably)?
GPS doesn’t really depend upon keeping accurate solar time at all. What is required, in addition to accurate knowledge of each satellite’s position, is near absolute agreement between satellites as to the current time based on some arbitrary reference clock. The fact that they happen to use UTC as a reference is not really relevant as long as they all agree.
GPS receivers are synchronized to the signals they receive from the satellites, so any deviation in the GPS time standard and solar or UTC time would only affect their usefulness as clocks, not navigation devices.
But if the earthbound-reference clock (I assume it is earthbound) is needed to properly geolocate on Earth, then would the slow-down in rotation need to be taken into account in the regular updates? Let’s say the Earth stopped spinning quickly (and that’s a whole other topic…) and no update was received from the earth-based clock (which, again, I’m assuming, possibly incorrectly). Nobody with a receiver would get the correct location until that update occurred, right?
I think it’s a matter of semantics, to a point. When you get three (or more) different time measurements, plug them into formulas related to a spherical grid in order to determine a location, I call that triangulation. I’m guessing there’s a sin(), cos(), or tan() somewhere in those formulas. (and probably an arctan() or two thrown in for good measure).
There are a number of different time scales, each optimized for a set of applications.
Atomic time, or TAI, is a pure atomic time scale based on a worldwide set of atomic clocks. It does not use leap seconds.
Coordinated Universal Time, or UTC, is what we think of as civil time. Leap seconds are used to keep it in sync with the Earth’s rotation.
GPS time is unique to the Global Positioning System. It is similar to UTC except that leap seconds are not applied. Currently, GPS time is 13 seconds ahead of UTC. GPS time started (the epoch) in 1980. See http://tycho.usno.navy.mil/gpstt.html.
I’ll agree that by your definition GPS receivers use triangulation. But the term is more usually held to mean the calculation of position by measurement of the angles of a triangle, and this is not how a GPS receiver finds its position.
The (much simplified) correct explanation is that, based on the difference in time between receipt of signals, the receiver calculates the difference in its distance to each of a pair of satellites. This defines a hyperbola (in 3 dimensions, a “hyperbloid of rotation”) on which it must lie. Doing this with other satellites defines other hyperbolae, and the correct position is the place where all these intersect.
I’m not really willing to concede that triangulation is proved if any of the calculations involve any trigonometric functions.