I have this thing on my computers, both at home and at work, click on it and it pings an atomic clock somewhere and synchronizes your computer’s clock with the atomic clock. I set my computer at home and my wrist watch EXACTLY, then I go to work and set that one to the same atomic clock but now my watch, which is a very good watch and doesn’t lose much time at all, is 14 seconds fast. Hmm. I drive home and check my computer and son of a gun if I’m not another 14 seconds fast. But all day at work my watch and my computer keep perfect time and if I stay home all day my watch and my home computer keep exact time together too. Strange. All three clocks should be synchronized to the same atomic clock somewhere but between home and work everyday I lose 14 seconds (or pick up 14 seconds, I don’t know). All I can figure is I must be driving VERY FAST, somewhere in the vicinity of the speed of light, I think.
Obviously you didn’t realize that you have a GPS watch that is resetting itself to GPS time every time you get a clear view of the satellites. It also must have a software bug that it doesn’t apply the GPS to UTC offset (currently 13 seconds – the other second must be your fault).
I’m sure you’re dying to know just how fast you must be driving! To first order, the answer is:
V = 3497874050275800 / D
where D is the distance you drive to work in miles, and V is your speed in mph. So if your distance to work is 59142827 miles, you’ll be going at 59142827 mph (8.8% the speed of light). To the outside world, the trip will take one hour. But to you, the trip will only take 59 min, 46 sec.
Now, this is only good to first order. If you’re going much faster than 8.8% the speed of light, the above formula will not be correct.