The duration of the earth’s rotation (true rotation against the fixed universe, not using the sun as the reference point) is described here http://www.ucolick.org/~sla/leapsecs/dutc.html as a function of time, because the rotation is slowing. This slowing is variable and somewhat unpredictable. Look at the graphs to get an idea how long the day is for whatever date you’re interested in, or use their regression formulae if you want for example a computer program to manipulate it.
Didn’t turn up anything for the length of the year.
Thanks for the site. It has alot of great info about earth’s slowing rotation. I do not see any accurate information regarding earth’s rotation duration and revolution around the sun.
Not that it is inaccurate as incorrect. It is not detailed according to required specifications.
Can anybody else give it a try? A very precise number of seconds for the earth’s rotation and revolution around the sun.
You do understand that there IS no precise answer to the level you’re asking? Not just that humans don’t know the answer, but that it truly does NOT exist.
If you understood the reference Napier gave, the Earth’s rotation changes by about 2 microseconds per year, plus/minus some unpredictable jitter. So the answer to that question changes continuously.
The Earth’s revolution period also changes continuously as the other planets move in their orbits. The effect is small, but it’s a lot more than a millionth of a second over a year.
Finally, defining the length of a year to the nearest microsecond implies a precision of 1 part in 10^14. Measuring physical processes to that degree of accuracy simply doesn’t work; there are always random perturbations larger than that affecting either whatever you’re measuring or your measurement tools.
Even if you could, in principal, determine a value that precisely for some particular measurement at some particular moment in time, the situation in the past or future would be different enough that a repetition of the measurement would yield a different answer.
And for a measurement taken over time, such as you want, the shifting experimental setup during the experiment ensures non-repeatability.
[nitpick]The measurement errors are presumeably random and +/- equally distributed. So if you take measurements for a long enough time and take their average the measurement errors should be filtered out. Of course you might have to wait 1000 years to get an accurate number. By that time, as you point out, the perturbations in the earth’s periot of rotation around the sun will have made your number useless. But at least you had a steady job. [/nitpick]
Which year do you want? The tropical year, the sidereal year, the apsic year, the legal year, the lunar year, or the solunar year? That right there will make a lot more difference than a millionth of a second.
>Finally, defining the length of a year to the nearest microsecond implies a precision of 1 part in 10^14. Measuring physical processes to that degree of accuracy simply doesn’t work
Actually, NIST is measuring time to a precision of 5E-16. I am not sure what “accuracy” means in judging the most precise measurement device of any kind, but suppose that other clock accuracies might be measured by comparing them to this one.
They use the average of a bunch of cesium clocks to determine the time. I suppose what is meant by 1 in 10[sup]16[/sup] accuracy is that the maximum variation of any of the clocks from the average is not more that figure
Isn’t anyone even remotely interested in what kind of doomsday machine the OP is going to plug the numbers into, once provided? It’s always nice to know you can come here when all resources have been exhausted to get the final piece of info you are looking for!
Those things never work. They always stop at around the three-second mark, despite the fact that our intrepid hero has been foiled at every previous opportunity to stop it. I suspect the manufacturer riggs Main Fusion Ignition Safe & Arm device to fail deliberately, fearing that a successful operation will result in a lack of repeat business.
>They use the average of a bunch of cesium clocks to determine the time. I suppose what is meant by 1 in 1016 accuracy is that the maximum variation of any of the clocks from the average is not more that figure
In my understanding, there is averaging of a number of clocks to determine what is broadcast as UTC, and there’s also averaging of several clocks as part of the GPS system to broadcast and use GPS Time, and some plan - might even be complete now - to integrate the two.
But there’s also the cesium fountain clock, which is far more stable than the clocks used as I just described, though it sounds like it might be intended as research equipment or as a prototype next generation standard clock. Since it’s much better than the others, and since averaging increases accuracy only very slowly as n increases (as n^-1/2), I wonder what they compare the cesium fountain clock to in order to get its accuracy. But there must be something - NIST isn’t going to get the 5E-16 claim wrong!