Metrology question - calibrating the world's most accurate clock

I make a lot of measurements.
Voltage.
Current.
Frequency.
Temperature.
Pressure.
etc.

The instruments that make these measurements are calibrated against a more accurate meter, or against some type of standard. For physical constants, in which the units are agreed upon by definition (like current - where the Ampere is defined in terms of force), I can see how one can make a measurement with arbitrary accuracy. However, what about time?
This clock: Most Accurate Clock Ever: 'Crystal Of Light' Clock Surpasses Accuracy Of NIST-F1 Fountain Clock | ScienceDaily
is supposed to be the most accurate time base ever developed. How does one go about determining it’s accuracy and precision, if it is supposed to be 10x more accurate than any existing clock? I can see comparing it to 10 less-accurate clocks, but that assumes that the inaccuracies in those clocks are random, and will average out to zero.
So, how do you calibrate the ultimate time standard?

There is a scientific definition of a second. Presumably your clock’s accuracy is based upon how closely it will agree with the standard.

That’s not a trivial questioon. I’ve seen it asked in the form of a cartoon (which shows an endless loop as the methods of checking keep coming back upon themselves). Tony Rothman has some of the characters asking it in his science fiction novel The World is Round.
Basically, our reverence for a new clock seems to derive from theoretical considerations (for various reasons, this *ought[/i to be more regular and stable than previous time-keeping methods) combined with the assumption that each “tick” is identical, so that by looking at large numbers of them you can see if there is deviation from what you expect. Inevitably, there’s some “circularity”, but it’s not undirected circularity – we use our best previously-known timekeeprers to check our newer ones,. and see if the discrepancies meet our expectations. And measure a long line of ticks.
http://whyfiles.org/078time/

If you make 2 clocks of the same design then you can see how much they drift apart over an extended period. That should tell you how accurate that type of clock is.

How about ~300 less accurate clocks instead of 10?

I recently read a much more detailed article about the calculation of UTC, but I don’t recall where I read it. One of the more interesting observations was from one of the fellows in charge of one of the clocks: He pointed out that he knows the time as accurately as anyone in the world – but not until weeks after the time has passed, once the International Bureau of Weights and Measures has processed all of the clock data and distributed the results.

At least the second is defined in absolute terms, “time that elapses during 9,192,631,770 (9.192631770 x 10 9 ) cycles of the radiation …” It’s the kilogram that gives problems. As I remember it the kilogram is still defined as the mass of a lump of platinum-iridium stored in vault in Paris!

I don’t see how one standard is any less arbitrary than the other.

A reproduceable physical process of repetition such as the atomic transition defining the second is pretty absolute. The lump of platinum-iridium (well, the cylinder) is problematic, at least in part because it appears to be changing its weight relative to the 10 or so secondary standards that were made to imitate it, compared to it, sent to various governments around the world, ahd periodically returned for recomparison.

The standards per se are not very arbitrary, they are chosen for reasons. Today the reasons supporting the definition of the second look more impressive than the ones for the kilogram, but then creating precise spheres of pure silicon whose atomic crystalline spacing is known has recently become a big priority. You see, that would let them define the kilogram in terms of a number of silicon atoms. The values of the units are arbitrary.

Note that MarcusF doesn’t say anything about how arbitrary the different standards are.

Not if they drift from true accuracy at exactly the same rate. Think about it: I have two clocks that are stopped. They will never drift apart; ten million years from now, those two clocks will still be EXACTLY in agreement. But I sure wouldn’t use either one to time anything.

We need someone to have a discussion on precision vs. accuracy here but I’m too tired to do it.

One standard isn’t any less arbitrary than the other, but one is perfectly accurate as long as the laws of physics don’t change, the other depends on a lump of matter which will lose and gain weight because of chemical reactions, handling and cleaning.

Analogy:

Precision - a precise rifle will shoot a very tiny group of holes into a target (i.e., it hits the more-or-less exact same time, every time).

Accuracy - an accurate rifle will punch the “average” of a group of shots into the center of the bullseye but if it’s not precise, the group will be large.
Edited to fix typo.

How does that help? Can we count out exactly 9,192,631,770 cycles of radiation?

-FrL-

ticker, you win the thread for the perfect balance of subject matter and user name.

You can take advantage of the strengths and weaknesses of existing clock designs. For example, a hydrogen maser has superior short-term stability and mediocre long-term stability. A cesium beam has mediocre short-term stability and superior long-term stability. Most of the factors that affect the accuracy and stability of the clock are known and an analysis of the clock’s design can help quantify them.

Is this ticking thread done? Don’t think so. We need to tock about it some more.

My WAG would be “yes”.

My WAG would be “no.”

-FrL-

We can that is basically what an atomic clock is.

The problem with the kilogram is that it is evaporating. Since the mass is by definition = 1Kg, the kilogram is changing. This is not a good thing to the people who care about such things.
No one knows why.