How To Determine a Kilogram

The main point isn’t really that it’s more precise, it’s that the reference mass can be reproduced by anyone, anywhere, with the right equipment. You don’t have to fly to Paris, unlock the vault, and touch the one and only Standard Kilogram. Similarly, the meter was originally defined by an artifact, the Standard Meter, with similar difficulties in ensuring that its length didn’t change, that it didn’t bend, that it wasn’t unnecessarily disturbed by measurements, etc. When the meter was redefined to be based on the duration of a second and the speed of light, anyone could construct their own perfect meter stick without recourse to a specific artifact that possibly changes over time.

–Mark

It will serve as the national standard for the kilogram and its performance will periodically be compared to other nation’s standards to ensure agreement. Domestically, they will use it to measure kilogram standards that will be used to measure primary lab mass standards which will be used to measure secondary labs’ mass standards and so on and so forth eventually calibrating scales used for defense/science/commerce/etc. that require legal calibration traceability.

Well thanks for trying to kill our nerd tourism, ass ! :mad:

:slight_smile:

So if I’m some company that wants to manufacture high-precision scales, I can apply to some branch of the federal government to get some testing time on some calibration equipment somewhere, and that calibration equipment was itself measured against something that was measured against The Official One, or something along those lines?

Well if you have a lot of money you can use NIST’s calibration services and they will compare your standard kg (or whatever mass you have) to their standards and give you a report that states the measured value of your standard and it’s measurement uncertainty. But typically you would send your calibration equipment or standard to a commercial lab with less precise equipment but a lot less cost, depending on the accuracy you need. The more accuracy you require, the higher up the calibration lab chain you must go and the prices get really expensive. Often the cost of the calibration is more than the cost of the item being measured, especially at the higher level labs.

Interesting. So, suppose that you’re Lockheed, and you’re going to start measuring things that will be components for parts of a satellite, or something like that. How “high up the chain” would a company of that sort be?

And what would they get calibrated/verified? A super-precise scale that they had purchased from some company that makes super-precise scales? Or does Lockheed have their own block of metal that weighs within a millionth of a percent of one kilo? If so, where would they have gotten it? Is there a company that makes and sells such things?

In the US, and probably most of the industrialized world, the chain of standards (or the traceability path) starts at the NMI (national metrology institute) which in the US is NIST. They create and maintain the national standards, do a ton of research and development, win nobel prizes, and usually hire the best of the best physicists with top-notch credentials.

Under the NMI there are primary labs which the DoD has one for each big branch (Army, Navy, AF), and DoE has one at Sandia National Lab, and there are several commercial labs with the better ones operated by test equipment manufacturers (Fluke, Keysight/Agilent/HP). These labs usually use custom made measurement systems or copies of systems in use at NIST and typically use standards calibrated at NIST so due to the propagation of measurement uncertainty can’t achieve the same degree of accuracy that NIST can.

The primary labs’ customers are secondary/reference labs that take the artifacts measured at primary labs and calibrate test equipment with them. Lockhead has this type of lab. They may send a standard resistor or capacitor to a primary lab for measurement/certification and then use it as a standard to calibrate their own test equipment.

There are manufacturers of test equipment and NIST actually sells some systems ($$$$) at their Standard Reference Instrument (SRI) website. So anyone can buy these really accurate standards. You could buy a good standard resistor for a company for say $2000 but getting NIST to measure it may cost $4000 or so. Or you could just use the manufacturer’s specs that probably aren’t quite as accurate.

And at the local level are authorized metrologists–some who work out of their home, I think–who can validate certain tool sets or devices to a level required by local contracting.

To expand a bit on what nate said…

Your measurement instrument must be traceable to NIST. This means there must be an unbroken chain of calibrations between your measurement instrument and NIST.

I’ll use my lab at work as an example.

Most measurement instruments in my lab must be calibrated every six to twelve months. I work at an Air Force base, so when an instrument is due for calibration - say, a voltmeter - I send it to the PMEL lab here on base. PMEL calibrates my voltmeter by comparing it to their in-house voltage standards. Every six to twelve months, our PMEL lab sends their voltage standards to the Air Forces Primary Standards Lab (AFMETCAL) in Heath, Ohio for calibration. And then AFMETCAL sends their voltage standards to NIST every six to twelve months for calibration. In the metrology world, NIST is God, so that is where the chain (usually) ends. :stuck_out_tongue:

So… my voltmeter is calibrated by PMEL. PMEL’s voltage standards are calibrated by AFMETCAL. AFMETCAL’s voltage standards are calibrated by NIST. Hence the unbroken chain of traceability to NIST. If I cannot prove to an auditor that my voltmeter is traceable to NIST through an unbroken chain of calibrations, then I will get in trouble and an investigation will occur.

Every industry does something similar.

The uncertainty of a measurement instrument gets worse the “farther” it is away from NIST. This is because, at each calibration, the uncertainty of the thing being calibrated is usually somewhere between 4X to 10X greater than the standard.

How do the professional societies–the Institute of Electrical and Electronics Engineers (IEEE), American Society of Mechanical Engineers, eg–interact with these specifying chains as to their own standards?

No one needs to interact.

Is your instrument calibrated and traceable to NIST? Is the uncertainty of the instrument known and properly determined? If you answer “yes” to both, then you’re good to go, regardless of what anyone else is doing.

So is it always the instrument that is sent? I would have imagined that a high-precision scale is a big, bulky thing, and that putting it on a truck and shipping it across the country (or whatever) would be very unhealthy. Or in the case of scales do you have a reference kilogram that gets wrapped up in the high-tech equivalent of bubble wrap and shipped off to get weighed?

Balances are not moved. They are calibrated with standard weights. The mass of each weight has an uncertainty and is traceable to NIST. (And the standard weights at NIST are traceable to the IPK in Sevres, France.)

Crafter_Man, thanks. It occurred to me after I posted that it was a stupid question–IEEE says “not to exceed 3 volts, and we mean 3 volts. Up to you to prove it.”

As to Max’s question, like the whole shebang, it comes down to a “who guards the guardians” question (insert Latin version for classiness). You/one/I might ask, I do ask, in fact, what about the readings from the metrological tools themselves which are subject to the same questions.

I would assume that, in most cases, the accuracy of your high-precision bulky scale is assessed by some tools (“tools” being weights/masses-if-you-need-to-be-extra-technical, in an easy-enough example); it’s the tools, and their numeric results, which are assessed up the chain of authority.

Ultimately, as we have seen, there is no one custodial-tool even at NIST for some things, including metrological instruments. (If the watt-balance is standard, the the kilogram is–interesting philosophical issues here).

But take calipers, for example: as long as the readout is available, it’s the read-out-and-test-material (accuracy of test material gets you in another chain) which is the original instrument itself as in this situation. So you end up taking readings on the accuracy of your big-bulky scale or whatever you don’t want to/can’t send packing for certification.

So as long as you can prove the accuracy of your caliper you’re gold. (What’s that standard for distance again? Oh crap…) Enter the super-caliper. It’s the commutative law of proof.

A cousin of mine was chief metrologist for the NYC subways, and blew my mind showing me his bizarro caliper-like thingies which he uses to measure calipers to check his torque-meter with which he checks those used by the safety inspectors (in other departments) have who check the tightness of the nuts on the bolts of the undercarriage of every train car in NY.

And those bizarro caliper-like thingies get tossed periodically and the city buys him new ones.

Not sure what happened in post #53. Meant to quote MaxTheVool. :o

Anyway, standard calipers are calibrated using gauge blocks. IIRC, gauge blocks are calibrated by comparing them to (more accurate) standard gauge blocks using an optical comparator. It’s ultimately traceable to NIST and number of wavelengths of a certain light beam. (Am too lazy to look up the specifics.)

To be clear about my point (if not universally applicable) about the commutative law of metrology: that the authority of the calipers (down to the optical level at NIST mentioned by Crafter Man, may be used to authorize a tool for differing measurement standard and the tool/data for it.

The notional “bizarro super calipers” can be checked/thrown out more often to vet another instrument for a differing primary measurement, whose authority will be established equally as often (to some level).

You’re talking about the new proposed kilogram standard?

The whole point is to get away from having a physical chunk of material that tells you what a kilogram is. That’s what we have right now - a physical chunk that gets pulled out occasionally (and very carefully) to check other, downstream standards.

If (when) they change the standard, then any competent lab can make its own. “Competent” here would mean “really sophisticated.”

Yea, metrologists do not like it when the “top” primary standard is based on a single, unique artifact. It is better when the “top” primary standard can be reproduced by anyone.

This was once the case for the meter. Today the meter is defined as “1,650,763.73 wavelengths of the orange-red emission line in the electromagnetic spectrum of the krypton-86 atom in a vacuum” (link). They want to do something similar with mass.

Here is an article published a week ago that describes a lot of the stuff discussed in this thread.

A funny thought popped into my head while reading this.

A Von Neumann probe travels aged between stars. Upon arrival, whatever other missions it has, it is supposed to build a set of self copies. Which would basically entail building an aerospace industry from scratch.

After reading these posts, I now realize that it would have to build a metrology institute as well. I’d imagined ore extractors and smelters and whatnot, but not that. Building a new metrology institute is fine (hard, but doable) if you have fundamentally defined units, but it might face real problems if it felt the need to zip back to Paris to check against the standard kilogram bar.

I guess in a pinch, it could define its own local kilogram standard (based on a volume of liquid, or something), which may be good enough.