I believe that the Bureau of weights and measures has an iridium rod that is a standardized length. Maybe also a standardized mass. The Naval observatory has a time of day that everyone can calibrate to. There’s a formula or standard for time, I think, based on the vibrations of a cesium atom or some such. All these - even if I have them somewhat bollixed up - are standards against which other instruments might be compared. Is there such a standard of any sort for temperature?
The standard is absolute zero (aka zero kelvins) but there’s nothing handy nearby that’s actually that cold to measure against.
So what is typically done to calibrate sensitive thermometers is to dunk them into very, very pure ice water at standard pressure. That should be about as close to zero degrees celsius as you can get.
More precisely the standard for temperature is the Kelvin, and the Kelvin is standardized relative to absolute zero and the triple point of water.
There are standards for temperature, but they are reproducible conditions instead of something like a bucket of ice water that everyone else could reference.
The Fahrenheit scale uses a mix of ice, water, and ammonium chloride salt to define its zero point. Daniel Fahrenheit wanted 0 to 100 to roughly correspond to the coldest and warmest temperatures in Germany (where he lived), but “coldest temperature in Germany” isn’t any sort of standard that can be reproduced, so he experimented until he came up with the above combination to create the standard.
The Celsius scale uses the freezing and boiling points of water at 1 atm pressure to create the 0 and 100 points on the scale.
You can create your own copy of the “standard” by simply reproducing the exact conditions defined.
In fact, if you want to calibrate your temperature measuring device, this is how you have to do it. You need to either calibrate it according to the exact definitions defined as the standard, or you need to calibrate it with a device that is already calibrated to those conditions.
This still seems abstruse in that I can’t figure out how anyone would know the exact moment that some sample of water had frozen or boiled or condensed. I think my question still would be is there is a “thermometer” of sorts that is considered to present the perfect reflection of the ambient temperature, as a standard, so that other measuring devices could be calibrated against it - in the same way that a ruler could conceivably be calibrated against the iridium rod.
Water doesn’t hit 100C and then suddenly explode in a cloud of vapor, nor does it reach 0C and instantly become a solid lump of ice.
Liquid water boils at 100C, but it doesn’t happen instantly. What happens is that there’s a molecule of water that has slightly higher kinetic energy than the others, and it goes zipping off as steam. But now the boiling water has slightly lower energy than before it lost that one higher energy molecule. So it takes constant input of heat to keep the water at the boiling point. The same thing with freezing, but in reverse. This concept is known as the “heat of fusion” and the “heat of vaporization”.
So boiling/condensing water and freezing/melting water doesn’t happen at an exact moment. You can have 0C ice and 0C water sitting together in a glass at equilibrium. You can have 100C water and 100C steam sitting together in a pot at equilibrium. And it takes quite a lot of heat transfer for for all the 0C ice to convert to 0C water or all the 0C water to convert to 0C ice, or all the 100C water to convert to 100C steam or all the 100C steam to convert to 100C water.
It’s not abstruse at all since the temperature does not change when water is boiling, no matter how hard it boils. The same for freezing. As long as there is a container of water and ice (suitably stirred) the temperature will remain the same until all the ice melts. You do have to wait for the ice/water temperature to stabilize.
Temperature is one of the standards you can replicate by yourself to a reasonable degree. Along with level (use a pool of liquid), plumb (weight on a string) and pi.
The other ‘standards’ you mentioned, length, weight, mass, I don’t give a crap about some platinum bar, it’s all completely made up. We cut the bar to suit us and made up some dimension. However, that is one of the reasons the English length measurement system is so superior to the metric, it is based on human observation and use.
Dennis
The beauty of freezing and boiling points as a standard is that you don’t need to find an exact moment. I just heated up some water for tea. When the kettle started whistling, it was boiling. While I walked over to the stove, it was still boiling. When I got my cup and spoon and such out, it was still boiling. During that time, it was continually at, neither above nor below, the boiling-point temperature. Materials don’t change their temperature while they’re changing state.
It’s also worth noting that the kilogram is the only unit whose standard is still defined by a physical artifact (a lump of precious metal in Paris), and they’re working to eliminate that, too. All of the other units are defined in terms of physical constants and reproducible experiments. For instance, the second is officially defined in terms of the frequency of light emitted by a cesium atom of a particular isotope under certain conditions, so any lab in the world with sufficiently-precise equipment can calibrate their clocks to a “standard second”, to within the precision of their instruments, by putting a cesium atom in those conditions and measuring the frequency of light it produces, without needing to ship a super-precise clock halfway around the world. The meter, in turn, is defined in terms of the standard second and the speed of light, so again, any lab with the appropriate equipment can measure the standard meter. There are still platinum bars with scratches in them, which are often more convenient than doing the experiments, but they’re not the official standard any more.
Maybe at one time. But when was the last time you needed to know how long a standard furrow was, or a thousand paces of a Roman legionaire? Why is the length of the last joint of your little finger any more fundamental than the width of your little fingernail? What human use is the pint based on, and is the American pint or the British one better for that purpose, or maybe something in between? Of what use is a unit of power that’s about half to a third of the power of a decent workhorse?
Here is a list of handy temperature standards you can reproduce in your laboratory, including what substance to use (water, mercury, neon, aluminum, etc.) and what type of thermometer you should be using.
Don’t you know that the yard is an elegant measure for a more civilized age, but the meter is just an abomination?
And a pound is just natural. The other day I bought a pound of bananas, which is the amount I wanted. It would be way too complicated to buy 0.453592 kilograms of bananas, that’s just insane! And I set the thermostat to a comfortable 70 degrees, imagine trying to set your thermostat to 21.1111 Celsius. Imagine trying to buy liquids in metric, like if you wanted two quarts of soda, but couldn’t find that amount for sale anywhere and had to settle for 2.11338 quarts? I just want some bananas for crying out loud!
The British pint is clearly superior because you get more beer.
That was yesterday during the Preakness Stakes … they were calling the distance in furlongs = length of an English furrow … [giggle] …
Since the OP talked about Standards, I am nitpicking your post. While what you say is generally true, this is the thermodynamics view of boiling and totally neglects heat transfer. Heat transfer rates or specifically heat flux rates effect boiling which gets a lot of attention when doing boiler design for steam power plants.
There is always superheating to some degree when you are boiling water using a pan. See this graph for different superheats as a function of heat flux :
On one hand is nucleate boiling (smooth and nice) and at the other end is film boiling (Leidenfrost effect, boiler explosion or walking on hot coals - are all related to film boiling).
To eliminate the super heat and other such effects, the boiling point of a liquid (or melting point of a solid) is measured using a Thiele tube in a laboratory. The heating of the sample is entirely by convection and thus avoids a lot of heat flux effects. It also gives a clear indication of the phase change.
See page 6 of 8 of this document http://www.chem.ucalgary.ca/courses/351/laboratory/boilingpoint.pdf to see how this tube is used.
In general, I favor U.S. standards because I am in the U.S., but in this case I think I can make an exception.
The list I linked to explicitly specifies triple point, melting point, or freezing point, as appropriate. Naturally, there is a lot of extra documentation on exactly how to measure all these.
You can also pay to have your thermometer calibrated.
Don’t forget about the 60 cl “Australian pint” !
True, if you’re a standards laboratory, there are a lot of fine details you need to worry about. But if you’re just a layman calibrating your home thermometer, to the precision you’ll actually be able to read it, a teakettle and a champagne bucket will be perfectly good temperature standards.
I love this website. I have gained more, in terms of factual knowledge, as well as in ability to think clearly and form a good question, than from almost any single source.
If you don’t need certification, but want more exact results with less work consider a Pt-100 RTD. One that is Class A under International IEC 60751 or US ASTM E1137 Standard it will be accurate to ~ ± 0.06°C @100C
Even a Class B will be ± 0.12°C @100C
PT100 RTD’s are trivial to calculate, stable, flexable and fairly inexpensive for most needs that don’t need traceable calibration but need precision.