It is easy to calibrate thermometers once you learn that water always freezes and cooks at the same temperature (at the same altitude) - but to know this, I assume, you need a thermometer first.
So how did “they” find out about this fact? It could be (without prior knowledge) that at one point on earth water freezes at 0 degrees and at another at -5 degrees. Or did they just assume so?
Sure, but how did they know these temperatures were equal without the use of a thermometer? You cannot “tell”, that water always boils at the same temperature by looking at it (or by sticking your finger inside).
I expect observation would lead one to believe, or at least suspect, that freezing temperature and boiling temperature were consistent. After calibrating a bunch of thermometers based on this assumption, one would then observe that those temperatures were indeed consistent. The same process, with additional equipment to measure altitude and/or barometric pressure, could then lead to accurately adjusting for those factors.
They didn’t know at first, but reasonably it was [del]suspected[/del] hypothesized, then verified through experimentation. The scientific method at work.
" In about 1654 Ferdinando II de’ Medici, Grand Duke of Tuscany, made sealed tubes part filled with alcohol, with a bulb and stem, the first modern-style thermometer…
“However, each inventor and each thermometer was unique—there was no standard scale. In 1665 Christiaan Huygens suggested using the melting and boiling points of water as standards, and in 1694 Carlo Renaldini proposed using them as fixed points on a universal scale. In 1701 Isaac Newton proposed a scale of 12 degrees between the melting point of ice and body temperature. Finally in 1724 Daniel Gabriel Fahrenheit produced a temperature scale which now (slightly adjusted) bears his name. He could do this because he manufactured thermometers, using mercury (which has a high coefficient of expansion) for the first time and the quality of his production could provide a finer scale and greater reproducibility, leading to its general adoption.”
By comparing to a different reference point. Some form of container of which billions can be found around the globe, which generally holds the same internal temperature (due to generating its own heat and able to vent that heat) no matter atmospheric pressure. Fahrenheit chose to set 100° (or close to it) as the average heat of this vessel.
If you want to skip that step though, construct a simple barometer (I suggest a piston system) and perform observations at different altitudes using your uncalibrated thermometer. This should give you enough data to calculate how pressure affects the boiling temperature of water.
I don’t think this contradicts Leaffan’s statement. There could be disagreement about which references to use and still broad recognition that certain temperatures were potential references. Unless you mean that the numbers assigned to the temperatures, as opposed to the temperatures per se, did not come first.
I don’t quite see your problem. You could easily (well, easily if you’re willing to travel) use the thermometer to test whether this was true - that’s what thermometers do. Once you have your first thermometer, just immerse it in boiling water and mark a line at the relevant point. Go to the other side of the world at the same altitude (say sea level, to make it easy). Repeat. The level of mercury should be at the same line, give or take a little wiggle room for atmospheric pressure variations.
Now write “212ºF” on your thermometer (or “100ºC” if you prefer), then repeat with iced water.
There’s a long history when it comes to standard fixed point temperatures. As mentioned, Fahrenheit was trying to create “the coldest thing in the world,” and defined it as 0 degrees. Yes, he was trying to create an absolute temperature scale. Of course, we now know that his brine solution is nowhere near the coldest achievable temperature. I believe it was he who also used the core temperature of the human body as a fixed temperature.
For a long period of time the freezing point of water was used as a defined fixed point temperature. It was defined to be exactly 0 °C. In 1948 the freezing point of water was abandoned as a fixed point and replaced with the triple point of VSMOW water, which is defined to be exactly 0.01 °C (273.16 K). The only other defined temperature is absolute zero (0 K). All other temperatures - including freezing points of pure substances - are simply known to a greater or lesser degree of accuracy. All have uncertainty. Except for absolute zero and the triple point of VSMOW water.
As a result of the above, the freezing point of water is no longer exactly 0 °C. The latest measurements suggest pure water freezes somewhere around 0.000089 °C. This is so close to 0 °C that, for virtually all applications, the freezing point temperature of pure water may be assumed to be 0 °C. But it is not *exactly *0 °C. Few people know this.
And of course, there’s also the question of what you do if you find that, in different circumstances, boiling water makes the mercury expand to different markings on your tube. Is the water acting differently, or the mercury?
Unless you’ve got a complete theory of thermodynamics (which nobody did until at least the 19th century) who is to say whether freezing/boiling water or expanding mercury are better gauges of ‘temperature’?
That is essentially correct… make a bunch of LiG thermometers and insert them into a substance that is in equilibrium at its freezing/melting point. Put a scribe mark on the glass at the meniscus. Repeat the process of creating the freezing point. If the meniscus always rises to the same scribe mark, you can assume the freezing point for your substance always occurs at the same temperature. And then you can assign it an arbitrary number if it is to be a *defining *fixed point on your scale. But it must not only be repeatable, but reproducible. If someone on the other side of the world makes his own freezing point bath using the same substance, your thermometers should respond the same in his bath.
The joke goes that Farenheit decided to make a thermoter that any amateur scientist could make - first he mixed salt with ice to create the lowest possible temperature you can make that way; then he measured body temperature using the same thermometer. He scaled these from 0 to 100 - but while fiddling with ice below zero, he got a bit of a fever - so body temperature is not really 100.
Of course, in the world of amateur 1600’s science, part of the expectation was that a lab would rarely have to deal with temperatures below 0. As for the question about consistency - the obvious view is that temperatures were pretty consistent; pure water froze about a certain point, body temp was around a certain temp, etc. Years of consistent obsevation basically showed that no matter which way you sliced and diced it, certain temperaures seemed to be consistent. The finer you measure it, the more consistent it appeared. Observation bears out hypothesis.