Units for Temperature Measurement

The Système International d’Unités (SI) uses the kelvin as the unit for temperature. Ultimately, kelvins are derived from degrees Celsius and the concept of absolute zero. The Celsius scale arose from originally assigning 0 for the freezing point of water and 100 as the boiling point of water. The current definition is just a little more complicated but is pretty close to that. The scale is applicable to the surface of this planet but the observable universe is a rather large place. Is there a unit for temperature that is more “natural” or “fundamental”?

Absolute zero is definitely the most natural reference point on the low side. So Kelvin, Rankine, etc. win over the others in that regard.

But aside from that, not really. Any upper reference point is going to be somewhat provincial, assuming low-energy chemistry instead of plasma physics or nuclear chemistry or the like. Perhaps one could argue that Planck temperature is a useful upper reference point, but that’s 1.417 x 10^32 Kelvin. We don’t even have a metric prefix to scale that down to anything useful. We’d have to work in milli-quecto-degrees.

Never mind. Brain fart. But it was fun while it lasted. :zany_face:

Side note, you don’t know “customary units joy” until you study naval nuclear power using things like degrees Rankine and British Thermal Units in all of your steam cycle work.
Salt and peppering the “proportionality constant” of 32 \frac{\mathrm{ft}\cdot\mathrm{lbm}}{\mathrm{lbf}} all over the place (or maybe it was the opposite…almost 40 years ago).

A more natural unit for temperature is whatever energy unit you are using. Joules or electron-volts (eV) or whatever best fits your context.

Temperature is fundamentally tied to the notion of thermal equilibrium, which in turn is fundamentally tied to how changes in entropy and changes in energy relate to one another. Entropy is fundamentally unitless, so temperature most naturally has units of energy. If you maintain separate units for temperature, there isn’t any physics that makes these separate units more natural. It’s only historical practice that governs the choice.

The Boltzmann constant k_B provides the relationship between temperature and energy units, but you can just set k_B to exactly 1 and skip the trouble.

In statistical mechanics, one often uses the energy version of temperature from the get-go, usually denoted “tau” (\tau), without ever needing to introduce the Boltzmann constant at all.

This choice is more than mathematical cleanliness. Every time to you hear about some extreme temperature, it may have been more useful to hear the energy-equivalent version so that the relevant physical processes can be immediately ascertained. Examples:

  • Some astronomical system is at T=50,000 K. What’s that hot enough to do? Who knows. Better: The system is at \tau=4.3 eV. Ah! So, hydrogen will not be fully ionized since the ionization energy of hydrogen is 13.6 eV.

  • A fusion reactor needs to be hot enough for the fuel’s atoms to overcome the Coulomb barrier between them. How hot does it need to be, in kelvin? It’s anybody’s guess. In energy units, though, \tau at or above 20 keV should start providing substantive fusion given the energy scale of the Coulomb barrier.

  • Superconductivity can arise when electron-electron pairs (“Cooper pairs”) can stably form, which requires low temperatures. How low? Well, the Cooper pair bond strength is about 0.001 eV, so – that’s the temperature \tau of interest. That happens to be around 10 K.

  • Say you are interested in an early stage of the universe when the temperature was 1012 K. Were Z bosons being created easily at that point? Dunno. If you had said the temperature was around 100 MeV, then the answer would be an immediate ‘no’ since Z bosons have a mass around a thousand times higher (91 GeV). Separately, one could immediately say that nuclei cannot readily form around temperature \tau=100 MeV since nuclear binding energies are only on the scale of a few to tens of MeV.

  • Room temperature is \frac{1}{40}~\rm{eV}. This is immediately applicable if you know the energy scales of physical processes you either want – or don’t want – to happen in your experiment.

Separating my reply in two since this part is silly bonus.

Last year there was this thread and post. To summarize the post, you could get a step more fundamental by using the quantity \beta=\frac{1}{\tau} (or negative that). This has two advantages. First, equations tend to be a lot cleaner in terms of \beta. Second, it eliminates a very niche but real aspect of temperature in which negative temperatures are possible and actually hotter than positive temperatures. So, something at -1 kelvin will give up heat to something at 1 billion kelvin. Negative temperatures are not possible in “normal” systems, though, and require some sort of contrivance to realize.

All the same, \beta gets lots of play given the mathematical cleanliness it provides.

As the need for more extreme prefixes continues, I hope to see groucho-, harpo-, and chico- (along with their counterparts grouchi-, harpi-, and chici-) adopted someday.

And Zeppo; everyone forgets about poor Zeppo.

Larry-, Moe-, and Curly- FTW! :grin:

Instead of the boiling point of water, you could use the triple point of water, the unique temperature and pressure at which ice, liquid water and vapor exist in equilibrium. The temperature is very close to 0 C. It is not truly universal since it depends on the arbitrary choice of water, but it is independent of conditions on earth.

Zepto is already a metric prefix (Honk! Honk!)

I suppose if you want a really natural energy unit for temperature, you’d use Rydberg’s, since Volts are human units. Rydberg constant - Wikipedia

What a coincidence… just yesterday I was helping someone form the ice mantle in a water triple point cell. :slight_smile: We use the cell when we need to use our SPRTs for temperature cals.

Between 1948 and 2019, the triple point of water was defined as 273.16 K (0.01 °C). Interesting bit of trivia: because it was defined, the temperature of the triple point of water couldn’t be measured. Because… you can’t measure something that’s defined. Sadly, IMO, it is no longer defined.

It’s not arbitrary. Only VSMOW is allowed to be used when using the triple point of water as a temperature reference point.

Not quite the same, but close enough that confusion might reign of they did add a zeppo- prefix along with the existing zepto-.

This article suggests they’ve just about exhausted the sensible prefixes, both as to magnitude and as to ease of writing unambiguously with just one letter. But we can always hope for the need to add more :wink:

IIRC the way the ITS-90 specifies the water, it must represent an average of “the seven seas”, and quantities of water are gathered in various oceans and carried around the world, purified, and mixed, to fill triple point cells. It’s the balance of various included isotopes that they’re going for, here.

All units are human units at the end of the day. The unecessary choice of having both temperature and energy units is fixable, but having energy units at all still requires a choice whose “naturalness” will be context dependent. A Rydberg of energy could be a handy unit for atomic and molecular systems, and those are pretty darn common systems, but for other contexts, a Rydberg would still feel quite arbitrary.

I understood Hari_Seldon to mean that it is arbitrary to chose water instead of any other chemical compound.

I strongly suspect very few people would have your understanding that one cannot just arbitrarily pick any water from somewhere in the universe…

Funny how professional goggles influence your understanding of what other people are saying.

The gram was also based (originally) on water so it’s actually sort of “natural” that the temperature gauge is also based on it, within the pantheon of the metric system.

Fahrenheit was determined with an eye towards measuring weather. I’m not sure but I’d expect that most organic chemistry happens within or near the 0-100 range of temperature, F. (E.g. freezers are calibrated to 0F, even in Metric-land, because bacteria continue to multiply at 0C. This activity bottoms out with 0 on the Fahrenheit scale. 100F is, effectively, the midpoint between a healthy human body temperature and a dangerous fever. The fever is undertaken by the body to destroy bacteria.)

But, conceivably, the range of organic chemistry could be much wider if we start including silicon-based life, or other, galactic life forms. There’s no knowing until we know.

The most satisfying response, for me anyway, that I’ve seen so far is the idea of redefining the Bolzmann constant as 1.

You’re correct; I misinterpreted his comment.

There are 14 “fixed” temperatures for ITS-90. Of these, one is a melting point, six are triple points, and seven are freezing points. In addition, one is a defined temperature (triple point of water), and the other 13 are simply known to a very high degree of accuracy.

As mentioned earlier, though, the temperature for the triple point of water is no longer defined as of 2019, and is now considered to be known to a very high degree of accuracy, just like the other 13.

My understanding is that Fahrenheit chose the coldest temperature he could easily create (by mixing ice with salt) to be zero, and his own body temperature that day as 100.

It is obviously an arbitrary, human-centric scale, but it works very well for “temperatures people routinely encounter”. Certainly, anything about 100F is too damn hot and anything below 0F is bitterly cold.

The smaller degrees work well for making temperature-sensitive stuff like jelly and candy, too.

Starting at absolute zero is more fundamental, but those numbers are awkward for most human purposes. Which is why we use Celsius and Fahrenheit for everything except physics.

My European-made freezer defaulted to -4F, which seems to be a fine temperature to store food at.

So if the triple point temperature of water is no longer defined, what is the definition?