Degree as a unit of temperature?

When and how did degree become the measurement using of temperature?

Does it have anything to do with the fact that degree is just a generic linear progression?

How (if at all) is it related to the degrees of measuring angles or longitude/latitude?

In that case is a degree less of a defined item than just a single “step” on the Fahrenheit or Celsius temperature scale?

I looked around and didn’t find much to help out. The definitions of “degree” I found supports the step in the scale theory. However if we accept that a degree is a step in the scale how was it determined what a full step was? How is the exact “degree of temperature” decided?

As for when I found one source that said it was first used by the French in 1727 with no supporting documentation (and this same source states that it precedes the use of degree as a unit of measurement for a circle).

Any insight would be appreciated.

Well, a degree doesn’t have any specific meaning with regard to temperature unless you specify what scale you’re talking about – Fahrenheit, Celcius, Kelvin, etc. The word “degree” basically just means “step.” It comes originally from Latin – the preposition de combined with the word gradus, meaning step. (My Latin is very rusty, but I think gradus is the noun for step, and gradi is the verb to step – possibly to walk step-by-step. Gressus is the past participle “having stepped.”) Thus, it makes sense that this word has come to be used in English when dividing a continuum into discrete steps, whether we’re talking about temperature or the angles of a circle.

As for why we call the temperature scales by the names we do:
Fahrenheit is named for the German physicist who invented the scale, as well as the mercury thermometer.

Celcius is named for the Swedish astronomer who invented the centigrade thermometer. (Centigrade, another name for the Celcius temperarture scale, is just a word of Latin origin meaning 100 steps, since there are one hundred degrees between the freezing and boiling points of water in this scale)

Kelvin is named for the Brittish physicist who developed this scale.

The word ‘degree’ is associated with measurement of temperature because the earliest temperature measurement devices used glass columns of water, mercury or alcohol that were marked, or “graduated.”

And, as tim314 has pointed out, these words originate from the same latin root. Someone more proficient with Italian can probably better describe the intermediate forms of the word, and the likely sequence of formation, since this page credits Santorio Santorio with the invention of the graduated “thermoscope.”

Kelvin is not measured in degrees. This is an important concept.

When you say the temperature using Kelvin you’d say 450 Kelvins, NOT 450° (degrees) Kelvin. Thats because Kelvins are an actual, quantifiable unit of energy.

Just a nitpick, tim314, but Kelvins are themself a unit; there’s no such thing as a “degree Kelvin.”

Damn, some how I missed Hail Ants’ post on preview. :smack:

But a Kelvin equals a degree Celsius, right?

An increase of one Kelvin is excactly proportional to an increase in temperature of one degree Celsius.

Actually, I thought “degrees” came at least partly from the invention of an early type of thermometer, which used a needle attached to a metal coil that would increase or decrease the torsion on a shaft connecting the needle to the coil. The thermometer was designed such that the shaft would be twisted a full 180 degrees from the freezing to the boiling point of water, and hence the 180 degree difference in temperature on the Farenheit scale between freezing and boiling (32 to 212 degrees).

I can’t remember where zero on the Farenheit scale came from. I think that’s the lowest temperature you can get water mixed with salt to attain, and that may have had something to do with thermometers that used water, and hence that was the lowest measurable temperature. Or something like that.

According to Merriam-Webster Collegiate “degree” ultimately descends from the Latin gradus meaning “step” or “stage(in a process)” A “degree” in temperature is one step in a graduated (i.e. divided into grades or intervals) scale.

All of those words degree, graduated, grade are derived from gradus.

Man I worded that poorly. It’s the lowest temperature you can get saltwater to attain before it solidifies.

Again, I can’t remember if that’s correct, though.

At least it was the lowest temperature that Fahrenheit could get in his laboratory.

Maybe that’s it.

Anyhoo, as I said above, the fact that there are 180 degrees between freezing and boiling water is not an accident, I think. It’s a half-circle rotation of a dial on a thermometer, meant to tick off one degree for every, er, degree. Unless I’ve lost my mind (which could happen).

To expand on the basis of the Fahrenheit degree, 0 degrees was arrived at by Fahrenheit as reported here, and Fahrenheit arbitrarily chose body temperature at 100 degrees. Someone else can explain why he was off by 1.4 degrees. thats the way I remember it.

Here is a site that describes the origins of the points on the Fahrenheit scale that tracks well with the descriptions I’ve read elsewhere.

There is no dial on a Fahrenheit-style thermometer; he used a simple mercury thermometer, which had recently been developed. The 180-degree thing is a deliberate (though slight) departure from the original scale, done because it was noticed that there were almost exactly 180 degrees between the freezing and boiling points of water on that scale.

I’ve been trying to get some kind of consensus about the 180deg. difference btw. boiling and freezing, and I’ve found that to be quite difficult. It seems like there are at least four or five “authoritative” versions of the story.

Anyway, I can’t find any reference to my dial thermometer, so I declare myself effing wrong on that one.

However, it does appear that Farenheit found the 180 degree thing aesthetically pleasing, due to the half-circle nature of the figure, and deliberately tweaked his scale to arrive at that interval. Also, 180 degrees is a nice, round number that’s divisible by 2, 3, 4, 5, 6, 10 and so on, making it convenient from a mathematical standpoint. He initially chose (somewhat arbitrarily) 30 for freezing and 100 for human body temp., and then had to adjust these numbers to A) fit experimental data, and B) fit his own ideas about what the temperature scale should look like.

I think that’s the right version of the history. It may not be, though. It’s the best I can come up with.

Here is Isaac Asimov’s take on the 180[sup]o[/sup] difference and the Fahrenheit temperature scale in general. It’s from his essay The Height of Up. By this account Fahrenheit originally set his high mark at body temperature of 12 but changed it to 96 because his thermometer was capable of greater accuracy than had been the case until then. On this scale freezing water was a little under 32 and boiling water was a little under 212, with a difference of not quite 180[sup]o[/sup]. Fahrenheit possibly liked 180 exactly and since it was so close that that anyway, he changed his reference points to two physical phenomena of water. He set the freezing point at exactly 32 and the boiling point exactly 180 away at 212. This made body temperature 98.6, but that’s not a constant and only approximate anyone so it doesn’t matter.

I can’t find any reference to Fhrenheit setting body temperature at 100.

*"But then, in 1714, a German physicist named Gabriel Daniel Fahrenheit made a major step forward. The liquid that had been used in the early thermometers was either water or alcohol. Water, however, froze and became useless at temperatures that were not very cold, while alcohol boiled and became useless at temperatures that were not very hot. What Fahrenheit did was to substitute mercury. Mercury stayed liquid well below the freezing point of water and well above the boiling point of alcohol. Furthermore, mercury expanded and contracted more uniformly with temperature than did either water or alcohol. Using mercury, Fahrenheit constructed the best thermometers the world had yet seen.

With his mercury thermometer, Fahrenheit was now ready to use Newton’s suggestion[sup]*[/sup]; but in doing so, he made a number of modifications. He didn’t use the freezing point of water for his zero (perhaps because winter temperatures below that point were common enough in Germany and Fahrenheit wanted to avoid the complication of negative temperatures). Instead, he set zero at the very lowest temperature he could get in his laboratory, and that he attained by mixing salt and melting ice. Then he set human body temperature at 12, following Newton, but that didn’t last either. Fahrenheit’s thermometer was so good that a division into twelve degrees was unnecessarily coarse. Fahrenheit could do eight times as well, so he set body temperature at 96.

On this scale, the freezing point of water stood at a little under 32, and the boiling point at a little under 212. It must have struck him as fortunate that the difference between the two should be about 180 degrees, since 180 was a number that could be divided evenly by a large variety of integers including 2, 3, 4, 5, 6, 9, 10, 12, 15, 18, 20, 30, 36, 45, 60 and 90. Therefore, keeping the zero point as was, Fahrenheit set the freezing point of water at exactly 32 and the boiling point at exactly 212. That made body temperature come out (on the average) at 98.6°, which was an uneven value, but this was a minor point.
Thus was born the Fahrenheit scale, which we, in the United States, use for ordinary purposes to this day. We speak of “degrees Fahrenheit” and symbolize it as “O F.” so that the normal body temperature is written 98.6° F."*

  • In 1701 Isaac Newton suggested that the temperature scale start with the temperature of melting ice as the zero point and the body temperature at 12.

Sounds good to me. What’s wild is this doesn’t agree with some other accounts I’ve read. Nobody ever provides references to primary sources (like Farenheit’s notebooks, for instance). Maybe there aren’t any, and we’re getting the story second-hand.

At any rate, I see no reason to favor any version of the story over Asimov’s, so great, thanks for the exerpt!

>An increase of one Kelvin is excactly proportional to an increase in temperature of one degree Celsius.
It’s also proportional to an increase of one degree Fahrenheit. But it is EQUAL to one degree Celsius.

>Kelvin is not measured in degrees. This is an important concept.
But it was until not all that many years ago.

>Kelvins are an actual, quantifiable unit of energy.
No, an actual quantifiable unit of thermodynamic temperature. More specifically, 1/273.16 of the thermodynamic temperature of the triple point of pure water having an isotopic mixture concentration like seawater. “Thermodynamic temperature” has been defined in terms of an ideal gas, and then of the Carnot cycle, and most recently of a quantum mechanical treatment of entropy. However the Dopers could get a lot of milage thinking of it as proportional to the mean kinetic energy of a molecule of ideal gas held at constant density.