How (If They Did So At All) Did The Ancients Measure Temperature?

Did any ancient culture have any means of measuring/describing the temperature? Could Socrates, Cicero, Ramanedes, etc. have had any means of describing today’s temp other than “slightly warmer than it was yesterday, I think”?

Without thermometers, it’s still fairly easy to compare relative temperatures. Various substances soften, melt, and burn at different temperatures. One can make observations such as “Today it’s hot enough to soften sheep fat, but not beeswax; yesterday it was hot enough to soften beeswax,” or “Olive oil heated over a wood fire will burst into flame, but olive oil heated over a beeswax flame will not.”*

One could easily maintain a set of jars of various substances and gauge the relative temperature by observing which ones are hard, which have softened, and which have liquefied.

Metals also expand and contract as the temperature changes. Again, without a thermometer, you can’t know the absolute temperature, but you can certainly make an observation like “When it’s cold, this iron rod will fit into this wooden cavity; when it’s hot enough that the iron rod doesn’t fit into the cavity, it’s too hot to do (something you shouldn’t do when it’s hot).”

*These are hypotheticals. Don’t try to warm olive oil over a beeswax candle and then come running to me because you set fire to the curtains.

I don’t know about colloquial descriptions of, say, how hot the day was, but one practical need for measuring temperature was in how hot an oven was for baking. as I was told this weekend (something I’d heard before), in Colonial days you gauged the temperature of your brick oven by how long you could stand to hold your arm in it. So you could argue that people measured temperature indirectly through time.

You also had to know the temperature in kilns for firing clay pottery. I don’t know how they did that – the how-long-can-you-hold-your-hand-in-there gom jabbar test won’t work if it immolates your hand. Josiah Wedgwood is supposed to have invented pyrometric beads circa 1782, and kiln cones came after that – but that was after the invention of the thermometer.

Regarding thermometer principles, they were known to the ancients, although they didn’t use them to make practical thermometers:

1593 or so before you get any sort of “device” - 1641 for the first “modern” thermometer.

THERMOSCOPE

Just a couple of things come to mind. I recall reading BBQ guides to test the temp of your grill by how long you can keep your hand over it.

Gotta figure there have long been cooking guides analogous to temps, such as candy making.

And metal working has long relied on color to gauge the temp of metal.

Ambient air temperature? Well, I’d figure frost/ice would have been an obvious signifier in temperate zones.

When all you can measure is “a little warmer than yesterday” or the like, that’s all that you need to be able to measure. In fact, this principle applies to measurements of all sorts: If there is a reason why a measurement needs to be made to a certain precision, then at least one technology exists to make that measurement to that precision.

For example: Suppose I need to know the temperature precisely enough to know whether it’s above or below the freezing point of water. I can measure that by putting out some water, and seeing if it freezes. Suppose I need to know if a forge is hot enough to melt bronze. I can measure that by putting in some bronze, and seeing if it melts. Suppose I need to know if it’s hot enough out to dry a pile of hay in twelve hours. I can measure that by putting out a pile of hay for twelve hours, and seeing if it dries out.

Now, often, there are other measurement methods which are more convenient. But there’s always at least that one.

Toss in some flammable material and see how long it takes to char/catch fire.

I’m guessing that, like blacksmiths with iron, they did it by looking at the colour of the glow of the kiln.

We still use such measures, colloquially - “red hot”, “white hot”, etc.

Naturally it would depend on the individual characteristics of a particular kiln and what it is made of. The potters would work this out by trial and error, basically (‘the bricks inside must glow a dull red, fire for six hours’ that sort of thing).

That sounds interesting, and I thought a little about it. Is it a generally accepted theory (with a name), or something you formulated yourself? I’m asking because I was looking for counterexamples. The best thing I came up with was the problem of calculating longitude for ships at sea. Determining the latitude of a ship is comparatively trivial, but for longitude it had been known since at least the 16th century that it would be useful to have, on board the ship, a clock that kept the time at a fixed point of reference with a reasonable degree of accuracy. Nonetheless, it took until the 18th century for such a clock to become available (which was immediately acknowledged as a major breakthrough). That can be seen as an example of a measurement technology that was needed even though no technology existed that could deliver that measurement with the required precision.

I have cookbooks that tell me the correct temperature for making pancakes is when you drop water on the griddle and it skitters and sizzles before evaporating.

I have other cookbooks that tell me the correct temperature for making pancakes is 325F.

Remarkably (and conveniently), it mostly doesn’t. As long as the material in question is a blackbody (and most materials are fairly close to being blackbodies), the same temperature will always give you the same color. A red-hot stove burner is the same temperature as a red-hot forge is the same temperature as a red-hot star.

If the principle has a name, I don’t know what it is, but I’d be surprised if I were the first to come up with it. But longitude isn’t actually a counterexample. You need to know longitude to answer questions like “If I sail due north from here, will I eventually hit Maine?”. And you can answer that question by sailing due north. It’s a really inconvenient way to measure longitude (as I said, there’s often a more convenient method), but it works.

There is the Galileo thermometer (though not actually invented by him). We have one and it works pretty well, though I’m not sure it could have been made as well back then.

I collect old cookbooks, and that indeed is a method in several for testing oven temperature (“quickness”): throw in a handful of flour and see how long it takes to scorch.

I was actually just thinking of this earlier today. Hope this anecdote makes sense. Yesterday I baked some biscuits from a can (Pillsbury, you know :o) at 350F, and everything went fine. I had noticed a bit of grease at the bottom of the oven when I put my pan in, but all went well. But tonight (having forgotten again about the bottom of the oven), I put in a frozen lasagna at 375F, and there was smoke all over the place. So whatever was at the bottom of the oven must have had a smoke point between 350 and 375. I thought back to what might have been the “whatever,” and remembered that I had made calzones a few days ago on a perforated pizza pan, with olive oil basted on the crust, and that was probably what dripped down and made the oven smoke a couple days later.

The main thing is, clean your oven before you heat it the next night. The secondary thing is, I figured this was essentially one way that pre-scientific cooks figured out appropriate temps for things.

Incidentally, I just want to mention that in pre-thermometer eras, people did have an idea of what sorts of foods required very hot temps, versus “could go for a variable amount of time at a medium temp,” versus “could sit quite a while in a cooling-off oven.” And they would plan their cooking/baking accordingly.

I don’t see your principle as being meaningful as anything which would be impossible to measure simple then would be seen as unnecessary. Hence the only things which are “needed” are the ones which can be measured. It becomes a circular argument.

Don’t know if they did it. But you can amplify the expansion and contraction of say a metal by having a long indicator arm with a pivot point resting upon it. Locate the metal close to the pivot point. The other end of the long arm would give a wider indication of small expansion and contraction. A concept that most ancients could have easily stumbled upon. Or something like it.

Actually. The change might still be too small to be of use with the long pointer scheme without further scheming. Looked up some expansion figures. Amber has a high coefficient of expansion.

If you really want to use expansion of metals, then you use a bimetallic strip. The long arm still helps, but you can save a lot of space by coiling up that arm. And then you have the type of thermometer used in most pre-digital thermostats.

I was thinking more the individual characteristics for firing. The bricks you see may be of a particular temperature judging by their color of glow, but the distribution of heat throughout the kiln will vary depending on how it is made - the center may not be as hot as the walls, so you will have to work out how long to fire your pots by trial.

Though come to think of it, that will be true if you are using cones as well …

In any event, before the invention of cones, I’m pretty sure they used this method.