Are there any at home ways to test if my body thermometers are registering the correct temperature?

When you describe it as a pointless hypothetical project it gets a lot more interesting :slight_smile:

What comes to mind right now is that you are just making another thermometer that will need calibration. Electronic thermometers use the varying resistance in materials according to temperature to get a very good reading and can be calibrated for consistent and accurate reading across a wide temperature span. A thermometer based on the phase change of a substance might work well on the same basis, a tiny tube filled with palm oil or gallium might show a distinct change in resistance at the melting point of it’s contents that provides a very consistent reading for particular temperatures. Any thermometer like that still needs refinement to deal with unsteady temperatures and the time for the sensor to change temperature.

Gallium melts at 85.58 °F. And it’s pretty cheap. If you can figure out a way to make a solid-liquid slurry of gallium (like ice cubes in water), you can use it to calibrate the thermometer.

On edit: looks like Napier beat me to it.

Yea, the “proper” way to calibrate one of these thermometers is to compare it to a calibrated PRT or SPRT in a stirred liquid bath. We have some at work. A standard thermistor would also work.

Found with a quick Google search (and unabashedly plagiarized):

I’m an analytical chemist, here’s how I would do it. Take a cooking thermometer, calibrate it and check the medical thermometer against it. This is why chemists (or students taking college chemistry) need to get good at algebra. It’s not too difficult. But it’s non-trivial. I’m going to work in Celsius so set your thermometers to Celsius mode. (It makes the calculations a tiny bit simpler)

Boil some water and check with the cooking thermometer. Record the temp. Now take some ice and stir it with the cooking thermometer until the temp stabilizes. Now take the linear equation:

At+B= T

where A and B are unknown constants your solving for, T is the temperature of the water bath and t is the reading on the thermometer.

You will end up with a system of two equations, in unknowns A and B. One for 100 degrees Celsius, the booking water, and 1 for 0 degrees Celsius, the ice bath. Let’s call The the reading in the hot water and Tc the reading in the ice bath. So, 100A+B=Th

0A+B=Tc —> B=Tc We now have pinned down Tc

100A+Tc=Th ----> (Th-Tc)/100= A Now we know A

(Th-Tc)t/100+Tc=T Now we substitute A and B into the original equation, and we have and equation for the measured temp in terms of the actual temp. But we need the actual temp in terms of the measured temp. So we have to rearrange things a bit.

T-Tc=(Th-Tc)t/100 We just move subtract off the Tc moving it to the other side

100(T-Tc)/(Th-Tc)=t Now we have the actual temp in terms of T the measured temp.

So the final equation for calibrating the cooking thermometer is:

100(T-Tc)/(Th-Tc)=t

Where

T is the temperature you measure, or , read off the cooking thermometer.

Th is the temperature you read off the cooking thermometer in the he boiling water

Tc is the temperature you read of the cooking thermometer when you are stirring the ice bath (make sure you stir it with the thermometer)

t is the actual temperature of whatever the cooking thermometer is measuring, corrected by the calibration you just did.

Now we put it all together.

Take a lot of water and bring it to a good boil.

Measure the temp with the cooking thermometer. This is Th . Write it down, even if it’s exactly 100.

Now take a cup, of crushed ice.

Stir the ice and water with the cooking thermometer.

The temp you measure is Tc. Write it down, even if it’s exactly 0.

Now take a cup and put some tap water in it. styrofoam cups works best as the temp will stay more stable. Heat the cup in the microwave in 30 second bursts until it’s between 37 and 38 degrees C. If you get it to hot you can stir in some cold water until it cools down enough.

Now, measure the temp with the cooking thermometer. This is T, write it down.

Now measure the water with the medical thermometer. Write that down too.

Now, plug those numbers it into the formula we derived,

100(T-Tc)/(Th-Tc)=t

Compare that to what the medical thermometer says.

How badly do you want this ? :wink:

I recently had some problems with medical types being dismissive of home thermometers. If I had been more confident of my home thermometers reliability and accuracy, I could have pushed the medical people more. As it was, I assumed they were right, because I did not trust the accuracy and reliability of my thermometers.

Three home thermometers all report different things. A child with a wildly swinging fever. What I see on my end is a child with a temperature between 99 and 107, while the “control” adults have temperatures between 97 and 99. What the medical people see is a child who is clearly ill, but with a temperature of 99-100. So, make sure the kids gets plenty of fluids, and check back in a few days if she’s not better.

Repeat the previous thing for two weeks. Finally the doctors see a temperature of 106, which prompts all sorts of excitement involving lots of tests and a hospital admission. (A kidney infection, which was treated by antibiotics, and everybody is fine.)

So, accuracy and reliability are important. Did the 106 I was seeing really mean 102, and no big deal, some acetaminophen and plenty of liquids, or was it 108 and get to the ER? Did the changing temperatures mean the thermometer was unreliable, that the temperature was changing hourly, or both? How am I supposed to be anything other than a helicopter parent repeatedly going to the doctor with a kid with a 100 degree temperature, if I don’t trust my own measurements? If I had trusted that the 106 I’d measured was real, then I could have confidently pushed the doctors to do some tests, instead of thinking I’d just been overreacting.

Thermocouples are not necessarily linear. I used to calibrate Type N, K, R and S probes for exacting temperatures (one example —be in a 50 deg F window at 2600F) and there was always a curve between the certified source and the reading device. I could nail the end temps (2100 & 2800 fer instance) and be 5-7 degrees off at 2540.

I also build and calibrate thermocouples, and I don’t understand what you’re saying here.

Yes, the raw voltage vs. temperature curve for a thermocouple is not linear. But the readout device knows this, and uses a high-order polynomial to convert voltage to temperature. When performing a comparison calibration - e.g. between a thermocouple readout and SPRT standard - a temperature metrologist is only concerned about the error at each point.

This is simply writing out in an incredibly convoluted way what I said above at post #2 - check the error on an ordinary thermometer at boiling and freezing temperatures, and assume any error is linear for the intermediate temperature. It’s nothing more complicated that saying if the thermometer reads 4 degrees too high at boiling, and 2 degrees too high at freezing, it’s probably going to read about 3 degrees too high halfway between. The arithmetic is trivial and does not need to be done to any great precision, and if you don’t already have the basic numeracy to be able to do that calculation in your head in a few seconds, this verbose overly-complicated account isn’t going to help you.

The greatest potential source of error in this approach is not the difficulty of the algebra, it is failure to account for pressure deviation from sea level in the boiling water reference point.

I ran some tests using water and my cooking thermometers. I have three kitchen thermometers, but only the Thermapro seemed to really be accurate. My three oral thermometers all seem to be pretty consistent with each other when using the water bath, but have some variation when actually taking an oral temperature. This leads me to believe that the variations I’m seeing in the oral thermometers are due to variations in where it is placed under the tongue.

Boiling point based on my pressure/elevation: 211.44

Cooking Thermometers:

Thermometer Boiling Freezing
Thermapro 212.2 32.4
Reita 210 35
Oneita 209 Lo

The Reita and Oneita are regular probe thermometers and they didn’t seem very accurate with all the other tests, so I excluded their values.


Water immersion test I used a Yeti mug filled with water at different temperatures and put the thermometers in all at the same time.

ThermaPro 105.3 103.8 102.6 101.5 98.8 98.1
Oral1 104.7 102.7 101.8 100.7 98.3 97.2
Oral2 104.7 102.7 101.9 100.7 98.3 97.2
Oral3 104.7 102.7 101.9 100.7 98.3 97.2

The oral thermometers are all pretty consistent with each other.


Oral thermometers all in my mouth at once

Oral1 97.4 97.4
Oral2 97.2 97.5
Oral3 97.5 97.4
Touchless 97.3 97.3

A little more variation. Seems to depend on where the thermometer is placed.


Oral thermometers one at a time. In this test I just did used one oral thermometer and compared it to the touchless

Oral1 97.2 97.4
Oral2 97.2 97.5
Oral3 97.3 97.6
Touchless 97.3 97.3 97.3 97.3 97.3 97.3

I wasn’t able to test the touchless thermometer with the water since it did not get a reading. I guess something about the way it uses IR for the temperature doesn’t work with water. I’ll have to wait until someone has an actual fever to test the touchless thermometer to see how it compares to the oral thermometers at different temps.

When I did a similar test with my two oral thermometers, I had the Thermapen at 101.9, one oral at 102.2, and the other at 104.2. The difference between those two oral thermometers is the difference between “contact doctor within 24 hours” and “seek care now” on the fever guide I was referred to.

Just my experience of 20 years in industry doing the weekly calibration verification and adjustments on the dozen or so units we used. Never had one completely linear.

How sure is it that making something accurate at the freezing and boiling points of water would mean that the range in between would also be accurate?

Personally, since this is about how doctors see it, I would calibrate against their thermometers. I’ve brought my blood pressure cuff there to make sure it aligned with their measurements, so I see no reason not to do so with thermometers. Measure yourself at the same time they measure you, and compare. (Though, admittedly, this might be harder in current times, since you’d be wearing a mask, and underarm temps are less precise. I guess it depends on how they check your temp at the office.)

Though, personally, my home thermometers tend to be consistent within a couple tenths of a degree. If one is wrong, it seems to be an outlier, and due to it being rusted inside. So I had assumed that thermometers would be tested and regulated to be accurate, unless they were malfunctioning.

I agree it does matter when the difference between what you should do depends on it. Especially now that we don’t want to unnecessarily go to the doctor.

There will always be error bars associated with measuring devices. The question is whether the error bars are close enough in the range you need to be close enough to be useful.

PoppaSan mentioned above that temps could be nailed at the end points but be off in the middle. I admit my guess would be the reverse such that temps in the middle would be pretty accurate and the error bars would move as you got to either end. Goes to show the importance of expert advice.