But that’s again an arbitrary distinction based on familiarity. Sure, 100 is about as hot as you’d want. But 0 is colder than you’d really want. Celsius is the reverse. 0 degrees as freezing makes sense. That puts 40 degrees as hot a day as you might expect. “40” is a bit arbitrary but a range from 0 to 40 or even -10 to 40 as “reasonable” is totally “natural”.
The range of comfortable metric temperature is no less arbitrary but a degree Celsius is more ‘natural’ in terms of detectable difference to a human being than a degree Fahrenheit. Again, the only reason to really prefer Fahrenheit gradations is familiarity, not any natural distinctions.
Likewise, a centimeter is a pretty good unit. It’s smaller than an inch but still large enough to produce a reasonable range of heights. 150 to 200 fits the vast majority of human beings. A 50 centimeter range makes plenty sense. Perhaps it’s because I have family in other countries, but it comes naturally to them. The foot/inch thing doesn’t.
It’s vastly easier for me to convert to centimeters than for them to convert to foot/inch.
And that’s probably the best way to tell. It’s easier to go from the units I’m used to to metric than vice-versa. I’ll admit a metric height doesn’t make intuitive sense to me. Nor does temperature in Celsius or weight in kilograms. But that’s on me. It doesn’t mean the system I’m used to is more ‘natural’. It just means I haven’t spent decades using it to the point it feels more intuitive.
In don’t think your thermometer reading 32 when mine reads 0 makes it more precise.
In fact, the thermometer on my wall shows both scales. It’s a cold day here and I’ve just aired the apartment: the temperature is almost 20 degrees on the Celsius scale and almost 72 on the Fahrenheit scale. In what respect does the Fahrenheit scale offer me a more precise reading?
I think it’s just a matter of resolution: the resolution of the Fahrenheit scale is indeed higher, which doesn’t make it a lot more useful in everyday life because one will barely sense the difference between 32 and 33 on the Fahrenheit scale whereas the difference between 0 and 1 on the Celsius scale is noticeable.
I for one don’t mind using both scales at the same time.
An inch is a meaningful unit of difference. A foot is a meaningful unit of difference. A centimeter and a meter are too large or small to make a difference in people’s height.
Like I said earlier, it isn’t just my familiarity because distances in kilometers, temperature in celsius, and weight in kilograms make approximately equal sense as their Imperial counterparts.
As the guy who apparently started this argument (then sensibly ducked out for awhile), my point was that, even though traditional American measurement systems are the only ones I’m familiar with, trying to explain them to my son when he was young really brought me up against the unnecessary complexity and arbitrariness of our systems of measurement.
I’ll be thinking in feet and inches, in ounces and pounds, in quarts and gallons, until the day I die. But I couldn’t help but think it would be a hell of a lot easier explaining metric measurement to a four or five year old kid. (And of course, if he’d grown up thinking in terms of liters and centimeters and kilograms, they’d be just as natural to him as pounds and quarts and feet are to me.)
An inch is small enough that I never notice that small a difference in people’s height.
What is too small a difference for one person, isn’t necessarily for another. People were talking earlier about 1°F being too small a difference to notice. I may not notice it right away, but the difference between 72° and 73° at my desk is something I will notice over the course of a couple of hours. (I’d love to find a home thermostat that I could adjust at increments of half a degree Fahrenheit. Often a whole degree is a bigger adjustment than I want.)
This isn’t an argument that Fahrenheit is superior, it’s just that different gradations of measurement are meaningful or not to different people. Just because inches work better for you, doesn’t mean they work better for everyone.
My wife keeps trying to squeeze our gas bill as low as she can by shaving Fahrenheit degrees off the thermostat setpoint. One day I noticed she had got it down to 60°F!
I thought I’d outsmart her so I switched the controls to Celsius hoping the larger increments would be more noticeable…Nope.
I was reading an article yesterday which made me think of this thread. It talked about how the standard of time was changed in 1967. For over fifty years now, our time system has not been based on astronomical events like the rotation of the Earth, the phases of the moon, or the revolution of the Earth around the Sun. Instead our time system is based on the vibrating speed of a caesium atom.
One second is no longer defined as 1/86,400 of the amount of time it takes for Earth to make one rotation or 1/31,556,925.9747 of the amount of time it takes the Earth to make one revolution. A second is now defined as the amount it takes a caesium-133 atom to vibrate between two hyperfine states 9,192,631,770 times. All of our other time measurements like minutes, hours, days, weeks, and years are based on multiples of this amount of time.