The metric system: Make your case for it and why we should switch to it. (As if we haven't already)

It is also really easy to convert from Fahrenheit to Celsius in your head. 77 ℉ is 25 ℃, and each ∆5 ℃ is ∆9 ℉, e.g. 30 ℃ is 86 ℉, 35 ℃ is 95 ℉, 10 ℃ is 50 ℉, et cetera. Just memorizing the conversion on ∆5 ℃ increments gives you an intuitive feel for the Celsius scale, and after using it for a few days you can intuit it without having to do conversions, just as when you go to a foreign country you get an idea for what standard things cost in the local currency without thinking about currency conversions, e.g. anything 25 ℃ and up is “short sleeve weather”, above 35 ℃ is really hot, below 15 ℃ is cool, et cetera. The only reason there is any ‘intuition’ about the Fahrenheit scale is because it is what we are used.

And no, nobody can tell air temperature down to single increments of degrees of the Fahrenheit scale; the heat capacity of air (which is what you ‘feel’ as heat) changes with humidity, and even a modest change in humidity will alter convective change in still air by more than a perceived change in a single degree of the Fahrenheit scale, notwithstanding @FinsToTheLeft point that the temperature of your home is not constant throughout unless you have massive heat sinks (e.g. rammed earth or live underground) to mediate it.

Stranger

Exactly! :slight_smile:

What do you mean? An African or a European ox?

But you are mixing two slightly different senses of “definition” here. The original definition of the metre (the sense of definition that corresponds to a foot being vaguely the length of a human foot) was that the distance from the equator to the pole is 10,000 km.

Whereas the definitions you cite are references to accessible and stable benchmarks, and similar benchmarks are required for feet and inches - or would be required, if anyone used them for any serious scientific purpose.

I grew up in the U.K. in a period where my parents’ generation would definitely always talk about the weather in Fahrenheit, but the official transition had taken place, and for all my childhood the weather forecast was given in both scales. And I live in the U.S., one of the few places where you’re still exposed to Fahrenheit, but I travel a lot, so I’m exposed to both scales regularly. So I don’t think I was ever strongly biased by habituation, and perhaps I’m an interesting anecdotal data point on whether Fahrenheit or Celsius are more natural scales for everyday purposes.

The result is a little odd. Although I can interconvert fairly easily, I still have much better intuition for high temperatures in Fahrenheit. 60s is mild, 70s is pleasantly warm, 90s is a scorcher. But for low temperatures, I find Celsius much more intuitive, because zero is a good intuitive benchmark - I know how cold the weather is when ice forms.

I suppose the reason is that having 100 be the boiling point of water is not especially useful or intuitive in everyday life, because it’s outside the range of environmental temperatures that we experience. But the freezing point of water is a prominent natural environmental benchmark.

Likewise, I have no problems running laps on a 400 meter track, in preparation for a 10K, while wondering if my knees could stand up to another 26.2 mile marathon.

I agree that the granularity issue is bogus.

But the “decades” of temparature in Fahrenheit represent are a useful approximation, and perhaps that’s partly why I know I’m somewhat attached to them. I think 10 degrees Fahrenheit happens to be a very useful size range of temperature for casual conversations about the weather. “It will be in the 80s tomorrow”. 10 degrees C is too wide a range to use in the same way. Of course, people who use C all the time just use slightly different expressions that ultimately work fine, but I think there’s a genuine albeit minor point here.

Not to dispute your observation which is true enough, but that is really just a colloquialism that comes of being used to the Fahrenheit scale rather than any absolute utility. It is easy enough to say that the temperature will be in the mid-20s or lower-30s on the Celsius scale, with the ∆10 ℉ divisions being roughly equivalent to ∆5 ℃ ranges. And in terms of dress, activities, et cetera, the most applicable divisions are actually in 20 ℉ frames, e.g. 60 to 80 ℉ is t-shirt weather, 40 to 60 ℉ is jacket weather, 20 to 40 ℉ is heavy sweater or layering weather, and <20 ℉ is insulated coat/parka weather (for most people). These roughly correspond to 15 to 25 ℃, 5 to 15 ℃, -5 to 5 ℃, and <5 ℃, which are nicely centered on ∆10 ℃ increments. Where the finer divisions actually make more significance is 35 ℃ (95 ℉) and up as temperatures are then starting to get in the range of human tolerance depending on relative humidity and corresponding heat index.

Stranger

I doubt anyone would use a span of 10 degrees C as a casual range. A commonly-encountered range like “room temperature” would be 20–25 °C, and you could expect to see just that printed on a medicine label. As for a random weather report, I see phrases like “temperature in the shade will be 26 degrees around 20h… close to 20 degrees around 2 am… Friday morning, 23 degrees around 8 a.m”

ETA what @Stranger said

Also,

Fine divisions or not, whether it is 30 or 35 or even a lot more (Dubai average summer high is more like 40), nobody is going to reach for layers or insulation to wear.

Sure, but if it is 35 ℃ you still go for a 5 km run, but if it is verging on 38 ℃ (and certainly if it is 40 ℃) you’ll probably forego it, again depending on what the heat index value is. At those temperatures, a degree or two increase in Celsius makes a significant difference.

Stranger

Are required. The inch is defined as 2.54 cm exactly. Which makes it defined as precisely and concretely as the meter itself. And so on for derived units such as the foot, mile, and gallon.

As a general note, I never understood the lumping of common divisions of units with the system of units itself. The fact that inches are commonly subdivided by powers of 2 is unrelated to US customary units and their definition. Indeed, in any serious work, inches are not used that way: decimal inches are used, usually to a precision of 1/1000. What makes the metric system decimal is that there is only one base unit for each time, and a prefix system for creating variations. US customary units fail that due to the numerous base units for the same measure–not because most (but not all) US rulers have fractional inches.

I agree with the OP. The US has gone metric in a large number of ways, but has only done so when that conversion has made sense (a very few things like 2 liter bottles of soda notwithstanding).

Science, math, engineering, auto repair, etc. It makes sense to standardize these things and it has largely, if not completely been done. But whether to measure a speed limit in mph or km/h? It doesn’t benefit the public to make that conversion. To be like the EU and outlaw small farmers from selling pounds of tomatoes and mandate that they sell them in kg? Why? What is the point other than to pander?

I don’t see where the US or any country would suffer by having purely internal measurement systems like a speed limit, even if that country used furlongs per hour.

Centigrade is not actually part of the Metric system. F is just as “metric” as C. The issue with C is that temps can easily dip below 0 but not so much with F. How often do you really need to know the boiling point or freezing point of water? You just cook until it boils, freeze until it freezes.

The issue to me with the metric system is that the French deliberately designed it to piss off the British.

And, for what it is worth, the USA is on the metric system, since food, etc are labeled in both. Two Liter bottles, for E.G.

And Celsius specifically asked that the Centigrade system not be named after him, and he put freezing at 100 and boiling at 0.

What the hell is a meter and why is it 39 inches or 100CM?

I’m laughing out loud at the pro-Celsius arguments to the effect that “you don’t need the precision of Farenheit.” Really. LOL.

Yes, there is a difference between setting my AC at 74 degrees or 72. Quite a bit.

Where I live, we have winter and summer. 100 degrees, that’s the hottest day of the year. 0 degrees, for a high, that’s pretty well the coldest. 32 is freezing, but in January, that’s mild. 100 to 0, hottest to coldest in common use, seems to work well for me.

Boiling point of water is arbitrary scientifically and useless practically. It’s 100 Celsius outside, I’m dead. My stove goes far higher than that.

I completely get why science and industry adopted metric. In cooking, halfs and quarters are more useful than 10ths. Otherwise, metric rules. Except for Celsius, which is inferior. But it got sold politically with metric, so metric adherents feel compelled to defend it.

I agree with you that Fahrenheit is better for human measurement than Celsius. But back to my argument, even if we assume that the other posters are correct, what you get is that both scales, at best, have equal value.

So why spend money, time, and energy to change hearts and minds in the US to convert to Celsius when all it affects is how the internal populace reflects temperature? When there is no good reason, that is when people start saying it is a global communist conspiracy to destroy America.

The argument for metric has actually gotten weaker since the 1970s as its number one argument (ease of conversion) is defeated by the fact that almost everyone has a smartphone in their pockets to easily convert feet to miles to kilometers to centimeters to inches.

It did not really get “sold politically” [do you have references for such a thing?], nor is it an official SI unit. The point is that, the way temperature used to be defined, it was a practical laboratory standard, at least in the early 18th century, with round numbers which one could realize by melting + boiling pure water. Hardly “inferior” to the half-a-dozen similar scales in use around the same time.

The proposed good reason is, that way everybody uses the same scale. Including Communists, I suppose.

ETA changing “hearts and minds” does not cost money, time and energy— what does is changing the readout on all the thermometers and updating all the documentation (the latter may be cheap if posters are correct that American manufacturing already uses metric units anyway)

For consistent baked goods, it’s metric all the way. There is just no substitute, as cup measurements have too much human error involved. If you doubt that, spoon dip a cup of flour out of a bag and scrape off the excess; weigh it. Now spoon a cup out of the bag, scrape and weigh it. Finally, spoon a cup out of the bag, pushing the flour down slightly, then scrape and weigh. Compare that to weighing out 120g of flour each time. Even small differences in weight can mean the difference in a nice biscuit and a hockey puck.

I don’t think those words mean what you think they mean.

Stranger

Fun trivia:

In a more up-to-date temperature scale like ITS-90 the boiling point of water is not used as a fixed point. It goes from the triple point of water (0.01 °C) to the melting point of gallium (29.7646 C) and then jumps all the way to the freezing point of indium (156.5985°C ). Pure metals are easier to work with than boiling water, which would require a difficult pressure measurement (not that the pressure and thermometer depth are not taken into account with metals).

My gal is red hot. Celsius or Fahrenheit?