Time is so we can have a three minute egg.
That’s so not what I’m saying. I understand the rest of the world uses metric. I understand its applications in math and science. I understand it has tremendous uses.
But for me, inches, feet, and yards work. Pounds and ounces work. It’s something I don’t have to think about. Of course we should use metric. But traditional measurements work too.
Any arbitrary system of measurement can work. But some work better than others for reasons stated here.
Right - and by doing so you erase any inherent advantage to metric. Metric is only “better” if you’re using it for its mighty decimal conversion advantages. If you’re just using a metric unit as a size on its own then it becomes no better than any other arbitrary unit - or more specifically, its quality as an arbitrary unit become based entirely in how useful the unit is for what you’re using it for.
The fact you have 1/2, 1/3, and 1/4 liter measures in use is reality telling you that deciliters, centiliters, and milliliters are not useful for the task at hand, and that having an official commonly-used unit sized at a 1/2, 1/3, or 1/4 liter would not necessarily be a inconvenient thing - until you started using metric as metric again and the nondecimal sizes failed to convert, anyway. (A problem that persists when you use the fraction-of-liter measure - what’s a third of a liter in milliliters? Be precise.)
Personally I have a hard time manually dividing something into three pieces, but I can see the value in having an official way to declare you want/have a third of something (which metric totally lacks).
As for why we don’t break down marathons into yards, it’s the same reason you don’t express height in nanometers: clarity and convenience.
It is when somebody’s claiming that the only reason standard people think metric has too many weird decimals is because they’re using it wrong, which was the terrible argument I was responding to.
Why on earth would I want to decimalize? What advantage would that give?
Using centimeters is “perfectly cromulent” if by that you mean “less cromulent”. There’s a simplicity to a system that implicitly categorizes people into normal, tall, short, really tall, and really short with a single digit, while providing a quick one or two digit gradiation to apply a touch more precision at will.
Meters and centimeter are great for some things, I’m sure - expressing human height isn’t one of them.
I agree. One could make a case for Kilograms, Kilometers and Celsius for their own uses – they’re not too much worse or better than Imperial – but Feet and Inches are perfectly gradiated for human height.
The proper measure of a man?
And our Imperial measures are not all the same as American ones. American pints and gallons are smaller than British ones.
Wait, what? How is comparing someone who is 5’ 7" to someone 6’ 3" better than comparing 170 cm to someone 190.5 cm? True, I don’t visualize in metric now, but with use it would become second nature and I could picture those heights easily. How do you compare relative heights using English measurements?
What single digit are you talking about? I don’t see one in use anywhere, unless you’re talking about inches and are assuming everyone is in the 5 foot range.
I grew up with English measurements and have used them all my life, but saying that it’s implicitly better is really a weak argument. It’s familiar, and it’s tradition; those are the only real arguments I’ve seen presented.
Let’s be fair and assume that nobody’s ever going to claim to be 190.5 centimeters; they’d round that off the same way we do inches.
Which leads directly into what makes centimeters a poor unit here - they’re unnecessarily precise. You’re comparing 170 to 190 - you could knock a syllable off each just by using decimeters and get closer to reasonable, but nobody uses decimeters because they’re satanic or something.
The other advantage to the feet/inches thing regarding human heights is that it actively encourages rounding, in a situation where rounding is generally advantageous. Me, I’m probably around 6’ 3/4" tall. I call myself 6’1" if I’m feeling precise, and I call myself 6’ if I’m not, because that’s close enough for virtually all purposes.
It’s hard to get more efficient than “six foot” - that’s two syllables, including the unit. And the vast majority of heights can be expressed in two or three syllables - or you can fudge it to a height that does. “Six foot.” “Five five.” “Five two.” “Four foot ten.” (People in the five foot range can always leave off the unit; outside the range it varies based on context.)
Now, I don’t know how brits refer to their height, but if they so much as mention the units they’re pretty much guaranteed to be using more syllables right there.
Good grief, the amount of effort you’re putting in to a complicated rationalization of the logical superiority of the familiar arbitrary system is just astonishing. Do you imagine that Europeans are all out of breath because they sometimes need one or two extra syllables to communicate their height? Perhaps we should change “U.S.A” to a one syllable name to remain competitive with France?
:dubious: We’re discussing whether one system of units is marginally more suited to measuring a specific thing. Is anybody going to say you can’t measure a human in centimeters? No! Just that one system is marginally more suited.
And when talking about marginals, we’re obviously going to be dealing in minor details here. Goes with the territory.
Unnecessarily precise? That’s the argument?
Might as well switch to Celsius then. Temperature in Fahrenheit is unnecessarily precise.
Since you brought it up, Celsius always struck me as being poorly scaled for temperatures humans live in. It’s great if you’re, say, boiling water. If you’re a human living in boiling water you’re probably having a pretty bad day. Fahrenheit spreads a reasonable number of units across the temperatures people are likely to be setting their thermostats to, and has 90 degrees at just about the point where things are going to start to suck, and 100 degrees at just about the right point to mark where things start to really suck.
All this by complete coincidence, of course, and the lower 50 degrees of the Fahrenheit scale don’t really line up with anything sensible. And in this case I’ll cheerfully agree that sure, maybe having a bit over half as many gradations on the thermostat also works awesome. I haven’t tried it. It’s not nearly so obvious a difference in utility as the difference between feet and centimeters, but then the size difference between the units isn’t nearly so big either.
Yeah, it’s sad the effort being put into defending feet/inches for measuring human height.
Sad that the only refutations provided are to resort to a rather weak ad-hominem, you mean.
Or perhaps it’s sad that people think that basic logic, analysis, and debate count as any significant as effort? That posting on a message board does? I suppose it’s possible that there might be people out there who do find such things difficult. Perhaps they don’t have but one finger and have to hunt and peck, nothing but hunt and peck, forever…
That is sad. ![]()
I’m not sure I see the ad-hominem? Your argument so far has been that somehow using feet and inches (two numbers with different units) is simpler than a single number that typically ranges between 150 and 200. You’ve yet to explain what’s simpler about English other than things that we’re familiar with seem intuitive.
It’s not as bad as feet vs meters though. With C vs F I can’t tell in any given circumstance if the temperature is 1F too hot or cold, or if I just need more air or time to cool off. But otherwise yes, 1C is pretty much the maximum threshold for detectable temperature difference: sometimes it’s smaller, like Fahrenheit. But on the other hand if it is 1C higher than what I am used to then I know it isn’t just me.
Well, “simpler” in terms of its raw numerical representation no, but “simpler” as in “more intuitive” arguably yes.
And that’s not because “foot” and “inch” are somehow more “natural” units in themselves, but rather because the human brain tends to process quantities better by breaking them into a small number of uniform largish chunks, with some easy fraction left over, than by parceling them out into a large number of uniform small units.
Which is why we have typically gravitated in our traditional metrologies to counting discrete objects out by, say, scores or dozens and a-few-left-over, and breaking down weights into pounds and ounces, or lengths into cubits and fingerbreadths, and so on.
The “Few Bigs Plus a Few Littles Left Over” type of metrology may be more cumbersome in terms of computation than the “Large Integer Number of Littles” type, but we seem to process it better when it comes to estimation.
Please explain how.
5′11″ = 180. Which is better and why?
A week is about how often you have to got to the market.