Mandarin Chinese is the craziest language ever! I can’t understand a word of it.
OTOH, English is perfectly natural. I’ve been using English since I could talk, and the ease with which I converse in this language is a fair indication of its inherent superiority.
Don’t even get me started on how arbitrary Mandarin is. Of course you’d call a tree a “tree”; why make things more complex by calling a tree “mù” instead? How does that even make sense? It’s like the Chinese went out of their way to make Mandarin confusing.
Right, but which quirks you are willing to forgive is largely a matter of personal, subjective experience.
Obviously people like myself in Celsius-land have no difficulty living day-to-day thinking in, communicating in and comprehending Celsius temperatures–this demonstrates that it is a natural scale to work with for those of us used to it.
Similarly, to me Fahrenheit seems arbitrary and unnatural. But at least I can recognise that my impression is rooted in a lack of familiarity with the scale–funnily enough, there are many people for whom Fahrenheit seems A-OK.
The argument that one scale is superior because of some completely subjective set of criteria based on nothing more than a parochial outlook is, frankly, inane.
As an example: Living west of Sydney, Australia, temperatures typically range here between 5 and 30 degrees Celsius, going as high as the low 40s on very hot days. However, where I am, below freezing–below 0–is very unusual. So 0-40 is really the scale I work with, and especially for the climate I live in, very, very natural.
0: frost outside
10: cold
20: reasonable
25: pleasant
30: hot
40: summer scorcher
0= about as cold as it gets for the places most people live.
30’s= freezing to cold
40’s-60’s= cold to cool.
70’s= comfortable.
80’s-90’s warm to hot.
100’s+ = damn hot.
The only difference is the digits.
If one system were intrinsically better than the other I’d expect more posts of people saying “I didn’t start using the Fahrenheit scale until recently, but I definitely find it superior” or whatever.
But the statement “The system I’ve always used is intuitive; yours is weird” is empty.
I think both Celsius and Fahrenheit are bizarre. Why do they both use that stupid counting method that only has 10 numbers between one and ten? How dumb is that!
I grew up in and live in the US. I’m used to the Fahrenheit scale. When I travel to Celsius countries, it’s confusing to me, but that’s only because my brain is calibrated to°F.
Logically, however,°C makes more sense. 0°C is the freezing point and 100°C is the boiling point for water. Perfectly logical. If we were to convert, it’d be an adjustment for a few weeks or so but we’d eventually get used to it. After a few years, especially after some very cold winters or very hot summers, our brains would be fully calibrated to°C.
If people want smaller increments, well, that’s what tenths and hundredths are for. And that applies to both °F and°C. Need half a degree? Okay, that’s a .5 in either°F or °C. No big deal.
Thanks, Qagdop, this is helpful. How hot is it on Mercota now?
I’m used to Fahrenheit. But I’ve been abroad where Celsius was used. It doesn’t take long to get used to. Celsius temperatures for the weather were often reported with a decimal place making it more accurate than Fahrenheit, and more accurate than I needed.
I think I can tell a difference in the temperature of my Dad’s house when he changes the thermostat by just a degree or two, because he typically has his house set at that temperature that is just a bit too cold for my taste so I’m especially sensative to it.
But in any other temperature range, I’d never know the difference between a single degree (97 or 98F is just fucking hot, 33, 34, 35F is pretty cold). But for some reason, right around 62 or 63F, I can feel even a single digit change in temperature. Or maybe it’s all in my head?
I’m not sure ‘twice as hot’ is so terribly wrong. ‘Hot’ itself isn’t really a scientific term anyway.
If someone describes, say, molten sugar, at 185 Celsius as ‘nearly twice as hot as boiling water’, for example - I don’t think that need necessarily be considered invalid or stupid.
I realise it’s not twice the absolute temperature, and I realise the comparison doesn’t work in Fahrenheit.
But in the context of comparison to water (which the reader will regard as ‘not hot at all’ at freezing point), and in the context of the specified Celsius units, the statement has meaning.
Ooh, that gives me an idea. You could have a temperature scale with “0” being the average temperature of human skin. An object of temperature 0 would not feel hot or cold. Objects with negative temperature would feel cold, and those with positive temperature would feel hot. Then, you could actually say with some degree of accuracy that an object with temperature 20 degrees at least feels “twice as hot” as one that is 10 degrees.
Of course, skin temperature probably varies between people and at different times of day, so you’d have the inevitable crowd of whining people all complaining that they touched an object at -1 degrees and it still felt hot to them.
I like this idea, but I don’t think our perception of temperature scales with its numerical value. Maybe in small regimes (that is, close to body temperature) this is the case, but +100 and +300 will probably feel equally hot. Actually, +300 may feel colder than +100 due to complete obliteration of the nerves.