I posted this 10 years ago.
Neither is sexagesimal (base-60) time to begin with… how can anyone not be confused by a system that breaks an orbit into 365/24/60/60/1000? It’s just bizarre any way you look at it.
Swatch once tried to make it all better. If they succeeded, it’d be @838 right now instead of 12:07pm.
Those are absolutely terrible – so much so that I think any car that still has them also has an analog display (at least optionally selectable). An analog speed display instantly and intuitively conveys where you are on the overall scale of speed, whereas a display of just a number requires more mental processing, even if it’s not intuitively obvious that this is the case – i.e.- it’s easier when distracted to make mistakes.
Oh, those young people! … No, I have actually not noticed anything of the sort.
It was not Swatch originally, it was the French and Chinese. Though there is nothing unnatural about base 60; the point is decimal time is revolutionary (cf. the metric system vs. Roman units of measurement)
Have to hard disagree there, but it could just be a matter of what you’re used to? Those gauges are absolutely useless for me, whereas the numbers can be instantly read and don’t require any thought…
I know many people who are not like that, though, and insist on the analog dial.
No, Swatch tried to invent (as a marketing gimmick) a different timekeeping system. They didn’t invent the base-60 measurements. (I thought the Sumerians did?)
I was referring to this
https://mathshistory.st-andrews.ac.uk/HistTopics/Decimal_time/
and this
Some facts … (OK, shamelessly cribbed from Google AI, but a good summary of what I believe about this issue):
-
The human brain processes information about objects’ spatial relationships very quickly and uses this for intuition. With an analog gauge, a person can process the needle’s position with just a glance, even using their peripheral vision.
-
Analog displays are better for detecting a dynamic change or the rate of change. For a driver, a rapidly sweeping speedometer needle provides an immediate, intuitive sense of increasing speed that a changing number can’t match. In contrast, a digital display requires the brain to read and compare a series of changing numbers to understand the rate of change.
-
An analog gauge displays the current value in context with the entire possible range, including normal operating ranges and danger zones. Seeing a needle positioned in the middle of a marked green zone immediately tells a person that conditions are normal. A digital display, which only shows the current number, requires the user to memorize and recall what constitutes a normal value.
-
The ability to process analog gauges using peripheral vision means less time with eyes off the road for a driver, which enhances safety. While a digital number requires direct focus to read, an analog needle’s position can be picked up more casually.
All that may be true, but that’s mostly extraneous information. All I need to know is “what number am I at right now, and how much higher is it than the number on the speed limit sign”. My car simplifies that even further, just showing a white icon for under the speed limit and red when you’re over.
The rate of acceleration, % of max speed, etc… that’s not useful information, just excessive detail.
If the current speed limit were somehow integrated into the analog gauge and you could easily see how close you were to that limit, like an overheat meter, that’d make more sense. But as a disjointed gauge, it’s not very useful since it’s your absolute speed relative to the limit that matters.
I think it’s fascinating that you prefer numbers to pictures. I once worked for an actuary who couldn’t read graphs. Maybe he was like you. He was an excellent actuary, and could process lists of numbers quickly. But he just didn’t get much from graphs.
I’m pretty sure you and he are in the minority. Lots of studies have found that people (on average) understand pictures faster than numbers. (And a speedometer is a well-designed picture of your speed.)
I’m sure there are individual preferences/differences involved, but even then, I wouldn’t say it’s generally true that I prefer numbers over graphics. I usually don’t, in fact. Given a table of numbers or data stream, I will absolutely try to pivot, summarize, and then graph it, especially when looking for time-series trends or correlations or cyclical patterns. Part of my job used to be just that, transforming large datasets into useful visualizations and summaries.
But for the specific cases of clocks and speedometers, where most of my needs are “are we at this time yet” or “am I at the speed limit”, a graph doesn’t do me any good when the target number isn’t itself part of the graph.
On Google Calendar, I can at least see “oh hey, the line marking the current time is almost to the next meeting, I better get ready”. And the big rectangle means it’s going to be a long meeting (sigh), relative to the rest of my day. There, the graphical representations are helpful.
In a speedometer, what is the graph of? 0 to some hypothetical, barely-reachable and certainly illegal max speed? Most of the time it hovers around the first quarter or third of the graph, and I have to squint to make out where the needle is exactly. Then I’d still have to read the actual number on the speed sign and calculate if I’m close enough to the limit to (probably) not get pulled over. Having a digital speedometer or overspeed indicator just makes that first part a bit quicker.
In other words, graphs are great where patterns matter, or where you can see the current position as part of some target whole. Numbers are more useful when you’re just comparing two numbers next to each other.
I can see how the tachometer gauge can be useful, because there the target RPM is an inherent property of your engine gearing, not an external limit.
If someone made a speedometer bar, like a progress bar, that went from 0 to current speed limit, I’d totally use that instead of the numbers!
If I was going to reinvent analog clocks, I’d have done it the other way: Start with a long fat hand for hours. Pointing at 12 numbers arranged around the circumference as now. When chronography advanced to the point minutes became relevant, add a shorter, thinner minute hand. Ditto when seconds eventually became relevant: add another even shorter even thinner had for that.
This makes much more intuitive sense. However, the benefit to the short fat hour hand vs a long fat hand is that you can still see the other hands behind it or in front of it. You’d have to have the hour hand in the back and the other hands a different color.
And if you grew up with analog clocks with overlapping hands, you see the invisible 5/10/15/… 50/55/0 marks.
If any user interface designer came up with a new interface that required the user to “see invisible marks”, they’d be fired, or at least ridiculed. Yes, we old timers can easily read analog clocks because we grew up with them and spent literally decades reading them, but let’s not pretend there’s anything intuitive about the interface. It’s objectively terrible.
Hebrew has invisible vowels. Invisible numbers (that make sense, because you can see the fraction of the hour) are really not problematic.
If someone made a speedometer bar, like a progress bar, that went from 0 to current speed limit, I’d totally use that instead of the numbers!
That’s really the punchline.
For most uses, the speedometer is really about answering “What’s my delta to the speed limit?” And a bit of “what’s my speed trend?”
Now that cars can know the speed limit, there’s no good reason not to display that somehow, and that somehow would be more intuitive if graphical (and color coded) than just some digits.
Invisible vowels are what makes Hebrew a terrible writing system. It’s why niqqud was invented.
If the invisible numbers were not problematic, young people wouldn’t have trouble reading analog clocks. They are quite obviously problematic.
If any user interface designer came up with a new interface that required the user to “see invisible marks”, they’d be fired
Tell that to my “swipe down and then swipe down again with more fingers and then swipe back up and left right left right” smartphone… I’m frankly surprised I haven’t accidentally summoned an ancient god when trying to set an alarm…
I agree completely. And output sensors that are noisy, a slightly vibrating needle is way better than a flickering number.
(P.S. Back to clocks - back in the 1990s I spent a few weeks at an out-of-town field office for work, and someone had built that office space a clock with the numbers and motion of the hands reversed. After a few days I could read that clock with ease, just like any other)
If the invisible numbers were not problematic, young people wouldn’t have trouble reading analog clocks. They are quite obviously problematic.
- it takes a little familiar to read any graph.
- many people seem to believe that the information they want is “2:58” , rather than “almost 3”. There’s no question that it’s easier to read off “2:58” from numbers. But in almost all cases, what you actually need to know is “it’s almost 3”.
There’s an “uncertainty principle” going on here. The digital clock tells you exactly what you need to know, but not what you want to know. The more you know exactly what you need the less you know what you want and vice versa. The Planck constant is probably involved.
many people seem to believe that the information they want is “2:58” , rather than “almost 3”. There’s no question that it’s easier to read off “2:58” from numbers. But in almost all cases, what you actually need to know is “it’s almost 3”.
I certainly agree with this. At least for folks like you and I who were raised on analog clocks that tell “qualitative time”, rather than digital clocks that tell “quantitative time”.
We inherently think of “almost 3”, or “a bit after 3” or “about a third of a clock face = 20 minutes before I need to leave.” That’s all automatic. No math, no shifting of gears; it just is.
But I wonder how much that’s a result of how we were raised, always viewing, comparing, and computing time on analog clocks.
For a youngun’ in the purely digital world, does the concept “almost 3” resonate with them at all? Or do they need to say to themselves “2:58. Hmm, that’s [subtracting] 2 minutes before 3. So close to 3, but not quite.”
I know I don’t know. But how we’re exposed to information sure affects how we think naturally about it.