I’ve noticed this several times before, but a segment in a recent game show (The Floor) really highlighted it.
For those with too much taste to watch low-brow game shows, the main action is two players face off, each with a bank of 45 seconds. Some image in a given category (like Fruits or Things in a Junk Drawer) is shown on a screen, and one of the players must identify whatever it is while their time clock counts down until they’re right, and then the other player gets a new image (and their clock starts running.) A player can guess multiple times (a beaver! an otter! a capybara!) until they either get it right or until they give up and say Pass. When you pass, there’s a five second ‘penalty’ before a fresh image is shown for you. Whoever’s clock gets to zero first loses and is eliminated from the game.
Anyway, I forget how the category was labeled, but the ‘challenge’ was merely saying what time was being shown on the image of a clock face! You know, just say “3:35”, “10:00” and so on. Maybe one in ten images was a digital clock. The times on all of those were higher than 12:59 but the players announced them converted down, like 16:30 converted to 4:30. We didn’t hear that as a ‘rule’ but it was so consistent it must have been included in the instructions they were given. So the “challenge” was to see if you could subtract 12 from what you saw?
But the overwhelming majority of the clocks shown were traditional round faces with two hands type. And the players frequently had trouble with it! If the time was close to the hour, they often would say “1:55” for “2:05”. And if the minute hand didn’t point exactly at a number they’d do a similar +2 minutes versus the correct -2 minutes. And they even screwed up when the hands were pointing exactly to digit, like something;45 o’clock.
It was like they were having to dig up and apply arcane-to-them rules and work it out mathematically. “Uh, it’s three quarters of the way around the clock. Hmm. An hour has sixty minutes so I guess that means…X:45!”
Have digital clocks taken over that fully that ‘at a glance’ time reading off a traditional clock isn’t wired into their minds the way it is in mine? Have you seen this in young people who otherwise seem perfectly normal intellectually?
Is it even limited to young folks…? I’m 41 and still struggle with analog clocks. I replace them with digital ones whenever I can. To me, the analog ones are more decorative elements (like grandfather clocks) than actually useful timekeeping devices.
I still know how to read them — the same way I can kinda sorta still read cursive — but it takes me 5-10x longer, with a much higher chance of error.
It’s not just clocks, either. I also struggle with cars that only have analog speedometer gauges (instead of the simple digital MPH display).
I literally go years at a time without needing to try and read an analog clock face. Been more than 20 years since I’ve had an analog clock in use at home and have literally never worn a non-digital watch (at almost 53 years old). I probably take a couple of seconds to figure it out, too.
I recently watched a TV episode where someone was taking the clock test for dementia (he failed) and I tried visualizing drawing one myself, and had to think for a while to remember if the hour hand was the big one or the small one.
(And that brings up a related point: how long before the clock test is too antiquated to be useful?)
First time I’d heard of this, and I already failed. Tried to draw the 1 where the 12 should go. Then I immediately thought, wait, is there even a 12 or is it 0? Would have to trial and error the other numbers and degrees. Something about right angles and 45s, I think, but can’t remember for sure… it’s one thing to be able to read them, but damned if I could draw one from memory.
Side story: I once failed a job interview because I couldn’t write a computer program to animate an analog clock. I had no idea how the hands work together (are they linked somehow?) and the motion was all wrong. They’re antiques…
Hah. I just realized I have a similar problem but in reverse! Most of the time when I look at a clock I don’t really want to know what time it is (and in fact, won’t be able to answer "What time is it? right afterwards) because what I wanted to know is “Is there enough time left to do X before I must start Y in order to be on time?” For example, I need to leave for my doctor appointment at 10:15, do I have enough time to vacuum the floor first? and the answer I got from the clock is yes or no or it’d be risky, not an actual time reading.
And for me in those situations, a digital clock reading doesn’t work well. I must be out the door by 3:15, and it says 02:50. and it’ll take me about 12 minutes to get the vac out and do the kitchen floor… and then I have to work out mathematically if there are indeed 12 minutes available, while the answer would be obvious on an analog clock.
Interesting! I think some cultures/languages default to telling time by chunks of 15 or 30 (quarter or half past/before the hour). I can see how that’d be useful when combined with an analog graphical clock.
In my life there’s never really enough time to finish anything… the precise numeric time just lets me know when I must interrupt whatever I’m currently doing in order to move onto the next thing, which will soon be interrupted by a third thing. The day is just a series of precisely-timed interruptions and Google Calendar reminders. How times have changed…
Of course they are - anybody would take longer using an unfamiliar method. I might be able to read an analog clock as quickly as a digital one, but I once had a job where if the time clock punched “1450” it meant 2:30 pm. Always took longer for me to interpret that clock , between the 24 hour time and the hour that was divided into 100 segments.
In the Pros and Cons in your link there is a link to an Israeli study of the test using digital clocks, meant to address the younger generations. It’s something being studied. I can see the analog clock test aging out eventually.
And I think people who struggle to read analog clocks do so because they’re trying to convert the hand positions to the “digital time”. It’s a lot easier to just think of the minute hand as a sort of progress bar for the hour (or a progress bar for whatever you’re doing if that doesn’t end on the hour). So for a high school student, for example, it would be more useful to think “I know it’s 1:something because I’m in math class, and class is over when the minute hand gets to the 10”, and just quickly glance at the analog clock to see how much time is left.
(Credit where credit is due, I got the “progress bar” analogy from the host of the Technology Connections YouTube channel)
This is how i read clocks, too, and it’s valuable enough to me that my phone displays an analog clock on the home page. I also made my kids use analog clocks when they were little, so they would learn that skill.
But sure, i know young adults who don’t know what direction “clockwise” is.
I will say that if you were designing a clock for the first time, without any restrictions, I doubt you would design the analog clock. It is not designed with human user interface in mind.
Just think about it, for a three handed clock at 3:58:30 we have this situation:
-The second hand points at something labeled “6”
-The closest thing the minute hand points to is “12”
-The closest thing the hour hand points to is “4”
None of these values are correct! What a user interface disaster!
Huh? Layering three arms in top of each other isn’t ideal, but each arm points very close to the right answer, and easily shows you “how far” from or two nearby round numbers. (Or unround numbers, for that matter.)
If your goal is to say some numbers out loud, the digital interface is better. But if you think of time as like distance, and want to know “how close” (which is what i usually want to do with time) the analog face is much better.
There are, of course, clock and watch faces that have both hours and minutes shown on the face, and prior to the digital era the only way was to either combined these together or have separate dials as some chronographs with a lot of complications do.
But, if what you want is a really consistent way of measuring time, I recommend to you the Hexadecimal clock:
To sort of rephrase what I said in my last post, sure, if you’re trying to determine “the current time is precisely 3:58:30” then an analog clock isn’t great. But it’s pretty easy to look at an analog clock and say “It’s almost 4:00”. Maybe you need the former if you’re taking precise scientific measurements, but in most everyday situations the latter is all you need.
Said another way, what’s wrong with @Hermitian’s idea is really an analog clock with the numbers “1” through “12” around the perimeter. If instead there were just 12 radial tick marks or dots, a la the icon in @Stranger_On_A_Train’s cite’s preview box, much of the confusion goes away. It’s e.g. straight down pointing at a printed “6” meaning either 6 or 30 that’s much of his complaint. So fix that.
Another thought unrelated to the above …
When I was a kid first learning to tell time I thought it was stupid that the hour hand, the one representing big amounts, was the little hand, while the minute hand (and second hand if present) that represented little amounts was big(ger).
If I was going to reinvent analog clocks, I’d have done it the other way: Start with a long fat hand for hours. Pointing at 12 numbers arranged around the circumference as now. When chronography advanced to the point minutes became relevant, add a shorter, thinner minute hand. Ditto when seconds eventually became relevant: add another even shorter even thinner had for that.
As well, once you have a minute hand, also have an inner circle of numbers 0, 5, 10, 15 … 55 in a smaller font size that the end of the minute hand points directly to.
Much more ergonomic and at no point in the development of horology is this idea any more conceptually or mechanically difficult than what was actually done.