Intentionally missetting clock: Cunning or stupid idea?

I’m not quite sure if this question is a mundane trviality of everyday life, or if there’s a legitimate mathematical puzzle behind it. I guess the former is more likely (although the latter would be cooler), hence I’m placing it in IMHO.

I have an el cheapo digital alarm clock by my bedside. It’s a quartz, not radio-controlled or otherwise online, and it has a tendency to go slow at a rate of about a minute a month. I reset it manually roughly every two months - two minutes is about the maximum inaccuracy I’m willing to tolerate, and I’m too lazy to set it more frequently than that. I find this manual resetting business to be quite Sisyphan - the day after the procedure, the clock is already two seconds slow (it displays only hours and minutes, not seconds).

So I’ve been wondering if next time I should set the clock intentionally two minutes ahead of actual time. That way it’d be four months until the inaccuracy exceeds two minutes again. The clock would be fast half the time and slow half the time, but the average absolute inaccuracy would be the same as if I set it to accurate time every two months, and I get to do only half the resetting work.

So, assuming that the clock goes slow at a constant rate of two seconds a day, is this thinking mathematically correct? Is it, in common-sense terms, a good or a terrible idea?

I don’t see anything wrong with that reasoning. As long as you’re okay with the clock being off two minutes in either direction, it makes sense.

Technically, though, there are now 2 more possibilities of what time it actually is (at the minute level of precision). If you looked at the clock before, and it said 1:21, then you would know it was somewhere between 1:19 and 1:21. After your proposed change, it could be anywhere between 1:19 and 1:23.

But your variance would be 1.4142 times as much

I have a watch that is a real pain to reset (but is otherwise very nice). It gains almost exactly 4 seconds a week. What DST ends I will set it about 40 seconds slow and when DST restarts it will be about 40 seconds fast. I will again set it about 35 seconds slow and reset it once during the summer. Right now it is almost exactly right.

So yes, what you are proposing is entirely reasonable.

Whether your idea is cunning or stupid depends entirely on whether it works for you.

Personally, I’d rather have a clock tell me a time that was a few minutes later than the correct time (so that I had a little more time than I thought to do something or go somewhere) than to have it tell me a time that was a few minutes earlier (so that if I relied on that clock, I might be late).

Which reminds me of how, when you’re making measurements or estimates, sometimes what you want is to come as close as possible to the actual value, but other times it’s preferable to err in one direction (e.g. too high) than in the other (e.g. too low).

Instead of tolerating the current maximum error and increasing your laziness, you could maintain your current level of laziness yet decrease the maximum error. In other words, every two months set it one minute fast. Then you’d always be within a minute of the correct time.

You could set it 3 minutes ahead when you change to-and-from daylight saving time. Your average error increases, but your additional effort becomes zero.

Personally I prefer a fast clock so I would set it 6 minutes ahead instead.

Just set it once a year on Dec 1 and starting Jan, add the month# to the minutes.

Your idea is cunningly stupid.

My bedroom clock is about 18 minutes ahead now. I leave it. It makes me feel good in the morning when I am awake earlier in the office than I was in the bedroom. Otherwise I just remember to subtract 18 mins to know the time.

I don’t have the patience to do any sort of regular reset or calculation. I just set it right when the power goes out every year or so.

But accurate within a minute for a year. The modular arithmetic might be a hindrance to some.

“There’s a fine line…”

That’s the kind of mathematical take on the problem that I was looking for. I realised that, after cutting my resetting work in half, there must be some price that I’m paying for it in terms of inaccuracy, but wasn’t sure where it was, since the average inaccuracy remais the same.

But it seems like irrelevant mathematics to me. The variance of the deviation from the correct time is relevant only under BOTH of two conditions:

(1) It’s relevant only if someone doesn’t know when the clock was last reset. You obviously know that. So it would be relevant only to (say) a visitor who knows that you reset the clock every 4 months, but doesn’t know your resetting schedule. For them, deriving the correct time from your clock would have greater uncertainty than a 2 month schedule.

(2) It’s relevant to the hypothetical visitor only if it’s important to them to know if they are slightly early or slightly late. For many tasks, all they may care about is that they are within two minutes of the correct time.

True. My point is there is no free lunch so that although the average deviation stayed the same, there are different measures of deviation that will change.

( Also, fwiw, the variance is 4 times greater when the size of the distribution has doubled. )

Goddammit. I did square root not the square. Now you know why I don’t (often) teach stats.

Numerical values don’t matter. You have an assistant to do the arithmetic. But I do think the most important part of the job of teaching statistics is to know what statistics are relevant when.

Any decision can only be meaningfully made after you’ve settled on your measure of merit.

Max laziness vs max accuracy vs whatever … is the tradeoff. Settle that and we can discuss how to get there.

How are you validating the cheapo digital alarm clock is running slow? If you are using your smart phone for that, why not just get rid of the cheapo clock and put your smart phone in it’s place, and use the alarm built into your phone? No need to over think this.