Continuity of our current time series

So, it’s 0803 GMT as I type this. 12 hours ago, an accurate timekeeping device would have said 2003 GMT. 9 weeks and 5 hours ago, it would have said 0303 GMT.

How far back can one go before this is no longer true?

Assume we are consulting an official timekeeping device in the US or UK and that formal adjustments such as DST are accounted for. Note, I’m not trying to correlate accuracy of timekeeping devices with physical phenomena (like, say, the actual duration of a year i.e. earth’s revolution) or with stability and precision of the timekeeping mechanism. I’m only inquiring about the ‘smooth continuity’ of our present time-series (with DST et al. accounted for).

Thanks.

I don’t understand what you’re asking.

According to the latest physics, that are very well outlined on PBS’ recent “The Fabric of Cosmos” series, there is no constant or standard time. Time is relative, therefore it is not a constant therefore it’s a subjective, or a quantity that depends on and defined by other physical parameters of nature.

I don’t have an answer to the question, but the question itself is fairly straight-forward. Different “official” organizations, such as the USNOin the US, maintain the Standard Time, which all countries in the world sync up to. At some point in the not-too-distant past, each town had its own, “local” time, determined by when the Sun passed the local meridian at Noon.

The OP appears to be trying to determine when the standard-setting organizations of the world set a “Master Clock” which all international clocks would agree with.

It is my WAG that this most likely happened in the mid-nineteenth century as train schedules became important, and the desire to run on time and avoid collisions overruled the local time of the communities through which the rail lines passed.

Wikipedia says “The local time at the Royal Greenwich Observatory in Greenwich, England was chosen as standard at the 1884 International Meridian Conference, leading to the widespread use of Greenwich Mean Time to set local clocks.”

So you’re not far off in your mid-19th century guess.

Of course, all that strictly means is that GMT was adopted as the reference time in 1884 but not provide the origin of the time series used at Greenwich in 1884.

The current definition of UTC also includes leap seconds which are added more or less periodically. The most recent was at the end of 2008. I’d be curious to know whether the OP would consider the addition of a leap second to break the “smooth continuity” of time measurement. After all, they are instances where e.g. 4 seconds ago the correct UTC time read 5 seconds fewer than it does now.

And the rules for leap seconds have changed over time since the 1884 convention.

This Coordinated Universal Time - Wikipedia , particularly the history section, introduces GMT, TAI, UT, UT2, and some of the other standards in use both now & over the last 150 years. This Time standard - Wikipedia is pretty good as well.
This Gregorian calendar - Wikipedia explains a bit about how calendars (which are just clock time writ large) have shifted over the years. Even now there are many different ones in use besides our OP uses every day. And each has it’s own history of adjustments.

Heck, the Gregorian calendar in use now in the US & Europe has had different adjustments applied at different times in different countries.

I think the question the OP means to ask is probably close to meaningless once we look close enough at the details. Speaking just to the UTC standard, I’d vote for the smooth continuity extending back only to 0000 on 1Jan2009 of the US/European Gregorian calendar. For any other standard the answer will be different.

Let me elaborate - I consider the time series continuous as long as the progression is smooth or assumed to be so in good faith i.e. 59:58 -> 59:59 -> 00:00 -> 00:01 and so on and if any deviations are specifically known and documented. So the case of the leap second added at the end of 2008 wouldn’t affect the “continuity” since it can be specifically factored in one’s regressive enumeration of the time series. With that said, how far back can we go for the present time series?

Good clarification, but you skipped the part about calendars. We can certainly track our current calendars back to Pope Gregory’s declaration on 24 Feb 1582 with all adjustments since then being well-documented.

My impression (and I’m not an expert) is that we can reckon accurately back to the Julian reform of 46BC. Again, there have been adjustments to that calendar between then & now, but we know what they are/were & can reckon backwards across those adjustments accurately at least to the resolution of a single day.

Farther back than 46BC is beyond my knowledge.
Now time of day is another matter. I have no idea how far back from the 1884 convention on time we can go with reliable clocks to determine time to the minute or second, much less with good records of each adjustment made to whatever passed for a master clock in those days.

Here are some signposts along the way though …

In 1847 GMT became the official time standard throughout Britain. Good bet any adjustments between then & now are well-documented.

Pendulum clocks were invented in 1656. That’s IMO the earliest era it makes sense to even speak of timekeeping in seconds. A lot of progress, and chaos, happened in the 200 years from the 1650s to the 1850s. The “right” answer, for reasonable values of “right”, lies somewhere in there & (IMO) much closer to 1850 than 1650.

As early as the 13th Century there *were *mechanical clocks but with uncontrolled and large daily errors. Prior to then it really doesn’t make sense to speak of time as something measured or monitored systematically by mankind.

Yes, there were earlier clock technologies, but certainly not any systematic time tracking.

You could read this Time - Wikipedia & follow the links you find relevant and let us know what you find. It *is *an interesting question although the answer you settle on depends a lot on what you want to assume or accept as well-documented.

My father told me that when he was growing up (he was born in 1906) every locale still had its own time; only railroads set their timetables according to GMT (I don’t know who legislated time zones) and they called that “railroad time”. He mentioned that Media, PA was three minutes behind Philadelphia. That’s actually far too much. At 40 deg, a time zone is just under 800 miles wide and so 3 minutes, which is 1/20 of an hour, should represent a distance of 40 miles, much farther than the distance from downtown Philly to Media.

So, no one’s got the time?

Actual time was set by the astronomical phenomenon as others have mentioned. Noon at the summer or winter solstice, IIRC, is the noon time.

To be able to specifically point to seconds, you have to measure seconds. I’m not sure the technology or instrument accuracy mattered before the 1700’s. (Heck, even in the 1960’s a typical cheap watch would lose a minutes a month. Getting close to seconds a month would require serious fine-tuning.) Galileo among others observed that a swinging weight kept time proportional to it’s length, regardless of weight - the principle of a pendulum clock. You fine-tune it by adjusting the length of the pedulum.

The issues became most relevant in the Age of Discovery; the overarching puzzle was how to detemine longitude. Latitude was easy - measure the angle location of the big dipper or other stars. Longitude meant taking the angle, rise or set of known stars, comparing that to tables that had been painstakingly created. Once portable but reliable clocks were possible,that solved a lot of problems. If star A rose at such and such a time GMT at Greenwich, and 1 hour later further west, then you were 360/24 or 15 degrees west of Greenwich. Knowing details like location meant the difference between a fast voyage and a short one onto the rocks.

If you are less concerned about seconds, then the specific timeline probably goes back to Constantine and the ascendency of Christiantity; or more likely, to Julius Ceasar and his calendar. There are plenty of books and articles on calendars and how they evolved. Before Caesar, IIRC Romans simply adjusted their calendar as needed. If feast days drifted from appointed times, the government decreed a few days - this also fixed the problem that month and year did not sync up properly. He came up with the 12 months, different month lengths, and leap years.

Christians took this because they needed to calculate holydays, especially Easter and Christmas. 1500 years later, it was obvious that 365.25 days was not entirely accurate; Pope Gregory’s modifications were to eliminate the 11 days of drift over 1500 years and make 3 of every 4 centuries non-leap-years. (Centuries not divisible by 4 - ie. 17 for 1700, 18, 19, 21, etc. are not leap years. 1600 and 2000 were.) Dropping the 11 days got Easter back on track.

Part of your discussion of discontinuity is that due to protestantism, many non-catholic countries were not happy adopting a Pope’s decree about calendars, much like our attitude to French metric systems or maybe a decree from a communist country; so IIRC for example England did not drop those 11 days until the 1700’s. Russia did not do so until after the revolution, so the October Revolution started on November 10th our calendar. Orthodox churches still use the old time, so Russian Orthodox Christmas happens 12 days after Dec. 25th.

Other cultures had their own calendars, and it is possible usually to “sync up” those times; but that is not the same as a continuous number system. As far as Chinese or Japanese numbering, or similar, they adpoted the very common calendar of what year of a monarch’s reign it was. I have no idea how reliable the continuity would be, given times of troubles; but it’s a common system for many kingdoms. I have no idea how reliable or continuous the Jewish Calendar is, given that there would be years where one group or another would be too preoccupied to lose track of time; but like Europe in the dark ages, it’s unlikely that the whole of europe could drop a whole year in the AD count.

The ability to physically synchronize large areas of territory came with the telegraph system. (give or take the speed of electricity). This was about the same time as the railroad, so it was a convenient co-development.

I guess the question is when “AD” numbering was adopted - probably in the reign of Constantine, between 308 and 336AD. So somewhere about there would be when numbering continuity started. Before that, we can match our AD system to Roman and other year numbering. (There’s the joke about the time traveller who asks the Roman what the year is, and he’s told, “oh, it’s 23BC.” :slight_smile: )

Tiny nitpick: At the time of the Gregorian reform, the adjustment was 10 days, although by the time it was adopted in Great Britain, it had increased to 11 days, because the British had treated 1700 as a leap-year.

The Japanese system has good continuity since 700, but that still makes it younger than our year-numbering system.