There’s a proposal to adopt a more uniform calendar that eliminates the leap year for an extra week every 5 or 6 years.
While looking up something completely different today I stumbled on this article which details a bunch of the many, many calendars used over the centuries in different parts of the world. It’s a large topic, larger than most of us might expect.
It’ll be a cold day on the Ides of Thermidor before the world agrees to alter the calendar they’re using already.
To add to this:
“Dyos bo’otik!”
(that’s Maya for “thank you” — one culture that made events in Mexico and Guatemala one to two thousand years ago anchored firmly in time for us today, thanks to their efforts then, and those of scholars and archaeologists in recent decades).
So I think this thread conflates two things that make up chronology…
- How the solar or tropical year is broken down into months. This is complicated as there are not an exact number of days in a “tropical” year, there are 365.24219 days. So there have been numerous attempts accounting for this with leap years and such. While complicated its just a maths problem that can be easily solved (especially if you have a modern calculator) to convert from one form to another (e.g. Gregorian to Julian).
- There is what is your “year zero”? Typically in ancient times this was the start of the reign of ruler who was in power when the event happened. In the example of Julius Caesar’s assassination it was actually in the fifth year of the Consulship of Julius Caesar and Marc Anthony (they also had a system tying it back to the founding of Rome, but the consular year was more common). For Ancient Rome we can convert this to a modern date pretty accurately, but for other civilization the problem is harder (and requires tying dates to particular astronomical events that we can work out an exact date for)
That paper is amazing — thank you! At first I thought “That can’t be right — the earth’s rotation is slowing just a millisecond per century. The ancient Chinese were good — but they weren’t THAT good!”
But then I realized a millisecond of ROTATIONAL change means a much bigger difference of the timing of eclipses (more like ten minutes per century) — because the earth in its ORBIT moves so fast. (Correct me if I’m misunderstanding something).
Here is my summary of the Science article:
They found writings of ancient China and Babylon, and others from more recent centuries, that recorded partial or full solar eclipses (or lunar eclipses), typically within an error/precision range of half an hour or so, sometimes more like five minutes (sand clocks were used and related to that day’s sunrise or sunset, e.g.). The authors only used cultures whose calendars we know to the proper day (the Chinese have been using a 60-day calendar continuously since 300 BC, for example).
They also used some documented solar eclipses where the time of day wasn’t recorded – but the location was, and so we can time it (within the half-hour-ish band of totality, e.g.).
Check out Figure 9. The gray line is what we’d expect, based on how the moon-Earth system has been slowing down the earth’s rotation for five billion years – just on the order of milliseconds each century ( I think around that), but this is enough to change the timing of eclipses more on the order of around ten minutes each century (I think it’s around that) – so, 2000 years ago, a difference of several hours.
But…the ancient observations don’t fall along this line! They consistently fall around a different line. The difference between them consistently gets bigger as you go into the past, so by (say) 500 BC the difference is like 40 or 50 minutes.
What explains this extra rotational deceleration component (that is, beyond the tidal friction we already knew about)? The authors offer two suggestions: core-mantle mass exchanges (deep tectonic sloshing around); and, isostatic rebound since the most recent glaciation ended.
Why did the Mayan calendar use the strange period of 819 days? Turns out that twenty 819-day periods or about 45 years synchs up nicely with the synodic periods of several of the planets:
I am not absolutely sure, but I think you got that one wrong: it is not about the Earth moving along its orbit around the sun, it is about the Earth spinning around its axis:
It is found that the rate of rotation departs from uniformity, such that the change in the length of the mean solar day (lod) increases at an average rate of +1.8 ms per century. This is significantly less than the rate predicted on the basis of tidal friction, which is +2.3 ms per century.
That means that the days get ~1.8 ms longer every century, we re talking about 27 centuries, so after one century, every day is 1.8 ms longer, after a year 365 x 1.8 = .657 s. Not quite a second, but after 27 centuries that would add up to almost 18 seconds. Then, the next century 1.8 ms are added, that is as per today 2 x 0.657 times 26 centuries: 34 seconds, plus 18 makes 52 seconds. Then comes another century (and you would have to integrate that constantly, so it is even slightly more): 1.8 ms x 3 (centuries) x 25 (centuries left) = 135 seconds. Makes already 187 seconds, more than three minutes. Keep doing this 24 more times and you get about six hours, one quarter of the day, 90° rotation less (as the earth is slowing down): An eclipse that would have happened in the middle of the Atlantic Ocean if Earth had not slowed down happened in Greece, which shows the Earth has slowed down by those 1.8 ms/day every century. The Earth is where it should be relative to the sun, it is just not midday where you would expect.
Now the thing is that calculations based solely on the tidal friction arrived at a value of 2.3 ms/day every century, that is: more slowing was expected! The slower deceleration of the rotation of the Earth (not the rotation around the sun, I believe), the authors postulate, may be due to post-glacial rebound and core–mantle coupling. They don’t explain why those explanations would slow down the slowing down (I know what I mean, I hope you can follow), but I just like that they can show how much 1.8 ms add up to and that 2.3 ms would have given a significantly different result after 27 centuries.
I may be wrong, but that is how I read the article of the Royal Society. Glad you liked it, and I hope I understood you right. Would not be the first time I refute something the other person had not said at all! In which case, of course: sorry!
ETA: I see made a mistake in my calculations, but correcting it would take longer than the edit window. Still the thrust of my argument is about right, I believe, so I will not even try to be more precise.
Thanks. I’m no expert, but I think I got this one basically right. The y axis of Figure 9 shows seconds, and the range is (as I recall) on the order of a couple thousand — in other words, half an hour to an hour, give or take (with the data points scattered across a cone that’s about five to eight minutes wide, as of 2,500 years ago).
This makes much more sense. Yes, a millisecond a century translates to a fifth of a second 1 to 2 thousand years ago, but there’s no way the ancient Chinese (and others) could measure THAT precisely, nor record it in a way we could be sure of understanding now, with such precision.
The authors explain how the MOST precise ancient measurement was done with a sand clock observed until the sun set later that day. I’d expect the error bar for this sorts of thing to be on the order of a few minutes; and that’s what the article implies.
Any single, isolated data point wouldn’t tell us much, but the power of this research is that they collected a bunch, and they’re all consistently different than the expected values, in the same direction and with a particular order of magnitude.
No. It’s still officially Washington’s Birthday.