I feel kind of stupid asking this, but I want to make sure I get it right.
I want to calculate the length of the average year. Any particular year needn’t be spot on, but an accumulation of years must approach total accuaracy.
A year has 365 days. If the year is divisible by 4, we add a day. If it is divisible by 100, we don’t. But if it’s divisible by 400, it is again. So over the course of a handful of years, the average length is 365 plus .25. Over the course of millenia, would it be 365 plus .25 minus .01 plus .0025? That is, 365.2425?
Depends on what you mean by “year”. If you’re talking about a Julian year, it’s 365.25 days per year on average. If by “year” you mean a sidereal year (how long it takes the earth to make one orbit around the sun), it’s 365.256363051 days.
Yes, I believe that’s correct, and interestingly, it’s nearly a complete reversal of the way the Gregorian calendar was devised, the way I remember reading it. That is, some bright astronomer figured out that the number of solar days in a seasonal year was approximately 365.2425 , and the gregorian calendar that the western world now observes, (leap year every 4 years, except for century years where the century is NOT a multiple of 4) was derived to match that figure.
Another method: Take the interval of 400 years. Each Gregorian year has 365 days, but there are 97 of them that have a leap day (100 years divisible by 4, throw 3 of these out since they end in 00 but are not divisible by 400). So the 365 days of a standard year are augmented by 97/400 days on average, or .2425. So the average length of a Gregorian year over millennia is 365.2425.
What exactly do you mean by a “sidereal year?” A seasonal year, unless I’m very confused, is the same as sidereal… the axis of the earth doesn’t precess enough to cause a meaningful difference.
On the other hand, the sidereal day is definitely different from a solar day, and so the year can be a different number of ‘days’ based on the precise day you are measuring it with.
When calculating an average, under what circumstances does it matter whether you report it accurate to 5 or 50 decimal points? Averages by their very nature are “fuzzy” things.
Actually, I’m not, I’m interested in the calendar year. What I need to do is approximate the difference between two dates and slap a label on the later one. If a patient has an operation, then comes in years later for a followup, I want to know whether to label it as a “4 year followup” or a “5 year followup”, for example. For this, I need to figure out which year the followup date is closer to, and so I need a constant to define a year, based on days. Since most patients end their relationship with us after the 5 year followup, 365.25 is more than accurate enough, in fact 365 is probably close enough. But I’m a geek, so I want to use the best number that I can reasonably get my hands on.
However, while the basic answer appeals to the programmer in me, the other stuff appeals to the astronomer in me, so it’s all good.
A bit more discussion of the various years (there are at least three definitions useful to astronomers) can be found here.
Fair enough — though the Gregorian calendar we use is designed to track the tropical year, on average. Obviously it doesn’t quite hit it, leaving a residual slippage of about one day for every 3300 years.