Don't ask Cecil for directions.

Then please tell me that 1 minus 1 equals. Since you don’t seem to believe in the concept of zero, I’m really curious.

Still working on it… I think this is better!
…752AUD…753AUD…754AUD…755AUD…
…||…||…||…||
…2BC…1BC…1AD…2AD…3AD…
…||…||…||…||
…Year 1…Year 0…Year 1…Year 2…

…2752AUD…2753AUD…2754AUD…
…||…||…||
…1999AD…2000AD…2001AD…2002AD…
…|
|…||…||
…Year 1999…Year 2000…Year 2001…

The pennies analogy is somewhat flawed. It is true that, outside of computers, we don’t use zero when counting. An analogy that’s more in line is your age.

When you were born (which could be called your first birthday, but usually isn’t because it would be misleading), you were 0 years old. After one year, you have a birthday and are now 1 years old. The instant of your 10th birthday, you enter your second decade. You enter your 20’s (AKA your third decade) on your 20th birthday (not counting your actual day of birth.) You would start your second century the day you turn 100, not 101.

The difference between this is that we don’t foolishly call a newborn 1 year old, like the creators of the Gregorian calendar presumably would have.

The calendar that we have left out a year. If it did not, we wouldn’t be discussing this. Everybody would be in agreement that this (3rd) millenium, this 21st century, started Jan 1, 2000.

But because of that left out year, people are taking a year from each decade, century, etc… and giving it to the previous one. I am simply saying that a more logical way to go about things is to admit that the first decade, (and century and millennium) got shafted a year. That seems more logical than perpetuating Aloysius Lilius’ mistake.

You’re all trying to make this much too complicated, and there are more straw men floating around this thread than you can throw a match at.

First of all, the nineties are 1990 through 1999. It has nothing whatsoever to do with the count of centuries, decades, or millennia.

The “first century” was the first 100 years, the “21st century” is the 21st 100 years. The third millennium is the 3rd 1,000 years. Since the first year was year one, the 21st century and 3rd millennium started on 1/1/2001.

Every single argument I’ve seen in this thread claiming that the 21st century somehow started in 2000 is reverse justification because people like round numbers. There’s no mathematical or historical logic to back it up, and making the 1st century only 99 years is just silly.

Yes, the 1900’s started in 1900. But the 20th century started in 1901.

I dispute this. The mathmatic logic states that -1 plus 1 = 0, and that the difference between -1 and 1 is 2. Mathematically, there SHOULD have been a year 0.

That I’ll have to concede.

I’d say that it’s slightly less silly than perpetuating a mathematical error until the end of time.

Let’s say, hypothetically, the Pope Gregory had decided that Jesus was actually born in AD 4, and declared that AD 4 was the first year; the previous year was 1 BC. Would you then say this millennium started on Jan 1, 2004? Or would you just say that the first century got slighted by a few years?

Rockinstar, that’s right, pennies are not a winning route. Your argument is much stronger with the, “should we mess with our definition of 20 centuries or just let one century take a hit.” With money, it’s exact groupings (dollars) that matter above all, but with time, alignment takes precedence over mere regularity (which is why in 1949, 1919, and 1582, countries saw whole weeks disappear as they switched to the Gregorian calendar… with hardly a blink).

By the way, counting from “zero” does have obvious precedent in our measurement of time. The 24-hour clock starts at 00:00:00, not the non-sensical 24:00:00. (In fact, not even the American clock starts counting at 1.) Not just programmers but mathematicians are pointedly aware of the rationality of counting from 0. If we started from year 0, the total time that passed from the inception of the lord could be calculated as x*years. Instead, the formula is the less divine (x-1)*years. This off-by-one business is a minor example, but nearly every fundamental insight into the mathematical universe (such as simple infinite series that awe-inspiringly sum to fundamental quantities like pi) has convinced mathematicians that in fact god counts from 0. And as with Gregor’s insight, the correction in the calendar is an easy one to make that virtually everyone has, if unbeknownst, embraced.

And every argument that the millenium started in 2001 is misplaced rationalism because people like to prove ‘everyone’ wrong. In fact, haha, how is demanding 100-year centuries not a love for round numbers?

I just came up with a solution: Since people feel the need to take a year from somewhere to make the first century 100 years, why don’t we take the year 1 BC? That makes as much sense as anything else argued here.

It’s the people who want 20 centuries and two millenia to start on a round number, versus the people who want one century and one millenium to be a round number. There’s nothing else of reason or consequence in this debate.

FWIW, from Wikipedia:

For dates before the year 1, unlike the proleptic Gregorian calendar used in the international standard ISO 8601, the traditional proleptic Gregorian calendar (like the Julian calendar) does not have a year 0 and instead uses the ordinal numbers 1, 2, … both for years AD and BC. Thus the traditional timeline is 2 BC, 1 BC, AD 1, and AD 2. ISO 8601 uses astronomical year numbering which includes a year 0 and negative numbers before it. Thus the ISO 8601 timeline is -0001, 0000, 0001, and 0002.

When you begin to argue in good faith, I will respond. As long as you are putting words in my mouth you may as well argue for both of us.

There wasn’t. And why should there have been? If there are 50 pennies in a roll or 12 eggs in a dozen or 100 years in a century, why go looking for some 0th year (or 0th egg or 0th penny) just so you can start your counts on an even number instead of ending on one?

There was no mathematical error. There is still no mathematical error. Centuries have 100 years in them.

Yep. You can’t start the 3rd millennium until 2,000 years have gone by. That’s just wrong. Logically wrong; mathematically wrong; conceptually wrong; historically wrong.

You’re not even trying to be serious about this, are you? One hundred years is the definition of the word “century.” And it’s not proving “everyone” wrong. I believe that most people do understand that centuries have 100 years and that year 101 began the 2nd century.

And besides, how is it “misplaced rationalism” to help someone learn to count?

Because the number that comes after -1 and before 1 is 0.

Why do you continue with the counting objects analogy? (Look inside the computer that you’re on now, and if you have 4 SATA slots, you’ll most likely see that they are numbered SATA0 through SATA3.) How about the more relevant analogy of human age that I mentioned?

Let’s try an exercise:

a.) What is -5 plus 10?
b.) What year would a person born in 5 BC have celebrated their 10th birthday?
c.) Why the discrepancy?

THAT is the mathematical error. There is a missing year.

Thank you. At this point, I’m happy to get an answer that doesn’t completely dance around the issue.

So you’re saying that you would celebrate new decades in xxx4 if the first century lacked the first 4 years? (I don’t want to be accused of putting anything in your mouth and have you run off.)

I’ve really been trying to figure out if the 01’ers think that way because they’re always started counting with 1, or if it’s because the first century started with 1. Getting an answer thus far has been like pulling teeth.

Look, Rucksinator, all of your negative number math is completely irrelevant. It works like this:

We have an arbitrary starting point to our calendar. It’s called “1 AD” (or “1 CE”–I still use AD because it’s what I learned). It really doesn’t matter what the year before that was called. We’re not debating whether John the Baptist’s age is off by a year as calculated by Biblical scholars.

The only issue here is that there are 100 years in a century and 1,000 in a millennium. It doesn’t matter if the years are called 1, 2, 3… or 0, 1, 2… or A, B, C… or Ethel, Benny, George… All that matters is that the 101st year begins the 2nd century and the 2001st year begins the 3rd millennium. It just so happens (what a happy coincidence) that the 2001st year is called 2001, so the 3rd millennium begins with 2001.

It’s clear; it’s easy; it has nothing to do with negative numbers; it’s unambiguous; it’s obvious; it requires only minimal math and logic. I fail to understand why you still have a problem with it.

You seem to have forgotten about the BC side of the calendar.

InvisibleWombat, I know it’s 10x easier to argue with Rucksinator because of the dumb points he’s making (dude… repeating that integers go -1,0,1 doesn’t prove anything, and the reasons that mathematicians count from 0 are way deeper than that), but I’d like to see you try to battle my points (and not just address one sentence out of 10).

Btw, what I meant about ‘misplaced rationalism’ is that while it’s rather rational to say a century is 100 years and all centuries should end in ‘1’, it’s actually more rational (in a holistic, practical, non-smartass way) to resize time periods by 0.002%, 0.3%, 1.0%, even 2.7%* just to keep alignment going, and to have ‘the 1900s’ and ‘the 20th century’ refer to the same period of time. This, in short, is why your rationalism is shortsighted, and misplaced.

*This is in reference to: the extra seconds that get added or removed to dec 31st every few years, the extra day in a leap year, the missing year of the first century, and the the missing weeks of 1582.

And you’ve forgotten about pickled pig’s feet.

The BC side of the calendar is completely irrelevant. It has nothing whatsoever to do with the topic at hand. It’s a distraction that seems to be confusing you. Start with the first year on our calendar and count. It’s really that easy.

Just to be sure we’re on the same page, are you sure you mean rationalism and not rationalization? They’re different, you know.

If you mean rationalism, then it’s a good thing, not a bad thing. It means I’m looking at the situation rationally. Why do you want “the 1900s” and “the 20th century” to refer to the same period of time? Rationally, it doesn’t have to. They are two different phrases meaning two different things. Why do you have such difficulty with that?

If you mean I’m rationalizing, then that’s not true. The term usually refers to making a decision and then trying to come up with some rationale to justify it. That’s not what I’m doing. The whole thing is quite straightforward. There are 100 years in a century. You’re the one trying to rationalize making it more complex by resizing time periods. I’m just rationally explaining the way it actually is, not rationalizing why we should change the meaning of “century”–but only sometimes.

Gems from wikipedia:

Apparently the official scientific standard now, ISO 8601, is to count years from 0.

Astronomers use the Julian Day Number system, which is a calendar that does not count years but only days. This enables them to easily calculate the time difference between two events (which is annoying enough even if the two events happened in the same year but not the same month! do you notice how no one beside astronomers cares about exact time distances?)

Until 1751, Great Britain thought the year started on March 25th. I’ll leave it as an excercise to the reader to think of analogies involving that.

but we do say it’s in its first year.

The Gregorian calendar has nothing to do with it.

Lilius had nothing to do with it.

Pope Gregory had nothing do do with it.

Yes. It’s called knowing how to count.