What would the hypothetical future people you’re talking about being dumb prove?
Decade The First, actually.
And really, people, it’s the SDMB of course we’re going to be pedantic about how the masses are being asses.
But yes, that is the core of the counting dispute, the calendar was fashioned around ordinal numbers, hence no year zero. To go by how Exiguus set it up, it’s Year Of The Lord the two thousandth and seventeenth. Not coordinate 2017 on some sort of X axis.
:smack:
You’re absolutely correct.
:smack::smack::smack:
The rest of my post I stand by as written.
Be so kind as to point out the “dumb” thing they hypothetically do in my thought experiment.
Not having a zero, obviously.
Hell, when the Khmer Rouge took over Cambodia, they started their own calendar with a Zero Year.
Sure, not the nicest people to use as an example, but as murderous as they were, at least they knew their math.
Ah, we’re all so shortsighted… what in the world will humans do in the year 10,000 when every computer thinks it’s Year 0000 / 1 BCE?
We need to get started on addressing the Y10K bug ASAP!
…well, maybe sometime after the UNIX 2038 bug is addressed, anyway.
You are mistaken.
No evidence has been presented to suggest that they were wholly without mathematical knowledge. However, you yourself have just provided rock-solid proof that they didn’t know jack shit about assigning a numeric label to the discrete items that they were counting.
Personally, I think introducing actual science to this debate is an insult to science.
Humbug. You can’t start counting a year until 364 days have passed, just like you wouldn’t count a car being built in a factory as a full car until it’s actually assembled…
Enviado desde mi 5051X mediante Tapatalk
This is why you are 0 years old in your first year of life.
The 60s is a cardinal measure. The 7th decade of the 20th century is an ordinal measure.
The arguments of the “2000 was the start of the 3rd millennium” crowd are incredibly pathetic: Majority of people? Round numbers? Not knowing the difference between cardinals and ordinals? Who cares?
All on a message board with a motto for fighting ignorance. Sheesh.
<nitpick> Pope Gregory was born a thousand years after Dionysius Exiguus … I swear Lil’ Ed is the last copy-editor on the planet … c’mon, SciAm, clean up your act … </nitpick>
Speak to Wolfpup, they will disagree with you. Apparently “the sixties” didn’t start until around 1963 or 64 and lasted in to the seventies (but they have yet to supply a definition for that period)
So it appears that periods of time do have a social and cultural element to them. Which is why referring to “the 20th century” or “the third millennia” are most commonly thought of as referring to 1st Jan 1900 onwards and 1st Jan 2000 onwards. In conversation with the average person in the street you’d have be very specific if you want your preferred definition to be used.
There is a difference between a colloquial “the 20th century” as commonly accepted and “the 20 century as counted from year 1”
If I sent you to the supermarket to get me some vegetables because I want to make a salad and you came back without tomatoes, cucumbers or olives you would be technically correct in what you did but in a very real-world sense you’d be an idiot and I would have a sub-standard salad.
I guess it’s tempting to try to pull religion (and proud non belief therein) into every topic, but it’s not relevant here. There is no year 0 on the Western calendar (now used everywhere, at least for some purposes). Putting one in would change the date of every event BC/BCE. Which of those you call it or what you believe religion-wise has nothing to do with it.
What year the 21st century or current millennium started in is not a fact, it’s a convention. The more logical convention would be 1/1/2001. The overwhelming popular convention is 1/1/2000. In today’s society the latter wins, no contest. Back in say 1900 there was still more of a sense ‘ordinary people should know their place’ and such a convention was chosen by the more knowledgeable, hence more references to the 20th century starting in 1901*. It’s a small example, but of a not so small phenomenon. And it’s a potentially important one even when the choice is of convention, which again I would distinguish from cases where most people are factually wrong about something, and definitely distinguish from matters of belief or opinion with no correct answer.
*admittedly the issue of computer programs designed to represent years with just two digits was also part of the significance of 2000, but I think social/cultural change was part of it too.
True, but isn’t an infant’s age usually counted in months? Instead of saying he’s 0 years old, you’d say he’s three months old, or six months old. Even past a year old, people often say an infant is 14 months old or 18 months old.
True, but we’re stuck with it, right? I mean, it’s what we use.
“Alexa, on what day was the start of the 21st century?”
“Sorry, I don’t know that”
“Alexa, on what date was the start of the new millenium?”
“Sorry, I don’t know that”
Fat lot of help she is!
You should switch to Google Assistant.
“Hey, Google, when did the 21st century start?”
“Start date, January 1, 2001.”
“Hey, Google, when did the new millenium start?”
“Third millenium started on January 1, 2001, and ends on December 31, 3000.”
It’s what many of us use, but it is hardly the only calendar that people use today. If you have a Mac, try changing the date and time settings to see a multitude of different names for what we call today.
I think you are my spirit animal.
Interestingly, and sort of related - I am one of the few who had to deal with an actual Millennium Bug - that is - an issue that manifest itself on 1st January 2001.
In the dBase / xBase standard, the second byte in the file header is a number describing the number of years since the most recent turn of the century.
Microsoft’s dBase driver just counts it as the number of years since 1900. Lazy programming I guess.
Oddly, dBase could cope with anything between 0 and 100 (it really should have been 0 to 99 or 1 to 100), but just breaks when it hits 101 - which is what the Microsoft driver started producing on 1st Jan 2001.
I was maintaining an Access DB that exported data to dBase III format (which in itself was all bad enough) - I had to write an awful hack that rewrote the second byte to standard inside the dBase file after Access had finished writing or updating it.