Clearly it’s not in common use though. Please, I’d ask you to refrain from being deliberately obtuse here.
I met my great-grandmother, who died nearly a centenarian in 1980. She was bedridden, but otherwise healthy and of sound mind. She taught me how to play a number of games, dominoes were her favorite.
Well, the oldest living Holocaust survivor was born in 1903 (so was well into adulthood at the time of the Holocaust), so it’ll be a while before we lose them all.
I don’t know about that. There’s always been plenty of popular confusion of “the XXth century” with “the YY00s” where XX = YY+1, and the distinction has survived. I don’t see anything special about now that should change that.
Ask a random person at your local pub or cafe if January 1, 2000 was the first day of the 21st Century or not and see how you go.
We certainly are sharing the planet with a LOT of people who are wrong, then.
What an odd way to use the word “definition”…
Um, that is EXACTLY what a definition is. A description of how people use (i.e., intend and understand the meaning of) a word when they communicate.
The “bar/pub/cafe test” is perfect. Primary definition: 19th century = 1800s = years that start with “18.”
(The first time I encountered the pedantic secondary definition was in a *Straight Dope *book, circa 1989. You know, the last year of the Eighties).
It’s seems you just claimed that “definition” has a precise meaning when used about words.
Meta! I was exaggerating, true-- only to counter the hardline prescriptivist presumption in the post I quoted.
There are various ways to define “definition” (i.e., various ways that word is used – intended and understood). That’s why I referred to “primary” and “secondary” definitions (which is just one way to classify them).
I can easily persuade any flexible person that the centuries start with the xx01 years.
Just start with: How long is a century? The first century ran from what year to what year?
Of course inflexible people exist who refuse to change their minds given facts and logic. I don’t want to be aligned with such people for a second, let alone a century.
Touche, I guess.
EXCEPT, when language has arithmetic intertwined with it. Then the arithmetic HAS to prevail, regardless of the innumeracy of lazy thinkers who insist that there be extra significance to the calendar’s odometer ticking over to an extra “0.”
Otherwise, we’re doomed to a society where eventually,
inevitably evolves to: The “bar/pub/cafe test” is perfect. Primary definition: 18th century = 1800s = years that start with “18.”
ISTM that “pedantic” is often just shorthand that allows one to be dismissive of a statement that is “technically correct.” I’ll note that “technically correct” is widely recognized as the best kind of correct.
Irish. And yeah, it occurred to me right after I posted that, from the description, it could be either.
Not really. There’s a big leap from “1900 was the first year of the 20th Century” to “1800 was the first year of the 18th century because it starts with 18 lololol”.
And if we’re going down that road: Technically “Decimate” means to kill every 10th of - but the word long ago changed its meaning to “nearly wipe out; inflict huge losses on”. Despite being mathematically based, the term’s definition has shifted. Similarly, a “Century” of years in common parlance means the year xx00-xx99.
Try convincing one of your work colleagues that 1980 was the last year of the '70s and see how far you get. Despite the mathematical “correctness” of that statement, windmills do not work that way in everyday use.
Alright, Hermes. We know how badly you want that promotion to the next level of Bureaucrat.
Wait… You mean pi isn’t exactly 3.14? :eek:
“Common parlance” is mistaken.
I would never attempt that, because the statement is absolutely NOT correct. The '70s comprise those years, which, when named aloud, have the speaker saying the word “seventy.” The '70s did not all take place in the same decade of the twentieth century. 1971-1979 were the first nine years of the eighth decade of the twentieth century, but 1970 was the final year of the seventh decade.
So sorry if it’s challenging to keep track of this, but people need to suck it up. Or be wrong. It’s okay though; there’s probably enough room on the planet to accommodate them and those of us who are right.
That WAS a Futurama gag wasn’t it? But I first read it here.
ETA: Make you a deal though: Let’s say global calendar reform takes place and everyone decided to start counting from the beginning. IFF it is decided that the first century of this new Era is to be numbered 0, I will concede the point to the innumerate among us.
I don’t think it will be decided that way, though.
Only for bakers when using the formula pi R round.
So logically, mathematically zero would equal one hundred.
As I’ve already pointed out, we’re not speaking to the general public or to a random person at the pub. You’re not posting “outside the SDMB,” but on that very board, where pedantry is a way of life. So your pointing that the general public is mistaken about this fact out is pointless. I might equally well say that you are being obtuse in failing to recognize the audience you are posting for. You don’t need to point out that the general public is ignorant; we know that already.
Not sure why anyone would argue what a century is, it is 100 years.
AD does not start at a year 0, because we humans did not chose to do it that way, so year 1 to 99 is only 99 years, hence the century runs until year 100, the new one starts at year 101 through year 200 etc.