Let it go, please. You’re not convincing ANYONE. It is illogical, faulty, wrong, impaired, unreasonable, non-sensical BULLSHIT. You’re right about one (1!) thing only: our calendar is indeed based on roman inclusive numbering.
Not that THAT has anything to do with the argument at hand. If there IS an argument, since 99% of the participants seem to agree.
And yes, I USED the word stupid, only to say I thought you were not. Now you’re using my words only to assign them to other people ? Maybe I need to revise my opinion regarding your stupidity after all.
bj0rn, if you wish to continue this, on your behalf, mathematically impaired debate, I suggest you provide us with some academic records in the field of mathermatics first. If it is a high school degree or more, I will lose all faith in Icelandic Education whatsoever.
Uh, not hardly, doof. The years 1 through 1001, inclusive, add up to 1001 years not matter what figure you use to represent “1” and what figure you use to represent “1001.” To say it’s “the same as saying 0-1000” is silly, and imagining that it equals 1000, is preposterous. The item being represented by “0” is an actual, physical year; therefore the years numbered 0-1000, inclusive, are STILL 1001 years.
“It’s my considered opinion you’re all a bunch of sissies!”–Paul’s Grandfather
Bjorn –
Say it is 1 through 11 instead of 1 through 1001, okay? They’re the same, right? Right. Okay, now, 1 through 11 = :
Count how many numbers are there. Are there ten numbers, or eleven? There are eleven. So, that means that 1AD through 11AD is eleven years, and 1000AD through 1001AD is 1001 years. Am I wrong?
Cessandra
It’s frightening how many crazies think that world is going to end in a few days. All of us smart people know that it’s not ending until next year.
Cessandra: i never said THROUGH.
this is how our calendar counts:
1,1 - 1,2 - 1,3 — 2 - 3 - 4 - 5 - 6 - 7 - 8 - 9 - 10 - 10,1 - 10,2…
this is the way we count:
0,1 - 0,2 - 0,3 — 1 - 2 - 3 - 4 - 5 - 6 - 7 - 8 - 9 - 9,1 - 9,2…
so 10 in roman inclusive is 1 - 11 (that is TO 11, not including it)
and 10 in arabic numeric system is 0 - 10 (that is TO 10, not including it)
got it?
coldfire:
and considering how you used the word stupid, i would have been making a compliment right?
so the majority is always right? how come the grades of the majority of students in collage arent all 10’s(A for those who use that system)? (i know that was a cheap one)
so you have to have a degree to add and subtract? this is also an historic question(about the calendar) so dont you want that as well? when kids start counting do you ask them for a degree to see if they are right or wrong? as for the numeric systems we are talking about (roman and arabic), it is sufficient to have knowledge in programming with arrays. for example visual basic and c, where you reserve arrays by indicating a certain number [10]. this would be considered numbers 1 - 10(through) in visual basic but 0 - 9(through) in c.
about that decade thingy:
we are counting the decades like this: the first year of a decade is the ‘0’ year, thus the 90’s ends when 99 ends. but considering that the calendar started on ‘1’ the first decade must have been 9 years. so either we are counting wrong now, or the calendar is counting wrong.
i suggest the later and suggest that we “fix” it. it really isnt that much of a “fix” since we are already counting that way(like we have already fixed it).
The excitement’s over, folks
Please carry on with what you were doing.
There’s nothing more to see here.
Take my word for it - No good can come from further explanations.
Please disperse…
(Oh, well it’s too late to stop this. I guess the damage has been done.)
OK, I think I see where the confusion in bjorns’s brain is.
No. The first year of “a decade” is whatever year you start counting from. The first year of any segment is the “0” year. So you’re half right. You’re trying to make the calendar define the word, rather than the other way around. A decade is ten years, no matter what.
Well, first of all, the calendar didn’t start on ‘1’, but in any case, it was ten years. All decades are ten years. Given that the first year is 1, then after ten years, we start again at 11. So decade 1 is the years 1-10, decade 2 is 11-20, decade 3 is 21-30.
You seem to be the only person on the earth who was difficulty understanding that “the nineties” encompasses the years beginning with a “9,” but that they don’t constitute “the decade”; that runs from 1991-2000, given that the first year was 1.
Do you have any trouble telling time? When we get to “12” we start again at “1.”
“It’s my considered opinion you’re all a bunch of sissies!”–Paul’s Grandfather
From what I have read of bj0rn, he is much alike a friend of mine. An idiot, in the sense that once he says something that is proven to be wrong, he changes the situation until he is “right” only to be proven wrong again. The cycle continues until everyone has a headache, the offender is labeled the town idiot, and people just ignore everything he/she has to say. This has yet to pass…but probably isn’t too far off.
Bj0rn, if it’s any consolation to you, I think youre right and the calendar should have started in year 0 instead of 1 so everything would have been less confusing. Still, this whole debate is pointless, since historians are saying that the actual birth of Christ was around 5 or 6 B.C. Mostly what people are celebrating is all of the zeros and the y2k bug, not the 2000th birthday of JC.
ok, how stupid(cheers coldfire ;))can people get here?
duh!!!
first of all, the calendar DID start on a ‘1’! the year is called 1 AD.
second of all, yes you are right, the first decade would have been 1 - 10(thats through 10) and adding 1990 years to that the 90’s decade should be 1991 - 2000(thats again, THROUGH 2000) but its strange, thats not how we count.
what did you just say? rephrase that statement will you.
do we now? isnt that a bit narrow minded? here after we get ‘12’ we count to ‘13’ and all the way up to ‘23’, then when the clock reaches ‘24’ we do not spell out ‘24’ but start counting from ‘0’ again but add another day.
nothing i have said in this thread has been proven wrong, so that would make you, not your friend, the idiot.
thank you kindly david, god of frogs. and yes this debate is pointless up to a certain tone. pointless because historians will not tell me anything about the birth of jesus and not because of the 1999th birthday of jesus. but because the point is; the way we count, and the way our calender counts.
theese are two different things. our calendar counts using the roman inclusive but we count using arabic numeral(as previously pointed out in this thread). the question is: “can we possibly without very much confusion use both counting systems at once?”. obviously, when reading this thread, NOT.
why?
although the systems both work seperately, they dont work together. proof being that silly talk about decades that none of you seems to understand(excluding those who do understand, but they are fewer than the ones who dont).
point taken from the op of this thread, that the year 1999 can not be the end of the 90’s because if you were to subtract 1990(199 decades) years from the current one that would result in 9. which clearly isnt the end of the first decade (1 through 10).
as goes for the second millenium and the 19th hundred.
we have two facts here, please take notice:
roman inclusive:
2000 years have passed when the year 2000 ends and 2001 begins
this is the way our calendar counts
arabic numeral:
2000 years have passed when the year 1999 ends and 2000 begins
this is the way we count today.
obviously theese two systems have difficulties “communicating” so we should only be using one of them. and i ask you, which one should we be using?
Bj0rn, I will try to rephrase something said above-The “90’s” consist of of the years with the roman numeral “9” in the second from the right column. That would be 1990 to 1999. The year 2000 may technically the end of the “90’s”, but it sounds stupid, and we as a society reject that premise. We recognize what you are saying, BUT, like trying to correct the phrase “Watching the sun rise.”, you are coming off like an anal know-it-all.
When you are done with this crusade, why don’t you go change all the signs celebrating the Battle of Bunker Hill, the Landing at Plymouth Rock, etc.?
SOUNDS STUPID?!? are you telling me you understand what im saying because it SOUNDS stupid? try bloody well reading what i said. i did not make any statement based on my personal belives, just simple and plain facts of life, and if you cant agree with that then what are you lacking?
i posted the simple mathematical statement that the 90’s couldnt be '90 - '99 because that would mean the first decade should have been 0 - 9, but since there was no ‘0’ that is not possible.
your interpetation…
i dont know now how often i have repeated myself here. difficulties in communications or not, you should have figured out by now what i am talking about.
Stick with me, here, bjorn–in the year we consider “AD 1” or “1 CE,” they didn’t start counting. It was a backformation from hundreds of years later.
That’s correct. And, in fact, it was. 1 through 10, inclusive.
Bingo.
Yes, it is. If I have a classroom full of children, and ask them to count off in groups of ten, they don’t start counting “0,1,2,3 . . .” They start with “1.”
“The nineties” are the years 1990, 1991, 1992, 1993, 1994, 1995, 1996, 1997, 1998 and 1999.
“The nineteen-hundredth decade” are the years 1991, 1992, 1993, 1994, 1995, 1996, 1997, 1998, 1999 and 2000.
It really isn’t confusing at all.
“It’s my considered opinion you’re all a bunch of sissies!”–Paul’s Grandfather
difference being that ‘1’ in arabic numeral means 0 - 1 but in roman inclusive it means 1 - 2.
obviously not to you. although there is one fault in your argument, the one and the same as i have previously pointed out. if you were to define the decades of the first century of our calendar what would the years of the first decade (10’s) be? it would be 1 through 10 would it not? like it has been so brilliantly pointed out before that a decade is 10 years, not matter what. so the 20’s would using the same logic be 11 through 20.
30’s = 31 through 40.
40’s = 41 through 50.
50’s = 51 through 60.
60’s = 61 through 70.
70’s = 71 through 80.
80’s = 81 through 90.
90’s = 91 through 100.
is this confusing?
no matter if the year 1 AD was made a few hundred years after it accually occurred.