Bret, I think because no year had been designated zero, we must count decades by adding ten to the year one resulting in all subsequent decades ending with the numeral one. Similarly, the second millennium didn't start until 2001.
Still, I like my decades to end in zeros, so even if it isn't strictly correct, at midnight, I'll say hello to a new decade.
Interestingly, similar confusion with this kind of calculation came up in our family because the custom in Albania, where my parents came from, was (and may still be) that a child is one when it's born and two on its first birthday.
I understand the calendarist's logic but find it lacking because the start of the calendar is arbitrary anyway so we can call any year we like the start of the decade.
I threw a new millennium party for 8 years in a row starting with 2000. A millennium is also just 1000 years in a row, so tomorrow starts the new millennium from Jan 1, 2010 to Dec 31, 3009.
Bret, of course you're correct, but we humans have a compulsion to put things into neat categories, so we've designated decades to mean ten year periods ending in zero.
No one counts to ten from zero to nine. Wouldn't that make 2011 the first year of the decade? That having been said, everyone seems O.K. with refering to the first decade in past tense. The 80's from '80 till '89 and the 70's from '70 till '79. I guess you can "divvie" up time anyway ya want. It's good we can recognize the technicalities when need be.
You're free to prefer it (although how the first floor is not the first floor is beyond me). But you still have to conform in order to be understood.
As for the decades, I have no problem with my father's functionalist explanation, given to excuse ending the millennium on December 30, 1999, that "we're in it for the zeros." Just don't pretend that your preferred way is also objectively correct. My preferred way is objectively correct.
9 comments:
From dictionary.com:
dec⋅ade
/ˈdɛkeɪd; Brit. also dəˈkeɪd/
–noun
1. a period of ten years: the three decades from 1776 to 1806.
2. a period of ten years beginning with a year whose last digit is zero: the decade of the 1980s.
3. a group, set, or series of ten.
------------
Certainly, a decade ends next December 31 as well, but a decade by all three definitions ends tonight. Your position may be firm, but mine is firmer.
Bret, I think because no year had been designated zero, we must count decades by adding ten to the year one resulting in all subsequent decades ending with the numeral one. Similarly, the second millennium didn't start until 2001.
Still, I like my decades to end in zeros, so even if it isn't strictly correct, at midnight, I'll say hello to a new decade.
Interestingly, similar confusion with this kind of calculation came up in our family because the custom in Albania, where my parents came from, was (and may still be) that a child is one when it's born and two on its first birthday.
erp,
I understand the calendarist's logic but find it lacking because the start of the calendar is arbitrary anyway so we can call any year we like the start of the decade.
I threw a new millennium party for 8 years in a row starting with 2000. A millennium is also just 1000 years in a row, so tomorrow starts the new millennium from Jan 1, 2010 to Dec 31, 3009.
Bret, of course you're correct, but we humans have a compulsion to put things into neat categories, so we've designated decades to mean ten year periods ending in zero.
No one counts to ten from zero to nine. Wouldn't that make 2011 the first year of the decade? That having been said, everyone seems O.K. with refering to the first decade in past tense. The 80's from '80 till '89 and the 70's from '70 till '79. I guess you can "divvie" up time anyway ya want. It's good we can recognize the technicalities when need be.
Time is continuous, not discrete. We don't count years, we live them.
Would we naturally group 0.0000 to 9.9999... as a length of 10, or 1.0000 to 10.9999... as a group of 10?
I choose the former.
It's also a little bit like C versus FORTRAN. In C, a[0] is the first element. In FORTRAN, a(1) is the first element. I prefer programming in C.
It's also a little bit like floors in a building. In Europe, the ground floor is 0. In the U.S. the ground floor is 1. I prefer the European approach.
You're free to prefer it (although how the first floor is not the first floor is beyond me). But you still have to conform in order to be understood.
As for the decades, I have no problem with my father's functionalist explanation, given to excuse ending the millennium on December 30, 1999, that "we're in it for the zeros." Just don't pretend that your preferred way is also objectively correct. My preferred way is objectively correct.
So, apparently when you're 50, you're still in your 40s?
Kevin;
As Bret notes, we code slingers count from 0 to 9.
As for being in my 40s when I'm 50, I actually plan on being in my teens when I am 50.
Personally, I think we should shift everything up a year and create a year 0. Then everyone would be happy!
Post a Comment