Slouching Toward The 21st Century

2014

– — –

2012

 

Has the 21st Century begun yet?

That’s a serious question, man.

For me, the jury’s still out. There’s no smoking gun to indicate that we’ve moved forward into a new epoch, but as we kick off the year 2014, the numbers needed for critical mass are starting to build up.

Historians say that the 20th Century didn’t really begin until 1914 and the first shot of World War I. Of course, what they’re really saying is that the 19th Century didn’t end until 1914. Life is lived forward, while history is lived backward.

Since we don’t have any world wars looming on the horizon at the moment and the 2000s aren’t getting any younger, we need to make an executive decision on when this new century begins.

I’m still not sure we’re there yet, but I nominate the year 2012 as the earliest possible starting date, and not a moment before.

* * *

 

Let me spell out my case:

 

1. The first ten years of a new century, the so-called “aughts,” don’t really count as a decade at all, and never have. This makes sense for a multitude of reasons. For starters, no one even knows how to say them right–the aughts, the zeroes, the -Os, or what? They had the same problem in the first decade of the 1900s, and a hundred years later there’s still no decent name for them. The 1920s were the first decade of the 20th Century to have their own identity while they were still going on.

 

2. Despite the sharp numerical delineation between one century and the next, and especially between one millennium and the next, the first day of the new century is just another day like all the other ones before it. Nothing is magically transformed overnight. Time is a perfectly flat line that stretches on without variation for eternity in both directions. Only when intelligent life forms distinguish one section of time from another does it take on any unique characteristics.

The world of Jan. 1, 2000 looks no different from the world of Dec. 31, 1999. When you went outside on the first day of 2000, the street was still covered with the same oil spots from the leaky engines of parked cars that they were in the 1900s. Similarly, on Jan. 1, 1900 the street would have been sullied with horse dung, just like streets of the 1800s.

Do you see where I’m going with this?

 

3. On paper, we slice up our centuries into tenths and call them decades. But holistically, we should break them down into halves, quarters, and eighths (furlongs if you prefer). It’s more natural and cyclical that way. It also means we passed the 1/8 mark on July 1, 2012, placing us a furlong into the new century. A decade feels arbitrary, whereas half of a half of a half feels organic. If 2006 is too soon to start the new century, 2012 feels about right.

 

4. I don’t know about you, but 2012 is when I gave up on calling the year “two-thousand and twelve” and started calling it “twenty-twelve.” 2010 was still “two-thousand ten” and 2011 was “two-thousand eleven,” but for whatever reason, 2012 became “twenty-twelve,” and for the rest of my natural life the years will begin with “twenty-” rather than “two-thousand-.”

The year 2000 was one of the most anticipated, unfathomable futuristic unknowns of the last third of the 20th Century. It was far more than just a number. As the big day approached we actually convinced ourselves that all the computers would grind to a halt and bring civilization to a standstill. What was that really all about? I’d say it was the irrational fear of a new millennium rather than rational concern about a data entry glitch.

In homage to the millennial juice that the year 2000 had, we continued to address the subsequent years by their formal names as well. 2001 was “two thousand and one,” 2002 was “two thousand and two,” and so on. At some point we dropped the “and” and just started saying “two-thousand five,” “two-thousand six,” etc.

But it would have been inconceivable to call 2007 “twenty-seven” or even “twenty-oh-seven” in real time. We have no problem pronouncing the zero in historical hindsight (think of how we say 1906, 1908, etc.), but for some reason it didn’t seem natural to enunciate the zero while we were actually living through the Aughts. Maybe it’s because in real time we have such an enhanced sense of the new century’s majesty that we don’t want to clip it from the stately “nineteen hundred” or “two-thousand” to the plain “nineteen” or “twenty.” As noted, life is lived forward, but history is lived backward.

 

5. 2012 was an election year and an Olympic year, which means that “2012” itself was going to be spoken out loud more than the three previous years combined. Every news broadcast mentioned the 2012 Olympics this and the 2012 Elections that. When you start saying a word out loud that much, you become very aware of how it sounds. It seemed like that year the wisdom of the crowd determined that twenty-twelve sounded more appropriate than two thousand twelve.

 

2013

 

2012 was a landmark year. 2013 was not at all. 13 is an oddball number to begin with. The numerals 1 to 12 each have a certain niche. They represent the hours on the clock and the months of the year. A dozen is a common unit of sale, and there are twelve inches in a foot. Despite the fact that we have ten fingers and ten toes, we’re very comfortable thinking in twelves.

 

Twelve, in fact, is the last number with a single morpheme in the English language. “Thirteen” is a compound word made up of three and ten. Fourteen is four and ten, etc. But twelve, like eleven, ten, nine, and all the numerals preceding it, is not a compound word, but an individual unit, a morpheme.

 

So what do you do with 13?  In the family of whole numbers, 13 is the redheaded stepchild . Where hours of the day and months of the year are concerned, twelve’s company, and thirteen’s a crowd. It’s the unlucky number for a reason.

 

Unless it was a wedding or graduation year for you, 2013 is already being forgotten.

 

2013 is the devil’s interval, a chronological instability that wants to resolve either down to 2012 or up to 2014.  Resolving down to 2012 takes us back into the first furlong of the 21st Century, which as we’ve discussed is really just an extra appendage of the 20th Century. Resolving up to 2014 propels us forward into the great unknown; the unseeable future, the uncharted 21st Century.

 

2014

 

2014 marks 100 years since the start of the First World War, the event that kick-started the 20th Century. We’re reaching some other milestones as well. It’s been 50 years since Kennedy was assassinated and 40 years since Patty Hearst was kidnapped. Pretty much all of what we call the Sixties happened between those two milestones, and now all of it is officially a long time ago. When our grandchildren grow up, the 1960s will be as ancient to them as the Civil War was to us.

 

A quick rule of thumb: If you have 20 years of memories accumulated in the previous century, it will take you 20 years in the new century to move past them. If you have 30 years of 20th Century memories, it will take you 30 years to get past them, and if you have 40 years of memory from the 1900s, you probably won’t live long enough to ever get out of the 20th Century.

 

There are precisely five people alive today who were born in the 1800s. Eight months ago, there were 11. By the end of 2014, there might very well be none. Another milestone will have been reached.

 

One more litmus test: When I say, “the last century,” do you think of the 1900s or the 1800s? The answer to that probably depends on how much 20th Century consciousness you have under your belt.

 * * *

The most exciting thing about the first furlong of the new century is the novelty dates peppered throughout the calendar:

Dates like 01/01/01, 02/02/02, and 03/03/03

Or 04/05/06, 07/08/09, and even 09/08/07

That first kind of novelty date ended on 12/12/12. According to my calculations, the second kind of novelty date will end later this year, on 12/13/14. Then it’s all business for the rest of the century. The novelty is over. The century begins.

* * *

Here in 2014, the people old enough to run their own lives, not to mention run the world, are still products of the 20th Century. The new century belongs to people too young to remember the old one, which means that the first 15 to 20 years of the 2000s belong to us, the people of the 1900s.

These first couple of decades of the 21st Century are a holding pattern until the current crop of kids gets old enough to start filling the ranks of society. And when they do, watch out, because your 20th Century consciousness will be as outmoded as 19th Century horse and buggy, pre-lightbulb people were once the 20th Century got going.

You’ll be a curious relic that the young will flock to for stories about the days before the Internet and cell phones, but slink away from once you start carrying on too long about those times like they still matter.

* * *

Whenever I have unresolved questions about modernity, I turn to the Old Testament for comfort. For instance, the Book of Exodus explains that the Israelites had to wander the desert for 40 years, not because it took that long to get where they were going, but because that’s how long it took them to forget where they came from.

That bit of Bronze Age folk wisdom is something to consider in 2014 as we kill the time between eras, waiting for the 21st Century to begin.

Share Your Thoughts

This site uses Akismet to reduce spam. Learn how your comment data is processed.