Are you prepared to ring in the new millennium tonight? Or did you do that last year? Will the "real" millennial New Year please stand up?
Once every hundred years people dust off the controversy about when the centuries really begin and end. Times change, but adding an extra "zero" to the situation has only intensified the issue.
Delving into the quirks of the calendar quickly reveals a bizarre mix of astronomy and history.
The ancients measured time by the rising and setting of the sun, the phases of the moon and the seasons of the year. As lives became more complicated, they wanted to make sure they remembered to move uphill before the big rains or could get back from the long hunt in time to plant crops.
Thousands of years ago, the Chinese and the Mesopotamians both started tallying seven days in a week (perhaps reflecting the quartered phases of the moon) and 12 hours each of day and night. The ancient Egyptians figured out the solar year to predict the annual flood of the Nile.
An inconvenience emerged: the periods of the day, the lunar month and the solar year did not divide evenly into each other. In other words, although all the celestial bodies dance together, choreographed by gravity, each follows its own rhythm.
This situation irritated the Romans, who preferred order in their empire, grammar and natural phenomena. They set up a splendid calendar to reconcile the sun and the moon, but over the centuries its months drifted away from their appointed seasons.
In 46 B.C., Julius Caesar sat down with an astronomer and worked out a system very similar to what we have now, with 12 months of assorted lengths and leap years. To put the other months back with their seasons, they threw in extra months for the year of adjustment. It ended up with 445 days and was henceforth called "the year of confusion."
Six centuries later, the empire had become Christian, and the Church wanted a better way to date the history of the world than from the founding of pagan Rome. The pope set a monk named Dionysius Exiguus (Dennis the short, in English) to figure out a Bible-based chronology.
Little Dennis got out the vellum volumes, chewed on the end of his quill and set to work. He concluded that years should count from the birth of Jesus, and he placed that pivotal event at Dec. 25 in the year 753 "from the founding of the city" (Rome). He decided to restart history eight days later, noting the convenience that the baby's "feast of circumcision" would fall on Jan. 1, the Roman New Year.
And he christened that year as the year of our Lord -- one.
Unfortunately, Little Dennis made two mistakes.
First, he goofed on placing the Nativity. Among other technical difficulties, Roman records show that Herod, the heavy in the Christmas story, died in 4 B.C. That means that little Jesus spent an indeterminate chunk of his childhood "before Christ."
Second, Exiguus forgot to put in a year "zero." It was a simple mistake -- after all, zero had not been invented yet -- but it came back to haunt time keepers later and plays a major role in this millennial confusion.
By the 16th century, the calendar was getting out of whack again. The years were slightly too long, messing up the placement of Easter. Pope Gregory had a team work on it, and in 1582 issued revisions. To realign the days, he ordered everyone in Europe to skip from Oct. 4 to Oct. 15 that year and to henceforth drop leap days in years divisible by 100, but not divisible by 400.
In other words, 2000 was a leap year, but not 1900 or 2100.
Devout Jews and Muslims use lunar calendars and the Eastern Orthodox retain the Julian Calendar, but most of the modern world has adopted the Gregorian Calendar. In the English version, it has held onto its weird pagan blend of Roman months and Norse days.
But it is not altogether accurate. It will lose two days, 14 hours and 24 minutes over 10,000 years.
Meanwhile, logical people count each century as ending, not beginning, in the year with the double zeros. After all, when we count our fingers, we count one through 10, not zero through nine. And if you do not count the 100th year as part of the century, then that means the first "century" was only 99 years long.
At the same time, no one can ignore the simple milestone of rolling over the historical odometer from 1999 to 2000. People may claim 2000 belongs to the 20th century, but they certainly cannot call it part of "the 1900s."
Stephen Jay Gould, Harvard professor and well-known contemporary science writer, speculated about the situation in his 1997 book, "Questioning the Millennium," and blamed the paradox on Dionysius Exiguus.
"If our shortsighted monk had only begun with a year zero, then logic and sensibility would coincide, and the wild millennial bells could ring forth but once and resoundingly at the beginning of Jan. 1, 2000," he wrote. "But he didn't."
Times have changed.
Long ago, sages lined up stones with solstice sunrises and lay awake at night tracking wandering stars.
Now, their descendants talk of the theoretical physics of time, evoking forces such as hyperdimensional strings, exotic subatomic particles and the birth of the universe. The sages of our time have recast the measurements of Babylonian priests into a modern idiom.
The second, a relic of the awkward 12 and 60 subdivisions of ancient Mesopotamia, is now defined as the length of time it takes a particular frequency of microwave emission from an atom of cesium-133 to oscillate 9,192,631,770 times.
The atomic clocks of the year 2000 are now accurate to within one second in 6 million years.
We still cannot say with certainty if tonight marks the "real" beginning of a new millennium, but we can recognize a good excuse to party. After all, time, however we measure it, is indisputably fleeting.
Peninsula Clarion ©2015. All Rights Reserved.