Printer Friendly

Clash of the Time Lords: who will own the measure of our days?

It is New Year's Eve, 2005, and the night sky above the U.S. Naval Observatory in Washington, D.C., is crisp and clear. On this moonless evening--the second new moon of the month--all that is white stands out against the blackness: the stars, the stark white dome of the observatory itself, the huge discs of the satellite dishes nearby, the muted shadow of the vice president's residence (built in 1893 to house the observatory's superintendent, and now reportedly one of the few intentionally obscured spots in Google Earth's view of Washington). On a rooftop adjacent to the observatory stands a pole holding a metal Time Ball, ancestor of the glittering Waterford Crystal ball that in a few hours will ceremoniously descend in New York City's Times Square. The Naval Observatory's Time Ball was the first such device in the United States, and beginning in 1845 it dropped every day precisely at noon. Its signal informed all of Washington, and all the ships on the Potomac, and soon afterward the entire country, of the precise time by which to set their clocks and chronometers, in order to accurately determine their position by the stars. The Naval Observatory has been telling the nation--and now the world--precisely what time it is ever since.

Inside one of the observatory's more humble spaces, a rabbit's warren of 1960s-vintage fluorescent-lit hallways and offices called the Time Service Building, time is being tracked more exactly than ever before in human history. What must have seemed like astonishing accuracy in the day of the Time Ball (to a few tenths of a second) has given way to the pinpointing of a billionth of a second, an unimaginable unit called the nanosecond. Amidst that exactitude, however, on this New Year's evening a glitch is about to occur--a carefully planned, elaborately executed glitch that nonetheless has time directors, or time managers, as they're sometimes called, from Tokyo to Paris to Cairo sitting on the edge of their seats.

Here, in an unremarkable-looking hallway, a small crowd is beginning to gather--an incongruous mixture of full-dress Navy officers, khaki- and polo-shirt-clad technicians, and my guide for the evening, Dennis McCarthy, looking natty in a tuxedo (his ultimate destination tonight is a New Year's Eve party). Earlier this year, McCarthy officially retired from his USNO post, which carried the Orwellian title of director of the Directorate of Time, but he has lingered on as a consultant, apparently unable to tear himself away from watching the clocks. Now a half-dozen or so of us are standing in front of a wall of windows, looking into a chamber full of atomic clocks, the most precise clocks on the planet. They don't resemble clocks at all, however; they're a surprisingly mundane-looking set of metal cabinets with a few dangling cords reminiscent of old-fashioned telephone switchboards. Embedded in the cabinets are several digital readouts counting off the seconds. The chamber must be kept environmentally stable, which is why we are banished to the hallway, peering in like a group of anxious new fathers clustered at the maternity-ward window. Behind us is a smaller window and a smaller chamber, housing the queen bee: the Master Clock, which averages out the readings of the other dozens of near-perfect clocks and keeps the most perfect time, It is 6:30 P.M., and we are all waiting for the glitch, waiting to see if, in one astronomer's words, there will be any "obvious Airbuses plummeting from the sky" when the clocks tick over to 7:00 P.M.

At 7:00 P.M. in Washington it will be midnight in England, and hence midnight Greenwich Mean Time (although for most of the world except England, GMT is now officially UTC, Temps Universel Coordonne, or, in English, Coordinated Universal Time). Every super-precise clock in the world must experience the glitch at the exact moment of midnight UTC--that is, at 9:00 A.M. local time the next day in Tokyo, 1:00 A.M. in Paris, 2:00 A.M. in Cairo. The glitch we're waiting for is a leap second, an extra second of time that no clock really wants to count, since it goes against the essential logic of timekeeping that we, and all clocks, understand. The hour before this particular midnight will contain 3,601 seconds rather than 3,600, and thus the clocks must register 23:59:60 before any of us can enter 2006. For a clock, that's a very unnatural act indeed.

"This is happening all around the world," muses McCarthy, a distinguished-looking man with a salt-and-pepper mustache, hazel eyes, thinning white hair, and a slightly ironic smile. "People are standing around rooms like this, watching, waiting, worrying about whether this will work." As we approach zero hour Universal Time, the ambient murmuring dies down. The CNN cameraman who is filming through the glass checks his equipment. Soon there is little sound except the recorded voice announcer, intoning the time every fifteen seconds. "U.S. Naval Observatory Master Clock. At the tone, Coordinated Universal Time will be twenty-three hours, fifty-seven minutes, fifty seconds." All the atomic clocks have been programmed earlier in the week to count the unnatural sixtieth second; the voice announcer is the only element of the leap-second glitch that must be adjusted by hand. USNO engineer Blair Fonville begins to fret. His responsibility is making sure the voice announcer the official voice of the USNO, which in turn disseminates the official U.S. time--registers the leap and hits the zero second right on the dot, using technology that seems very last century. For the last six hours or so he has been hovering over two tapes, one containing a pre-leap count and one that will start at the second after the leap. At zero hour he will turn one tape off and one on--a little wrist-flick maneuver that he confesses to having practiced earlier in the week. As is true for many of the technicians in the cramped little hallway, Fonville is about to witness his first leap second at the USNO. There hasn't been one in seven years, the longest period without a leap since the system began in 1972, and anxious eyes are watching here and abroad.

In the midst of the last minute my muscles tighten instinctively, and there seems to be an unconscious intake of breath across the room. Fifty-nine, sixty--a silent click on the digital readout in the sealed-off clock room, and an almost imperceptible instant of air in the voice-announcer tape, seeming strangely to echo our shared instinctual gasp, then: "Coordinated Universal Time is zero hours, zero minutes, zero seconds." Time has jumped, tripped a bit, taken an extra step on the way into the New Year, and everything seems exactly the same. We breathe again. Engineers and technicians run in from other rooms, issuing reports: this system is up, that system is working, everything seems operational, no airplanes fell from the sky, nothing exploded. For the next two minutes or so, the number of "hits" to the Network Time Protocol--the USNO's Internet service that provides precise time to any computer that requests it--swings from the 4,000 range up to 16,000, then settles back down. In a charmingly anachronistic touch, the three USNO servers are named Tick, Tock, and NTP2, even though USNO clocks haven't actually ticked in quite some time.

Everything about this little drama of the leap second feels like an anticlimax except this: the punch line about planes falling from the sky is not entirely a joke. The theories Einstein proposed a century ago about the relationship between time and space have since been put into operation on a grand and sophisticated scale, and the seemingly tiny leap second is, at least for the moment, an essential cog in that operation. But the leap second is also a controversial cog, the workings of which have recently become fodder for scientific power struggles, diplomatic end runs, and conspiracy theories. The ripples from those struggles and maneuverings extend from the sealed-off atomic-clock labs of the USNO and other national observatories out into the reaches of space, and back into the shadows of history. If there is a joke here, it is on us clever humans, and it goes like this: In the past half-century we have discovered so much about time that it has become increasingly difficult to know how best to keep it.

Timekeeping has always been one of mankind's greatest challenges, less vital than the need for food and shelter but more fundamental than most other quotidian concerns. Without an understanding of time, we are lost, wanderers in a murk of experience where yesterday and tomorrow are seen only in broad strokes, banded by sunrise and sunset. For the vast majority of human history, those strokes had sparse embellishment, supplied by the occasional sundial and by a human eye far more attuned to the subtle nuances of sunlight than most people possess today.

The sun was always the fulcrum, nature's universal timepiece, but it has been up to mankind to divvy up the day in some logical way. There has never been an obvious answer the Chinese used twelve subunits to our twenty-four; the Hindus chose sixty. One maverick technology geek recently floated a proposal for a new, much more orderly system based on decimals--a system of Dime rather than Time, containing ten Dours, each with 100 Dinutes divided into 100 Deconds.

The first mechanical clocks didn't come along until sometime in the fourteenth century (the actual moment and inventor is lost to history), and for many centuries afterward time remained approximate, and entirely local. Each town had its central square and clock tower, and lived in a world of time unto itself. Nineteenth-century technology changed that system forever, not through timekeeping innovations but through the mechanics of movement and communication--that is, trains and telegraph wires. How could you run a train line with anything like on-time arrivals and departures if each town kept its own time? The telegraph provided the answer: an instantaneous way of coordinating time from town to town, country to country, even continent to continent. In 1884, President Chester Arthur invited the world to an International Meridian Conference in Washington, D.C., at which representatives from twenty-five countries wrangled and discussed and finally defined the outlines of global time as we know it today: twenty-four time zones, each adhering to a "mean [or average] solar time," with the Prime Meridian, or zero hour, located at the Observatory at Greenwich--the origins of Greenwich Mean Time.

The new system of mean solar time, however, invented as a practical solution to a practical problem, ended up subtly--but drastically--changing our conception of time. "Noon," as defined by clocks, telegraph wires, train schedules, and entire countries, became something mechanical and abstract, an agreed-upon average rather than a natural phenomenon. It could no longer be determined by looking at the sky, because the sky at "noon" looked different at one edge of a time zone from "noon" at the other edge of the same time zone. Time has been diverging further and further from what could be called solar truth ever since.

Therein lies the cosmic joke--the joke that has led to the little leap-second scene played out at New Year's (a scene that has occurred twenty-two times before), and the much bigger battle over whether that scene should ever take place again. In the past century, and particularly in the past fifty years, extraordinary technological advances have made timekeeping so precise that it has become, in essence, more precise than time itself. That sounds paradoxical only because we tend to think of "time" as one entity, when in fact the invention of the first atomic clock in 1955 ended up teaching us otherwise: that we live with two different kinds of time (or "flavors of time," in one astronomer's description). Both flavors are, in a sense, true, but one is more perfect--or at least more regular--than the other.

The first flavor of time, the only one we had tasted for all of human history until the mid-twentieth century, is earth time: defined by the rotation of the earth on its axis (twenty-four hours) and by its orbit around the sun (365 and one quarter days). We broke the day down into hours, minutes, and seconds, and defined each of those as a fraction of the twenty-four-hour rotation. That system worked very well for a very long time. The second flavor, atomic time, appeared at first to be simply a more exact method of counting earth time. Physicists discovered that the electrons within a cesium atom oscillate at an astonishingly unvarying rate, regular to within nanoseconds, never changing, never slowing or speeding up, more accurate by far than any clock ever invented. By 1967 atomic clocks had prompted an official redefinition of the second: it was declared, at the thirteenth General Conference on Weights and Measures, to be not 1/86,400th of a day but 9,192,631,770 oscillations of a cesium atom.

Atomic time seemed a perfect wedding of nature and science, a harnessing of the unimaginable precision of the universe. The honeymoon, however, was short-lived. Atomic time is indeed close to perfection, but earth time, though equally "natural," is not. This rock careening through space, spinning on its axis as a result of a force physicists call angular momentum, is a flawed clock when compared with an oscillating atom, because the rate of the planet's spin is not steady. Not only does the earth wobble on its axis, affected by everything from space dust to tides to weather at the core of the planet, but over eons the earth's rotation has been gradually slowing down. As a result, the period from sun rise to sunrise has become incrementally longer; scientists estimate that the length of a day in the Devonian era, about 400 million years ago, was about twenty-two hours. Periodically, therefore, earth or solar time falls behind the implacable march of atomic time, which takes no notice of the rest of the universe.

Before perfect clocks, the unpredictability of the earth's movements was not a problem; a variation of a second here or there in the length of a day over years or decades could not even be tracked. But just in the past half-century, time and space have become linked in ways beyond perhaps even Einstein's imagination. The global positioning satellite system (GPS), which is based on the travel time of electromagnetic signals, factors time into distance to send missiles to precise targets; space telescopes are programmed to a billionth of a second; even airplanes navigate by GPS rather than radar. Within these systems, one second has come to contain an eternity. If one second is off, maritime and flight navigation could be thrown into disarray, satellites could lose data, the space shuttle could come down in the wrong place. Within the tiniest fragment of a second, a bomb could end up in the wrong country: in GPS, an accuracy of 10 nanoseconds (ten one-billionths of a second) corresponds to a positional accuracy of about 10 feet.

In this tightly balanced world of nanoseconds; there is no room for competing flavors of time. Two decades after the invention of the atomic clock, a solution was hailed: the leap second. Whenever solar time is about to fall nine-tenths of a second behind the atomic count, global timekeepers force all the atomic clocks in the world to count an extra second, as they did last New Year's Eve, giving the earth a second to catch up. Space and time remain tethered together, the stars realign, telescopes still tell the truth about the heavens, missile systems still know where to send their bombs--all because of that fragile, almost imperceptible, extra tick. So why is it that Dennis McCarthy, and many other scientists and time managers in the United States and around the world, are now trying to do away with the leap second forever--or at least for the next 600 years or so?

"If you want to know why, you have to be one of them," says Steve Allen portentously. "Or you have to have a spy." Allen is in many ways McCarthy's opposite number--or, perhaps more accurately, the leader of the opposing army. McCarthy was one of the first to publicly propose banishing the leap second in 1999, and Allen began rallying the troops to save the leap second four years later. More fundamentally, they represent a struggle that many on both sides describe as "the astronomers versus the physicists" (and that one astronomer characterizes as "natural time versus technology time"). Steve Allen, an astronomer at the University of California's Lick Observatory, in Santa Cruz, California, is matter-of-fact about who is currently winning. "The major change over the last fifty years," he says, "is that the authority over time has shifted from astronomers to physicists."

When our only flavor of time came from understanding the sun, stars, and planets, astronomers ruled the clock. Now not only do the physicists run the clocks that officially define the second but they are pressing their advantage: if they succeed in doing away with the leap second, they will own time itself. "There's a reason they have been given the appellation 'Time Lords,'" Allen intones. "They don't talk to people outside. They just decide what is right and they tell us." It seems fitting, in a milieu defined by perennial nerdiness, that the term "Time Lords" apparently derives from the 1960s cult TV show Doctor Who.

Alien is a tall, almost cadaverous-looking man with muttonchop sideburns and oversize, 1970s-style aviator glasses. He is telling me these things in a low voice as we sit having coffee (or, in Allen's case, a large glass of whole milk) early in the morning, in a hotel lobby behind enemy lines. We are in Boulder, Colorado, a little less than two miles from the National Institute of Standards and Technology, which is the western counterpart to the USNO. NIST, like the Naval Observatory, is home to dozens of atomic clocks, and is the other official source of time for the nation. NIST also, like the USNO, harbors its share of leap-second foes.

As perhaps befits a guerrilla fighter in a covert war, Allen is entranced by secrecy and conspiracy theories. His conversation is laced with phrases like "a network of plausible deniability" and "I can't tell you how I know that." He freely acknowledges his "inner geek"--his Leap Second website is an astonishing collection of minutiae about the issue--and its geek vocabulary: blogland, lurkers, flame-bait. What he calls his "evil analogy" of the core issue behind the leap-second debate refers to an imaginary war fought in a fictional fairyland: "In C. S. Lewis's Narnia," says Allen, "the evil queen casts one of her delusional spells and says, 'What do you mean, "the sun"? The true light is this lamp, but you've imagined this greater thing in the sky called the sun, that gives off a greater light. The sun is a fantasy.' It reminds me of how the clock people are saying, 'Never mind that sun thing in the sky, it's just an illusion. We have the cesium atomic clock here, and it tells you the time far better.'"

Allen's analogy is an exaggeration, but it does contain a kernel of truth. Even the time managers and physicists admit that if the leap-second system is dropped and our universal civil time becomes simple atomic time, it will spell the end of mean solar time as we know it. That is, the sun, along with the stars and our measurements of the earth's rotation, will become irrelevant to telling time. "Time" will become an abstraction, numbers on a display, unbound from the outside physical world. If time were left to march on, perfectly atomically regular, second by second, with no reference to the earth's movements, the result would inevitably be darkness at noon. Granted, that darkness would take between 3,000 and 4,000 years to occur, but it's an image that astronomers like to evoke when the leap second comes up for discussion.

The shock value of that image, with its biblical allusion to the hour of the Crucifixion, hints at the emotional power that time--its counting, its passing, its structuring of our world holds over our inner lives. The opposing armies in the leap-second struggle may frame their arguments in practicalities like computer systems and air-traffic glitches, but what we decide to do about time also resonates on a more intuitive and philosophical level. Rob Seaman, an astronomer at the National Optical Astronomy Observatory in Tucson, and, with Allen, a key voice in defense of the leap second, argues that the sun's patterns are inherent to our instinctive sense of time.

"Our clocks have always been synchronized to the spinning earth," says Seaman, who, like Allen, was passing through Colorado last fall after attending an astronomy seminar in Aspen. "In fact, the first high-precision clocks and chronometers were invented precisely for that purpose--for navigation, so that the British Navy could figure out where they were on the surface of the earth. To calculate longitude, you need a good clock." The Englishman John Harrison's invention of the time-based method of determining longitude in the eighteenth century confirmed a principle that now orders our technological existence: that space and time not only are the two central means through which humans experience reality but are intimately connected.

Seaman confronts that reality every day, because the space telescopes he programs depend on precise time for accuracy--time, that is, as it relates to the earth and the stars. Exact time allows us to determine distance and position. When looking at the sky through a telescope, explains Seaman, one second of time corresponds to as many as fifteen "arc seconds," a measurement of distance in the sky. "That's a gargantuan distance by the telescopes we use today," says Seaman, "which operate on the level of milliseconds of arc." So if there is a one-second difference between the official (UTC) time an astronomer is using to program his telescope and actual solar time based on where the earth is in relation to the sun and stars, the object he's looking for might be entirely outside his field of vision.

It's not as if professional astronomers couldn't make the necessary corrections, Seaman adds, but abandoning leap seconds would be a complicated and expensive proposition. "It could easily cost hundreds of thousands of dollars per telescope to reprogram everything," he says. "And we'd have to do it quickly, because within five to ten years the discrepancies would be too large to ignore." On the other hand, he jokes, dropping leap seconds could offer some personal economic opportunity. "I think a lot of corporations would be looking for retreaded astronomers to help their software deal with it," he says. "It would be easy to sell that idea to anybody who depends on time--cell-phone companies, railways, shipping companies ..."

Seaman is a tall man with a deep, resonant voice and long, slightly graying hair pulled back into a ponytail. His style is deliberate and moderate, but his persona in print is considerably more heated. If Steve Allen is the conspiracy theorist and unofficial historian of leap seconds, Rob Seaman is the resident hothead of the Leap Seconds listserv established by the USNO. For six years the listserv has functioned as a time-geek chat room populated by a few voluble astronomers, software engineers, physicists, and an invisible, uncounted legion of lurkers. Seaman freely admits that his self-selected role on the listserv is as a needler. Few arguments put forth by the anti-leap-second, crowd escape his rebuttal and occasionally his ridicule. Seaman's conversational and passionate listserv emails often bristle with provocative words and phrases: "intellectually dishonest," "assassination of UTC," "dumb," "naive," "ill-conceived political machinations," "if you don't give a rat's ass," "we can incessantly debate trivialities," "political shenanigans," "balderdash," "craven," "inane," "daft," "bollocks." What lies behind his acerbic and challenging style is a fervent belief in the sanctity of mean solar time.

The intensity of the listserv itself has been kicked up a notch recently by a fresh challenge to solar time: a formal proposal that a U.S. delegation introduced in September 2004 to the International Telecommunication Union (ITU). The U.S. group (called the U.S. Working Party-7A, or USWP-7A) suggested that leap seconds be dropped and replaced with a leap hour when the differential between atomic and solar times grows to that length--effectively putting off the problem for 600 years or so and allowing the two time systems to slowly diverge in the meantime.

One prominent European astronomer blasted the U.S. proposal as not only a "dirty trick" but a "disaster for classical astronomy." A computer-security expert at Cambridge University used remarkably similar language in calling it a "political trick" and a "recipe for catastrophes." Esoteric and sometimes outlandish balloons were floated: If we're not going to worry about civil time matching solar time, for instance, why not simplify global timekeeping even more by reducing the number of global time zones from twenty-four to five? Or, another speculation: How will we decide, when the need arises, what time it is on Mars or Jupiter? The latter question was prompted by a listserv discussion of Mars Local Solar Time and "Martian jet lag," an actual interplanetary phenomenon. When technicians on Earth operate Mars Rovers, they must live on Mars time, just as the two Rovers do--and the Rovers are situated on opposite sides of the planet. So space engineers might operate one Rover during the Mars day (which might be the middle of the night on Earth), and then have to operate the other Rover when the sun reaches it hours later. Martian jet lag, and sleep deprivation, ensue.

Among all the listserv chatter, however, certain names do not appear--the names of those intimately involved in working for the demise of the leap second. Those include Judah Levine, a physicist and eminence grise of atomic-clock technology, and Wayne Hanson, chairman of the U.S. Working Party-7A, which proposed the leap-hour idea. Both are headquartered at NIST in Boulder, which is why, after meeting with rebel leaders Allen and Seaman, I head into the foothills of the Rockies' Front Range, to the lair of the country's other Master Clock.

The National Institute of Standards and Technology is a near-perfect example of Eisenhower-era, institutional-modern construction: a low-to-the-ground concrete-and-glass box sporting the occasional border of stonework as a grudging concession to style. Judah Levine meets me in the lobby and leads me back to the chambers that house NIST's dozens of atomic clocks, which turn out to look a little more like refrigerators than filing cabinets.

Levine became a key figure in the debate in 2001, when he coauthored an often-referenced article for the journal Metrologia titled "The Leap Second: Its History and Possible Future" (another contributor to the article was Dennis McCarthy). It is an exhaustive primer on leap seconds. And although its tone and premise seem straightforward, by the end of the article the point of view becomes clear: at each stage in the evolution of timekeeping we have taken another step away from the sun as a measure of time, toward "a more uniform, accessible, or convenient standard." The next step might be simply moving to atomic time, "a disassociation of civil time from solar time altogether.... [W]e should perhaps not be too hesitant in adapting to modern technology and modern needs." In other words, face it: the sun's day is over.

Levine, like McCarthy, has been in the business of administering time long enough to have been present at the birth of the leap second. And although Levine is a physicist, right down to the plastic pocket protector in his short-sleeved shirt, and McCarthy trained as an astronomer, they have landed on the same side. To wit: "We've clanked along, and we've built this system," says Levine, "and it's a hassle." Since 1972, he points out, the world of technology that depends on precise time has changed drastically. GPS was inaugurated in 1978; now the Soviets have their own satellite system (GLONASS), and Europe recently launched a trial of its own system (Galileo). If the three aren't perfectly synchronized, the disasters that could follow would be physical, not theoretical. Also, since 1972 the world of the stock market has been transformed, in several ways. Computerization has made fractions of a second crucial in stock values and trades, and the Asian markets have become much bigger players. Leap seconds occur during working hours in, say, Japan--potentially troublesome when the leap occurs on a non-holiday.

The deeper problem that haunts Levine is what might be considered the underlying gap between the two ways of counting time, a gap that cannot be remedied by inserting leap seconds. He explains this dilemma with a real-world analogy: our actual physical age, counted by seconds and days lived, and our age counted by birthdays. Someone who is, say, twenty years old could be considered, on his birthday, to have been alive 7,300 days (365 times 20). But this self-evident fact is not actually true. Because of the every-four-year leap year, he has in real-life (what Levine calls "heartbeat") terms been alive 7,305 days (including the five leap days that have occurred since his birth). On his birthday he is twenty years old, but depending on how you count time--by a calendar or by the actual moments that have passed he is either twenty, or he is twenty years and five days. It's a mathematical conundrum with no true answer that results from our ability (or our decision) to count time in two different ways.

"Every four years you live a day that is essentially not counted," explains Levine, just as the leap second is not counted. We consider a day with a leap second to be a "day," period--even though it contains 86,401 seconds rather than 86,400. "There's a fundamental problem whenever you want to compare a dynamic process with a physical clock," he says. "And every system that has something dynamic going on has to deal with this problem of leap seconds. In 1972 that wasn't a big problem, but now we have more and more systems of that type."

The leap-second plan, says Levine, originally made sense as a compromise between two groups: astronomers, who needed civil time to match the heavens, and physicists, who needed a defining standard of time based on frequency--a steady, unchanging beat. Now, Levine and other time managers feel, that compromise needs to be reconsidered. "It's likely that any future compromise will leave someone unhappy," says Levine. "That's the nature of a compromise."

Neither Levine nor Wayne Hanson, who has been responsible for presenting to the rest of the world the American case for deep-sixing leap seconds, gets terribly specific about the practical problems or dangers of continuing them. Levine's term "hassle" is phrased by Hanson as "troublesome"; science, says Hanson, "is more comfortable with a uniform time scale." Hanson has an office at NIST on the floor above the atomic clocks, although he, like Dennis McCarthy, is officially retired from the Time Service and remains on as a consultant. Hanson has been the focus of some ire on the part of astronomers, since he not only wrote the U.S. proposal for discontinuing leap seconds but also prefers to keep a very low profile. The latter propensity has led to rampant suspicion and paranoia, even accusations that the Time Lords in the United States are trying to slip through the change before anyone has a chance to Object.

Hanson placidly denies the charges, pointing out that the U.S. Working Party he chairs is open to "any U.S. citizen"--although all the participants in the most recent meeting, he concedes, were members of the U.S. government. As to why the astronomical world should be up in arms about the proposed change, he appears mystified. "Maybe it's nostalgia, tradition," he muses. "Leap seconds are really an artifact of the sextant era."

But to frame the issue purely in practicalities is to vastly underestimate the personal and proprietary significance of time. That the question of who "owns" time has recurred frequently among leap-second combatants hints at this visceral element--and, of course, once something can be owned, it also can be stolen. Long ago there was universal agreement about who owned time: God, or some supreme being, was both creator and administrator of time. When that notion first began to change hundreds of years ago--when mankind seized control of time in the most basic Way--there was hell to pay.

The starkest example of that power shift, at least in the lives of English-speaking peoples, was in many ways a cartoonish prequel to the leap-second battle. It certainly was a more clamorous affair, played out on a public stage rather than in a physics lab. And although the leap second is a nano-adjustment, a small tug at the hem of time, the change in time that occurred two and a half centuries ago was practically a remaking of the whole garment, a matter of cutting out a section and stitching the remnants of days and hours back together. In 1752 time leapt forward not by one second but by a week and a half. And the populace was not pleased.

In that year, English and Colonial American subjects went to bed on Wednesday, September 2, and awoke the next morning on Thursday, September 14. The intervening eleven days, which would have been September 3-13, never came into being. To one way of thinking, this was simply a semantic change; "the next day" by any other name is still the next day, attach to it whatever arbitrary number you will. And yet this unprecedented deviation from the steady march of the calendar loomed large in the eighteenth-century mind. Time, after all, had never been imagined as something random or alterable--and certainly not as something subject to casual manipulation by humans. Time belonged to God, or at least to the mysterious and perfect order imposed by the cosmos. But that manipulation was in fact a necessity if time were to continue to make any sense at all.

The underlying conundrum of two and a half centuries ago continues to dog today's timekeepers: How do we make our human calculations accommodate the endless complexity and quirkiness of the earth's movements? In 1752 the central problem was posed not by the earth's twenty-four-hour rotation but by its 365-day orbit around the sun. And the difficulty lay not in the unsteadiness or unpredictability of that cycle but in the fact that it is extremely tricky to break that orbit down into a logical and accurate calendar. Consider: the length of a year is 365 days, 5 hours, 48 minutes, and 45 seconds--not an easily divisible number. Then consider that the lunar month--which consists of 29 days, 12 hours, 44 minutes, and 2.9 seconds--does not match up neatly to the solar year. If you were to put twelve lunar months together, you'd end up with a 354-day year--not a very useful concept when the earth takes eleven days longer than that to complete an orbit of the sun. (If you've ever wondered why the calendar months differ in length, or why we sometimes have a "blue moon" year that contains thirteen full moons, there's your answer.)

Humankind came up with a reasonably workable way to cope with these inconsistencies a surprisingly long time ago, under the aegis of the Roman Emperor Julius Caesar. The Julian Calendar served most of the world for more than 1,600 years, from 45 B.C. until 1582. It had only one small glitch: its calculation of a year was longer than the solar year--the time it takes the earth to orbit the sun by about eleven and a half minutes.

Eleven minutes may not sound like much when you're waiting for a table at your favorite restaurant, but in the course of centuries, eleven minutes and change become a formidable chunk of time. By the 1300s, those superfluous minutes had added up to hours, then days, then more than a week. The calendar was losing time, irrevocably, to the "real" year, slipping further and further behind in its measurement of the earth's orbit. Anomalies began to creep into what had been the certainties of life. The spring equinox--one of two moments in the year when day and night are of equal length all over the earth, and which occurs on or about March 21--began to fall on March 16, then 15, then 14. Humans had always tethered their calendar, and their understanding of time itself, to the physical world--to the seasons dictated by the earth's orbit. Now Easter, the symbol of rebirth, might drift from spring to midwinter; Christmas, originally based on a pagan celebration of passing the longest night of the year, would come in the fall.

It took two centuries of wrangling among European mathematicians and astronomers to find a solution. And it took the unchallenged power of a world leader, Pope Gregory XIII, to make it legal. The pope appointed an official calendar commission to revise the Julian system and presented the result to the world in 1582. His Gregorian Calendar, which we use to this day, was not a major rewriting of the rules but rather a subtle tweaking of the Julian system. The key difference was the strategic elimination of a leap day every few hundred years.

Correcting the system for the future was only step one, however. Step two was much less subtle: The new calendar couldn't work until the world made up for all those centuries of drifting minutes and days in one ruthless stroke, by making ten days disappear forever. In many, mainly Catholic, parts of the world, the days October 5-14, 1582, never existed. By the end of the seventeenth century, almost all of the Western world had embraced the Gregorian system.

In England the response was less than enthusiastic. In the throes of a brutal struggle between Catholics and the newly formed Protestant Anglican Church, the English government rejected the pope's authority. So began a strange, off-kilter period of European life--quite possibly the only moment in history when humans could experience something akin to time travel. In those years a traveler who went from England to France immediately leapt forward by ten days, and then fell ten days back upon his return.

Finally, that considerable inconvenience--not to mention the embarrassment of being (along with Sweden and Russia) the last Western country backward enough to still live by the "Old Stile"--forced England into step with the rest of the world. This time the instigator was not an earnest scientist or mathematician or even a pious pope but in every way the opposite: an aristocratic fop, bon vivant, and self-confessed dilettante named Philip Dormer Stanhope, fourth earl of Chesterfield. Chesterfield was neither a radical nor an ideologue; his incentive for pushing through legislation to correct England's calendar, which by then was off by eleven days rather than ten, appears instead to have been largely social and practical. The project also gave him, for a time, something amusing to pursue when he wasn't gaming at his club in St. James's or tossing off witticisms at fashionable salons.

To his mistress in France he wrote that the confusion of sending their titillating correspondence across the eleven-day dateline convinced him to promote the reform. To his beloved illegitimate son, living as a courtier-in-training in Paris, he wrote, "I have of late been a sort of an astronome malgre moi, by bringing last Monday, into the House of Lords, a bill for reforming our present calendar.... Upon which occasion I was obliged to talk some astronomical jargon, of which I did not understand one word, but got it by heart, and spoke it by rote from a master."

There is no transcript of the speech Chesterfield gave to Parliament on February 25, 1751, when he introduced his Calendar Bill for consideration, but it was apparently persuasive. Chesterfield's close friend Lord Macclesfield, a leading astronomer and later president of the Royal Society, gave a seconding speech on March 18, and the deal was closed with little argument. The key to his success, Chesterfield recounted in another letter to his son, was not any particular brilliance of mind but simply a gloss of style and entertainment. "Between you and me, Lord Macclesfield's speech was, in truth, worth a thousand of mine," Chesterfield wrote after the bill passed. But although Macclesfield added the necessary expertise and gravitas to the discussion, Chesterfield, in all his charming and enthusiastic ignorance, was ultimately given credit for the bill's acceptance.

The ease of that political triumph, however, showed itself by the following year to have been deceptive. Although Parliament painstakingly made provisions for all kinds of legal transactions that might be confused by the missing days, and the Church of England contributed a catchy slogan worthy of twenty-first-century politicking ("The New Style the True Style"), the people were having none of it. According to many accounts in the decades that followed, the 1752 calendar change provoked widespread mayhem in Britain. Not only did many countrymen simply refuse to accept the new system; the so-called Calendar Riots erupted in various cities, resulting in several deaths in Bristol. Crowds of angry protesters gathered in towns across the country to chant slogans like, "In seventeen hundred and fifty-three,/The style it was changed to Popery," and "Give us back our eleven days!" The latter slogan appears, scrawled on a poster, in a 1754 political cartoon by William Hogarth. The more violent tales of calendar unrest, however, have provoked some scholarly skepticism in the last few decades, and the written record from the time is sketchy enough that we will probably never know the exact level of hysteria prompted by the eleven days' incident. Reluctance and confusion abounded, though, especially among country people. Many villages continued for years after 1752 to observe the old calendar for public events like Christmas, New Year's, and seasonal market days. One contemporary observer wrote that although there was pragmatic opposition to the change among businesspeople concerned about rents, leases, and debts, "greater difficulty was ... found in appeasing the clamour of the people against the supposed profaneness, of changing the saints' days in the Calendar, and altering the time of all the immoveable feasts."

In the American English Colonies, the reaction to the Calendar Act was muted. But some historians have since pointed out an oddity, a little-known bit of Colonial trivia resulting from the calendar change: the date officially presented as George Washington's birthday is, in physical terms, wrong. Every source cites the day as February 22, 1732, but in fact Washington was born on February 11 and celebrated his birthday on that day throughout his life. When the calendar reform was imposed in September 1752, Washington was twenty years old, awaiting his coming of age the next February, and the act made it very clear what was to be done about the legal aspect of birthdays: "No Person or Persons whatsoever shall be deemed or taken to have attained the said Age of one and twenty Years ... until the full Number of Years and Days shall be elapsed on which such Person or Persons respectively would have attained such Age ... in case this Act had not been made."

In other words, George had to count eleven days past the day on which he had, in reality, come into the world before he legally attained his majority. From then on, he was officially considered to have become twenty-one years old on February 22, and twenty-two years old on the following February 22, and so on. The banished eleven days did not disappear, after all--they existed, they were lived through, perhaps impatiently, by young George and all his countrymen. But their "disappearance" nonetheless altered facts--like dates of birth--that had theretofore been thought of as unalterable.

The paradoxical fact of Washington's two birthdays brings to mind Judah Levine's leap-year-and-true-age analogy. In both scenarios we are confronted with two different systems of time, both of which are legitimate methods of determining the passage of days and years. One is physical, "heartbeat" time, as it marches steadily on; the other is clock-and-calendar time, representing the order we humans have imposed on that heedless march. Consequently, it's not surprising that the leap-second tussle has resurrected the eleven days' incident from the backwaters of history, though its most germane lessons seem to be not calendrical but political.

"Who will be the equivalent of Pope Gregory XIII at about 2600?" asks computer-software expert Marcus Kuhn of the University of Cambridge in a leap-second listserv entry, referring to the approximate era in which all the missed leap seconds will have added up to one hour. "And where would this person get the authority from to break thoroughly what was meant to be an interrupt-free computer time scale? Even the, at-the-time, almighty Catholic Church wasn't able to implement the Gregorian transition smoothly by simply decreeing it. Do we rely on some dictator vastly more powerful than a sixteenth-century pope to be around near the years 2600, 3100, 3500, 3800, 4100, 4300, etc., to get the then necessary UTC leap hour implemented?" Many in both armies have suggested that, whichever way the leap-second decision finally goes, the U.S. leap-hour proposal is a dud, a barely disguised means of pushing the problem onto our descendants, and into a future that is distant enough to be completely unpredictable.

If time first became a global affair in 1582, it is now immeasurably more so--which means that time is, of necessity, an affair of politics and diplomacy as well as of science. For instance, the international consortium that governs time, the ITU, is part of the United Nations; and the ITU committee run by Wayne Hanson, the USWP-7A, which proposed doing away with the leap second, operates under the purview of the U.S. State Department.

And just as in both 1582 and 1752, national pride and competitiveness have become major players. The Naval Observatory's managers consistently describe its role in timekeeping as "the largest single contributor to the international time scale." That is due, Dennis McCarthy explains, to the sheer number of USNO atomic clocks, each of which submits data to the international bureau that distributes the official global time. If the U.S. proposal for a redefinition of time were to be accepted by the rest of the world, that would cement USNO's status as top dog of time.

The U.S. victory would in turn officially and finally spell the end of the last vestige of empire for the United Kingdom, which still retains its beloved GMT as the official time standard even as most of the world has anointed UTC. Until now the distinction between the two has been merely semantic, because the two standards have remained linked by the leap second. But GMT is by definition based on earth time. If the leap second disappears and official time diverges from earth time, GMT will be a relic, an historical curiosity--like the Greenwich Observatory itself, which is now a privately owned museum rather than a working national scientific center. Not surprisingly, then, the U.K. is reprising the role it last played in 1582, as the chief foot-dragger in the campaign to change timekeeping. The English have emphasized the legal hassle involved in redefining the time standard from GMT to UTC, but it's impossible to ignore the psychological implications: England has had its name on time ever since there was a global time standard in existence.

In the end, the Empire ruled again--at least for a brief moment. The world's Time Lords (a.k.a. the ITU) convened in Geneva last November to consider the U.S. leap-hour proposal. After four days of discussion, the Time Lords spoke: "More time is required to build consensus." That simple statement guarantees a significant stall in the momentum to drop leap seconds. For while the nanoseconds rush by in our closely clocked world, the bureaucracy of time creeps at a glacial speed.

That leaden pace frustrates Dennis McCarthy. "Making no decision either for or against leaves the future in doubt as far as people who are crafting computer code," he says, "code that will be put into satellites and be around twenty years from now." One theory floated by the save-the-leap-second crowd, in fact, holds that the rush to dump leaps is tied to the launching of Galileo, the European version of GPS, and the question of what time system Galileo will use. GPS uses its own time standard, which is currently UTC plus fourteen seconds. The satellites don't count leap seconds, but because the system was launched after nine leap seconds had already been added to UTC, GPS time lingers in a purgatory between pure atomic time and UTC. Every time a leap second is added, the GPS program adds one second to the offset between GPS time and civil time.

That's exactly the scenario that haunts McCarthy if leap seconds continue: a proliferation of private time scales, instituted by various entities that don't want to cope with leaps--communications systems, even large companies--leading to serious confusion. A few leap seconds ago, rumor has it ("We'll never get to the bottom of this one," says McCarthy), the Russians just turned off their GLONASS system rather than try to program the leap, and then had trouble rebooting. "The story I got was that it was off for twenty hours," says McCarthy. When your navigation and communications depend on precise time, crashing a satellite system is not a good thing.

McCarthy's other bugaboo is not disputed by anyone on either side of the question: as time goes on we will need more and more leap seconds, more frequently. This is not because the earth's rate of increase in spin will accelerate (it won't) but because, similar to the way interest compounds, we will get further and further from the starting offset (the difference between the length of the atomic second and the length of an earth second), and the difference will build on itself. Here's the really odd part--one indication, perhaps, of how befuddling this atomic-clock business was right from the beginning: when the atomic second was first defined, it was decided that its length should match the existing earth second--as it had been measured in 1820. Why 1820? Reasonable question.

It seemed least disruptive to match the current value of a second, explains Judah Levine, which dated back to astronomical data from 1900. "But there was immediate trouble," he says, "because the length of the 1900 second was actually based on length-of-day data gathered in 1820." And since the day was a tiny bit shorter in 1820 than it was in 1967, "by the time the second was defined in atomic time, it was already outdated--it was off by about ten seconds." The gap has been growing ever since.

But while physicists see the prospect of more and more leap seconds as a hassle to the power of ten, some astronomers see that very eventuality as something of a solution. Call it the practice-makes-perfect theory. "If every month you might have a leap second," says Rob Seaman, "which is actually what the current standard allows for, you could build software that takes that into account and basically has a Yes and No toggle for it."

Last December's leap second tested the converse of that idea, coming as it did after the longest break in the leap-second program. Not only were time managers instructed by the ITU to closely observe any hitches in the proceedings but, as Dennis McCarthy said before the event, "There are probably some people out there who have forgotten totally how to deal with it--maybe some equipment hasn't been tested through a leap second--and some people who remember how to do it have retired or taken new jobs." One could imagine the potential for anxiety in 2600 if time had to leap by an hour--that is, if one could imagine life in 2600 at all.

The fact that no planes fell from the sky and no major systems crashed last December 31 may slow even further the bureaucratic shuffle toward eliminating leap seconds. Even McCarthy admits that he doesn't see an overwhelming consensus around the world for a change, and given the global nature of timekeeping, he says, "It's never good in international conventions to have an agreement pass by a two-vote plurality or something." That raises the possibility of a reluctant participant (think England, 1582) simply refusing to go along with the rest of the world.

After the New Year's leap second, the USNO listserv teemed with dispatches and tirades. For a brief time, what Steve Allen calls the "dysfunctional family of astronomers, navigators, radio scientists, physicists, and software engineers" came closer together. News clips were swapped, including a satire by Steve Martin, written in the voice of Bill O'Reilly, claiming hysterically that "a leap second is a denial of everything American, of everything good, of everything moral." The particularities of "hyperfine wibbles of caesium-133" and "the DUT1 correction in binary-coded decimal form" were discussed with zeal. Amid the white noise of jargon and insult, however, Rob Seaman's voice--often the most fiery but also the most lyrical--occasionally brought the squabble back down, literally, to earth. He wrote of "gracefully accommodating the charming quirks of Earth and Moon," of "blaming poor Mother Earth for her middle-aged unsteadiness," of "seeking a grand vision of the shared meaning of time in human concerns." And his readers were reminded, however briefly, of what is really going on here: that beyond the aggressive perfection of machinery there is an opposing and gorgeous imperfection that somehow manages also to be true.

Michelle Stacey's most recent book, The Fasting Girl: A True Victorian Medical Mystery, was published by Tarcher/Putnam.
COPYRIGHT 2006 Harper's Magazine Foundation
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2006 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:precise standard time
Author:Stacey, Michelle
Publication:Harper's Magazine
Geographic Code:1USA
Date:Dec 1, 2006
Previous Article:String theory: building a 4,000-acre house in Manhattan.
Next Article:The magic mountain: trickle-down economics in a Philippine garbage dump.

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters