The Centuries-Long Quest to Measure One Second
There is an elegant, simple, and entirely accurate way to define the second. A second is 1/86,400 of the time that it takes the Earth to rotate once on its axis. With 24 hours in a day, 60 minutes per hour, and 60 seconds per minute, there are 86,400 seconds in a day. There has never been a more accurate definition for the second, and there never will be.
If only the real world were so simple.
That clean, mathematical definition has a major problem: The length of a day changes ever so slightly. It can vary year to year based on a number of factors, from the amount of snowfall at the poles to space weather particles hitting our planet. In addition to the random variations, the rotation of the Earth is gradually slowing due to tidal forces from the moon, lengthening our days. The International Earth Rotation and Reference Systems Service (IERS) corrects for this by deciding when a year will have a leap second. We had one last year.
Over the past few decades, scientists have been trying out ever-more-elaborate ways to define the second as accurately and consistently as possible. The people who do this job for the United States work at the National Institute of Standards and Technology, or NIST, out in Boulder, Colorado. I went there to see how they keep the world running on time.
A BRIEF HISTORY OF A BRIEF AMOUNT OF TIME
The Persian scholar Al-Biruni first used the term “second” around 1000. He defined it—as well as the day, hour, and minute—as fractions according to the lunar cycle. The first mechanical clocks to mark the second appeared in the 1500s, and in 1644 French mathematician Marin Mersenne used a pendulum to define the second for the first time, leading to the international adoption of grandfather clocks by the end of the 17th century. In the 19th century, scientific institutions worked to define the second in astronomical terms, and in the 1940s an international agreement defined the second as 1⁄86,400 of a mean solar day.
t was in the 1950s, however, that researchers recognized the Earth’s rotation is not consistent enough to provide a standard unit of time. Instead, the second was redefined according to the length of a year, and officially became the fraction 1⁄31,556,925.9747 of the year 1900. The definition would not endure.
Around the same time, the first accurate atomic clocks were being developed that used cesium. Finally, here was a natural phenomenon precise and consistent enough to define a second. In 1967, the Thirteenth General Conference of the International Committee for Weights and Measures officially defined the second as “the duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium-133 atom.” And that has remained the official definition ever since.
With the new official definition of a second in place, the problem became one of engineering, chemistry, and physics, as scientists around the world collaborate to build the most consistent clocks ever conceived. In the United States, NIST defines the second, and its measurement is averaged with the measurements of other institutions around the world to create Coordinated Universal Time, or UTC.
NIST uses two primary clocks to measure the second, called the F1 and the F2. These devices are not really clocks at all in the traditional sense, but rather particle physics experiments that are designed to find the specific frequency of electromagnetic radiation that causes a cesium atom’s outer electron to transition. That frequency is exactly 9,192,631,770 hertz, and it falls into the microwave range of the electromagnetic spectrum.
“It’s about four times the frequency your microwave oven runs at,” says NIST physicist Steve Jefferts, lead designer of the F1 and F2, and the clock guru who makes the calibrations to define time in America.
The only major difference between the clocks is that F2 is newer and its chamber is encased in a liquid nitrogen container to cool the system, which decreases background interference and makes the clock more accurate. Interestingly enough, NIST had already created a model to correct for interference in the F1 clock, and after building F2, they were able to confirm that the calculations they used were correct, effectively making F1 more accurate.
When I visited the NIST labs in Boulder, teams had just finished disassembling and moving the F1 cesium fountain clock to a new building. F2 will be moved next, once they get F1 back up and running. I found Jefferts tinkering with F1, “in the middle of a huge tune-up.”
He had taken the entire fountain chamber, which weighs around 600 pounds, off the optics table and disassembled it to replace internal components. When he put it back on, all the precisely aligned mirrors and glass instruments bolted to the optics table needed to be readjusted, which he had been working on for the past three days. Wrench sets, pliers, wire cutters and optical instruments were scattered over the tables.
“Give me a minute to safe up the room,” Jefferts said after he popped out from behind a blacked-out doorway to the lab. “There are lasers on in here, dangerous ones.”
Large red lamps line the hallways of NIST outside each lab door, and when they are on, it means that lasers are on inside the rooms and eye protection is required. Almost all of the red lamps were switched on when I visited the facility. Horologists, those who study and measure time, were busy tinkering with atomic clocks and taking measurements.
A CLOCK MADE OF MICROWAVE LASERS
Ask Jefferts to explain how the clock works, and it’s not long before the general theory of relativity and quantum mechanics become part of the conversation.
The F1 and F2 clocks doesn’t count seconds, he says. They measure masers, which are microwave lasers, to find a signal that is exactly 9,192,631,770 Hz. At roughly 9.2 billion cycles per second, the frequency of that maser can then be used to measure the second according to its official definition. The problem is that even though you can get exactly 9,192,631,770 Hz dialed in, it will “drift” over time, so the clocks need to be calibrated constantly.
The calibration is done with the alkali metal cesium. A cesium atom has one electron orbiting its nucleus in the highest energy level, all by itself. This lone outer electron is either “spin up” or “spin down,” which refers to a quantum measurement of the electron’s angular momentum. This spin produces a magnetic field, and the magnetic field is either aligned with the magnetic field of the atom’s nucleus, or it’s not. A maser that is exactly 9,192,631,770 Hz will force that outer electron to transition from spin up to spin down, or vice versa.
So here’s what’s going on. A gaseous ball of about 10 million cesium atoms is released into the bottom of a cylindrical vacuum chamber, which is the “fountain.” The chamber has four layers of magnetic shielding, and on the F2, the liquid nitrogen casing keeps temperatures around the entire system steady at about 80 Kelvin, or -193 degrees Celsius.
Lasers are used to slow and cool the atoms to near absolute zero, and then more lasers are used to elevate the ball of cesium up the chamber. As a result, all of the atoms’ outer electrons are aligned according to their nuclei. The ball of cesium passes through the maser at the top of the chamber. If calibrated to exactly 9,192,631,770 Hz, the maser will force every single one of the atoms’ outer electrons to transition. The ball of atoms then settles back down to the bottom of the chamber and additional lasers are used to measure them, checking to see how many transitioned.
“Okay, I just lied to you,” says Jefferts after I finally start to grasp the concept.
The problem is that if you measure 10 million cesium atoms and all but three transition, you know the maser frequency is ever-so-slightly off, but you don’t know if it is too high or too low. So Jefferts intentionally calibrates the maser at a frequency that’s too high and one that’s too low. When he has the two measurements equal, he can calculate the average frequency to land on 9,192,631,770 hertz.
That maser, running through a cable, is then preserved in a high-tech temperature and pressure vessel—a converted egg incubation chamber. The measurement is then used to evaluate a suite of commercial atomic clocks, also stored in egg incubation chambers, and give them a weighted grade. The time of the commercial clocks is averaged, according to their weighted grade, and that sets the official civilian time for the United States. Simple right?
THE UNCERTAIN FUTURE OF TIME
So 2016 had a leap second. But unlike leap years, leap seconds are not predictable in advance. Climate change, which will send ice at the poles to the equator as water, will make the inconsistencies in Earth’s rotation even more pronounced.
Continuing to correct for these changes with leap seconds is a controversial issue. The leap second is expensive for computing and financial institutions, which must account for it in calculations and modifications, so many people think we should simply let leap seconds accumulate until we have a full leap minute, then make the adjustment. Jefferts, however, is in agreement with most astronomers and thinks we should keep the leap second.
“I’m certainly in the minority around here from that point of view. I’m a sailor, and many years ago I taught celestial navigation, and so I sort of have this very personal linkage idea that, dammit, the sun should be overhead on the 21st of March or the 20th of March [the spring equinox] at noon at Greenwich… We should not be accumulating leap seconds so that that is no longer true.”
Leap seconds are far from the only ongoing uncertainty about time. The current definition of a second, 9,192,631,770 periods of a maser that will cause cesium to transition, isn’t perfect. For one thing, the duration of a second, as currently defined, is slightly different at altitude compared to sea level due to general relativity, so corrections need to be made. There’s also the more fundamental problem that our current measurement is not exactly 1⁄86,400 of a day. It could be more accurate, and as a matter of fact, we have clocks that are more accurate.
These optical clocks, as they are called, work similarly to cesium fountain clocks. However, instead of levitating a ball atoms, they simply trap those atoms in place in a chamber with a system of lasers and then use one specific laser to cause them to transition. The difference is in the frequency. A laser in the visible spectrum has a frequency that is around ten thousand times higher than the 9,192,631,770 Hz maser. A definition for the second based on an optical clock would therefore be counted as some 90 trillion periods of radiation, sometimes called “ticks,” rather than 9 billion, as it is now.
The problem is no one can agree on which atoms are the best ones to use.
“Here in Boulder, if you walk over to JILA, Jun Ye will tell you that strontium is obviously the right atom to replace cesium with,” says Jefferts. “And if you walk down to the end of the hall here, those guys will tell you ytterbium is absolutely the right atom. And then if you walk to the hall over there, those guys will be like no it’s aluminum ions. Unless of course you go to the room next door, at which point it’s mercury ions, I’ll tell you it’s mercury ions!”
Additionally, lawmakers, who Jefferts says often have trouble grasping the definition of the second in the first place, are unlikely to change the standard unless they have a practical reason. If there is a financial incentive, or a need to make computers more accurate, for example, the official definition of a second could be changed to an optical clock measurement. But whether that will happen in our lifetimes is unknown.
Until then, like Jefferts, you might just want to memorize the number 9,192,631,770, lest you lose track of the time.