Atomic clock


An atomic clock is a clock that measures time by monitoring the resonant frequency of atoms. It is based on the fact that atoms have quantised energy levels, and transitions between such levels are driven by very specific frequencies of electromagnetic radiation. This phenomenon serves as the basis for the SI definition of the second:

The second, symbol s, is the SI unit of time. It is defined by taking the fixed numerical value of the caesium frequency,, the unperturbed ground-state hyperfine transition frequency of the caesium-133 atom, to be when expressed in the unit Hz, which is equal to s−1.

This definition underpins the system of TAI, which is maintained by an ensemble of atomic clocks around the world. The system of UTC — the basis of civil time — implements leap seconds to allow clock time to stay within one second of Earth's rotation.
The accurate time-keeping capabilities of atomic clocks are also used for navigation by satellite networks such as the EU’s Galileo Programme and the United States’ GPS. The timing accuracy of the atomic clocks matters because even a timing error of 1 nanosecond corresponds to a positional error of roughly 30 cm when multiplied by the speed of light.
The main variety of atomic clock in use today employs caesium atoms cooled to near absolute zero. For example, the United States’ primary standard, the NIST caesium fountain clock named NIST-F2, operates with a relative uncertainty around 10−16.

Recent advances

In July 2025, researchers at the National Institute of Standards and Technology in the United States reported a record-setting optical atomic clock based on a trapped aluminium ion. This “quantum logic” clock achieves a systematic uncertainty corresponding to around 19 decimal places of accuracy, representing a 41 % improvement over the previous record and being 2.6 times more stable than any other ion clock.

Redefinition of the second

The rapid improvement in optical atomic clock performance has prompted the global time-and-frequency community to prepare for a possible redefinition of the SI second. In June 2025, a coordinated international comparison of optical clocks across six countries was reported — marking a major step towards establishing a global optical-time standard.

Technological impact

Optical atomic clocks are enabling new applications: ultra-precise time- and-frequency dissemination, improved global navigation satellite systems, relativistic geodesy, and tests of fundamental constants and general relativity.

History

The Scottish physicist James Clerk Maxwell proposed measuring time with the vibrations of light waves in his 1873 Treatise on Electricity and Magnetism: 'A more universal unit of time might be found by taking the periodic time of vibration of the particular kind of light whose wave length is the unit of length.' Maxwell argued this would be more accurate than the Earth's rotation, which defines the mean solar second for timekeeping.
During the 1930s, the American physicist Isidor Isaac Rabi built equipment for atomic beam magnetic resonance frequency clocks.
The accuracy of mechanical, electromechanical and quartz clocks is reduced by temperature fluctuations. This led to the idea of measuring the frequency of an atom's vibrations to keep time more accurately, as proposed by James Clerk Maxwell, Lord Kelvin, and Isidor Rabi. A prototype measuring phase transitions of the ammonia molecule was developed in 1949. The first practical atomic clock using caesium atoms was built at the National Physical Laboratory in the United Kingdom in 1955 by Louis Essen in collaboration with Jack Parry.
In 1949, Alfred Kastler and Jean Brossel developed a technique called optical pumping for electron energy level transitions in atoms using light. This technique is useful for creating much stronger magnetic resonance and microwave absorption signals. Unfortunately, this caused a side effect with a light shift of the resonant frequency. Claude Cohen-Tannoudji and others managed to reduce the light shifts to acceptable levels.
Ramsey developed a method, commonly known as Ramsey interferometry nowadays, for higher frequencies and narrower resonances in the oscillating fields. Kolsky, Phipps, Ramsey, and Silsbee used this technique for molecular beam spectroscopy in 1950.
After 1956, atomic clocks were studied by many groups, such as the National Institute of Standards and Technology in the USA, the Physikalisch-Technische Bundesanstalt in Germany, the National Research Council in Canada, the National Physical Laboratory in the United Kingdom, International Time Bureau, at the Paris Observatory, the National Radio Company, Bomac, Varian, Hewlett–Packard and Frequency & Time Systems.
During the 1950s, the National Radio Company sold more than 50 units of the first atomic clock, the Atomichron. In 1964, engineers at Hewlett-Packard released the 5060 rack-mounted model of caesium clocks.

Definition of the second

In 1968, the SI defined the duration of the second to be vibrations of the unperturbed ground-state hyperfine transition frequency of the caesium-133 atom. Prior to that it was defined by there being seconds in the tropical year 1900. In 1997, the International Committee for Weights and Measures added that the preceding definition refers to a caesium atom at rest at a temperature of absolute zero. Following the 2019 revision of the SI, the definition of every base unit except the mole and almost every derived unit relies on the definition of the second.
Timekeeping researchers seek an even more stable atomic reference for the second, with a plan to find a more precise definition of the second as atomic clocks improve based on optical clocks or the Rydberg constant around 2030.

Metrology advancements and optical clocks

Technological developments such as lasers and optical frequency combs in the 1990s led to increasing accuracy of atomic clocks. Lasers enable the possibility of optical-range control over atomic states transitions, which has a much higher frequency than that of microwaves; while optical frequency comb measures highly accurately such high frequency oscillation in light.
The first advance beyond the precision of caesium clocks occurred at NIST in 2010 with the demonstration of a "quantum logic" optical clock that used aluminum ions to achieve a precision of. Optical clocks are a very active area of research in the field of metrology as scientists work to develop clocks based on elements ytterbium, mercury, aluminum, and strontium. Scientists at JILA demonstrated a strontium clock with a frequency precision of in 2015. Scientists at NIST developed a quantum logic clock that measured a single aluminum ion in 2019 with a frequency uncertainty of.
At JILA in September 2021, scientists demonstrated an optical strontium clock with a differential frequency precision of between atomic ensembles separated by. The second is expected to be redefined when the field of optical clocks matures, sometime around the year 2030 or 2034. In order for this to occur, optical clocks must be consistently capable of measuring frequency with accuracy at or better than. In addition, methods for reliably comparing different optical clocks around the world in national metrology labs must be demonstrated, and the comparison must show relative clock frequency accuracies at or better than.

Chip-scale atomic clocks

Reducing the size and power consumption of optical clocks is necessary to enable their use in geodesy and GPS navigation. In August 2004, NIST scientists demonstrated a chip-scale atomic clock that was 100 times smaller than an ordinary atomic clock and had a much smaller power consumption of. The atomic clock was about the size of a grain of rice with a frequency of about 9 GHz. This technology became available commercially in 2011.

Measuring time with atomic clocks

Clock mechanism

An atomic clock is based on a system of atoms which may be in one of two possible energy states. A group of atoms in one state is prepared, then subjected to microwave radiation. If the radiation is of the correct frequency, a number of atoms will transition to the other energy state. The closer the frequency is to the inherent oscillation frequency of the atoms, the more atoms will switch states. Such correlation allows very accurate tuning of the frequency of the microwave radiation. Once the microwave radiation is adjusted to a known frequency where the maximum number of atoms switch states, the atom and thus, its associated transition frequency, can be used as a timekeeping oscillator to measure elapsed time.
All timekeeping devices use oscillatory phenomena to accurately measure time, whether it is the rotation of the Earth for a sundial, the swinging of a pendulum in a grandfather clock, the vibrations of springs and gears in a watch, or voltage changes in a quartz crystal watch. However all of these are easily affected by temperature changes and are not very accurate. The most accurate clocks use atomic vibrations to keep track of time. Clock transition states in atoms are insensitive to temperature and other environmental factors and the oscillation frequency is much higher than any of the other clocks.
One of the most important factors in a clock's performance is the atomic line quality factor,, which is defined as the ratio of the absolute frequency of the resonance to the linewidth of the resonance itself. Atomic resonance has a much higher than mechanical devices. Atomic clocks can also be isolated from environmental effects to a much higher degree. Atomic clocks have the benefit that atoms are universal, which means that the oscillation frequency is also universal. This is different from quartz and mechanical time measurement devices that do not have a universal frequency.
A clock's quality can be specified by two parameters: accuracy and stability. Accuracy is a measurement of the degree to which the clock's ticking rate can be counted on to match some absolute standard such as the inherent hyperfine frequency of an isolated atom or ion. Stability describes how the clock performs when averaged over time to reduce the impact of noise and other short-term fluctuations.
The instability of an atomic clock is specified by its Allan deviation. The limiting instability due to atom or ion counting statistics is given by
where is the spectroscopic linewidth of the clock system, is the number of atoms or ions used in a single measurement, is the time required for one cycle, and is the averaging period. This means instability is smaller when the linewidth is smaller and when is larger. The stability improves as the time over which the measurements are averaged increases from seconds to hours to days. The stability is most heavily affected by the oscillator frequency. This is why optical clocks such as strontium clocks are much more stable than caesium clocks.
Modern clocks such as atomic fountains or optical lattices that use sequential interrogation are found to generate type of noise that mimics and adds to the instability inherent in atom or ion counting. This effect is called the Dick effect and is typically the primary stability limitation for the newer atomic clocks. It is an aliasing effect; high frequency noise components in the local oscillator are heterodyned to near zero frequency by harmonics of the repeating variation in feedback sensitivity to the LO frequency. The effect places new and stringent requirements on the LO, which must now have low phase noise in addition to high stability, thereby increasing the cost and complexity of the system. For the case of an LO with Flicker frequency noise where is independent of, the interrogation time is, and where the duty factor has typical values, the Allan deviation can be approximated as
This expression shows the same dependence on as does, and, for many of the newer clocks, is significantly larger. Analysis of the effect and its consequence as applied to optical standards has been treated in a major review that lamented on "the pernicious influence of the Dick effect", and in several other papers.