Age of Earth


The age of Earth is estimated to be 4.54 ± 0.05 billion years. This age represents the final stages of Earth's accretion and planetary differentiation. Age estimates are based on evidence from radiometric age-dating of meteoritic material—consistent with the radiometric ages of the oldest-known terrestrial material and lunar samples—and astrophysical accretion models consistent with observations of planet formation in protoplanetary disks.
Following the development of radiometric dating in the early 20th century, measurements of lead in uranium-rich minerals showed that some were in excess of a billion years old. The oldest such minerals analyzed to date—small crystals of zircon from the Jack Hills of Western Australia—are at least 4.404 billion years old. Calcium–aluminium-rich inclusions—the oldest known solid constituents within meteorites that are formed within the Solar System—are 4.5673 ± 0.00016 billion years old giving a lower limit for the age of the Solar System.
It is hypothesized that the accretion of Earth began soon after the formation of the calcium-aluminium-rich inclusions. Because the duration of this accretion process is not yet adequately constrained—predictions from different accretion models range from around 30 million to 100 million years—the difference between the age of Earth and of the oldest rocks is difficult to determine. It can also be difficult to determine the exact age of the oldest rocks on Earth, exposed at the surface, as they are aggregates of minerals of possibly different ages.

Development of modern geologic concepts

Studies of strata—the layering of rocks and soil—gave naturalists an appreciation that Earth may have been through many changes during its existence. These layers often contained fossilized remains of unknown creatures, leading some to interpret a progression of organisms from layer to layer.
Nicolas Steno in the 17th century was one of the first naturalists to appreciate the connection between fossil remains and strata. His observations led him to formulate important stratigraphic concepts. In the 1790s, William Smith hypothesized that if two layers of rock at widely differing locations contained similar fossils, then it was very plausible that the layers were the same age. Smith's nephew and student, John Phillips, later calculated by such means that Earth was about 96 million years old.
In the mid-18th century, the naturalist Mikhail Lomonosov suggested that Earth had been created separately from, and several hundred thousand years before, the rest of the universe. Lomonosov's ideas were mostly speculative. In 1779 the Comte de Buffon tried to obtain a value for the age of Earth using an experiment: he created a small globe that resembled Earth in composition and then measured its rate of cooling. This led him to estimate that Earth was about 75,000 years old. Even earlier, in 1687, in his Principia, the mathematician and physicist Isaac Newton was the first to calculate the age of the Earth by experiment, doing so by modeling its cooling from a red-hot state, theorizing a globe of red-hot iron the same size as Earth, ultimately coming to a conclusion of around 50,000 years. A method that Lord Kelvin would follow in his attempts to calculate the age of Earth.
Other naturalists used these hypotheses to construct a history of Earth, though their timelines were inexact as they did not know how long it took to lay down stratigraphic layers. In 1830, geologist Charles Lyell, developing ideas found in James Hutton's works, popularized the concept that the features of Earth were in perpetual change, eroding and reforming continuously, and the rate of this change was roughly constant. This was a challenge to the traditional view, which saw the history of Earth as dominated by intermittent catastrophes. Many naturalists were influenced by Lyell to become "uniformitarians" who believed that changes were constant and uniform.

Early calculations

In 1862, the physicist William Thomson, 1st Baron Kelvin published calculations that fixed the age of Earth at between 20 million and 400 million years. He assumed that Earth had formed as a completely molten object, and determined the amount of time it would take for the near-surface temperature gradient to decrease to its present value. His calculations did not account for heat produced via radioactive decay or, more significantly, convection inside Earth, which allows the temperature in the upper mantle to remain high much longer, maintaining a high thermal gradient in the crust much longer. Even more constraining were Thomson's estimates of the age of the Sun, which were based on estimates of its thermal output and a theory that the Sun obtains its energy from gravitational collapse; Thomson estimated that the Sun is about 20 million years old.
Geologists such as Lyell had difficulty accepting such a short age for Earth. For biologists, even 100 million years seemed much too short to be plausible. In Charles Darwin's theory of evolution, the process of random heritable variation with cumulative selection requires great durations of time, and Darwin stated that Thomson's estimates did not appear to provide enough time. According to modern biology, the total evolutionary history from the beginning of life to today has taken place since 3.5 to 3.8 billion years ago, the amount of time which passed since the last universal ancestor of all living organisms as shown by geological dating.
In a lecture in 1869, Darwin's great advocate, Thomas Henry Huxley, attacked Thomson's calculations, suggesting they appeared precise in themselves but were based on faulty assumptions. The physicist Hermann von Helmholtz and astronomer Simon Newcomb contributed their own calculations of 22 and 18 million years, respectively, to the debate: they independently calculated the amount of time it would take for the Sun to condense down to its current diameter and brightness from the nebula of gas and dust from which it was born. Their values were consistent with Thomson's calculations. However, they assumed that the Sun was only glowing from the heat of its gravitational contraction. The process of solar nuclear fusion was not yet known to science.
In 1892, Thomson was ennobled as Lord Kelvin in appreciation of his many scientific accomplishments. In 1895 John Perry challenged Kelvin's figure on the basis of his assumptions on conductivity, and Oliver Heaviside entered the dialogue, considering it "a vehicle to display the ability of his operator method to solve problems of astonishing complexity." Other scientists backed up Kelvin's figures. Darwin's son, the astronomer George H. Darwin, proposed that Earth and Moon had broken apart in their early days when they were both molten. He calculated the amount of time it would have taken for tidal friction to give Earth its current 24-hour day. His value of 56 million years was additional evidence that Thomson was on the right track. The last estimate Kelvin gave, in 1897, was: "that it was more than 20 and less than 40 million year old, and probably much nearer 20 than 40". In 1899 and 1900, John Joly calculated the rate at which the oceans should have accumulated salt from erosion processes and determined that the oceans were about 80 to 100 million years old.

Radiometric dating

Overview

By their chemical nature, rock minerals contain certain elements and not others; but in rocks containing radioactive isotopes, the process of radioactive decay generates exotic elements over time. By measuring the concentration of the stable end product of the decay, coupled with knowledge of the half life and initial concentration of the decaying element, the age of the rock can be calculated. Typical radioactive end products are argon from decay of potassium-40, and lead from decay of uranium and thorium. If the rock becomes molten, as happens in Earth's mantle, such nonradioactive end products typically escape or are redistributed. Thus the age of the oldest terrestrial rock gives a minimum for the age of Earth, assuming that no rock has been intact for longer than Earth itself.

Convective mantle and radioactivity

The discovery of radioactivity introduced another factor in the calculation. After Henri Becquerel's initial discovery in 1896, Marie and Pierre Curie discovered the radioactive elements polonium and radium in 1898; and in 1903, Pierre Curie and Albert Laborde announced that radium produces enough heat to melt its own weight in ice in less than an hour. Geologists quickly realized that this upset the assumptions underlying most calculations of the age of Earth. These had assumed that the original heat of Earth and the Sun had dissipated steadily into space, but radioactive decay meant that this heat had been continually replenished. George Darwin and John Joly were the first to point this out, in 1903.

Invention of radiometric dating

Radioactivity, which had overthrown the old calculations, yielded a bonus by providing a basis for new calculations, in the form of radiometric dating.
Ernest Rutherford and Frederick Soddy jointly had continued their work on radioactive materials and concluded that radioactivity was caused by a spontaneous transmutation of atomic elements. In radioactive decay, an element breaks down into another, lighter element, releasing alpha, beta, or gamma radiation in the process. They also determined that a particular isotope of a radioactive element decays into another element at a distinctive rate. This rate is given in terms of a "half-life", or the amount of time it takes half of a mass of that radioactive material to break down into its "decay product".
Some radioactive materials have short half-lives; some have long half-lives. Uranium and thorium have long half-lives and so persist in Earth's crust, but radioactive elements with short half-lives have generally disappeared. This suggested that it might be possible to measure the age of Earth by determining the relative proportions of radioactive materials in geological samples. In reality, radioactive elements do not always decay into nonradioactive elements directly, instead, decaying into other radioactive elements that have their own half-lives and so on, until they reach a stable element. These "decay chains", such as the uranium-radium and thorium series, were known within a few years of the discovery of radioactivity and provided a basis for constructing techniques of radiometric dating.
The pioneers of radioactivity were chemist Bertram B. Boltwood and physicist Rutherford. Boltwood had conducted studies of radioactive materials as a consultant, and when Rutherford lectured at Yale in 1904, Boltwood was inspired to describe the relationships between elements in various decay series. Late in 1904, Rutherford took the first step toward radiometric dating by suggesting that the alpha particles released by radioactive decay could be trapped in a rocky material as helium atoms. At the time, Rutherford was only guessing at the relationship between alpha particles and helium atoms, but he would prove the connection four years later.
Soddy and Sir William Ramsay had just determined the rate at which radium produces alpha particles, and Rutherford proposed that he could determine the age of a rock sample by measuring its concentration of helium. He dated a rock in his possession to an age of 40 million years by this technique. Rutherford wrote of addressing a meeting of the Royal Institution in 1904:
Rutherford assumed that the rate of decay of radium as determined by Ramsay and Soddy was accurate and that helium did not escape from the sample over time. Rutherford's scheme was inaccurate, but it was a useful first step. Boltwood focused on the end products of decay series. In 1905, he suggested that lead was the final stable product of the decay of radium. It was already known that radium was an intermediate product of the decay of uranium. Rutherford joined in, outlining a decay process in which radium emitted five alpha particles through various intermediate products to end up with lead, and speculated that the radium–lead decay chain could be used to date rock samples. Boltwood did the legwork and by the end of 1905 had provided dates for 26 separate rock samples, ranging from 92 to 570 million years. He did not publish these results, which was fortunate because they were flawed by measurement errors and poor estimates of the half-life of radium. Boltwood refined his work and finally published the results in 1907.
Boltwood's paper pointed out that samples taken from comparable layers of strata had similar lead-to-uranium ratios, and that samples from older layers had a higher proportion of lead, except where there was evidence that lead had leached out of the sample. His studies were flawed by the fact that the decay series of thorium was not understood, which led to incorrect results for samples that contained both uranium and thorium. However, his calculations were far more accurate than any that had been performed to that time. Refinements in the technique would later give ages for Boltwood's 26 samples of 410 million to 2.2 billion years.