Atomic Age


The Atomic Age, also known as the Atomic Era, is the period of history following the detonation of the first nuclear weapon, The Gadget at the Trinity test in New Mexico on 16 July 1945 during World War II. Although nuclear chain reactions had been hypothesized in 1933 and the first artificial self-sustaining nuclear chain reaction had taken place in December 1942, the Trinity test and the ensuing bombings of Hiroshima and Nagasaki that ended World WarII represented the first large-scale use of nuclear technology and ushered in profound changes in sociopolitical thinking and the course of technological development.
While atomic power was promoted for a time as the epitome of progress and modernity, entering into the nuclear power era also entailed frightful implications of nuclear warfare, the Cold War, mutual assured destruction, nuclear proliferation, the risk of nuclear disaster, as well as beneficial civilian applications in nuclear medicine. It is no easy matter to fully segregate peaceful uses of nuclear technology from military or terrorist uses, which complicated the development of a global nuclear-power export industry right from the outset.
In 1973, concerning a flourishing nuclear power industry, the United States Atomic Energy Commission predicted that by the turn of the 21st century, 1,000 reactors would be producing electricity for homes and businesses across the U.S. However, the "nuclear dream" fell far short of what was promised because nuclear technology produced a range of social problems, from the nuclear arms race to nuclear meltdowns, and the unresolved difficulties of bomb plant cleanup and civilian plant waste disposal and decommissioning. Since 1973, reactor orders declined sharply as electricity demand fell and construction costs rose. Many orders and partially completed plants were cancelled.
By the late 1970s, nuclear power had suffered a remarkable international destabilization, as it was faced with economic difficulties and widespread public opposition, coming to a head with the Three Mile Island accident in 1979 and the Chernobyl disaster in 1986, both of which adversely affected the nuclear power industry for many decades.

Early years

In 1901, Frederick Soddy and Ernest Rutherford discovered that radioactivity was part of the process by which atoms changed from one kind to another, involving the release of energy. Soddy wrote in popular magazines that radioactivity was a potentially "inexhaustible" source of energy and offered a vision of an atomic future where it would be possible to "transform a desert continent, thaw the frozen poles, and make the whole earth one smiling Garden of Eden." The promise of an "atomic age," with nuclear energy as the global, utopian technology for the satisfaction of human needs, has been a recurring theme ever since. But "Soddy also saw that atomic energy could possibly be used to create terrible new weapons".
The concept of a nuclear chain reaction was hypothesized in 1933, shortly after James Chadwick's discovery of the neutron. Only a few years later, in December 1938 nuclear fission was discovered by Otto Hahn and his assistant Fritz Strassmann. Hahn understood that a "burst" of the atomic nuclei had occurred. The first artificial self-sustaining nuclear chain reaction took place at Chicago Pile-1 in December 1942 under the leadership of Enrico Fermi.
In 1945, the pocketbook The Atomic Age heralded the untapped atomic power in everyday objects and depicted a future where fossil fuels would go unused. One science writer, David Dietz, wrote that instead of filling the gas tank of your car two or three times a week, you will travel for a year on a pellet of atomic energy the size of a vitamin pill. Glenn T. Seaborg, who chaired the Atomic Energy Commission, wrote "there will be nuclear powered earth-to-moon shuttles, nuclear powered artificial hearts, plutonium heated swimming pools for SCUBA divers, and much more".

World War II

The phrase Atomic Age was coined by William L. Laurence, a journalist with The New York Times, who became the official journalist for the Manhattan Project which developed the first nuclear weapons. He witnessed both the Trinity test and the bombing of Nagasaki and went on to write a series of articles extolling the virtues of the new weapon. His reporting before and after the bombings helped to spur public awareness of the potential of nuclear technology and in part motivated development of the technology in the U.S. and in the Soviet Union. The Soviet Union would go on to test its first nuclear weapon in 1949.
In 1949, U.S. Atomic Energy Commission chairman, David Lilienthal stated that "atomic energy is not simply a search for new energy, but more significantly a beginning of human history in which faith in knowledge can vitalize man's whole life".

1950s

The phrase gained popularity as a feeling of nuclear optimism emerged in the 1950s in which it was believed that all power generators in the future would be atomic in nature. The atomic bomb would render all conventional explosives obsolete, and nuclear power plants would do the same for power sources such as coal and oil. There was a general feeling that everything would use a nuclear power source of some sort, in a positive and productive way, from irradiating food to preserve it, to the development of nuclear medicine. There would be an age of peace and plenty in which atomic energy would "provide the power needed to desalinate water for the thirsty, irrigate the deserts for the hungry, and fuel interstellar travel deep into outer space". This use would render the Atomic Age as significant a step in technological progress as the first smelting of bronze, of iron, or the commencement of the Industrial Revolution.
This included even cars, leading Ford Motor Company to display the Ford Nucleon concept car to the public in 1958. There was also the promise of golf balls which could always be found and nuclear-powered aircraft, which the U.S. federal government even spent US$1.5 billion researching. Nuclear policymaking became almost a collective technocratic fantasy, or at least was driven by fantasy:

The very idea of splitting the atom had an almost magical grip on the imaginations of inventors and policymakers. As soon as someone said—in an even mildly credible way—that these things could be done, then people quickly convinced themselves... that they would be done.

In the US, military planners "believed that demonstrating the civilian applications of the atom would also affirm the American system of private enterprise, showcase the expertise of scientists, increase personal living standards, and defend the democratic lifestyle against communism". Some media reports predicted that thanks to the giant nuclear power stations of the near future electricity would soon become much cheaper and that electricity meters would be removed, because power would be "too cheap to meter."
When the Shippingport reactor went online in 1957 it produced electricity at a cost roughly ten times that of coal-fired generation. Scientists at the AEC's own Brookhaven Laboratory "wrote a 1958 report describing accident scenarios in which 3,000 people would die immediately, with another 40,000 injured". However Shippingport was an experimental reactor using highly enriched uranium and originally intended for a nuclear-powered aircraft carrier.
Kenneth Nichols, a consultant for the Connecticut Yankee and Yankee Rowe nuclear power stations, wrote that while considered "experimental" and not expected to be competitive with coal and oil, they "became competitive because of inflation... and the large increase in price of coal and oil." He wrote that for nuclear power stations the capital cost is the major cost factor over the life of the plant, hence "antinukes" try to increase costs and building time with changing regulations and lengthy hearings, so that "it takes almost twice as long to build a atomic power plant in the United States as in France, Japan, Taiwan or South Korea." French pressurised-water nuclear plants produce 60% of their electric power and have proven to be much cheaper than oil or coal.

Atomic City

During the 1950s, Las Vegas earned the nickname "Atomic City" for becoming a hotspot where tourists would gather to watch above-ground nuclear weapons tests taking place at Nevada Test Site. Following the detonation of Able, one of the first atomic bombs dropped at the Nevada Test Site, the Las Vegas Chamber of Commerce began advertising the tests as an entertainment spectacle to tourists.
The detonations proved popular, and casinos throughout the city capitalised on the tests by advertising hotel rooms or rooftops which offered views of the testing site or by planning "Dawn Bomb Parties" where people would come together to celebrate the detonations. Most parties started at midnight, and musicians would perform at the venues until 4:00 a.m. when the party would briefly stop so guests could silently watch the detonation. Some casinos capitalised on the tests further by creating so called "atomic cocktails", a mixture of vodka, cognac, sherry and champagne. Meanwhile, groups of tourists would drive out into the desert with family or friends to watch the detonations.
Despite the health risks associated with nuclear fallout, tourists and viewers were told to simply "shower". Later on, however, anyone who had worked at the testing site or lived in areas exposed to nuclear fallout fell ill and had higher chances of developing cancer or suffering pre-mature deaths.

1960s

The term "atomic age" was initially used in a positive, futuristic sense, but by the 1960s the threats posed by nuclear weapons had begun to edge out nuclear power as the dominant motif of the atom. In the Thunderbirds TV series, a set of vehicles was presented that were imagined to be completely nuclear, as shown in cutaways presented in their comic-books.