Second Industrial Revolution


The Second Industrial Revolution, also known as the Technological Revolution, was a phase of rapid scientific discovery, standardization, mass production and industrialization from the late 19th century into the early 20th century. The First Industrial Revolution, which ended in the middle of the 19th century, was punctuated by a slowdown in important inventions before the Second Industrial Revolution in 1870. Though a number of its events can be traced to earlier innovations in manufacturing, such as the establishment of a machine tool industry, the development of methods for manufacturing interchangeable parts, as well as the invention of the Bessemer process and open hearth furnace to produce steel, later developments heralded the Second Industrial Revolution, which is generally dated between 1870 and 1914 when World War I commenced.
Advancements in manufacturing and production technology enabled the widespread adoption of technological systems such as telegraph and railroad networks, gas and water supply, and sewage systems, which had earlier been limited to a few select cities. The enormous expansion of rail and telegraph lines after 1870 allowed unprecedented movement of people and ideas, which culminated in a new wave of colonialism and globalization. In the same time period, new technological systems were introduced, most significantly electrical power and telephones. The Second Industrial Revolution continued into the 20th century with early factory electrification and the production line; it ended at the beginning of World War I.
Starting in 1947, the Information Age is sometimes also called the Third Industrial Revolution.

Overview

The Second Industrial Revolution was a period of rapid industrial development, primarily in the United Kingdom, Germany, and the United States, but also in France, Italy, Japan, and the Low Countries. It followed on from the First Industrial Revolution which flourished in Britain in the later 18th century and then spread throughout Western Europe. It came to an end with the start of World War I. While the First Revolution was driven by limited use of steam engines, interchangeable parts and mass production, and was largely water-powered, especially in the United States, the Second featured the build-out of railroads, large-scale iron- and steel-production, widespread use of machinery in manufacturing, greatly increased use of steam power, widespread use of the telegraph, use of petroleum and the beginning of electrification. Education played a role,
and modern organizational methods for operating large-scale businesses over vast areas came into use.
The concept of a second industrial revolution was introduced by Patrick Geddes, Cities in Evolution, and was used by economists such as Erich Zimmermann, but David Landes' use of the term in a 1966 essay and in The Unbound Prometheus standardized scholarly definitions of the term, which was most intensely promoted by Alfred Chandler. However, some continue to express reservations about its use. In 2003, Landes stressed the importance of new technologies, petroleum, new materials and substances, electricity and communication-technologies, such as the telegraph, telephone, and radio.
One author has called the period from 1867 to 1914, during which most of the great innovations emerged, "The Age of Synergy" since the inventions and innovations were engineering- and science-based.

Industry and technology

A synergy between iron and steel, railroads and coal developed at the beginning of the Second Industrial Revolution. Railroads allowed cheap transportation of materials and products, which in turn led to cheap rails to build more roads. Railroads also benefited from cheap coal for their steam locomotives. This synergy led to the laying of 75,000 miles of track in the U.S. in the 1880s, the largest amount anywhere in world history.

Iron

The hot blast technique, in which the hot flue gas from a blast furnace is used to preheat combustion air blown into a blast furnace, was invented and patented by James Beaumont Neilson in 1828 at Wilsontown Ironworks in Scotland. Hot blast was the single most important advance in fuel efficiency of the blast furnace as it greatly reduced the fuel consumption for making pig iron, and was one of the most important technologies developed during the Industrial Revolution. Falling costs for producing wrought iron coincided with the emergence of the railway in the 1830s.
The early technique of hot blast used iron for the regenerative heating medium. Iron caused problems with expansion and contraction, which stressed the iron and caused failure. Edward Alfred Cowper developed the Cowper stove in 1857. This stove used firebrick as a storage medium, solving the expansion and cracking problem. The Cowper stove was also capable of producing high heat, which resulted in very high throughput of blast furnaces. The Cowper stove is still used in today's blast furnaces.
With the greatly reduced cost of producing pig iron with coke using hot blast, demand grew dramatically and so did the size of blast furnaces.

Steel

The Bessemer process, invented by Sir Henry Bessemer, allowed the mass-production of steel, increasing the scale and speed of production of this vital material, and decreasing the labor requirements. The key principle was the removal of excess carbon and other impurities from pig iron by oxidation with air blown through the molten iron. The oxidation also raises the temperature of the iron mass and keeps it molten.
The "acid" Bessemer process had a serious limitation in that it required relatively scarce hematite ore which is low in phosphorus. Sidney Gilchrist Thomas developed a more sophisticated process to eliminate the phosphorus from iron. Collaborating with his cousin, Percy Gilchrist a chemist at the Blaenavon Ironworks, Wales, he patented his process in 1878; Bolckow Vaughan & Co. in Yorkshire was the first company to use his patented process. His process was especially valuable on the continent of Europe, where the proportion of phosphoric iron was much greater than in England, and both in Belgium and in Germany the name of the inventor became more widely known than in his own country. In America, although non-phosphoric iron largely predominated, an immense interest was taken in the invention.
The next great advance in steel making was the Siemens–Martin process. Sir Charles William Siemens developed his regenerative furnace in the 1850s, for which he claimed in 1857 to able to recover enough heat to save 70–80% of the fuel. The furnace operated at a high temperature by using regenerative preheating of fuel and air for combustion. Through this method, an open-hearth furnace can reach temperatures high enough to melt steel, but Siemens did not initially use it in that manner.
French engineer Pierre-Émile Martin was the first to take out a license for the Siemens furnace and apply it to the production of steel in 1865. The Siemens–Martin process complemented rather than replaced the Bessemer process. Its main advantages were that it did not expose the steel to excessive nitrogen, it was easier to control, and that it permitted the melting and refining of large amounts of scrap steel, lowering steel production costs and recycling an otherwise troublesome waste material. It became the leading steel making process by the early 20th century.
The availability of cheap steel allowed building larger bridges, railroads, skyscrapers, and ships. Other important steel products—also made using the open hearth process—were steel cable, steel rod and sheet steel which enabled large, high-pressure boilers and high-tensile strength steel for machinery which enabled much more powerful engines, gears and axles than were previously possible. With large amounts of steel it became possible to build much more powerful guns and carriages, tanks, armored fighting vehicles and naval ships.

Rail

The increase in steel production from the 1860s meant that railways could finally be made from steel at a competitive cost. Being a much more durable material, steel steadily replaced iron as the standard for railway rail, and due to its greater strength, longer lengths of rails could now be rolled. Wrought iron was soft and contained flaws caused by included dross. Iron rails could also not support heavy locomotives and were damaged by hammer blow. The first to make durable rails of steel rather than wrought iron was Robert Forester Mushet at the Darkhill Ironworks, Gloucestershire in 1857.
The first of Mushet's steel rails was sent to Derby Midland railway station. The rails were laid at part of the station approach where the iron rails had to be renewed at least every six months, and occasionally every three. Six years later, in 1863, the rail seemed as perfect as ever, although some 700 trains had passed over it daily. This provided the basis for the accelerated construction of railways throughout the world in the late nineteenth century.
The first commercially available steel rails in the US were manufactured in 1867 at the Cambria Iron Works in Johnstown, Pennsylvania.
Steel rails lasted over ten times longer than did iron, and with the falling cost of steel, heavier weight rails were used. This allowed the use of more powerful locomotives, which could pull longer trains, and longer rail cars, all of which greatly increased the productivity of railroads. Rail became the dominant form of transport infrastructure throughout the industrialized world, producing a steady decrease in the cost of shipping seen for the rest of the century.

Electrification

The theoretical and practical basis for the harnessing of electric power was laid by the scientist and experimentalist Michael Faraday. Through his research on the magnetic field around a conductor carrying a direct current, Faraday established the basis for the concept of the electromagnetic field in physics. His inventions of electromagnetic rotary devices were the foundation of the practical use of electricity in technology.
In 1881, Sir Joseph Swan, inventor of the first feasible incandescent light bulb, supplied about 1,200 Swan incandescent lamps to the Savoy Theatre in the city of Westminster, London, which was the first theater, and the first public building in the world, to be lit entirely by electricity. Swan's lightbulb had already been used in 1879 to light Mosley Street, in Newcastle upon Tyne, the first electrical street lighting installation in the world. This set the stage for the electrification of industry and the home. The first large scale central distribution supply plant was opened at Holborn Viaduct in London in 1882 and later at Pearl Street Station in New York City.
Image:3phase-rmf-noadd-60f-airopt.gif|thumb|Three-phase rotating magnetic field of an AC motor. The three poles are each connected to a separate wire. Each wire carries current 120 degrees apart in phase. Arrows show the resulting magnetic force vectors. Three phase current is used in commerce and industry.
The first modern power station in the world was built by the English electrical engineer Sebastian de Ferranti at Deptford. Built on an unprecedented scale and pioneering the use of high voltage alternating current, it generated 800 kilowatts and supplied central London. On its completion in 1891 it supplied high-voltage AC power that was then "stepped down" with transformers for consumer use on each street. Electrification allowed the final major developments in manufacturing methods of the Second Industrial Revolution, namely the assembly line and mass production.
Electrification was called "the most important engineering achievement of the 20th century" by the National Academy of Engineering. Electric lighting in factories greatly improved working conditions, eliminating the heat and pollution caused by gas lighting, and reducing the fire hazard to the extent that the cost of electricity for lighting was often offset by the reduction in fire insurance premiums. Frank J. Sprague developed the first successful DC motor in 1886. By 1889 110 electric street railways were either using his equipment or in planning. The electric street railway became a major infrastructure before 1920. The AC motor was developed in the 1890s and soon began to be used in the electrification of industry. Household electrification did not become common until the 1920s, and then only in cities. Fluorescent lighting was commercially introduced at the 1939 World's Fair.
Electrification also allowed the inexpensive production of electro-chemicals, such as aluminium, chlorine, sodium hydroxide, and magnesium.