Information Age


The Information Age is a historical period that began in the mid-20th century. It is characterized by a rapid shift from traditional industries, as established during the Industrial Revolution, to an economy centered on information technology. The onset of the Information Age has been linked to the development of the transistor in 1947. This technological advance has had a significant impact on the way information is processed and transmitted.
According to the United Nations Public Administration Network, the Information Age was formed by capitalizing on computer miniaturization advances, which led to modernized information systems and internet communications as the driving force of social evolution.
There is ongoing debate concerning whether the Third Industrial Revolution has already ended, and if the Fourth Industrial Revolution has already begun due to the recent breakthroughs in areas such as artificial intelligence and biotechnology. This next transition has been theorized to harken the advent of the Imagination Age, the Internet of things, and rapid advances in machine learning.

History

The digital revolution converted technology from analog format to digital format. By doing this, it became possible to make copies that were identical to the original. In digital communications, for example, repeating hardware was able to amplify the digital signal and pass it on with no loss of information in the signal. Of equal importance to the revolution was the ability to easily move the digital information between media and to access or distribute it remotely. One turning point of the revolution was the change from analog to digitally recorded music. During the 1980s, the digital format of optical compact discs gradually replaced analog formats, such as vinyl records and cassette tapes, as the popular medium of choice.

Previous inventions

Humans have manufactured tools for counting and calculating since ancient times, such as the abacus, astrolabe, equatorium, and mechanical timekeeping devices. More complicated devices started appearing in the 1600s, including the slide rule and mechanical calculators. By the early 1800s, the Industrial Revolution had produced mass-market calculators like the arithmometer and the enabling technology of the punch card. Charles Babbage proposed a mechanical general-purpose computer called the Analytical Engine, but it was never successfully built, and was largely forgotten by the 20th century, and unknown to most of the inventors of modern computers.
The Second Industrial Revolution, in the last quarter of the 19th century, developed useful electrical circuits and the telegraph. In the 1880s, Herman Hollerith developed electromechanical tabulating and calculating devices using punch cards and unit record equipment, which became widespread in business and government.
Meanwhile, various analog computer systems used electrical, mechanical, or hydraulic systems to model problems and calculate answers. These included an 1872 tide-predicting machine, differential analysers, perpetual calendar machines, the Deltar for water management in the Netherlands, network analyzers for electrical systems, and various machines for aiming military guns and bombs. The construction of problem-specific analog computers continued in the late 1940s and beyond, with FERMIAC for neutron transport, Project Cyclone for various military applications, and the Phillips Machine for economic modeling.
Building on the complexity of the Z1 and Z2, German inventor Konrad Zuse used electromechanical systems to complete in 1941 the Z3, the world's first working programmable, fully automatic digital computer. Also, during World War II, Allied engineers constructed electromechanical bombes to break the German Enigma machine encoding. The base-10 electromechanical Harvard Mark I was completed in 1944, and was to some degree improved with inspiration from Charles Babbage's designs.

1947–1969: Origins

In 1947, the first working transistor, the germanium-based point-contact transistor, was invented by John Bardeen and Walter Houser Brattain while working under William Shockley at Bell Labs. This led the way to more advanced digital computers. From the late 1940s, universities, the military, and businesses developed computer systems to digitally replicate and automate previously manually performed mathematical calculations, with the LEO being the first commercially available general-purpose computer.
Digital communication became economical for widespread adoption after the invention of the personal computer in the 1970s. Claude Shannon, a Bell Labs mathematician, is generally credited with laying the foundations of digitalization in his pioneering 1948 article, A Mathematical Theory of Communication.
In 1948, Bardeen and Brattain patented an insulated-gate transistor with an inversion layer. Their concept forms the basis of CMOS and DRAM technology today. In 1957, at Bell Labs, Frosch and Derick were able to manufacture planar silicon dioxide transistors, later a team at Bell Labs demonstrated a working MOSFET. The first integrated circuit milestone was achieved by Jack Kilby in 1958.
Other important technological developments included the invention of the monolithic integrated circuit chip by Robert Noyce at Fairchild Semiconductor in 1959, made possible by the planar process developed by Jean Hoerni. In 1963, complementary MOS was developed by Chih-Tang Sah and Frank Wanlass at Fairchild Semiconductor. The self-aligned gate transistor, which further facilitated mass production, was invented in 1966 by Robert Bower at Hughes Aircraft and independently by Robert Kerwin, Donald Klein, and John Sarace at Bell Labs.
In 1962, AT&T deployed the T-carrier for long-haul pulse-code modulation digital voice transmission. The T1 format carried 24 pulse-code modulated, time-division multiplexed speech signals, each encoded in 64 kbit/s streams, leaving 8 kbit/s of framing information, which facilitated the synchronization and demultiplexing at the receiver. Over the subsequent decades, the digitisation of voice became the norm for all but the last mile.
Following the development of MOS integrated circuit chips in the early 1960s, MOS chips reached higher transistor density and lower manufacturing costs than bipolar integrated circuits by 1964. MOS chips further increased in complexity at a rate predicted by Moore's law, leading to large-scale integration with hundreds of transistors on a single MOS chip by the late 1960s. The application of MOS LSI chips to computing was the basis for the first microprocessors, as engineers began recognizing that a complete computer processor could be contained on a single MOS LSI chip. In 1968, Fairchild engineer Federico Faggin improved MOS technology with his development of the silicon-gate MOS chip, which he later used to develop the Intel 4004, the first single-chip microprocessor. It was released by Intel in 1971 and laid the foundations for the microcomputer revolution that began in the 1970s.
MOS technology also led to the development of semiconductor image sensors suitable for digital cameras. The first such image sensor was the charge-coupled device, developed by Willard S. Boyle and George E. Smith at Bell Labs in 1969, based on MOS capacitor technology.

1969–1989: Invention of the internet, rise of home computers

The public was first introduced to the concepts that led to the Internet when a message was sent over the ARPANET in 1969. Packet switched networks such as ARPANET, Mark I, CYCLADES, Merit Network, Tymnet, and Telenet, were developed in the late 1960s and early 1970s using a variety of protocols. The ARPANET in particular led to the development of protocols for internetworking, in which multiple separate networks could be joined into a network of networks.
The Whole Earth movement of the 1960s advocated the use of new technology.
In the 1970s, the home computer was introduced, time-sharing computers, the video game console, the first coin-op video games, and the golden age of arcade video games began with Space Invaders. As digital technology proliferated, and the switch from analog to digital record keeping became the new standard in business, a relatively new job description was popularized, the data entry clerk. Culled from the ranks of secretaries and typists from earlier decades, the data entry clerk's job was to convert analog data into digital data.
In developed nations, computers achieved semi-ubiquity during the 1980s as they made their way into schools, homes, businesses, and industry. Automated teller machines, industrial robots, CGI in film and television, electronic music, bulletin board systems, and video games all fueled what became the zeitgeist of the 1980s. Millions of people purchased home computers, making household names of early personal computer manufacturers such as Apple, Commodore, and Tandy. To this day, the Commodore 64 is often cited as the best-selling computer of all time, having sold 17 million units between 1982 and 1994.
In 1984, the U.S. Census Bureau began collecting data on computer and Internet use in the United States; their first survey showed that 8.2% of all U.S. households owned a personal computer in 1984, and that households with children under the age of 18 were nearly twice as likely to own one at 15.3%. By 1989, 15% of all U.S. households owned a computer, and nearly 30% of households with children under the age of 18 owned one. By the late 1980s, many businesses were dependent on computers and digital technology.
Motorola created the first mobile phone, the Motorola DynaTac, in 1983. However, this device used analog communication – digital cell phones were not sold commercially until 1991, when the 2G network started to be opened in Finland to accommodate the unexpected demand for cell phones that was becoming apparent in the late 1980s.
Compute! magazine predicted that CD-ROM would be the centerpiece of the revolution, with multiple household devices reading the discs.
The first true digital camera was created in 1988, and the first were marketed in December 1989 in Japan and in 1990 in the United States. By the early 2000s, digital cameras had eclipsed traditional film in popularity.
Digital ink and paint were also invented in the late 1980s. Disney's CAPS system was used for a scene in 1989's The Little Mermaid and for all their animation films between 1990's The Rescuers Down Under and 2004's Home on the Range.