History of computing


The history of computing extends beyond the history of computing hardware and modern computing technology including earlier methods that relied on pen and paper or chalk and slate, with or without the aid of tables.

Concrete devices

Digital computing is intimately tied to the representation of numbers. But long before abstractions like the number arose, there were mathematical concepts to serve the purposes of civilization. These concepts are implicit in concrete practices such as:
  • One-to-one correspondence, a rule to count how many items, e.g. on a tally stick, eventually abstracted into numbers.
  • Comparison to a standard, a method for assuming reproducibility in a measurement, for example, the number of coins.
  • The 3-4-5 right triangle was a device for assuring a right angle, using ropes with 12 evenly spaced knots, for example.

    Numbers

Eventually, the concept of numbers became concrete and familiar enough for counting to arise, at times with sing-song mnemonics to teach sequences to others. All known human languages, except the Piraha language, have words for at least the numerals "one" and "two", and even some animals like the blackbird can distinguish a surprising number of items.
Advances in the numeral system and mathematical notation eventually led to the discovery of mathematical operations such as addition, subtraction, multiplication, division, squaring, square root, and so forth. Eventually, the operations were formalized, and concepts about the operations became understood well enough to be stated formally, and even proven. See, for example, Euclid's algorithm for finding the greatest common divisor of two numbers.
By the High Middle Ages, the positional Hindu–Arabic numeral system had reached Europe, which allowed for the systematic computation of numbers. During this period, the representation of a calculation on paper allowed the calculation of mathematical expressions, and the tabulation of mathematical functions such as the square root and the common logarithm, and the trigonometric functions. By the time of Isaac Newton's research, paper or vellum was an important computing resource, and even in our present time, researchers like Enrico Fermi would cover random scraps of paper with calculation, to satisfy their curiosity about an equation. Even into the period of programmable calculators, Richard Feynman would unhesitatingly compute any steps that overflowed the memory of the calculators, by hand, just to learn the answer; by 1976 Feynman had purchased an HP-25 calculator with a 49 program-step capacity; if a differential equation required more than 49 steps to solve, he could just continue his computation by hand.

Early computation

Mathematical statements need not be abstract only; when a statement can be illustrated with actual numbers, the numbers can be communicated and a community can arise. This allows the repeatable, verifiable statements which are the hallmark of mathematics and science. These kinds of statements have existed for thousands of years, and across multiple civilizations, as shown below:
The earliest known tool used for computation is the Sumerian abacus, believed to have been invented in Babylon –2300 BC. Its original style of usage was by lines drawn in sand with pebbles.
In –771 BC, the south-pointing chariot was invented in ancient China. It was the first known geared mechanism to use a differential gear, which was later used in analog computers. The Chinese also invented a more sophisticated abacus from around the 2nd century BC known as the Chinese abacus.
In the 3rd century BC, Archimedes used the mechanical principle of balance to calculate mathematical problems, such as the number of grains of sand in the universe, which also required a recursive notation for numbers.
The Antikythera mechanism is believed to be the earliest known geared computing device. It was designed to calculate astronomical positions. It was discovered in 1901 in the Antikythera wreck off the Greek island of Antikythera, between Kythera and Crete, and has been dated to circa 100 BC.
According to Simon Singh, Muslim mathematicians also made important advances in cryptography, such as the development of cryptanalysis and frequency analysis by Alkindus. Programmable machines were also invented by Muslim engineers, such as the automatic flute player by the Banū Mūsā brothers.
During the Middle Ages, several European philosophers made attempts to produce analog computer devices. Influenced by the Arabs and Scholasticism, Majorcan philosopher Ramon Llull devoted a great part of his life to defining and designing several logical machines that, by combining simple and undeniable philosophical truths, could produce all possible knowledge. These machines were never actually built, as they were more of a thought experiment to produce new knowledge in systematic ways; although they could make simple logical operations, they still needed a human being for the interpretation of results. Moreover, they lacked a versatile architecture, each machine serving only very concrete purposes. Despite this, Llull's work had a strong influence on Gottfried Leibniz, who developed his ideas further and built several calculating tools using them.
The apex of this early era of mechanical computing can be seen in the Difference Engine and its successor the Analytical Engine both by Charles Babbage. Babbage never completed constructing either engine, but in 2002 Doron Swade and a group of other engineers at the Science Museum in London completed Babbage's Difference Engine using only materials that would have been available in the 1840s. By following Babbage's detailed design they were able to build a functioning engine, allowing historians to say, with some confidence, that if Babbage had been able to complete his Difference Engine it would have worked. The additionally advanced Analytical Engine combined concepts from his previous work and that of others to create a device that, if constructed as designed, would have possessed many properties of a modern electronic computer, such as an internal "scratch memory" equivalent to RAM, multiple forms of output including a bell, a graph-plotter, and simple printer, and a programmable input-output "hard" memory of punch cards which it could modify as well as read. The key advancement that Babbage's devices possessed beyond those created before him was that each component of the device was independent of the rest of the machine, much like the components of a modern electronic computer. This was a fundamental shift in thought; previous computational devices served only a single purpose but had to be at best disassembled and reconfigured to solve a new problem. Babbage's devices could be reprogrammed to solve new problems by the entry of new data and act upon previous calculations within the same series of instructions. Ada Lovelace took this concept one step further, by creating a program for the Analytical Engine to calculate Bernoulli numbers, a complex calculation requiring a recursive algorithm. This is considered to be the first example of a true computer program, a series of instructions that act upon data not known in full until the program is run.
Following Babbage, although unaware of his earlier work, Percy Ludgate in 1909 published the 2nd of the only two designs for mechanical analytical engines in history. Two other inventors, Leonardo Torres Quevedo and Vannevar Bush, also did follow-on research based on Babbage's work. In his Essays on Automatics Torres presented the design of an electromechanical calculating machine and introduced the idea of Floating-point arithmetic. In 1920, to celebrate the 100th anniversary of the invention of the arithmometer, Torres presented in Paris the Electromechanical Arithmometer, an arithmetic unit connected to a remote typewriter, on which commands could be typed and the results printed automatically. Bush's paper Instrumental Analysis discussed using existing IBM punch card machines to implement Babbage's design. In the same year, he started the Rapid Arithmetical Machine project to investigate the problems of constructing an electronic digital computer.
Several examples of analog computation survived into recent times. A planimeter is a device that does integrals, using distance as the analog quantity. Until the 1980s, HVAC systems used air both as the analog quantity and the controlling element. Unlike modern digital computers, analog computers are not very flexible and need to be reconfigured manually to switch them from working on one problem to another. Analog computers had an advantage over early digital computers in that they could be used to solve complex problems using behavioral analogues while the earliest attempts at digital computers were quite limited.
Image:Visual Smith Chart.png|thumb|A Smith Chart is a well-known nomogram.
Since computers were rare in this era, the solutions were often hard-coded into paper forms such as nomograms, which could then produce analog solutions to these problems, such as the distribution of pressures and temperatures in a heating system.

Digital electronic computers

In an 1886 letter, Charles Sanders Peirce described how logical operations could be carried out by electrical switching circuits. During 1880-81 he showed that NOR gates alone can be used to reproduce the functions of all the other logic gates, but this work on it was unpublished until 1933. The first published proof was by Henry M. Sheffer in 1913, so the NAND logical operation is sometimes called Sheffer stroke; the logical NOR is sometimes called Peirce's arrow. Consequently, these gates are sometimes called universal logic gates.
Eventually, vacuum tubes replaced relays for logic operations. Lee De Forest's modification, in 1907, of the Fleming valve can be used as a logic gate. Ludwig Wittgenstein introduced a version of the 16-row truth table as proposition 5.101 of Tractatus Logico-Philosophicus. Walther Bothe, inventor of the coincidence circuit, got part of the 1954 Nobel Prize in physics, for the first modern electronic AND gate in 1924. Konrad Zuse designed and built electromechanical logic gates for his computer Z1.
The first recorded idea of using digital electronics for computing was the 1931 paper "The Use of Thyratrons for High Speed Automatic Counting of Physical Phenomena" by C. E. Wynn-Williams. From 1934 to 1936, NEC engineer Akira Nakashima, Claude Shannon, and Victor Shestakov published papers introducing switching circuit theory, using digital electronics for Boolean algebraic operations.
In 1936 Alan Turing published his seminal paper On Computable Numbers, with an Application to the Entscheidungsproblem in which he modeled computation in terms of a one-dimensional storage tape, leading to the idea of the Universal Turing machine and Turing-complete systems.
The first digital electronic computer was developed between April 1936 and June 1939 at the IBM Patent Department in Endicott, New York, by Arthur Halsey Dickinson. In this computer IBM introduced, a calculating device with a keyboard, processor and electronic output. The competitor to IBM was the digital electronic computer NCR3566, developed in NCR, Dayton, Ohio by Joseph Desch and Robert Mumma in the period April 1939 - August 1939. The IBM and NCR machines were decimal, executing addition and subtraction in binary position code.
In December 1939 John Atanasoff and Clifford Berry completed their experimental model to prove the concept of the Atanasoff–Berry computer which began development in 1937. This experimental model is binary, executed addition and subtraction in octal binary code and is the first binary digital electronic computing device. The Atanasoff–Berry computer was intended to solve systems of linear equations, though it was not programmable. The computer was never truly completed due to Atanasoff's departure from Iowa State University in 1942 to work for the United States Navy. Many people credit ABC with many of the ideas used in later developments during the age of early electronic computing.
The Z3 computer, built by German inventor Konrad Zuse in 1941, was the first programmable, fully automatic computing machine, but it was not electronic.
During World War II, ballistics computing was done by women, who were hired as "computers." The term computer remained one that referred to mostly women until 1945, after which it took on the modern definition of machinery it presently holds.
The ENIAC was the first electronic general-purpose computer, announced to the public in 1946. It was Turing-complete, digital, and capable of being reprogrammed to solve a full range of computing problems. Women implemented the programming for machines like the ENIAC, and men created the hardware.
The Manchester Baby was the first electronic stored-program computer. It was built at the Victoria University of Manchester by Frederic C. Williams, Tom Kilburn and Geoff Tootill, and ran its first program on 21 June 1948.
William Shockley, John Bardeen and Walter Brattain at Bell Labs invented the first working transistor, the point-contact transistor, in 1947, followed by the bipolar junction transistor in 1948. At the University of Manchester in 1953, a team under the leadership of Tom Kilburn designed and built the first transistorized computer, called the Transistor Computer, a machine using the newly developed transistors instead of valves. The first stored-program transistor computer was the ETL Mark III, developed by Japan's Electrotechnical Laboratory from 1954 to 1956. However, early junction transistors were relatively bulky devices that were difficult to manufacture on a mass-production basis, which limited them to a number of specialized applications.
In 1954, 95% of computers in service were being used for engineering and scientific purposes.