Integrated circuit
An integrated circuit, also known as a microchip or simply chip, is a compact assembly of electronic circuits formed from various electronic components — such as transistors, resistors, and capacitors — and their interconnections.
These components are fabricated onto a thin, flat piece of semiconductor material, most commonly silicon. Integrated circuits are integral to a wide variety of electronic devices — including computers, smartphones, and televisions — performing functions such as data processing, control, and storage. They have transformed the field of electronics by enabling device miniaturization, improving performance, and reducing cost.
Chips can be connected to other chips, and other materials.
Compared to assemblies built from discrete components, integrated circuits are orders of magnitude smaller, faster, more energy-efficient, and less expensive, allowing for a very high transistor count.
The ICs capability for mass production, its high reliability, and the standardized, modular approach of integrated circuit design facilitated rapid replacement of designs using discrete transistors. Today, ICs are present in virtually all electronic devices and have revolutionized modern technology. Products such as computer processors, microcontrollers, digital signal processors, and embedded processing chips in home appliances are foundational to contemporary society due to their small size, low cost, and versatility.
Very-large-scale integration was made practical by technological advancements in semiconductor device fabrication. Since their origins in the 1960s, the size, speed, and capacity of chips have progressed enormously, driven by technical advances that fit more and more transistors on chips of the same size – a modern chip may have many billions of transistors in an area the size of a human fingernail. These advances, roughly following Moore's law, make the computer chips of today possess millions of times the capacity and thousands of times the speed of the computer chips of the early 1970s.
ICs have three main advantages over circuits constructed out of discrete components: size, cost and performance. The size and cost is low because the chips, with all their components, are printed as a unit by photolithography rather than being constructed one transistor at a time. Furthermore, packaged ICs use much less material than discrete circuits. Performance is high because the IC's components switch quickly and consume comparatively little power because of their small size and proximity. The main disadvantage of ICs is the high initial cost of designing them and the enormous capital cost of factory construction. This high initial cost means ICs are only commercially viable when high production volumes are anticipated.
Terminology
An integrated circuit is formally defined as:A circuit in which all or some of the circuit elements are inseparably associated and electrically interconnected so that it is considered to be indivisible for the purposes of construction and commerce.In its strict sense, the term refers to a single-piece circuit construction — originally called a monolithic integrated circuit — consisting of an entire circuit built on a single piece of silicon. In general usage, the designation "integrated circuit" can also apply to circuits that do not meet this strict definition, and which may be constructed using various technologies such as 3D IC, 2.5D IC, MCM, thin-film transistors, thick-film technology, or hybrid integrated circuits. This distinction in terminology is often relevant in debates on whether Moore's law remains applicable. File:Kilby solid circuit.jpg|thumb|right|Jack Kilby’s original integrated circuit — the first in the world — made from germanium with gold-wire interconnects.
History
The first integrated circuits
In 1952, British electronics engineer Geoffrey Dummer conceived the concept of the integrated circuit also known as a microchip, a precursor to the modern IC was the development of small ceramic substrates, known as micromodules, each containing a single miniaturized electronic component. These modules could then be assembled and interconnected into a two- or three-dimensional compact grid. The idea, considered highly promising in 1957, was proposed to the U.S. Army by Jack Kilby, leading to the short-lived Micromodule Program. However, as the project gained traction, Kilby devised a fundamentally new approach: the integrated circuit itself.Newly employed by Texas Instruments, Kilby recorded his initial ideas concerning the integrated circuit in July 1958, successfully demonstrating the first working example of an integrated circuit on 12 September 1958. In his patent application of 6 February 1959, Kilby described his new device as "a body of semiconductor material … wherein all the components of the electronic circuit are completely integrated". The first customer for the new invention was the US Air Force. Kilby won the 2000 Nobel Prize in physics for his part in the invention of the integrated circuit.
However, Kilby's invention was not a true monolithic integrated circuit chip, as it relied on external gold-wire connections, making large-scale production impractical. About six months later, Robert Noyce at Fairchild Semiconductor developed the first practical monolithic IC chip. The monolithic integrated circuit chip was enabled by the inventions of the planar process by Jean Hoerni and of p–n junction isolation by Kurt Lehovec. Hoerni's invention was built on Carl Frosch and Lincoln Derick's work on surface protection and passivation by silicon dioxide masking and predeposition, as well as Fuller, Ditzenberger's and others work on the diffusion of impurities into silicon.
Unlike Kilby's germanium-based design, Noyce's version was fabricated from silicon using the planar process by his colleague Jean Hoerni, which allowed reliable on-chip aluminum interconnections. Modern IC chips are based on Noyce's monolithic design, rather than Kilby's early prototype.
NASA's Apollo Program was the largest single consumer of integrated circuits between 1961 and 1965.
TTL integrated circuits
was developed by James L. Buie in the early 1960s at TRW Inc. TTL became the dominant integrated circuit technology during the 1970s to early 1980s.File:Dov Frohman.jpg|thumb|Dov Frohman, an Israeli electrical engineer who developed the EPROM in 1969–1971Use of dozens of TTL integrated circuits was the standard method of construction for the processors of minicomputers and mainframe computers. Computers such as IBM 360 mainframes, PDP-11 minicomputers and the desktop Datapoint 2200 were built from bipolar integrated circuits, either TTL or the faster emitter-coupled logic.
MOS integrated circuits
Modern integrated circuits are based on the metal–oxide–semiconductor field-effect transistor, forming MOS ICs. The MOSFET was developed at Bell Labs between 1955 and 1960, enabling the creation of high-density ICs. Unlike bipolar transistors, which required additional steps for p–n junction isolation, MOSFETs could be easily isolated from one another without such measures. This advantage for integrated circuits was first highlighted by Dawon Kahng in 1961. The list of IEEE Milestones includes Kilby's first IC in 1958, Hoerni's planar process and Noyce's planar IC in 1959.The earliest experimental MOS IC to be fabricated was a 16-transistor chip built by Fred Heiman and Steven Hofstein at RCA in 1962. General Microelectronics later introduced the first commercial MOS integrated circuit in 1964, a 120-transistor shift register developed by Robert Norman. By 1964, MOS chips had reached higher transistor density and lower manufacturing costs than bipolar chips. MOS chips further increased in complexity at a rate predicted by Moore's law, leading to large-scale integration with hundreds of transistors on a single MOS chip by the late 1960s.
Following the development of the self-aligned gate MOSFET by Robert Kerwin, Donald Klein and John Sarace at Bell Labs in 1967, the first silicon-gate MOS IC technology with self-aligned gates, the basis of all modern CMOS integrated circuits, was developed at Fairchild Semiconductor by Federico Faggin in 1968. The application of MOS LSI chips to computing was the basis for the first microprocessors, as engineers began recognizing that a complete computer processor could be contained on a single MOS LSI chip. This led to the inventions of the microprocessor and the microcontroller by the early 1970s. During the early 1970s, MOS integrated circuit technology enabled the very large-scale integration of more than 10,000 transistors on a single chip.
At first, MOS-based computers only made sense when high density was required, such as aerospace and pocket calculators. Computers built entirely from TTL, such as the 1970 Datapoint 2200, were much faster and more powerful than single-chip MOS microprocessors, such as the 1972 Intel 8008, until the early 1980s.
Advances in IC technology, primarily smaller features and larger chips, have allowed the number of MOS transistors in an integrated circuit to double every two years, a trend known as Moore's law. Moore originally stated it would double every year, but he went on to change the claim to every two years in 1975. This increased capacity has been used to decrease cost and increase functionality. In general, as the feature size shrinks, almost every aspect of an IC's operation improves. The cost per transistor and the switching power consumption per transistor goes down, while the memory capacity and speed go up, through the relationships defined by Dennard scaling. Because speed, capacity, and power consumption gains are apparent to the end user, there is fierce competition among the manufacturers to use finer geometries. Over the years, transistor sizes have decreased from tens of microns in the early 1970s to 10 nanometers in 2017 with a corresponding million-fold increase in transistors per unit area. As of 2016, typical chip areas range from a few square millimeters to around 600 mm2, with up to 25 million transistors per mm2.
The expected shrinking of feature sizes and the needed progress in related areas was forecast for many years by the International Technology Roadmap for Semiconductors. The final ITRS was issued in 2016, and it is being replaced by the International Roadmap for Devices and Systems.
Initially, ICs were strictly electronic devices. The success of ICs has led to the integration of other technologies, in an attempt to obtain the same advantages of small size and low cost. These technologies include mechanical devices, optics, and sensors.
- Charge-coupled devices, and the closely related active-pixel sensors, are chips that are sensitive to light. They have largely replaced photographic film in scientific, medical, and consumer applications. Billions of these devices are now produced each year for applications such as cellphones, tablets, and digital cameras. This sub-field of ICs won the Nobel Prize in 2009.
- Very small mechanical devices driven by electricity can be integrated onto chips, a technology known as microelectromechanical systems. These devices were developed in the late 1980s and are used in a variety of commercial and military applications. Examples include DLP projectors, inkjet printers, and accelerometers and MEMS gyroscopes used to deploy automobile airbags.
- Since the early 2000s, the integration of optical functionality into silicon chips has been actively pursued in both academic research and in industry resulting in the successful commercialization of silicon based integrated optical transceivers combining optical devices with CMOS based electronics. Photonic integrated circuits that use light such as Lightelligence's PACE also being developed, using the emerging field of physics known as photonics.
- Integrated circuits are also being developed for sensor applications in medical implants or other bioelectronic devices. Special sealing techniques have to be applied in such biogenic environments to avoid corrosion or biodegradation of the exposed semiconductor materials.
- various approaches to stacking several layers of transistors to make a three-dimensional integrated circuit, such as through-silicon via, "monolithic 3D", stacked wire bonding, and other methodologies.
- transistors built from other materials: graphene transistors, molybdenite transistors, carbon nanotube field-effect transistor, gallium nitride transistor, transistor-like nanowire electronic devices, organic field-effect transistor, etc.
- fabricating transistors over the entire surface of a small sphere of silicon.
- modifications to the substrate, typically to make "flexible transistors" for a flexible display or other flexible electronics, possibly leading to a roll-away computer.