Integrated circuit
Integrated circuit
An integrated circuit or monolithic integrated circuit (also referred to as an IC, a chip, or a microchip) is a set of electronic circuits on one small flat piece (or "chip") of semiconductor material that is normally silicon. The integration of large numbers of tiny MOS transistors into a small chip results in circuits that are orders of magnitude smaller, faster, and less expensive than those constructed of discrete electronic components. The IC's mass production capability, reliability, and building-block approach to circuit design has ensured the rapid adoption of standardized ICs in place of designs using discrete transistors. ICs are now used in virtually all electronic equipment and have revolutionized the world of electronics. Computers, mobile phones, and other digital home appliances are now inextricable parts of the structure of modern societies, made possible by the small size and low cost of ICs.
Integrated circuits were made practical by technological advancements in metal–oxide–silicon (MOS) semiconductor device fabrication. Since their origins in the 1960s, the size, speed, and capacity of chips have progressed enormously, driven by technical advances that fit more and more MOS transistors on chips of the same size – a modern chip may have many billions of MOS transistors in an area the size of a human fingernail. These advances, roughly following Moore's law, make computer chips of today possess millions of times the capacity and thousands of times the speed of the computer chips of the early 1970s.
ICs have two main advantages over discrete circuits: cost and performance. Cost is low because the chips, with all their components, are printed as a unit by photolithography rather than being constructed one transistor at a time. Furthermore, packaged ICs use much less material than discrete circuits. Performance is high because the IC's components switch quickly and consume comparatively little power because of their small size and proximity. The main disadvantage of ICs is the high cost to design them and fabricate the required photomasks. This high initial cost means ICs are only practical when high production volumes are anticipated.
Terminology
An integrated circuit is defined as:[1]
A circuit in which all or some of the circuit elements are inseparably associated and electrically interconnected so that it is considered to be indivisible for the purposes of construction and commerce.
Circuits meeting this definition can be constructed using many different technologies, including thin-film transistors, thick-film technologies, or hybrid integrated circuits. However, in general usage integrated circuit has come to refer to the single-piece circuit construction originally known as a monolithic integrated circuit.[2][3]
History
Early concepts of an integrated circuit go back to 1949, when German engineer Werner Jacobi[4] (Siemens AG)[5] filed a patent for an integrated-circuit-like semiconductor amplifying device[6] showing five transistors on a common substrate in a 3-stage amplifier arrangement. Jacobi disclosed small and cheap hearing aids as typical industrial applications of his patent. An immediate commercial use of his patent has not been reported.
The idea of an integrated circuit was conceived by Geoffrey Dummer (1909–2002), a radar scientist working for the Royal Radar Establishment of the British Ministry of Defence. Dummer presented the idea to the public at the Symposium on Progress in Quality Electronic Components in Washington, D.C. on 7 May 1952.[7] He gave many symposia publicly to propagate his ideas and unsuccessfully attempted to build such a circuit in 1956. Between 1953 and 1957, Sidney Darlington and Yasuro Tarui (Electrotechnical Laboratory) proposed similar chip designs where several transistors could share a common active area, but there was no electrical isolation to separate them from each other.[4]
The monolithic integrated circuit chip was enabled by the surface passivation process, which electrically stabilized silicon surfaces via thermal oxidation, making it possible to fabricate monolithic integrated circuit chips using silicon. The surface passivation process was developed by Mohamed Atalla at Bell Labs in 1957. This was the basis for the planar process, developed by Jean Hoerni at Fairchild Semiconductor in early 1959, which was critical to the invention of the monolithic integrated circuit chip.[8][9][10] A key concept behind the monolithic IC is the principle of p–n junction isolation, which allows each transistor to operate independently despite being part of the same piece of silicon. Atalla's surface passivation process isolated individual diodes and transistors,[11] which was extended to independent transistors on a single piece of silicon by Kurt Lehovec at Sprague Electric in 1959,[12] and then independently by Robert Noyce at Fairchild later the same year.[13][14]
First integrated circuits
A precursor idea to the IC was to create small ceramic substrates (so-called micromodules),[15] each containing a single miniaturized component. Components could then be integrated and wired into a bidimensional or tridimensional compact grid. This idea, which seemed very promising in 1957, was proposed to the US Army by Jack Kilby and led to the short-lived Micromodule Program (similar to 1951's Project Tinkertoy).[15][16][17] However, as the project was gaining momentum, Kilby came up with a new, revolutionary design: the IC.
Newly employed by Texas Instruments, Kilby recorded his initial ideas concerning the integrated circuit in July 1958, successfully demonstrating the first working example of an integrated circuit on 12 September 1958.[18] In his patent application of 6 February 1959,[19] Kilby described his new device as "a body of semiconductor material … wherein all the components of the electronic circuit are completely integrated."[20] The first customer for the new invention was the US Air Force.[21] Kilby won the 2000 Nobel Prize in Physics for his part in the invention of the integrated circuit.[22] However, Kilby's invention was a hybrid integrated circuit (hybrid IC), rather than a monolithic integrated circuit (monolithic IC) chip.[23] Kilby's IC had external wire connections, which made it difficult to mass-produce.[24]
Half a year after Kilby, Robert Noyce at Fairchild Semiconductor invented the first true monolithic IC chip.[25][24] It was a new variety of integrated circuit, more practical than Kilby's implementation. Noyce's design was made of silicon, whereas Kilby's chip was made of germanium. Noyce's monolithic IC put all components on a chip of silicon and connected them with copper lines.[24] Noyce's monolithic IC was fabricated using the planar process, developed in early 1959 by his colleague Jean Hoerni. In turn, Hoerni's planar process was based on Mohamed Atalla's surface passivation process.[26][27][28] Modern IC chips are based on Noyce's monolithic IC,[25][24] rather than Kilby's hybrid IC.[23]
MOS integrated circuits
Nearly all modern IC chips are metal–oxide–semiconductor (MOS) integrated circuits, built from MOSFETs (metal–oxide–silicon field-effect transistors).[29] The MOSFET (also known as the MOS transistor), which was invented by Mohamed Atalla and Dawon Kahng at Bell Labs in 1959,[30] made it possible to build high-density integrated circuits.[31] Atalla first proposed the concept of the MOS integrated circuit chip in 1960, followed by Kahng in 1961, both noting that the MOS transistor's ease of fabrication made it useful for integrated circuits.[32][33] The list of IEEE milestones includes the first integrated circuit by Kilby in 1958,[34] Hoerni's planar process and Noyce's planar IC in 1959, and the MOSFET by Atalla and Kahng in 1959.[35]
The earliest experimental MOS IC to be fabricated was a 16-transistor chip built by Fred Heiman and Steven Hofstein at RCA in 1962.[36] General Microelectronics later introduced the first commercial MOS integrated circuit in 1964,[37] a 120-transistor shift register developed by Robert Norman.[36] Following the development of the self-aligned gate (silicon-gate) MOS transistor by Robert Kerwin, Donald Klein and John Sarace at Bell Labs in 1967,[38] the first silicon-gate MOS IC technology with self-aligned gates, the basis of all modern CMOS integrated circuits, was developed at Fairchild Semiconductor by Federico Faggin in 1968. In the early 1970s, MOS integrated circuit technology allowed the integration of more than 10,000 transistors in a single chip.[39]
Advances in IC technology, primarily smaller features and larger chips, have allowed the number of MOS transistors in an integrated circuit to double every two years, a trend known as Moore's law. Moore originally stated it would double every year, but he went on to change the claim to every two years in 1975.[40] This increased capacity has been used to decrease cost and increase functionality. In general, as the feature size shrinks, almost every aspect of an IC's operation improves. The cost per transistor and the switching power consumption per transistor goes down, while the memory capacity and speed go up, through the relationships defined by Dennard scaling (MOSFET scaling).[41] Because speed, capacity, and power consumption gains are apparent to the end user, there is fierce competition among the manufacturers to use finer geometries. Over the years, transistor sizes have decreased from 10s of microns in the early 1970s to 10 nanometers in 2017 [42] with a corresponding million-fold increase in transistors per unit area. As of 2016, typical chip areas range from a few square millimeters to around 600 mm2, with up to 25 million transistors per mm2.[43]
The expected shrinking of feature sizes and the needed progress in related areas was forecast for many years by the International Technology Roadmap for Semiconductors (ITRS). The final ITRS was issued in 2016, and it is being replaced by the International Roadmap for Devices and Systems.[44]
Initially, ICs were strictly electronic devices. The success of ICs has led to the integration of other technologies, in an attempt to obtain the same advantages of small size and low cost. These technologies include mechanical devices, optics, and sensors.
Charge-coupled devices, and the closely related active pixel sensors, are chips that are sensitive to light. They have largely replaced photographic film in scientific, medical, and consumer applications. Billions of these devices are now produced each year for applications such as cellphones, tablets, and digital cameras. This sub-field of ICs won the Nobel prize in 2009.[45]
Very small mechanical devices driven by electricity can be integrated onto chips, a technology known as microelectromechanical systems. These devices were developed in the late 1980s[46] and are used in a variety of commercial and military applications. Examples include DLP projectors, inkjet printers, and accelerometers and MEMS gyroscopes used to deploy automobile airbags.
Since the early 2000s, the integration of optical functionality (optical computing) into silicon chips has been actively pursued in both academic research and in industry resulting in the successful commercialization of silicon based integrated optical transceivers combining optical devices (modulators, detectors, routing) with CMOS based electronics.[47] Integrated optical circuits are also being developed, using the emerging field of physics known as photonics.
Integrated circuits are also being developed for sensor applications in medical implants or other bioelectronic devices.[48] Special sealing techniques have to be applied in such biogenic environments to avoid corrosion or biodegradation of the exposed semiconductor materials.[49]
As of 2018, the vast majority of all transistors are MOSFETs fabricated in a single layer on one side of a chip of silicon in a flat two-dimensional planar process. Researchers have produced prototypes of several promising alternatives, such as:
various approaches to stacking several layers of transistors to make a three-dimensional integrated circuit (3DIC), such as through-silicon via, "monolithic 3D",[50] stacked wire bonding,[51] and other methodologies.
transistors built from other materials: graphene transistors, molybdenite transistors, carbon nanotube field-effect transistor, gallium nitride transistor, transistor-like nanowire electronic devices, organic field-effect transistor, etc.
fabricating transistors over the entire surface of a small sphere of silicon.[52][53]
modifications to the substrate, typically to make "flexible transistors" for a flexible display or other flexible electronics, possibly leading to a roll-away computer.
As it becomes more difficult to manufacture ever smaller transistors, companies are using multi-chip modules, three-dimensional integrated circuits, 3D NAND, package on package, and through-silicon vias to increase performance and reducing size, without having to reduce the size of the transistors. [54] [55][56]
Design
The cost of designing and developing a complex integrated circuit is quite high, normally in the multiple tens of millions of dollars.[57] Therefore, it only makes economic sense to produce integrated circuit products with high production volume, so the non-recurring engineering (NRE) costs are spread across typically millions of production units.
Modern semiconductor chips have billions of components, and are too complex to be designed by hand. Software tools to help the designer are essential. Electronic Design Automation (EDA), also referred to as Electronic Computer-Aided Design (ECAD),[58] is a category of software tools for designing electronic systems, including integrated circuits. The tools work together in a design flow that engineers use to design and analyze entire semiconductor chips.
Types
Digital integrated circuits can contain anywhere from one[62] to billions[43] of logic gates, flip-flops, multiplexers, and other circuits in a few square millimeters. The small size of these circuits allows high speed, low power dissipation, and reduced manufacturing cost compared with board-level integration. These digital ICs, typically microprocessors, DSPs, and microcontrollers, work using boolean algebra to process "one" and "zero" signals.
Among the most advanced integrated circuits are the microprocessors or "cores", which control everything from personal computers and cellular phones to digital microwave ovens. Digital memory chips and application-specific integrated circuits (ASICs) are examples of other families of integrated circuits that are important to the modern information society.
In the 1980s, programmable logic devices were developed. These devices contain circuits whose logical function and connectivity can be programmed by the user, rather than being fixed by the integrated circuit manufacturer. This allows a single chip to be programmed to implement different LSI-type functions such as logic gates, adders and registers. Programmability comes in at least four forms - devices that can be programmed only once, devices that can be erased and then re-programmed using UV light, devices that can be (re)programmed using flash memory, and field-programmable gate arrays (FPGAs) which can be programmed at any time, including during operation. Current FPGAs can (as of 2016) implement the equivalent of millions of gates and operate at frequencies up to 1 GHz.[63]
Analog ICs, such as sensors, power management circuits, and operational amplifiers (op-amps), work by processing continuous signals. They perform analog functions such as amplification, active filtering, demodulation, and mixing. Analog ICs ease the burden on circuit designers by having expertly designed analog circuits available instead of designing and/or constructing a difficult analog circuit from scratch.
ICs can also combine analog and digital circuits on a single chip to create functions such as analog-to-digital converters and digital-to-analog converters. Such mixed-signal circuits offer smaller size and lower cost, but must carefully account for signal interference. Prior to the late 1990s, radios could not be fabricated in the same low-cost CMOS processes as microprocessors. But since 1998, a large number of radio chips have been developed using RF CMOS processes. Examples include Intel's DECT cordless phone, or 802.11 (Wi-Fi) chips created by Atheros and other companies.[64]
Modern electronic component distributors often further sub-categorize the huge variety of integrated circuits now available:
Digital ICs are further sub-categorized as logic ICs, memory chips, interface ICs (level shifters, serializer/deserializer, etc.), Power Management ICs, and programmable devices.
Analog ICs are further sub-categorized as linear ICs and radio frequency (RF) ICs.
Mixed-signal integrated circuits are further sub-categorized as data acquisition ICs (including A/D converters, D/A converter, digital potentiometers) and clock/timing ICs.
Manufacturing
Fabrication
The semiconductors of the periodic table of the chemical elements were identified as the most likely materials for a solid-state vacuum tube. Starting with copper oxide, proceeding to germanium, then silicon, the materials were systematically studied in the 1940s and 1950s. Today, monocrystalline silicon is the main substrate used for ICs although some III-V compounds of the periodic table such as gallium arsenide are used for specialized applications like LEDs, lasers, solar cells and the highest-speed integrated circuits. It took decades to perfect methods of creating crystals with minimal defects in semiconducting materials' crystal structure.
Semiconductor ICs are fabricated in a planar process which includes three key process steps – photolithography, deposition (such as chemical vapor deposition), and etching. The main process steps are supplemented by doping and cleaning.
Mono-crystal silicon wafers are used in most applications (or for special applications, other semiconductors such as gallium arsenide are used). The wafer need not be entirely silicon. Photolithography is used to mark different areas of the substrate to be doped or to have polysilicon, insulators or metal (typically aluminium or copper) tracks deposited on them. Dopants are impurities intentionally introduced to a semiconductor to modulate its electronic properties. Doping is the process of adding dopants to a semiconductor material.
Integrated circuits are composed of many overlapping layers, each defined by photolithography, and normally shown in different colors. Some layers mark where various dopants are diffused into the substrate (called diffusion layers), some define where additional ions are implanted (implant layers), some define the conductors (doped polysilicon or metal layers), and some define the connections between the conducting layers (via or contact layers). All components are constructed from a specific combination of these layers.
In a self-aligned CMOS process, a transistor is formed wherever the gate layer (polysilicon or metal) crosses a diffusion layer.
Capacitive structures, in form very much like the parallel conducting plates of a traditional electrical capacitor, are formed according to the area of the "plates", with insulating material between the plates. Capacitors of a wide range of sizes are common on ICs.
Meandering stripes of varying lengths are sometimes used to form on-chip resistors, though most logic circuits do not need any resistors. The ratio of the length of the resistive structure to its width, combined with its sheet resistivity, determines the resistance.
More rarely, inductive structures can be built as tiny on-chip coils, or simulated by gyrators.
Since a CMOS device only draws current on the transition between logic states, CMOS devices consume much less current than bipolar junction transistor devices.
A random-access memory is the most regular type of integrated circuit; the highest density devices are thus memories; but even a microprocessor will have memory on the chip. (See the regular array structure at the bottom of the first image.) Although the structures are intricate – with widths which have been shrinking for decades – the layers remain much thinner than the device widths. The layers of material are fabricated much like a photographic process, although light waves in the visible spectrum cannot be used to "expose" a layer of material, as they would be too large for the features. Thus photons of higher frequencies (typically ultraviolet) are used to create the patterns for each layer. Because each feature is so small, electron microscopes are essential tools for a process engineer who might be debugging a fabrication process.
Each device is tested before packaging using automated test equipment (ATE), in a process known as wafer testing, or wafer probing. The wafer is then cut into rectangular blocks, each of which is called a die. Each good die (plural dice, dies, or die) is then connected into a package using aluminium (or gold) bond wires which are thermosonically bonded[65] to pads, usually found around the edge of the die. Thermosonic bonding was first introduced by A. Coucoulas which provided a reliable means of forming these vital electrical connections to the outside world. After packaging, the devices go through final testing on the same or similar ATE used during wafer probing. Industrial CT scanning can also be used. Test cost can account for over 25% of the cost of fabrication on lower-cost products, but can be negligible on low-yielding, larger, or higher-cost devices.
As of 2016, a fabrication facility (commonly known as a semiconductor fab) can cost over US$8 billion to construct.[66] The cost of a fabrication facility rises over time because of increased complexity of new products. This is known as Rock's law. Today, the most advanced processes employ the following techniques:
The wafers are up to 300 mm in diameter (wider than a common dinner plate).
As of 2016, a state of the art foundry can produce 14 nm transistors, as implemented by Intel, TSMC, Samsung, and GlobalFoundries. The next step, to 10 nm devices, is expected in 2017.[67]
Copper interconnects where copper wiring replaces aluminum for interconnects.
Low-κ dielectric insulators.
Silicon on insulator (SOI).
Strained silicon in a process used by IBM known as strained silicon directly on insulator (SSDOI).
Multigate devices such as tri-gate transistors being manufactured by Intel from 2011 in their 22 nm process.
Packaging
The earliest integrated circuits were packaged in ceramic flat packs, which continued to be used by the military for their reliability and small size for many years. Commercial circuit packaging quickly moved to the dual in-line package (DIP), first in ceramic and later in plastic. In the 1980s pin counts of VLSI circuits exceeded the practical limit for DIP packaging, leading to pin grid array (PGA) and leadless chip carrier (LCC) packages. Surface mount packaging appeared in the early 1980s and became popular in the late 1980s, using finer lead pitch with leads formed as either gull-wing or J-lead, as exemplified by the small-outline integrated circuit (SOIC) package – a carrier which occupies an area about 30–50% less than an equivalent DIP and is typically 70% thinner. This package has "gull wing" leads protruding from the two long sides and a lead spacing of 0.050 inches.
In the late 1990s, plastic quad flat pack (PQFP) and thin small-outline package (TSOP) packages became the most common for high pin count devices, though PGA packages are still used for high-end microprocessors.
Ball grid array (BGA) packages have existed since the 1970s. Flip-chip Ball Grid Array packages, which allow for much higher pin count than other package types, were developed in the 1990s. In an FCBGA package the die is mounted upside-down (flipped) and connects to the package balls via a package substrate that is similar to a printed-circuit board rather than by wires. FCBGA packages allow an array of input-output signals (called Area-I/O) to be distributed over the entire die rather than being confined to the die periphery. BGA devices have the advantage of not needing a dedicated socket, but are much harder to replace in case of device failure.
Intel transitioned away from PGA to land grid array (LGA) and BGA beginning in 2004, with the last PGA socket released in 2014 for mobile platforms. As of 2018, AMD uses PGA packages on mainstream desktop processors,[69] BGA packages on mobile processors,[70] and high-end desktop and server microprocessors use LGA packages.[71]
Electrical signals leaving the die must pass through the material electrically connecting the die to the package, through the conductive traces (paths) in the package, through the leads connecting the package to the conductive traces on the printed circuit board. The materials and structures used in the path these electrical signals must travel have very different electrical properties, compared to those that travel to different parts of the same die. As a result, they require special design techniques to ensure the signals are not corrupted, and much more electric power than signals confined to the die itself.
When multiple dies are put in one package, the result is a system in package, abbreviated SiP. A multi-chip module (MCM), is created by combining multiple dies on a small substrate often made of ceramic. The distinction between a large MCM and a small printed circuit board is sometimes fuzzy.
Packaged integrated circuits are usually large enough to include identifying information. Four common sections are the manufacturer's name or logo, the part number, a part production batch number and serial number, and a four-digit date-code to identify when the chip was manufactured. Extremely small surface-mount technology parts often bear only a number used in a manufacturer's lookup table to find the integrated circuit's characteristics.
The manufacturing date is commonly represented as a two-digit year followed by a two-digit week code, such that a part bearing the code 8341 was manufactured in week 41 of 1983, or approximately in October 1983.
Intellectual property
The possibility of copying by photographing each layer of an integrated circuit and preparing photomasks for its production on the basis of the photographs obtained is a reason for the introduction of legislation for the protection of layout-designs. The Semiconductor Chip Protection Act of 1984 established intellectual property protection for photomasks used to produce integrated circuits.[72]
A diplomatic conference was held at Washington, D.C., in 1989, which adopted a Treaty on Intellectual Property in Respect of Integrated Circuits[73] (IPIC Treaty).
The Treaty on Intellectual Property in respect of Integrated Circuits, also called Washington Treaty or IPIC Treaty (signed at Washington on 26 May 1989) is currently not in force, but was partially integrated into the TRIPS agreement.[74]
National laws protecting IC layout designs have been adopted in a number of countries, including Japan,[75] the EC,[76] the UK, Australia, and Korea. The UK enacted the Copyright, Designs and Patents Act, 1988, c. 48, § 213, after it initially took the position that its copyright law fully protected chip topographies. See British Leyland Motor Corp. v. Armstrong Patents Co.
Criticisms of inadequacy of the UK copyright approach as perceived by the US chip industry are summarized in Further chip rights developments.[77]
Australia passed the Circuit Layouts Act of 1989 as a sui generis form of chip protection. Korea passed the Act Concerning the Layout-Design of Semiconductor Integrated Circuits
Other developments
Future developments seem to follow the multi-core multi-microprocessor paradigm, already used by Intel and AMD multi-core processors. Rapport Inc. and IBM started shipping the KC256 in 2006, a 256-core microprocessor. Intel, as recently as February–August 2011, unveiled a prototype, "not for commercial sale" chip that bears 80 cores. Each core is capable of handling its own task independently of the others. This is in response to heat-versus-speed limit, that is about to be reached using existing transistor technology (see: thermal design power). This design provides a new challenge to chip programming. Parallel programming languages such as the open-source X10 programming language are designed to assist with this task.[78]
Generations
In the early days of simple integrated circuits, the technology's large scale limited each chip to only a few transistors, and the low degree of integration meant the design process was relatively simple. Manufacturing yields were also quite low by today's standards. As the technology progressed, millions, then billions[79] of transistors could be placed on one chip, and good designs required thorough planning, giving rise to the field of electronic design automation, or EDA.
Name | Signification | Year | Transistors number[80] | Logic gates number[81] |
---|---|---|---|---|
SSI | small-scale integration | 1964 | 1 to 10 | 1 to 12 |
MSI | medium-scale integration | 1968 | 10 to 500 | 13 to 99 |
LSI | large-scale integration | 1971 | 500 to 20 000 | 100 to 9999 |
VLSI | very large-scale integration | 1980 | 20 000 to 1 000 000 | 10 000 to 99 999 |
ULSI | ultra-large-scale integration | 1984 | 1 000 000 and more | 100 000 and more |
SSI, MSI and LSI
The first integrated circuits contained only a few transistors. Early digital circuits containing tens of transistors provided a few logic gates, and early linear ICs such as the Plessey SL201 or the Philips TAA320 had as few as two transistors. The number of transistors in an integrated circuit has increased dramatically since then. The term "large scale integration" (LSI) was first used by IBM scientist Rolf Landauer when describing the theoretical concept;[82] that term gave rise to the terms "small-scale integration" (SSI), "medium-scale integration" (MSI), "very-large-scale integration" (VLSI), and "ultra-large-scale integration" (ULSI). The early integrated circuits were SSI.
SSI circuits were crucial to early aerospace projects, and aerospace projects helped inspire development of the technology. Both the Minuteman missile and Apollo program needed lightweight digital computers for their inertial guidance systems. Although the Apollo guidance computer led and motivated integrated-circuit technology,[83] it was the Minuteman missile that forced it into mass-production. The Minuteman missile program and various other United States Navy programs accounted for the total $4 million integrated circuit market in 1962, and by 1968, U.S. Government spending on space and defense still accounted for 37% of the $312 million total production.
The demand by the U.S. Government supported the nascent integrated circuit market until costs fell enough to allow IC firms to penetrate the industrial market and eventually the consumer market. The average price per integrated circuit dropped from $50.00 in 1962 to $2.33 in 1968.[84] Integrated circuits began to appear in consumer products by the turn of the 1970s decade. A typical application was FM inter-carrier sound processing in television receivers.
The first MOS chips were small-scale integration chips for NASA satellites.[85]
The next step in the development of integrated circuits, taken in the late 1960s, introduced devices which contained hundreds of transistors on each chip, called "medium-scale integration" (MSI).
MSI devices were attractive economically because while they cost a little more to produce than SSI devices, they allowed more complex systems to be produced using smaller circuit boards, less assembly work because of fewer separate components, and a number of other advantages.
Further development, driven by the same economic factors, led to "large-scale integration" (LSI) in the mid-1970s, with tens of thousands of transistors per chip.
The masks used to process and manufacture SSI, MSI and early LSI and VLSI devices (such as the microprocessors of the early 1970s) were mostly created by hand, often using Rubylith-tape or similar.[87] For large or complex ICs (such as memories or processors), this was often done by specially hired professionals in charge of circuit layout, placed under the supervision of a team of engineers, who would also, along with the circuit designers, inspect and verify the correctness and completeness of each mask.
Integrated circuits such as 1K-bit RAMs, calculator chips, and the first microprocessors, that began to be manufactured in moderate quantities in the early 1970s, had under 4,000 transistors. True LSI circuits, approaching 10,000 transistors, began to be produced around 1974, for computer main memories and second-generation microprocessors.
Some SSI and MSI chips, like discrete transistors, are still mass-produced, both to maintain old equipment and build new devices that require only a few gates. The 7400 series of TTL chips, for example, has become a de facto standard and remains in production.
VLSI
The final step in the development process, starting in the 1980s and continuing through the present, was "very-large-scale integration" (VLSI). The development started with hundreds of thousands of transistors in the early 1980s, As of 2016, transistor counts continue to grow beyond ten billion transistors per chip.
Multiple developments were required to achieve this increased density. Manufacturers moved to smaller design rules and cleaner fabrication facilities so that they could make chips with more transistors and maintain adequate yield. The path of process improvements was summarized by the International Technology Roadmap for Semiconductors (ITRS), which has since been succeeded by the International Roadmap for Devices and Systems (IRDS). Electronic design tools improved enough to make it practical to finish these designs in a reasonable time. The more energy-efficient CMOS replaced NMOS and PMOS, avoiding a prohibitive increase in power consumption. Modern VLSI devices contain so many transistors, layers, interconnections, and other features that it is no longer feasible to check the masks or do the original design by hand. Instead, engineers use EDA tools to perform most functional verification work.[88]
In 1986 the first one-megabit random-access memory (RAM) chips were introduced, containing more than one million transistors. Microprocessor chips passed the million-transistor mark in 1989 and the billion-transistor mark in 2005.[89] The trend continues largely unabated, with chips introduced in 2007 containing tens of billions of memory transistors.[90]
ULSI, WSI, SoC and 3D-IC
To reflect further growth of the complexity, the term ULSI that stands for "ultra-large-scale integration" was proposed for chips of more than 1 million transistors.[91]
Wafer-scale integration (WSI) is a means of building very large integrated circuits that uses an entire silicon wafer to produce a single "super-chip". Through a combination of large size and reduced packaging, WSI could lead to dramatically reduced costs for some systems, notably massively parallel supercomputers. The name is taken from the term Very-Large-Scale Integration, the current state of the art when WSI was being developed.[92]
A system-on-a-chip (SoC or SOC) is an integrated circuit in which all the components needed for a computer or other system are included on a single chip. The design of such a device can be complex and costly, and whilst performance benefits can be had from integrating all needed components on one die, the cost of licensing and developing a one-die machine still outweigh having separate devices. With appropriate licensing, these drawbacks are offset by lower manufacturing and assembly costs and by a greatly reduced power budget: because signals among the components are kept on-die, much less power is required (see Packaging).[93] Further, signal sources and destinations are physically closer on die, reducing the length of wiring and therefore latency, transmission power costs and waste heat from communication between modules on the same chip. This has led to an exploration of so-called Network-on-Chip (NoC) devices, which apply system-on-chip design methodologies to digital communication networks as opposed to traditional bus architectures.
A three-dimensional integrated circuit (3D-IC) has two or more layers of active electronic components that are integrated both vertically and horizontally into a single circuit. Communication between layers uses on-die signaling, so power consumption is much lower than in equivalent separate circuits. Judicious use of short vertical wires can substantially reduce overall wire length for faster operation.[94]
Silicon labelling and graffiti
To allow identification during production most silicon chips will have a serial number in one corner. It is also common to add the manufacturer's logo. Ever since ICs were created, some chip designers have used the silicon surface area for surreptitious, non-functional images or words. These are sometimes referred to as chip art, silicon art, silicon graffiti or silicon doodling.
ICs and IC families
The 555 timer IC
The 741 operational amplifier
7400 series TTL logic building blocks
4000 series, the CMOS counterpart to the 7400 series (see also: 74HC00 series)
Intel 4004, generally regarded as the first commercially available microprocessor, which led to the famous 8080 CPU and then the IBM PC's 8088, 80286, 486 etc.
The MOS Technology 6502 and Zilog Z80 microprocessors, used in many home computers of the early 1980s
The Motorola 6800 series of computer-related chips, leading to the 68000 and 88000 series (used in some Apple computers and in the 1980s Commodore Amiga series)
The LM-series of analog integrated circuits
See also
Integrated injection logic
Ion implantation
Monolithic microwave integrated circuit
Multi-threshold CMOS
Silicon-germanium
Sound chip
SPICE
Chip carrier