Digital Integrated Circuits Ppt

0 views
Skip to first unread message

Maggie Szydlowski

unread,
Aug 4, 2024, 4:37:56 PM8/4/24
to inprefmikis
Anintegrated circuit (IC), also known as a microchip, computer chip, or simply chip, is a small electronic device made up of multiple interconnected electronic components such as transistors, resistors, and capacitors. These components are etched onto a small piece of semiconductor material, usually silicon. Integrated circuits are used in a wide range of electronic devices, including computers, smartphones, and televisions, to perform various functions such as processing and storing information. They have greatly impacted the field of electronics by enabling device miniaturization and enhanced functionality.

The IC's mass production capability, reliability, and building-block approach to integrated circuit design have ensured the rapid adoption of standardized ICs in place of designs using discrete transistors. ICs are now used in virtually all electronic equipment and have revolutionized the world of electronics. Computers, mobile phones, and other home appliances are now essential parts of the structure of modern societies, made possible by the small size and low cost of ICs such as modern computer processors and microcontrollers.


ICs have three main advantages over circuits constructed out of discrete components: size, cost and performance. The size and cost is low because the chips, with all their components, are printed as a unit by photolithography rather than being constructed one transistor at a time. Furthermore, packaged ICs use much less material than discrete circuits. Performance is high because the IC's components switch quickly and consume comparatively little power because of their small size and proximity. The main disadvantage of ICs is the high initial cost of designing them and the enormous capital cost of factory construction. This high initial cost means ICs are only commercially viable when high production volumes are anticipated.


A circuit in which all or some of the circuit elements are inseparably associated and electrically interconnected so that it is considered to be indivisible for the purposes of construction and commerce.


In strict usage, integrated circuit refers to the single-piece circuit construction originally known as a monolithic integrated circuit, built on a single piece of silicon.[2][3] In general usage, circuits not meeting this strict definition are sometimes referred to as ICs, which are constructed using many different technologies, e.g. 3D IC, 2.5D IC, MCM, thin-film transistors, thick-film technologies, or hybrid integrated circuits. The choice of terminology frequently appears in discussions related to whether Moore's Law is obsolete.


An early attempt at combining several components in one device (like modern ICs) was the Loewe 3NF vacuum tube first made in 1926.[4][5] Unlike ICs, it was designed with the purpose of tax avoidance, as in Germany, radio receivers had a tax that was levied depending on how many tube holders a radio receiver had. It allowed radio receivers to have a single tube holder. One million were manufactured, and were "a first step in integration of radioelectronic devices".[6] The device contained an amplifier, composed of three triodes, two capacitors and four resistors in a six-pin device.[7]


Early concepts of an integrated circuit go back to 1949, when German engineer Werner Jacobi[8] (Siemens AG)[9] filed a patent for an integrated-circuit-like semiconductor amplifying device[10] showing five transistors on a common substrate in a three-stage amplifier arrangement. Jacobi disclosed small and cheap hearing aids as typical industrial applications of his patent. An immediate commercial use of his patent has not been reported.


A precursor idea to the IC was to create small ceramic substrates (so-called micromodules),[13] each containing a single miniaturized component. Components could then be integrated and wired into a bidimensional or tridimensional compact grid. This idea, which seemed very promising in 1957, was proposed to the US Army by Jack Kilby[13] and led to the short-lived Micromodule Program (similar to 1951's Project Tinkertoy).[13][14][15] However, as the project was gaining momentum, Kilby came up with a new, revolutionary design: the IC.


However, Kilby's invention was not a true monolithic integrated circuit chip since it had external gold-wire connections, which would have made it difficult to mass-produce.[21] Half a year after Kilby, Robert Noyce at Fairchild Semiconductor invented the first true monolithic IC chip.[22][21] More practical than Kilby's implementation, Noyce's chip was made of silicon, whereas Kilby's was made of germanium, and Noyce's was fabricated using the planar process, developed in early 1959 by his colleague Jean Hoerni and included the critical on-chip aluminum interconnecting lines. Modern IC chips are based on Noyce's monolithic IC,[22][21] rather than Kilby's.


Dozens of TTL integrated circuits were a standard method of construction for the processors of minicomputers and mainframe computers. Computers such as IBM 360 mainframes, PDP-11 minicomputers and the desktop Datapoint 2200 were built from bipolar integrated circuits,[25] either TTL or the even faster emitter-coupled logic (ECL).


The earliest experimental MOS IC to be fabricated was a 16-transistor chip built by Fred Heiman and Steven Hofstein at RCA in 1962.[35] General Microelectronics later introduced the first commercial MOS integrated circuit in 1964,[36] a 120-transistor shift register developed by Robert Norman.[35] By 1964, MOS chips had reached higher transistor density and lower manufacturing costs than bipolar chips. MOS chips further increased in complexity at a rate predicted by Moore's law, leading to large-scale integration (LSI) with hundreds of transistors on a single MOS chip by the late 1960s.[37]


Following the development of the self-aligned gate (silicon-gate) MOSFET by Robert Kerwin, Donald Klein and John Sarace at Bell Labs in 1967,[38] the first silicon-gate MOS IC technology with self-aligned gates, the basis of all modern CMOS integrated circuits, was developed at Fairchild Semiconductor by Federico Faggin in 1968.[39] The application of MOS LSI chips to computing was the basis for the first microprocessors, as engineers began recognizing that a complete computer processor could be contained on a single MOS LSI chip. This led to the inventions of the microprocessor and the microcontroller by the early 1970s.[37] During the early 1970s, MOS integrated circuit technology enabled the very large-scale integration (VLSI) of more than 10,000 transistors on a single chip.[40]


At first, MOS-based computers only made sense when high density was required, such as aerospace and pocket calculators. Computers built entirely from TTL, such as the 1970 Datapoint 2200, were much faster and more powerful than single-chip MOS microprocessors such as the 1972 Intel 8008 until the early 1980s.[25]


Advances in IC technology, primarily smaller features and larger chips, have allowed the number of MOS transistors in an integrated circuit to double every two years, a trend known as Moore's law. Moore originally stated it would double every year, but he went on to change the claim to every two years in 1975.[41] This increased capacity has been used to decrease cost and increase functionality. In general, as the feature size shrinks, almost every aspect of an IC's operation improves. The cost per transistor and the switching power consumption per transistor goes down, while the memory capacity and speed go up, through the relationships defined by Dennard scaling (MOSFET scaling).[42] Because speed, capacity, and power consumption gains are apparent to the end user, there is fierce competition among the manufacturers to use finer geometries. Over the years, transistor sizes have decreased from tens of microns in the early 1970s to 10 nanometers in 2017[43] with a corresponding million-fold increase in transistors per unit area. As of 2016, typical chip areas range from a few square millimeters to around 600 mm2, with up to 25 million transistors per mm2.[44]


The expected shrinking of feature sizes and the needed progress in related areas was forecast for many years by the International Technology Roadmap for Semiconductors (ITRS). The final ITRS was issued in 2016, and it is being replaced by the International Roadmap for Devices and Systems.[45]


Initially, ICs were strictly electronic devices. The success of ICs has led to the integration of other technologies, in an attempt to obtain the same advantages of small size and low cost. These technologies include mechanical devices, optics, and sensors.


As of 2018[update], the vast majority of all transistors are MOSFETs fabricated in a single layer on one side of a chip of silicon in a flat two-dimensional planar process. Researchers have produced prototypes of several promising alternatives, such as:


As it becomes more difficult to manufacture ever smaller transistors, companies are using multi-chip modules, three-dimensional integrated circuits, package on package, High Bandwidth Memory and through-silicon vias with die stacking to increase performance and reduce size, without having to reduce the size of the transistors. Such techniques are collectively known as advanced packaging.[56] Advanced packaging is mainly divided into 2.5D and 3D packaging. 2.5D describes approaches such as multi-chip modules while 3D describes approaches where dies are stacked in one way or another, such as package on package and high bandwidth memory. All approaches involve 2 or more dies in a single package.[57][58][59][60][61] Alternatively, approaches such as 3D NAND stack multiple layers on a single die. A technique has been demonstrated to include microfluidic cooling on integrated circuits, to improve cooling performance[62] as well as peltier thermoelectric coolers on solder bumps, or thermal solder bumps used exclusively for heat dissipation, used in flip-chip.[63][64]

3a8082e126
Reply all
Reply to author
Forward
0 new messages