Integrated circuit

From Computer History Wiki
(Redirected from Integrated Circuit)
Jump to: navigation, search

An integrated circuit (often IC or chip) is a section of electronic circuitry (transistors, resistors, capacitors, and the wires connecting them together) fabricated as a small monolithic entity. They are usually produced by a photo-lithography process on small pieces of semi-conductor.

The first ICs were independently invented in the late 1950s by Jack Kilby of Texas Instruments and Robert Noyce of Fairchild Semiconductor. (Several people had described the basic concept before them, but they were the first to reduce the idea to working hardware.)

Experience, and improved production technology, allowed the amount of circuitry in ICs to increase rapidly. Initial ICs, in the early 1960s, so-called Small-Scale Integration or SSI, contained only a few transistors; the next generation, so-called Medium-Scale Integration or MSI, in the late 1960s contained hundreds; this was followed by Large-Scale Integration or LSI in the early 1970s, which contained tens of thousands of transistors.

Further growth required the development of a whole new methodology of design to be developed; with this in hand, the process moved on through the Very-Large-Scale Integration or VLSI in the early 1980s, which contained contained up to a million transistors. Today's largest ICs contain billions.

Since the IC contains a large mass of circuitry in a small space, thereby increasing the speed, and reducing the power requirements, as well as being one which is easy to produce in high volume at low cost, it has utterly revolutionized the computer field (and indeed, electronics as a whole).

Early IC's were usually packaged in plastic or ceramic Dual Inline Packages; in recent years, surface mount packaging, which avoids the need for through holes, and can be denser, has become usual.