Integrated circuit

From Computer History Wiki
Revision as of 20:29, 2 July 2017 by Jnc (talk | contribs) (A start)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

An integrated circuit (often IC or chip) is a section of electronic circuitry (transistors, resistors, capacitors, and the wires connecting them together) fabricated as a small monolithic entity. They are usually produced by a photolithography process on small pieces of semi-conductor.

Initial IC's, in the early 1960's, so-called Small Scale Integration or SSI, contained only a few transistors; the next generation, so-called Medium-Scale Integration or MSI, in the late 1960's contained hundreds; this was followed by Large-Scale Integration or LSI in the early 1970's, which contained contained tens of thousands of transistors.

Since the IC contains a large mass of circuitry in a small space, thereby increasing the speed, and reducing the power requirements, as well as one which is easy to produce in high volume at low cost, it has utterly revolutionized the computer field (and indeed, electronics as a whole).

The first ICs were independently invented in the late 1950s by Jack Kilby of Texas Instruments and Robert Noyce of Fairchild Semiconductor. (Several people had described the basic concept before them, but they were the first to reduce the idea to working hardware.)