Difference between revisions of "Parallel computer"
(An acceptable start) |
m (Improve wording) |
||
(3 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
− | A '''parallel computer''' is one which uses a [[parallel]] implementation for operations on larger units of data (e.g. the addition of two numbers). Essentially all computers are now parallel, since [[integrated circuit|ICs]] have reduced the cost of [[transistor]]s, the usual basic functional unit of modern computers, to essentially nothing. | + | A '''parallel computer''' is one which uses a [[parallel]] implementation for operations on larger units of data (e.g. the addition of two numbers). Essentially all computers are now parallel, since [[integrated circuit|ICs]] have reduced the cost of [[transistor]]s, the usual basic functional unit of modern computers, to essentially nothing, and the fundamental disadvantages of [[serial computer]]s (principally slower speed) make them inherently a bad approach. |
− | For example, instead of having only a single-[[bit]] adder (so that to add two numbers, they are fed into it a bit at a time, one bit on each [[clock]] tick, starting with the least significant bit), a parallel computer would have a [[word]]-wide [[adder]], which would add two words together in a single clock tick. | + | For example, in a machine which is [[binary]] internally, instead of having only a single-[[bit]] adder (so that to add two numbers, they are fed into it a bit at a time, one bit on each [[clock]] tick, starting with the least significant bit), a parallel computer would have a [[word]]-wide [[adder]], which would add two words together in a single clock tick. |
− | Serial computers were more common in the early stages of computing; they are slower, and have more complex control [[logic]], but | + | Serial computers were more common in the early stages of computing; they are slower, and have more complex control [[logic]], but seemingly used fewer components - an acceptable trade-off at that stage, when the technology (e.g. [[vacuum tube]]s) was more expensive and physically bulky. In reality, due to their complexity, they were not actually as economical in their use of components as it had been thought they would be; their greater complexity, and slower speed, turned out to be too high a price to pay for their somewhat limited component counts. |
{{semi-stub}} | {{semi-stub}} | ||
− | + | <!-- | |
==See also== | ==See also== | ||
− | + | --> | |
− | + | [[Category: Hardware Basics]] | |
− | |||
− | [[Category: Hardware |
Latest revision as of 12:21, 13 May 2024
A parallel computer is one which uses a parallel implementation for operations on larger units of data (e.g. the addition of two numbers). Essentially all computers are now parallel, since ICs have reduced the cost of transistors, the usual basic functional unit of modern computers, to essentially nothing, and the fundamental disadvantages of serial computers (principally slower speed) make them inherently a bad approach.
For example, in a machine which is binary internally, instead of having only a single-bit adder (so that to add two numbers, they are fed into it a bit at a time, one bit on each clock tick, starting with the least significant bit), a parallel computer would have a word-wide adder, which would add two words together in a single clock tick.
Serial computers were more common in the early stages of computing; they are slower, and have more complex control logic, but seemingly used fewer components - an acceptable trade-off at that stage, when the technology (e.g. vacuum tubes) was more expensive and physically bulky. In reality, due to their complexity, they were not actually as economical in their use of components as it had been thought they would be; their greater complexity, and slower speed, turned out to be too high a price to pay for their somewhat limited component counts.