Difference between revisions of "Parallel computer"

From Computer History Wiki
Jump to: navigation, search
(Clarify that example is for binary machines)
m (Improve wording)
 
Line 1: Line 1:
A '''parallel computer''' is one which uses a [[parallel]] implementation for operations on larger units of data (e.g. the addition of two numbers). Essentially all computers are now parallel, since [[integrated circuit|ICs]] have reduced the cost of [[transistor]]s, the usual basic functional unit of modern computers, to essentially nothing, and the inherent disadvantages of serial machines (principally slower speed) make serial computers a bad fundamental approach.
+
A '''parallel computer''' is one which uses a [[parallel]] implementation for operations on larger units of data (e.g. the addition of two numbers). Essentially all computers are now parallel, since [[integrated circuit|ICs]] have reduced the cost of [[transistor]]s, the usual basic functional unit of modern computers, to essentially nothing, and the fundamental disadvantages of [[serial computer]]s (principally slower speed) make them inherently a bad approach.
  
 
For example, in a machine which is [[binary]] internally, instead of having only a single-[[bit]] adder (so that to add two numbers, they are fed into it a bit at a time, one bit on each [[clock]] tick, starting with the least significant bit), a parallel computer would have a [[word]]-wide [[adder]], which would add two words together in a single clock tick.
 
For example, in a machine which is [[binary]] internally, instead of having only a single-[[bit]] adder (so that to add two numbers, they are fed into it a bit at a time, one bit on each [[clock]] tick, starting with the least significant bit), a parallel computer would have a [[word]]-wide [[adder]], which would add two words together in a single clock tick.
  
[[Serial computer]]s were more common in the early stages of computing than parallel computers; they are slower, and have more complex control [[logic]], but seemingly used fewer components - an acceptable trade-off at that stage, when the technology (e.g. [[vacuum tube]]s) was more expensive and physically bulky. In reality, due to their complexity, they were not actually as economical in their use of components as it had been thought they would be; their greater complexity, and slower speed, turned out to be too high a price to pay for their somewhat limited component counts.
+
Serial computers were more common in the early stages of computing; they are slower, and have more complex control [[logic]], but seemingly used fewer components - an acceptable trade-off at that stage, when the technology (e.g. [[vacuum tube]]s) was more expensive and physically bulky. In reality, due to their complexity, they were not actually as economical in their use of components as it had been thought they would be; their greater complexity, and slower speed, turned out to be too high a price to pay for their somewhat limited component counts.
  
 
{{semi-stub}}
 
{{semi-stub}}

Latest revision as of 13:21, 13 May 2024

A parallel computer is one which uses a parallel implementation for operations on larger units of data (e.g. the addition of two numbers). Essentially all computers are now parallel, since ICs have reduced the cost of transistors, the usual basic functional unit of modern computers, to essentially nothing, and the fundamental disadvantages of serial computers (principally slower speed) make them inherently a bad approach.

For example, in a machine which is binary internally, instead of having only a single-bit adder (so that to add two numbers, they are fed into it a bit at a time, one bit on each clock tick, starting with the least significant bit), a parallel computer would have a word-wide adder, which would add two words together in a single clock tick.

Serial computers were more common in the early stages of computing; they are slower, and have more complex control logic, but seemingly used fewer components - an acceptable trade-off at that stage, when the technology (e.g. vacuum tubes) was more expensive and physically bulky. In reality, due to their complexity, they were not actually as economical in their use of components as it had been thought they would be; their greater complexity, and slower speed, turned out to be too high a price to pay for their somewhat limited component counts.