Parallel computer
A parallel computer is one which uses a parallel implementation for operations on larger units of data (e.g. the addition of two numbers). Essentially all computers are now parallel, since ICs have reduced the cost of transistors, the usual basic functional unit of modern computers, to essentially nothing, and the fundamental disadvantages of serial computers (principally slower speed) make them inherently a bad approach.
For example, in a machine which is binary internally, instead of having only a single-bit adder (so that to add two numbers, they are fed into it a bit at a time, one bit on each clock tick, starting with the least significant bit), a parallel computer would have a word-wide adder, which would add two words together in a single clock tick.
Serial computers were more common in the early stages of computing; they are slower, and have more complex control logic, but seemingly used fewer components - an acceptable trade-off at that stage, when the technology (e.g. vacuum tubes) was more expensive and physically bulky. In reality, due to their complexity, they were not actually as economical in their use of components as it had been thought they would be; their greater complexity, and slower speed, turned out to be too high a price to pay for their somewhat limited component counts.