Programming language

From Computer History Wiki
Jump to: navigation, search

A programming language is a human-readable means of giving a computer directions on what computations to perform.

The earliest programming languages were assembly languages, which were simply human-readble forms of the machine's basic object code. Rather than being in binary (expressed in whatever base), they instead uses mnemonics to indicate the instructions.

It soon became apparently that it was desirable to be able to allow people to give these directions in a form which was closer to the high-level tasks the humans were trying to accomplish.

For example, since an important use for early computers was to perform mathematical computations, FORTRAN (one of the earliest computer languages) allowed directions to be given in equations of a form familiar to engineers, etc.

The development of programming languages has been a never-ending process since then, as experience allows better languages, ones with capabilties more suited to the tasks to be accomplished, to be developed.

Compilers and interpreters

Programming languages are supported in two general ways:

  • compilers translate the high-level language statements into a sequence of machine instructions which will perform the specified actions;

It is quite common for a language to have both an interpreter and a compiler, since each has advantages and disadvantages. A program which has been compiled generally runs faster, but interpreters usually provide superior debugging tools.