Computer
A computer is unfortunately not defined in a way that is both generally agreed, and precise. (Which leads to ferocious debates about what was the first 'computer' - because, as Michael Williams observed, "If you add enough adjectives to a description you can always claim [a particular machine to be the 'first'].")
The meaning of 'computer' is generally now 'stored-program computing device' - i.e. the program (which, it must be emphasized, is a specified series of elementary steps) is stored in the device in some form, in a way that the user can relatively easily change to make it perform some other computation; programmability is not simply the ability to make physical changes to the device to change what calculation it performs - even if those changes are instantiated in something that was added to the device for that exact purpose, like the plug-boards of the ENIAC.
An open question is whether the program must be stored in memory that the computer can modify, allowing the program to change its own program if it desires. This has historically been seen as part of what makes a 'computer', both because Turing machines (the theoretical foundation of computers) had that capability, and because the first electronic computers generally could do that too; they used the same memory to hold the program and data.
Whether or not the ability to modify its own program is important, for something to be classified as a 'computer', can be debated; modern computers almost always run pure code (in part because self-modifying code can be difficult to understand and debug), so it is probably not critical; indeed, embedded systems usually use ROM for their program storage. On the other hand, program modifiability is a key aspect of classical Turing machines; but it might be possible for a machine running out of ROM to emulate a Turing machine, thereby making such a machine a Turing machine (since it can provide a Turing machine). It is also possible that having the ability to do conditional branches could be an alternative to being able to modify the program, for Turing completeness.
See also
Further reading
- Paul E. Ceruzzi, Reckoners: The Prehistory of The Digital Computer, From Relays to the Stored Program Concept, 1935-1945, Greenwood, Westport, 1983
- William Aspray (editor), Computing Before Computers, Iowa State University Press, Ames, 1990
- Brian Randell (editor), The Origins of Digital Computers: Selected Papers, Springer-Verlag, Berlin, Heidelberg, New York, 1973, 1982 (3rd edition)
- Nicholas Metropolis, Jack Howlett, Gian-Carlo Rota (editors), A History of Computing in the Twentieth Century, Academic Press, New York, 1980
- Raúl Rojas, Ulf Hashagen (editors), The First Computers: History and Architectures, MIT Press, Cambridge, 2002
External links
- Disambiguating the first computer
- How to Make Zuse's Z3 a Universal Computer - it turns out one doesn't even need conditional branching to have a UTM