Decimal

From Computer History Wiki
Jump to: navigation, search

Formally, decimal refers to the base-10 number system, which is used at the hardware level of most early computing devices (ranging from Babbage's Difference and Analytical Engines, up though the ENIAC); and in some early computers also (sometimes in the form of bi-quinary). It can also refer to numbers or other data stored in this form.