Decimal

From Computer History Wiki
Revision as of 06:18, 7 August 2024 by Jnc (talk | contribs) (Fills a gap)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

Formally, decimal refers to the base-10 number system, which is used at the hardware level of most early computing devices (ranging from Babbage's Difference and Analytical Engines, up though the ENIAC); and in some early computers also (sometimes in the form of bi-quinary). It can also refer to numbers or other data stored in this form.