Difference between revisions of "Decimal"

From Computer History Wiki
Jump to: navigation, search
(Fills a gap)
 
(No difference)

Latest revision as of 06:18, 7 August 2024

Formally, decimal refers to the base-10 number system, which is used at the hardware level of most early computing devices (ranging from Babbage's Difference and Analytical Engines, up though the ENIAC); and in some early computers also (sometimes in the form of bi-quinary). It can also refer to numbers or other data stored in this form.