Serial decimal

From Wikipedia, the free encyclopedia

In computers, a serial decimal numeric representation is one in which ten bits are reserved for each digit, with a different bit turned on depending on which of the ten possible digits is intended. According to [1], ENIAC used this representation.