Serial decimal
From Wikipedia, the free encyclopedia
| This article is orphaned as few or no other articles link to it. Please help introduce links in articles on related topics. (November 2006) |
In computers, a serial decimal numeric representation is one in which ten bits are reserved for each digit, with a different bit turned on depending on which of the ten possible digits is intended. According to [1], ENIAC used this representation.

