Talk:Extended Binary Coded Decimal Interchange Code

From Wikipedia, the free encyclopedia

Contents

[edit] EBCD

The article on the IBM 5100 says EBCD was used in the IBM 2741. --Gbleem 05:30, 1 April 2006 (UTC)

No, the 2741 used a 6-bit character code (plus shift-up and shift-down control codes). The communications controller (an IBM 270x-series device) interpreted the up/down shift codes and converted them into a seventh bit to store in memory, and generated up-down shift codes from the seventh bit on output. This seven-bit code was generally converted to/from EBCDIC by software. The 6-bit code was an encoding of the Selectric's typeball tilt amount (2 bits) and rotation (4 bits). In models where an ordinary office typeball was used, the resulting character code was called "correspondence code". Other models used a code closer to computational BCD, achieved mostly by using a typeball with the characters arranged differently. (This latter form required more logic circuitry to translate keyboard inputs appropriately.) 209.245.22.27 (talk) 18:07, 28 November 2007 (UTC)

[edit] Addition Request

The ISO/IEC 8859 article has a nice table showing the various parts. It would be nice to have a similar table showing the EBCDIC variants. One could then see at a glance where they were the same and where they were different. Such a table should have (at a minimum) CCSIDs 037, 285, and 500.

Agreed. The most common code page used in the U.S. was 037, but this has been replaced in recent years by 1047 (at least on S/390 systems running Linux). — Loadmaster 23:28, 13 November 2006 (UTC)

[edit] Query

Is 5A not the exclaimation mark? —Preceding unsigned comment added by 194.81.254.10 (talk) 02:01, 2 November 2007 (UTC)

It seems that the code page table was wrong. The row 8 with the letters a-i was be moved one column to the left, that is a=81, b=82 and so on. —Preceding unsigned comment added by 88.131.23.18 (talk) 16:53, 17 December 2007 (UTC) , fixed. JoeBackward (talk) 03:31, 9 January 2008 (UTC)

[edit] Support

I would guess that the word support as in the "computer supports EBCDIC" was originally marketspeak. It implies that the use of EBCDIC is disirable option instead of a requirement. --Gbleem 22:15, 31 August 2006 (UTC)

The IBM S/360 had an "ASCII/EBCDIC" bit in the program status word "register", supposedly to control what zone nibbles were created in zone decimal conversion opcodes. I think the theory was that instead of generating "F0 F0" (which is EBCDIC "00"), it would generate "30 30" (which is ASCII "00"). This control bit was removed in later versions of the hardware. — Loadmaster 23:26, 13 November 2006 (UTC)
Actually, the S/360 would generate zone values of "50 50" for ASCII zeroes, because IBM assumed (was hoping) that the industry would accept extending 7-bit ASCII to 8 bits by shifting the first 3 bits to the left and inserting a duplicate of the high-order bit into the fourth bit position (from the left). The logic of packed decimal arithmetic in some instructions, such "Edit" depended on the notion that the "sign digit" would have a value that fell beyond the 0-9 range. (A full discussion of this might be interesting content for Wikipedia, but not in the EBCDIC article.) In the end, IBM apparently decided that the best way to support ASCII was to use EBCDIC internally and then convert character data to ASCII by use of the TR (Translate) instruction. The bit in the PSW (Program Status Word) assigned to specifying "ASCII" mode was re-assigned in S/370 to control "Extended Control" mode. This was safe because IBM never created an operating system that set the ASCII bit to 1, and setting the bit could only be done by privileged (i.e. OS) code. -- RPH 12:29, 27 June 2007 (UTC)
Correction: Actually IBM's concept of an 8-bit version of ASCII (or USASCII, as it was known later in the life of the System/360) was more complex, as described in the System/360 Principles of Operation. IBM had proposed an 8-bit extension of (US)ASCII by applying a mapping transform in which the three high-order bits of the byte were taken from the first two bits of the ASCII character code, followed by the high-order bit, repeated. This had the effect of "stretching" the ASCII code points across the 0-255 range. For example, the numeric values would mapped from hex 50-59 instead of 30-39. IBM apparently hoped that this arrangement would be accepted by the committee, because it would avoid architectural problems with the implementation of packed-decimal instructions. For example, the "Edit" (ED) and "Edit-and-Mark" (EDMK) instructions used character values of 20 and 21 as "digit select" and "significance start" characters, but that wouldn't work properly if the space were still mapped to the hex 20 code point. Under IBM's re-mapping, the value of a Space character would be hex 40 (the same as in EBCDIC). Since the standards committee never agreed to IBM's 8-bit mapping, IBM dropped the "ASCII-mode" bit in the Program Status Word in the following generation of processors, replacing the bit with one that indicated "extended control mode". ASCII would be supported by using the Translate instruction upon input and output. This information would be an interesting historical background for both EBCDIC and the System/360, although probably in a separate article RPH 20:40, 10 September 2007 (UTC) (reedit: RPH 21:30, 23 October 2007 (UTC))

[edit] Pronunciation?

How is EBCDIC pronounced? Eb-ka-dic? --Dgies 18:10, 3 November 2006 (UTC)

The jargon file gives "eb-see-dic" together with two less euphonic variants; but I'd really like an "official pronunciation" added into the article. --tyomitch 03:49, 7 November 2006 (UTC)
Most mainframe programmers I've heard (in the U.S.) pronounce it "eb'-se-dik". — Loadmaster 23:23, 13 November 2006 (UTC)
I've heard "eb-see-dic" a lot too. --Memming (talk) 21:27, 5 May 2008 (UTC)

[edit] Relation to Hollerith Code

I have read that EBCDIC is a descendant of Hollerith Code (e.g. http://www.columbia.edu/acis/history/census-tabulator.html). Unless this is not accurate, it should be mentioned. (Even if it is false, that should be mentioned, since it is out there.) —überRegenbogen 12:26, 24 February 2007 (UTC)

I added a mention of "Extended Hollerith" as the card-code that corresponds to EBCDIC in the S/360+ systems. Much more could be said on this topic, such as including a code-chart that demonstrates the logical nature of the mapping between EBCDIC and the extended card-code. Such a chart is found in the IBM S/360 Principles of Operation manual and many other publications from that period. Actually, such a chart, showing EBCDIC in its original form, would be more instructive than the somewhat disingenuous inclusion of one of the National Language Support extensions to EBCDIC, apparently to make the point that EBCDIC is a chaotic mess, even though extensions to ASCII for this purpose has had essentially the same effect on that code as well. -- RPH 12:42, 27 June 2007 (UTC)
I removed the international characters from the chart, leaving only the common EBCDIC characters. I also shaded the invariant code points, which represent the same characters in all EBCDIC variants. (Corrections welcome, of course.) — Loadmaster (talk) 19:14, 12 April 2008 (UTC)

[edit] Usage of EBCDIC

All IBM mainframe peripherals and operating systems (except Linux on zSeries or iSeries) use EBCDIC as their inherent encoding but software can translate to and from other encodings.

At exactly which places is EBCDIC used within IBM products? I can only think up EBCDIC being used as a text file encoding, but with Unicode even that usage is obsolete. --Abdull 09:51, 7 June 2007 (UTC)

As far as I know, every IBM mainframe still uses EBCDIC as its primary character set. Which means that that every mainframe disk file (dataset), data storage tape, or CD contains text data in EBCDIC form. Can you name any IBM mainframe systems that actually use Unicode? — Loadmaster (talk) 19:19, 12 April 2008 (UTC)
It is still used in financial transaction processing for debit networks. --Mirell (talk) 15:30, 14 April 2008 (UTC)

[edit] Redundant

Under the "Criticism and humor" section, isn't "Another popular complaint is that the EBCDIC alphabetic characters follow an archaic punch card encoding rather than a linear ordering like ASCII. " equivalent to the snippet from esr: "...such delights as non-contiguous letter sequences..."? --WayneMokane 22:36, 18 October 2007 (UTC)

[edit] EBCDIC niceties?

The example given, "while in EBCDIC there is one bit which indicates upper or lower case", is not valid, since the same applies to ASCII-- the third most significant bit signifies lowercase. Anyone have a replacement, or is EBCDIC without niceties? --Luke-Jr (talk) 07:43, 23 March 2008 (UTC)

Yes. In its original form, that is, as an Extension of Binary Coded Decimal Interchange Code, a six-bit IBM code, the addition of two high-order bits allowed the characters to be unfolded into four groups, or "quadrants", numbered 0 to 3. Quadrant 0 contained control codes, generally only used for terminals. Quadrant 1 contained the space and all "special" characters (punctuation marks and symbols). Quadrant 1 was for lower-case letters (rarely used in the 1960's, and not part of the original BCD code). Quadrant 3 contained the capital letters and numberic characters, in that order. Overall, this mapping allowed put the characters into a good sorting-order, while at the same time simplifying the logic circuits that the translation of the old 6-bit BCD into EBCDIC, into quadrants 1 and 3. Since BCDIC had no controls characters or lower-case letters, this was done by setting the leftmost bit according to whether the character was alpha or numeric, and the second bit was always 1. Early peripherals for the 360, such as the 1403 and 1443 printers, carried over from the previous generation of systems, worked without modification using the last 6 bit of the character code, although an extra-cost feature, called UCS for (Universal Character Set), permitted use of the full 8 bits, to support lower-case and other characters. The old 7-track tapes (6 bits plus parity) written on earlier systems, could be read and translated into EBCDIC by the tape control unit electronics. Most installations had mostly 9-track drives, and one or two 7-track drives for tape compatibility with the older IBM systems, which continued to be used alongside the 360's until they were phased out. So, EBCDIC was meant to be a transitional code, but conversion to ASCII turned out to be tougher than anticipated, made more difficult when IBM's proposal for an 8-bit mapping of ASCII was rejected by the standards committee, in which IBM had little voting power. This made the ASCII-mode bit in the 360's Program Status Word essentially useless. The ASCII-mode was dropped in the System/370, and the bit became the "extended control mode" bit, enabling the new 370 features, such as virtual memory. Since the bit was always set to 0 by older operating systems, it became a handy way of enabling compatible operation of 360-based operating system code. RPH (talk) 14:49, 12 April 2008 (UTC)

[edit] IBM ASCII support citation request

Can someone find a citation to the following paragraph?

Interestingly, IBM was a chief proponent of the ASCII standardization committee. However, IBM did not have time to prepare ASCII peripherals (such as card punch machines) to ship with its System/360 computers, so the company settled on EBCDIC at the time. The System/360 became wildly successful, and thus so did EBCDIC.