Origin of ASCIIA(merican) S(tandard) C(ode for) I(nformation) I(nterchange)
Origin of ASCIIA(merican) S(tandard) C(ode for) I(nformation) I(nterchange).
- (obsolete) plural form of ascian
- (computing) Alternative form of ASCII.
- (computing) American Standard Code for Information Interchange.
ascii - Computer Definition
A standard coding scheme specifically oriented toward data processing applications, ASCII was developed in 1963 and modified in 1967 by the American National Standards Institute (ANSI). ASCII employs a 7-bit coding scheme, supporting 128 (2 7 ) characters, which is quite satisfactory for both upper case and lower case letters of the English alphabet and similarly simple Roman alphabets, Arabic numerals, punctuation marks, a reasonable complement of special characters, and a modest number of control characters. As ASCII was designed for use in asynchronous communications (involving non-IBM computers, in those days), relatively few control characters were required, making a 7-bit scheme acceptable. IBM computers, which were relatively complex mainframes, required the 8-bit EBCDIC coding scheme to accommodate the necessary complement of control characters.Table A-2 shows the ASCII code. Table A-2: ASCII Code
|Bit positions 1, 2, 3, 4||Bit positions 5, 6, 7|
(American Standard Code for Information Interchange) Pronounced "ask-ee," it is the built-in binary code for representing characters in all computers except IBM mainframes, which use the EBCDIC coding system. ASCII was originally developed for communications and uses only seven bits per character, providing 128 combinations that include upper and lower case alphabetic letters, the numeric digits and special symbols such as the $ and %. The first 32 characters are set aside for communications and printer control (see ASCII chart). A Byte Holds ASCII and Then Some Since the common storage unit in a computer is an 8-bit byte (256 character combinations) and ASCII uses only the first 128 (0-127), the second set of 128 characters (128-255) are technically not ASCII, but are typically used for foreign language and math symbols. In the first PCs running DOS, they also contained elementary graphics symbols. In the Mac, the additional values can be defined by the user. ASCII Vs. Hex In technical applications typically used by developers, you may have a choice between entering data in ASCII or "hex" for editing or searching. ASCII is entered by typing in regular text, but because there are not enough keys on the keyboard to enter 256 distinct characters, the hexadecimal (hex) numbering system is used. Hex is entered by typing only the digits 0 to 9 or the letters A to F, and it provides a precise way of defining any of the 256 possible combinations in the byte, whether they be control codes (0-31) or the last 128 (128-255). See hex chart, ASCII file and Unicode.