The reason why ASCII uses only 7 bits instead of all 8 bits is primarily rooted in historical and technical considerations from the early days of computing.

Explanation:

  1. Original Design of ASCII:

    • The ASCII (American Standard Code for Information Interchange) was developed in the early 1960s.
    • It was designed to represent text characters and control codes using 7 bits, which allows for 128 different characters (i.e., 27=1282^7 = 12827=128 possible values).
    • These 128 characters include:
      • Control characters (like NULL, BEL, ESC, etc.) from 0 to 31.
      • Printable characters (like letters, digits, and symbols) from 32 to 127.
  2. Why Not Use the 8th Bit?

    • At the time, 7-bit systems were common because it was more efficient for transmitting data over older communication lines, such as teletypes and early network protocols, which often used 7-bit data frames.
    • The 8th bit was commonly used for parity checking:
      • A parity bit is an additional bit added to a string of binary code to ensure that the total number of 1-bits is even (even parity) or odd (odd parity).
      • This was a simple form of error detection to check if data was transmitted correctly.
  3. Evolution to 8-bit Systems:

    • As technology advanced, 8-bit systems became more standard, and computers started using 8-bit bytes.
    • The extra 8th bit (leftover in 7-bit ASCII) allowed for additional character sets beyond the original 128 ASCII characters.
    • This led to the creation of extended ASCII and other encoding schemes (like ISO 8859-1, Windows-1252, etc.) that utilize all 256 possible values of an 8-bit byte (i.e., 28=2562^8 = 25628=256).
    • These extensions provided support for additional characters, including accented letters, special symbols, and characters from other languages.

Summary

  • 7 bits were used in ASCII because it was sufficient for basic English text and control codes while allowing the 8th bit to be used for error checking or other special purposes.
  • The 8th bit was initially left out in ASCII to serve as a parity bit for error detection.
  • As computing technology evolved, the full 8-bit range was utilized in extended character sets to accommodate more symbols and international characters.