Generally speaking, information theory shows that for any particular application there is an optimum code; it does not, unfortunately, tell how to devise the code. Morse code, consisting of a series of dots and dashes, or marks and spaces, is commonly used in telegraphy. In a computer, information is digitally encoded as strings of binary digits or bits. ASCII, the American Standard Code for Information Interchange, and Unicode are two ways representing alphanumeric characters in a binary form.
Special error-detecting codes are used extensively in digital systems to ensure the successful transfer of data. One method uses an extra bit, called a parity-check bit; if each bit is considered as a 1 or 0 (depending on whether or not it is set), the sum of a fixed number of bits can be made even (or odd) by properly setting the parity bit to a one or zero. Errors are detected on the receiving end simply by checking whether each received word is even (or odd). Audio data on a compact disc is digitally encoded and a special error correcting code is used to detect and correct errors that may have been introduced through manufacturing error or are created during the reading or playing process.
Certain arbitrary codes are used to ensure secrecy of communication; merely the message, without the rules by which the symbols are associated, will not provide an eavesdropper with an understandable version of it (see cryptography). See also signaling.
See P. Lunde, ed., The Book of Codes (2009).
The Columbia Electronic Encyclopedia, 6th ed. Copyright © 2012, Columbia University Press. All rights reserved.
See more Encyclopedia articles on: Language and Linguistics