How Does Shannon Define Information?
The bandwidth falls to zero. The extraordinary pattern of information processing that was Claude Shannon finally succumbed to the depredations of thermodynamic entropy in February 2001. But some of the signal generated by Shannon lives on, expressed in the information technology in which our own lives are now immersed. The problem with the one-time pad (so-called because an agent would carry around his copy of a key on a pad and destroy each page of digits after they were used) is that the two parties to the communication must each have a copy of the key, and the key must be kept secret from spies or eavesdroppers. Quantum cryptography solves that problem. More properly called quantum key distribution, the technique uses quantum mechanics and entanglement to generate a random key that is identical at each end of the quantum communications channel. The quantum physics ensures that no one can eavesdrop and learn anything about the key: any surreptitious measurements would disturb subtle correlations that can be checked, similar to error-correction checks of data transmitted on a noisy communications line. A year after he founded and launched information theory, Shannon published a paper that proved that unbreakable cryptography was possible. (He did this work in 1945, but at that time it was classified.) The scheme is called the one-time pad or the Vernam cypher, after Gilbert Vernam, who had invented it near the end of World War I. The idea is to encode the message with a random series of digits--the key--so that the encoded message is itself completely random. The catch is that one needs a random key that is as long as the message to be encoded and one must never use any of the keys twice. Shannon's contribution was to prove rigorously that this code was unbreakable. To this day, no other encryption scheme is known to be unbreakable.
Again, energy is the most conspicuous example: since essentially present in all the theories of physics, it is not tied to one in particular; it has different physical manifestations in different domains; nevertheless, the concept of energy is perhaps the physical concept par excellence. Mutatis mutandis, the same can be said of information, and therefore its theoretical neutrality is not an insurmountable obstacle to interpret the concept in physical terms. The difference between the epistemic and the physical interpretations of information is not merely nominal, but may yield different conclusions regarding certain common physical situations. For instance, in the influential philosophical tradition that explains scientific observation in terms of information (Shapere 1982, Brown 1987, Kosso 1989), the way in which information is conceived leads to very different consequences regarding observation. This turns out to be particularly clear in the so-called ‘negative experiments’ (see Jammer 1974), in which it is assumed that an object or event has been observed by noting the absence of some other object or event.
I can only invite you to go further and learn more. This is what’s commonly called open your mind. I’m going to conclude with this, but in Shannon’s language… Increase the entropy of your thoughts!
Shapere, D. (1982). “The Concept of Observation in Science and Philosophy.” Philosophy of Science, 49: 485-525.
Kosso, P. (1989). Observability and Observation in Physical Science. Dordrecht: Kluwer.
Stonier, T. (1990). Information and the Internal Structure of the Universe: An Exploration into Information Physics. New York-London: Springer.
Timpson, C. (2006). “The Grammar of Teleportation.” The British Journal for the Philosophy of Science, 57: 587-621.