Essay sample

How Does Shannon Define Information?

Free ideas for

In Shannon's theory ‘information’ is fully determined by the probability distribution on the set of possible messages, and unrelated to the meaning, structure or content of individual messages. In many cases this is problematic, since the distribution generating outcomes may be unknown to the observer or (worse), may not exist at all

For example, can we answer a question like “what is the information in this book” by viewing it as an element of a set of possible books with a probability distribution on it? This seems unlikely. Kolmogorov complexity provides a measure of information that, unlike Shannon's, does not rely on (often untenable) probabilistic assumptions, and that takes into account the phenomenon that ‘regular’ strings are compressible.

Free ideas for

In the 1990s, in one of life's tragic ironies, Shannon came down with Alzheimer's disease, which could be described as the insidious loss of information in the brain. The communications channel to one's memories--one's past and one's very personality--is progressively degraded until every effort at error correction is overwhelmed and no meaningful signal can pass through. The bandwidth falls to zero. The extraordinary pattern of information processing that was Claude Shannon finally succumbed to the depredations of thermodynamic entropy in February 2001. But some of the signal generated by Shannon lives on, expressed in the information technology in which our own lives are now immersed. The problem with the one-time pad (so-called because an agent would carry around his copy of a key on a pad and destroy each page of digits after they were used) is that the two parties to the communication must each have a copy of the key, and the key must be kept secret from spies or eavesdroppers

Quantum cryptography solves that problem. More properly called quantum key distribution, the technique uses quantum mechanics and entanglement to generate a random key that is identical at each end of the quantum communications channel. The quantum physics ensures that no one can eavesdrop and learn anything about the key: any surreptitious measurements would disturb subtle correlations that can be checked, similar to error-correction checks of data transmitted on a noisy communications line. A year after he founded and launched information theory, Shannon published a paper that proved that unbreakable cryptography was possible. (He did this work in 1945, but at that time it was classified.) The scheme is called the one-time pad or the Vernam cypher, after Gilbert Vernam, who had invented it near the end of World War I. The idea is to encode the message with a random series of digits--the key--so that the encoded message is itself completely random. The catch is that one needs a random key that is as long as the message to be encoded and one must never use any of the keys twice. Shannon's contribution was to prove rigorously that this code was unbreakable. To this day, no other encryption scheme is known to be unbreakable.

Free ideas for

However, this conclusion is not unavoidable. In fact, many physical concepts, through the evolution of the discipline, have experienced a process of abstraction and generalization in such a way that, at present, they are no longer tied to a specific theory but permeate the whole of physics. Again, energy is the most conspicuous example: since essentially present in all the theories of physics, it is not tied to one in particular; it has different physical manifestations in different domains; nevertheless, the concept of energy is perhaps the physical concept par excellence. Mutatis mutandis, the same can be said of information, and therefore its theoretical neutrality is not an insurmountable obstacle to interpret the concept in physical terms. The difference between the epistemic and the physical interpretations of information is not merely nominal, but may yield different conclusions regarding certain common physical situations. For instance, in the influential philosophical tradition that explains scientific observation in terms of information (Shapere 1982, Brown 1987, Kosso 1989), the way in which information is conceived leads to very different consequences regarding observation

This turns out to be particularly clear in the so-called ‘negative experiments’ (see Jammer 1974), in which it is assumed that an object or event has been observed by noting the absence of some other object or event.

Free ideas for

Overall, Shannon did not only write the 1948 paper. In fact, the first major breakthrough he did was back when he was a Master’s student at MIT. His thesis is by far the most influential Master’s thesis of all time, as it shows how exploiting boolean algebra could enable to produce machines that would compute anything. In other words, in his Master’s thesis, Shannon drew the blueprints of computers! Shannon also made crucial progress in cryptography and artificial intelligence. I can only invite you to go further and learn more. This is what’s commonly called open your mind

I’m going to conclude with this, but in Shannon’s language… Increase the entropy of your thoughts!

Free ideas for

Shapere, D. (1982). “The Concept of Observation in Science and Philosophy.” Philosophy of Science, 49: 485-525.

Kosso, P. (1989). Observability and Observation in Physical Science. Dordrecht: Kluwer.

Stonier, T. (1990). Information and the Internal Structure of the Universe: An Exploration into Information Physics. New York-London: Springer.

Timpson, C. (2006). “The Grammar of Teleportation.” The British Journal for the Philosophy of Science, 57: 587-621.

Was this essay example useful for you?

Do you need extra help?

Order unique essay written for you
essay statistic graph
Topic Popularity