What Does It Mean According to Shannon to Transmit Information?
Originally developed by Shannon & Weaver in 1948, this model describes communication as a linear process. This model describes how a sender, or speaker, transmits a message to a receiver, or listener. More specifically, the sender is the source of the message. A message may consist of the sounds, words, or behaviours in a communication interaction. The message itself is transmitted through a channel, the pathway or route for communication, to a receiver, who is the target or recipient of the message. There may be obstacles in the communication process, or noise. Noise refers to any interference in the channel or distortion of the message. This is a fairly simple model in which a message is simply passed from sender to receiver.
Classical information science, by contrast, sprang forth about 50 years ago, from the work of one remarkable man: Claude E. Shannon. In a landmark paper written at Bell Labs in 1948, Shannon defined in mathematical terms what information is and how it can be transmitted in the face of noise. What had been viewed as quite distinct modes of communication--the telegraph, telephone, radio and television--were unified in a single framework. Shannon was born in 1916 in Petoskey, Michigan, the son of a judge and a teacher. Among other inventive endeavors, as a youth he built a telegraph from his house to a friend's out of fencing wire. He graduated from the University of Michigan with degrees in electrical engineering and mathematics in 1936 and went to M.I.T., where he worked under computer pioneer Vannevar Bush on an analog computer called the differential analyzer. Shannon's M.I.T. master's thesis in electrical engineering has been called the most important of the 20th century: in it the 22-year-old Shannon showed how the logical algebra of 19th-century mathematician George Boole could be implemented using electronic circuits of relays and switches. This most fundamental feature of digital computers' design--the representation of "true" and "false" and "0" and "1" as open or closed switches, and the use of electronic logic gates to make decisions and to carry out arithmetic--can be traced back to the insights in Shannon's thesis. Shannon defined the quantity of information produced by a source--for example, the quantity in a message--by a formula similar to the equation that defines thermodynamic entropy in physics. In its most basic terms, Shannon's informational entropy is the number of binary digits required to encode a message. Today that sounds like a simple, even obvious way to define how much information is in a message. In 1948, at the very dawn of the information age, this digitizing of information of any sort was a revolutionary step. His paper may have been the first to use the word "bit," short for binary digit.
Although the use of the word ‘information’, with different meanings, can be traced back to antique and medieval texts (see Adriaans 2013), it is only in the 20th century that the term begins to acquire the present-day sense (Adriaans, P., 2013). Nevertheless, the pervasiveness of the notion of information both in our everyday life and in our scientific practice does not imply the agreement about the content of the concept. As Luciano Floridi (2010, 2011) stresses, it is a polysemantic concept associated with different phenomena, such as communication, knowledge, reference, meaning, truth, etc. In the second half of the 20th century, philosophy begins to direct its attention to this omnipresent but intricate concept in an effort of unravel the tangle of significances surrounding it. According to a deeply rooted intuition, information is related with data, it has or carries content. In order to elucidate this idea, the philosophy of information has coined the concept of semantic information. At present, the philosophy of information has put on the table a number of open problems related with the concept of information (see Adriaans and van Benthem 2008): the possibility of unification of various theories of information, the question about a logic of information, the relations between information and thermodynamics, the meaning of quantum information, the links between information and computation, among others. In this wide panoply of open issues, it can be supposed that any question about the meaning and interpretation of S hannon information has a clear and undisputed answer. However, this is not the case. In this paper we will see that, in spite of the agreement concerning the traditional and well understood formalism, there are many points about Shannon’s theory that still remain obscure or have not been sufficiently stressed. Moreover, the very interpretation of the concept of information is far from unanimous (Floridi, L., 2011).
Definitely, the crucial question for Communication Studies is: to what extent does the message received correspond to the message transmitted? That's where all the other factors in the communication process come into play. The Shannon-Weaver model and others like it tends to portray the message as a relatively uncomplicated matter. Frequently the messages have meaning that is they refer to or are correlated according to some system with certain physical or conceptual entities. These considerations are irrelevant to the engineering problem. It may however be a criticism of the application of Shannon's model to the more general area of human-to-human communication.
Adriaans, P. (2013). “Information.” In E. N. Zalta (ed.), The Stanford Encyclopedia of Philosophy (Fall 2013 Edition), URL =
Adriaans, P. and van Benthem, J. (eds) (2008). Handbook of Philosophy of Information. Amsterdam: Elsevier Science Publishers.
Bar-Hillel, Y. (1964). Language and Information: Selected Essays on Their Theory and Application. Reading, MA: Addison-Wesley.
Floridi, L. (2011). The Philosophy of Information, Oxford; Oxford University Press.