The significance of Shannon's Work
Note: It was reported that Claude Shannon died at the beginning of this year. He was a great man with with an amazing mind. His outstanding research gave him the distinguished title of the Father of All Digital Computers.Claude Shannon's work is essential to the idea of Noise to Knowledge engineering. The creation in the 1940's of the subject of information theory is one of the great intellectual achievements of the twentieth century. Information theory has had a fundamental influence on mathematics, particularly on probability theory and ergodic theory. Shannon's mathematics has been a profound contribution to pure mathematics.
Shannon's work was primarily in the context of communication engineering. It is in this area that his work stands as unique research. It dealt with the measurement of information from noise. In his paper of 1948 and its sequels, he formulated the modern model of a communication systems that is distinctive for its generality and its amenability to mathematical analysis. While doing this work he formulated the central problems of theoretical interest, and gave a brilliant and elegant solutions that are used today.
Shannon saw the communication processes as stochastic. His work did not include the Noise to Knowledge concept because the semantic meaning of information plays no role in the theory. In the Shannon paradigm, information from a "source" is defined as a stochastic process which must be transmitted though a "channel." This channel is defined by a transition probability law relating the channel output to the input. A system designer is allowed to place a device called an "encoder" between the source and channel which can introduce a fixed though finite coding delay. A "decoder" can be placed at the output of the channel. His theories seek to answer questions such as how rapidly or reliably information from the source can be transmitted over the channel and how to optimize the performance of that encoder/decoder?
By creating this model Shannon gave elegant answers to such questions. His solution had two parts.
First he gives a fundamental limit which, for a given source and channel, it is impossible to achieve a fidelity or reliability level better than a certain value. Second, he was able to show that for large encoder delays, it is possible to achieve performance that is essentially as good as the fundamental limit. In fact to do this, he was able to show that the encoder would have to make use of a complex code which was not implementable in practice at that time.
Perhaps Shannon's most brilliant insight was in the separation of problems where the encoder must take both the source and channel into account into two separate coding problems. He showed that with no loss of generality one can study the source and channel separately and assume that they are connected by a digital binary interface. The source encoder/decoder must be found to optimize the source-to-digital performance, and the channel encoder/decoder to optimize the performance of the channel as a transmitter of digital data. Solutions of the source and channel problems lead immediately to the solution to the original joint source-channel problem. The fact that a digital interface between the source and channel can be optimized in this manner has profound implications in the modern era of digital storage and communication of all types of information.
Thus the revolutionary elements of Shannon's contribution were the invention of the source-encoder-channel-decoder-destination model, and the elegant and remarkably general solution of the fundamental problems which he was able to pose in terms of this model. Significant was the demonstration of the power of coding with delay in a communication system, the separation of the source and channel coding problems, as well as the mathematics for the establishment of fundamental natural limits on communication.
Shannon introduced several new mathematical concepts. Primary among these is the notion of the entropy of a random variable by extension of a random sequence, the mutual information between two random variables or sequences, and a calculus that relates these quantities and their derivatives. He also achieved success with his technique of random coding, in which he showed that an encoder chosen at random from the universe of possible encoders will, with high probability, give essentially optimal performance.
Shannon's work, provides a crucial knowledge base for the discipline of communication engineering. The communication model is general enough so that the fundamental limits and general intuition provided by Shannon theory provide an extremely useful "roadmap" to designers of communication and information storage systems. For example, the theory tells us that English text is not compressible to fewer than about 1.5 binary digit per English letter, no matter how complex and clever the encoder/decoder. (Shannon took a narrow perspective of communications into consideration.)
Shannon's theory showed how to design more efficient communication and storage systems by demonstrating the enormous gains achievable by coding, and by providing the correct design of coding systems. All sophisticated coding schemes used in systems owe a their their basis to the insight provided by Shannon theory. As time goes on, and our ability to implement more and more complex processors increases, more relevant to communications.
Copyright (c) 2001-2007 RDFollendoreIII All Rights Reserved