- Claude Elwood Shannon is considered as the founding father of electronic communications age. He is an American mathematical engineer, whose work on technical and engineering problems within the communications industry, laying the groundwork for both the computer industry and telecommunications. After Shannon noticed the similarity between Boolean algebra and the telephone switching circuits, he applied Boolean algebra to electrical systems at the Massachusetts Institute of technology (MIT) in 1940. Later he joined the staff of Bell Telephone Laboratories in 1942. While working at Bell Laboratories, he formulated a theory explaining the communication of information and worked on the problem of most efficiently transmitting information. The mathematical theory of communication was the climax of Shannon's mathematical and engineering investigations. The concept of entropy was an important feature of Shannon's theory, which he demonstrated to be equivalent to a shortage in the information content (a degree of uncertainty) in a message.

- Claude Elwood Shannon was born in Gaylord, Michigan, on April 30, 1916, to Claude Elwood and Mabel Wolf Shannon. Shannon's father, Claude, was a judge at Gaylord which was a little town of about three thousands in Michigan. Although he didn't work in the field of mathematics, he was clever mathematically and knew what he was talking about. As for his mother, Mabel, who was the principal of the high school in Gaylord. Even though there wasn't much scientific influence from Shannon's father, most of it came from his grandfather. Shannon's grandfather was an inventor and a farmer. He invented the washing machine along with many others farming machinery. On March 27, 1949, Shannon married Mary Elizabeth Moore and together they have three children; Robert James, Andrew Moore, and Margarita Catherine.

- Just a few miles from the Massachusetts Institute of Technology was Shannon's large house. The house is filled with musical instruments such as five pianos and 30 other instruments, from piccolos to trumpets. The chess-playing machines include one that moves the pieces with a three-fingered arm, beep and makes wry comments. A chair lift that he built to take his three children 600 feet down to the lakeside has been taken down now that they are grown. Shannon's lifelong fascination with balance and controlled instability has led him to design a unicycle with an off-center wheel to keep the rider steady while juggling. Shannon love to juggle since he was a kid. In his toy room is a machine with soft beanbag hands that juggle steel balls. His juggling masterpiece is a tiny stage on which three clowns juggle 11 rings, 7 balls, and 5 clubs, all driven by an invisible mechanism of clockwork and rods.

- Shannon was educated at Michigan University in 1936, where he earned his B.S. degree. Later he went to Massachusetts Institute of Technology, where he studied both electrical engineering and mathematics, receiving a master's degree and a doctorate. For his master's degree in electrical engineering, he applied George Boole's logical algebra to the problem of electrical switching. During that time Boole's system for logically manipulating 0 and 1 was little known, but it is now the nervous system of every computer in the world. Then for his doctorate, he applied mathematics to genetics. Shannon received both his master's degree and his doctorate in 1940.

- He was named a National Research Fellow and spent a year at Princeton's Institute for Advanced Study. In addition to his work at Bell Laboratories, Shannon has spent many years teaching at MIT. He was a visiting professor of electrical communication in 1956, and then in 1957 he was named professor of communications sciences and mathematics. In 1958 he returned to MIT as Donner Professor of Science until he retired. Throughout Shannon's life, he has received many honors including the Morris Liebmann Memorial award in 1949, the Ballantine Medal in 1955, and the Merin J. Kelly Award of the American Institute of Electrical Engineers in 1962. In addition, he was awarded the National Medal of science in 1966, as well as the Medal of Honor that same year from the Institute of Electrical and Electronics Engineers. Likewise, he received the Jaquard award in 1978, the John Fritz Medal in 1983, and the Kyoto Prize in Basic Science in 1985, along with numerous other prizes and over a dozen honorary degrees. Also, he is a member of the American Academy of Arts and Sciences, the National Academy of Sciences, the National Academy of Engineering, the American Philosophical Society, and the Royal Society of London.

- Besides Shannon's theory of communication, he published a classic paper "A Symbolic Analysis of Relay and Switching Circuits." This paper point out the identity between the two "truth values" of symbolic logic and the binary values 1 and 0 of electronic circuits. Shannon showed how a "logic machine" could be built using switching circuits corresponding to the propositions of Boolean algebra.

- Shannon joined Bell Telephone Laboratories as a research mathematician in 1941. He worked on the problem of most efficiently transmitting information. Soon he discovered the similarity between boolean algebra and telephone switching circuits. By 1948, Shannon turned his efforts toward a fundamental understanding of the problem and had evolved a method of expressing information in quantitative form. The fundamental unit of information is a yes-no situation. Either something is or is not. This can be easily expressed in Boolean two-value binary algebra by 1 and 0, so that 1 means "on" when the switch is closed and the power is on, and 0 means "off" when the switch is open and power is off. Under these circumstances, 1 and 0 are binary digits, a phrase that can be shortened to "bits." Thus the unit of information is the bit. A more complicated information can be viewed as built up out of combinations of bits. For example, the game of "twenty questions," shows how quite complicated objects can be identified in twenty bits or less, using the rules of the game. Also, something much more elaborate, such as is seen by the human eye, can also be measured in bits. Since each cell of the retina might be viewed as recording "light" or "dark" ("yes" or "no") and it is the combination of these yes-no situations that makes up the complete picture.

- One of the most important feature of Shannon's theory was the concept of entropy, which he demonstrated to be equivalent to a shortage in the information content in a message. According to the second law of thermodynamics, as in the 19th century, entropy is the degree of randomness in any system always increased. Thus many sentences could be significantly shortened without losing their meaning. Shannon proved that in a noisy conversation, signal could always be send without distortion. If the message is encoded in such a way that it is self-checking, signals will be received with the same accuracy as if there were no interference on the line. A language, for example, has a built in error-correcting code. Therefore, a noisy party conversation is only partly clear because half the language is redundant. Shannon's method were soon seen to have applications not only to computer design but to virtually very subject in which language was important such as linguistic, psychology, cryptography and phonetics.

Achievement :

- a source of information which is a transmitting device that transforms the information or "message" into a form suitable for transmission by a particular means.
- the means or channel over which the message is transmitted.
- a receiving device which decodes the message back into some approximation of its original form.
- the destination or intended recipient of the message.
- a source of noise (i.e., interference or distortion) which changes the message in unpredictable ways during transmission.

- the rate at which information is produced at the source.
- the capacity of the channel for handling information.
- the average amount of information in a message of any particular type.

- Shannon is a living legend at seventy-nine. What made him stand out from others mathematician is that he never content just to know a topic well. He constantly rearranges it, tries it in different settings, until he gets it into form in which he explain it, sometime literally, to the people in the street. He is the founding father who laid down its most important principles. His contributions are saluted by the world and his work not only helped translate circuit design from an art into a science, but its central tenet.