|Born||April 30, 1916|
Petoskey, Michigan, U.S.
|Died||February 24, 2001 (aged 84)|
Medford, Massachusetts, U.S.
|Norma Levor (1940-41) |
Betty Shannon (1949)
|Fields||Mathematics and electronic engineering|
|Doctoral advisor||Frank Lauren Hitchcock|
Claude Elwood Shannon (April 30, 1916 - February 24, 2001) was an American mathematician, electrical engineer, and cryptographer known as "the father of information theory". Shannon is noted for having founded information theory with a landmark paper, "A Mathematical Theory of Communication", which he published in 1948.
He is also well known for founding digital circuit design theory in 1937, when--as a 21-year-old master's degree student at the Massachusetts Institute of Technology (MIT)--he wrote his thesis demonstrating that electrical applications of Boolean algebra could construct any logical numerical relationship. Shannon contributed to the field of cryptanalysis for national defense during World War II, including his fundamental work on codebreaking and secure telecommunications.
The Shannon family lived in Gaylord, Michigan, and Claude was born in a hospital in nearby Petoskey. His father, Claude Sr. (1862-1934) was a businessman and for a while, a judge of probate in Gaylord. His mother, Mabel Wolf Shannon (1890-1945), was a language teacher, who also served as the principal of Gaylord High School. Claude Sr. was a descendant of New Jersey settlers, while Mabel was a child of German immigrants.
Most of the first 16 years of Shannon's life were spent in Gaylord, where he attended public school, graduating from Gaylord High School in 1932. Shannon showed an inclination towards mechanical and electrical things. His best subjects were science and mathematics. At home he constructed such devices as models of planes, a radio-controlled model boat and a barbed-wire telegraph system to a friend's house a half-mile away. While growing up, he also worked as a messenger for the Western Union company.
Shannon's childhood hero was Thomas Edison, whom he later learned was a distant cousin. Both Shannon and Edison were descendants of John Ogden (1609-1682), a colonial leader and an ancestor of many distinguished people.
In 1932, Shannon entered the University of Michigan, where he was introduced to the work of George Boole. He graduated in 1936 with two bachelor's degrees: one in electrical engineering and the other in mathematics.
In 1936, Shannon began his graduate studies in electrical engineering at MIT, where he worked on Vannevar Bush's differential analyzer, an early analog computer. While studying the complicated ad hoc circuits of this analyzer, Shannon designed switching circuits based on Boole's concepts. In 1937, he wrote his master's degree thesis, A Symbolic Analysis of Relay and Switching Circuits. A paper from this thesis was published in 1938. In this work, Shannon proved that his switching circuits could be used to simplify the arrangement of the electromechanical relays that were used then in telephone call routing switches. Next, he expanded this concept, proving that these circuits could solve all problems that Boolean algebra could solve. In the last chapter, he presented diagrams of several circuits, including a 4-bit full adder.
Using this property of electrical switches to implement logic is the fundamental concept that underlies all electronic digital computers. Shannon's work became the foundation of digital circuit design, as it became widely known in the electrical engineering community during and after World War II. The theoretical rigor of Shannon's work superseded the ad hoc methods that had prevailed previously. Howard Gardner called Shannon's thesis "possibly the most important, and also the most noted, master's thesis of the century."
Shannon received his PhD from MIT in 1940. Vannevar Bush had suggested that Shannon should work on his dissertation at the Cold Spring Harbor Laboratory, in order to develop a mathematical formulation for Mendelian genetics. This research resulted in Shannon's PhD thesis, called An Algebra for Theoretical Genetics.
In 1940, Shannon became a National Research Fellow at the Institute for Advanced Study in Princeton, New Jersey. In Princeton, Shannon had the opportunity to discuss his ideas with influential scientists and mathematicians such as Hermann Weyl and John von Neumann, and he also had occasional encounters with Albert Einstein and Kurt Gödel. Shannon worked freely across disciplines, and this ability may have contributed to his later development of mathematical information theory.
Shannon then joined Bell Labs to work on fire-control systems and cryptography during World War II, under a contract with section D-2 (Control Systems section) of the National Defense Research Committee (NDRC).
For two months early in 1943, Shannon came into contact with the leading British mathematician Alan Turing. Turing had been posted to Washington to share with the U.S. Navy's cryptanalytic service the methods used by the British Government Code and Cypher School at Bletchley Park to break the ciphers used by the Kriegsmarine U-boats in the north Atlantic Ocean. He was also interested in the encipherment of speech and to this end spent time at Bell Labs. Shannon and Turing met at teatime in the cafeteria. Turing showed Shannon his 1936 paper that defined what is now known as the "Universal Turing machine". This impressed Shannon, as many of its ideas complemented his own.
In 1945, as the war was coming to an end, the NDRC was issuing a summary of technical reports as a last step prior to its eventual closing down. Inside the volume on fire control, a special essay titled Data Smoothing and Prediction in Fire-Control Systems, coauthored by Shannon, Ralph Beebe Blackman, and Hendrik Wade Bode, formally treated the problem of smoothing the data in fire-control by analogy with "the problem of separating a signal from interfering noise in communications systems." In other words, it modeled the problem in terms of data and signal processing and thus heralded the coming of the Information Age.
Shannon's work on cryptography was even more closely related to his later publications on communication theory. At the close of the war, he prepared a classified memorandum for Bell Telephone Labs entitled "A Mathematical Theory of Cryptography", dated September 1945. A declassified version of this paper was published in 1949 as "Communication Theory of Secrecy Systems" in the Bell System Technical Journal. This paper incorporated many of the concepts and mathematical formulations that also appeared in his A Mathematical Theory of Communication. Shannon said that his wartime insights into communication theory and cryptography developed simultaneously and that "they were so close together you couldn't separate them". In a footnote near the beginning of the classified report, Shannon announced his intention to "develop these results ... in a forthcoming memorandum on the transmission of information."
While he was at Bell Labs, Shannon proved that the cryptographic one-time pad is unbreakable in his classified research that was later published in October 1949. He also proved that any unbreakable system must have essentially the same characteristics as the one-time pad: the key must be truly random, as large as the plaintext, never reused in whole or part, and be kept secret.
In 1948, the promised memorandum appeared as "A Mathematical Theory of Communication", an article in two parts in the July and October issues of the Bell System Technical Journal. This work focuses on the problem of how best to encode the information a sender wants to transmit. In this fundamental work, he used tools in probability theory, developed by Norbert Wiener, which were in their nascent stages of being applied to communication theory at that time. Shannon developed information entropy as a measure of the information content in a message, which is a measure of uncertainty reduced by the message, while essentially inventing the field of information theory. In 1949 Claude Shannon and Robert Fano devised a systematic way to assign code words based on probabilities of blocks. This technique, known as Shannon-Fano coding, was first proposed in the 1948 article.
The book, co-authored with Warren Weaver, The Mathematical Theory of Communication, reprints Shannon's 1948 article and Weaver's popularization of it, which is accessible to the non-specialist. Warren Weaver pointed out that the word "information" in communication theory is not related to what you do say, but to what you could say. That is, information is a measure of one's freedom of choice when one selects a message. Shannon's concepts were also popularized, subject to his own proofreading, in John Robinson Pierce's Symbols, Signals, and Noise.
Information theory's fundamental contribution to natural language processing and computational linguistics was further established in 1951, in his article "Prediction and Entropy of Printed English", showing upper and lower bounds of entropy on the statistics of English - giving a statistical foundation to language analysis. In addition, he proved that treating whitespace as the 27th letter of the alphabet actually lowers uncertainty in written language, providing a clear quantifiable link between cultural practice and probabilistic cognition.
Another notable paper published in 1949 is "Communication Theory of Secrecy Systems", a declassified version of his wartime work on the mathematical theory of cryptography, in which he proved that all theoretically unbreakable ciphers must have the same requirements as the one-time pad. He is also credited with the introduction of sampling theory, which is concerned with representing a continuous-time signal from a (uniform) discrete set of samples. This theory was essential in enabling telecommunications to move from analog to digital transmissions systems in the 1960s and later.
He returned to MIT to hold an endowed chair in 1956.
In 1956 Shannon joined the MIT faculty to work in the Research Laboratory of Electronics (RLE). He continued to serve on the MIT faculty until 1978.
Outside of Shannon's academic pursuits, he was interested in juggling, unicycling, and chess. He also invented many devices, including a Roman numeral computer called THROBAC, juggling machines, and a flame-throwing trumpet. He built a device that could solve the Rubik's Cube puzzle.
Shannon met his second wife Betty Shannon (née Mary Elizabeth Moore) when she was a numerical analyst at Bell Labs. They were married in 1949. Betty assisted Claude in building some of his most famous inventions. They had three children.
There are six statues of Shannon sculpted by Eugene Daub: one at the University of Michigan; one at MIT in the Laboratory for Information and Decision Systems; one in Gaylord, Michigan; one at the University of California, San Diego; one at Bell Labs; and another at AT&T Shannon Labs. After the breakup of the Bell System, the part of Bell Labs that remained with AT&T Corporation was named Shannon Labs in his honor.
According to Neil Sloane, an AT&T Fellow who co-edited Shannon's large collection of papers in 1993, the perspective introduced by Shannon's communication theory (now called information theory) is the foundation of the digital revolution, and every device containing a microprocessor or microcontroller is a conceptual descendant of Shannon's publication in 1948: "He's one of the great men of the century. Without him, none of the things we know today would exist. The whole digital revolution started with him." The unit shannon is named after Claude Shannon.
The Bit Player, a feature film about Shannon directed by Mark Levinson premiered at the World Science Festival in 2019.  Drawn from interviews conducted with Shannon in his house in the 1980s, the film was released on Amazon Prime in August 2020.
"Theseus", created in 1950, was a mechanical mouse controlled by an electromechanical relay circuit that enabled it to move around a labyrinth of 25 squares. The maze configuration was flexible and it could be modified arbitrarily by rearranging movable partitions. The mouse was designed to search through the corridors until it found the target. Having travelled through the maze, the mouse could then be placed anywhere it had been before, and because of its prior experience it could go directly to the target. If placed in unfamiliar territory, it was programmed to search until it reached a known location and then it would proceed to the target, adding the new knowledge to its memory and learning new behavior. Shannon's mouse appears to have been the first artificial learning device of its kind.
In 1949 Shannon completed a paper (published in March 1950) which estimates the game-tree complexity of chess, which is approximately 10120. This number is now often referred to as the "Shannon number", and is still regarded today as an accurate estimate of the game's complexity. The number is often cited as one of the barriers to solving the game of chess using an exhaustive analysis (i.e. brute force analysis).
On March 9, 1949, Shannon presented a paper called "Programming a Computer for playing Chess". The paper was presented at the National Institute for Radio Engineers Convention in New York. He described how to program a computer to play chess based on position scoring and move selection. He proposed basic strategies for restricting the number of possibilities to be considered in a game of chess. In March 1950 it was published in Philosophical Magazine, and is considered one of the first articles published on the topic of programming a computer for playing chess, and using a computer to solve the game.
His process for having the computer decide on which move to make was a minimax procedure, based on an evaluation function of a given chess position. Shannon gave a rough example of an evaluation function in which the value of the black position was subtracted from that of the white position. Material was counted according to the usual chess piece relative value (1 point for a pawn, 3 points for a knight or bishop, 5 points for a rook, and 9 points for a queen). He considered some positional factors, subtracting ½ point for each doubled pawn, backward pawn, and isolated pawn; mobility was incorporated by adding 0.1 point for each legal move available.
Shannon formulated a version of Kerckhoffs' principle as "The enemy knows the system". In this form it is known as "Shannon's maxim".
This section needs to be updated.April 2016)(
The Shannon centenary, 2016, marked the life and influence of Claude Elwood Shannon on the hundredth anniversary of his birth on April 30, 1916. It was inspired in part by the Alan Turing Year. An ad hoc committee of the IEEE Information Theory Society including Christina Fragouli, Rüdiger Urbanke, Michelle Effros, Lav Varshney and Sergio Verdú, coordinated worldwide events. The initiative was announced in the History Panel at the 2015 IEEE Information Theory Workshop Jerusalem and the IEEE Information Theory Society Newsletter.
A detailed listing of confirmed events was available on the website of the IEEE Information Theory Society.
Some of the planned activities included:
Shannon described himself as an atheist and was outwardly apolitical.