Claude Shannon short biography and interesting facts. Claude Shannon's information theory 4 Claude Shannon founder of statistical information theory

Claude Elwood Shannon - leading American scientist in the field of mathematics, engineering, cryptanalytics.

He gained worldwide fame thanks to his discoveries in the field of information technology and invention of the “bit” (1948), as the smallest information unit. He is considered the founder of information theory, the main provisions of which are still relevant in the section of high-tech communications and modern communications.

Shannon was also the concept of “entropy” was first introduced, which indicates an indefinite amount of transmitted information.

This scientist was the first to apply a scientific approach to information ideas and the laws of cryptography, substantiating his thoughts in works on the mathematical theory of communication, as well as on the theory of communication in secret systems.

He also made a great contribution to the development of cybernetics, substantiating such key points as the probabilistic scheme, the game scientific concept, as well as thoughts on creating automata and management systems.

Childhood and adolescence

Claude Shannon was born in Petoskey, Michigan, USA. This joyful event happened 04/30/1916.

The father of the future scientist was engaged in business in the field of advocacy, and then was appointed a judge. Mother taught foreign language and eventually received the position of school director in Gaylord.

Shannon Sr. had mathematical inclinations. A key role in shaping his grandson’s inclination towards scientific activity was played by his grandfather, a farmer and inventor.

In his arsenal creation of a washing machine and some types of applied agricultural machinery. It is noteworthy that Edison has family ties to this family.

At the age of 16, Claude graduated from high school, where his mother taught. I managed to work courier to Western Union, engaged in the design of various devices.

He was interested in modeling aircraft and radio equipment, and repairing small radio stations. With his own hands he made a radio-controlled boat and a telegraph for communicating with a friend.

As Claude himself assures, he was absolutely not interested only in politics and faith in God.

Student years

The University of Michigan opened its doors to Shannon in 1932. Studying here exposed him to the works of J. Boole. Claude received his bachelor's degree in mathematics and electrical engineering in 1936.

His first job was as a research assistant at the Massachusetts Institute of Technology. Claude conducted his scientific activities as an operator of a mechanical computer device created by his teacher V. Bush.

Having delved deeply into Boole's conceptual scientific developments, Shannon realized the possibility of their practical application. Protecting master's thesis in 1937, which was supervised by Frank L. Hitchcock, he moved to the famous Bell Telephone Laboratories, where he produced material on symbolic analysis in switching circuits and using relays.

It was published on the pages of a special magazine by the Institute of Electrical Engineers in the USA (1938).

The main provisions of the article were revealed improvement of telephone call routing, thanks to the replacement of electromechanical type relays with a switching circuit. The young scientist substantiated the concept of the possibility of solving all Boolean algebra problems using schemes.

This work by Shannon received Nobel Prize in Electrical Engineering (1940) and became the basis for creating logical digital circuits in electrical circuits. This master's work became a real scientific breakthrough of the twentieth century, laying the foundation for the creation of electronic computer technology of the modern generation.

Bush recommended that Shannon pursue a dissertation for a doctorate in mathematics. They paid serious attention to mathematical research in close connection with the genetic laws of inheritance of the famous Mendel. But this work never received due recognition and was first published only in 1993.

Scientists have devoted a lot of effort to building a mathematical foundation for various disciplines, especially information technology. This was facilitated by his communication with a prominent mathematician G. Weyl, as well as J. von Neumann, Einstein, Gödel.

War period

From the spring of 1941 to 1956 Claude Shannon works for US defense, developing fire control and enemy detection during air defense. He created a stable intergovernmental connection between the US President and the British Prime Minister.

He was awarded the National Research Award for his paper on the design of two-pole switching circuits (1942).

The scientist became interested in the ideas of the Englishman Turing on speech encryption (1943), and already in 1945 he published a work on data averaging and forecasting for fire control systems. Its co-authors were Ralph B. Blackman and H. Bode. Having modeled a special system that processes information and special signals, they ushered in the information age.

Secret memorandum of K. Shannon in the field mathematical theory of cryptography(1945) proved that cryptography and communication theory are inseparable.

Post-war period

This time was marked by his memorandum on the theory of communication from a mathematical point of view (1948) regarding the encoding of transmitted texts.

Shannon's subsequent work was closely related to information theory in the field of game development, in particular the roulette wheel, the mind-reading machine, and the on solving a Rubik's cube.

The scientist has implemented an idea that makes it possible to compress information, which avoids its loss during unpacking.

The scientist created a school where he periodically conducted seminars, where he taught students to find new approaches to solving certain problems.

His scientific research is famous in financial mathematics. Among them, the electrical circuit of the flow of money in American pension funds and the rationale for choosing an investment portfolio when allocating monetary assets.

Many compare the popularity of Claude Shannon with Isaac Newton.

After 1978, in retirement, he took up the theory of juggling and designed a special machine.

Claude Shannon published a collection of his articles in 1993, where included 127 of his scientific works.

The final stage of life

He spent his last years at the Massachusetts Boarding Home due to Alzheimer's disease. Here, according to his wife Mary Elizabeth, Claude participated in research to study methods of treating her.

The whole family was constantly with him. Death occurred on February 24, 2001.

Shannon is survived by his only wife, with whom his marriage lasted from March 1949. They had children three children Robert, Andrew, Margarita.

In the 40s last century American scientist K. Shannon, who specialized in issues of communication channel capacity and message encoding, gave the measure of information quantity a more universal form : the amount of information has come to be understood as the amount of entropy by which the total entropy of a system decreases as a result of this system receiving information. This formula expresses entropy through the sum of a number of probabilities multiplied by their logarithms, and relates only to the entropy (uncertainty) of the message.

Entropy – a quantitative measure of uncertainty removed when obtaining information.

In other words, The information content of a message is inversely proportional to its obviousness, predictability, probability: the less predictable, unobvious and unlikely the message, the more information it carries for the recipient. A completely obvious (with probability equal to 1) message is as empty as the complete absence of such (i.e., a message whose probability is obviously equal to 0). Both of them, according to Shannon’s assumption, are uninformative and do not convey any information to the recipient. For a number of reasons related to mathematics and related to the convenience of formalization, the entropy of a message is described by Shannon as a function of the distribution of random variables.

The article "Mathematical Theory of Communication" was published in 1948 and made Claude Shannon world famous. In it, Shannon outlined his ideas, which later became the basis of modern theories and techniques for processing the transmission and storage of information. The results of his work in the field of information transmission through communication channels launched a huge number of studies around the world. Shannon generalized Hartley's ideas and introduced the concept of information contained in transmitted messages. As a measure of the information of the transmitted message M, Hartley proposed using a logarithmic function. Shannon was the first to consider transmitted messages and noise in communication channels from a statistical point of view, considering both finite sets of messages and continuous sets of messages.

The information theory developed by Shannon helped solve the main problems associated with the transmission of messages, namely: eliminate redundancy of transmitted messages, produce coding and transmission of messages over communication channels with noise.

Solving the Redundancy Problem the message to be transmitted allows for the most efficient use of the communication channel. For example, modern, widely used methods for reducing redundancy in television broadcasting systems today make it possible to transmit up to six digital commercial television programs in the frequency band occupied by a conventional analog television signal.

Solving the problem of message transmission over noisy communication channels at a given ratio of the power of the useful signal to the power of the interference signal at the receiving location, it allows messages to be transmitted over the communication channel with an arbitrarily low probability of erroneous message transmission. Also, this ratio determines the channel capacity. This is ensured by the use of codes that are resistant to interference, while the rate of message transmission over a given channel must be lower than its capacity.

Claude Shannon's brief biography and interesting facts from the life of the American engineer, cryptanalyst and mathematician, the father of the information age, are presented in this article.

Claude Shannon short biography

Claude Elwood Shannon was born on April 30, 1916 in the town of Petocki, Michigan. His father was a lawyer, and his mother taught foreign languages. In 1932, the young man graduated from high school and was simultaneously educated at home. Claude's father constantly bought his son amateur radio kits and construction sets, promoting his technical creativity. And his older sister taught him in-depth mathematics classes. Therefore, the love for technology and mathematics was obvious.

In 1932, the future scientist entered the University of Michigan. He graduated from the educational institution in 1936 with a bachelor's degree in mathematics and electrical engineering. At the university, he read the works “Logical Calculus” and “Mathematical Analysis of Logic” by the author George Boole, which largely determined his future scientific interests.

Soon he was invited to work at the Massachusetts Institute of Technology as a research assistant in the electrical engineering laboratory. Shannon worked on upgrading an analog computer, Vannevar Bush's differential analyzer.

In 1936, Claude decided to enroll in a master's program, and a year later he wrote his dissertation. Based on it, he produces an article entitled “Symbolic Analysis of Relays and Switching Circuits,” published in 1938 in the Journal of the American Institute of Electrical Engineers. His article attracted the interest of the scientific electrical engineering community and in 1939 he was awarded the Prize. Alfred Nobel. Without finishing his master's thesis, Shannon began work on his doctorate in mathematics, touching on problems in genetics. It was called “Algebra for Theoretical Genetics.”

In 1941, at the age of 25, he began working in the mathematics department of the Bell Laboratories research center. At this time, hostilities began in Europe. America financed Shannon's research in the field of cryptography. He was the author of the analysis of encrypted texts using information theoretical methods. In 1945, the scientist completed a large secret report, “The Mathematical Theory of Cryptography.”

What contributions did Claude Shannon make to computer science?

In his research, the scientist prepared concepts on information theory. In 1948, Shannon published the work “Mathematical Theory of Communication,” in which mathematical theory appeared as a receiver of information and a communication channel for its transmission. All that remains is to translate everything into a simpler language and convey our achievements to humanity. Claude Shannon introduced the concept of information entropy, which denotes a quantity, a unit of information. The scientist said that a mathematician advised him to use this term. Claude Shannon created 6 conceptual theorems that are the foundation of his information theory:

  • Theorem for quantitative assessment of information.
  • Theorem for rational packing of symbols during primary encoding.
  • Theorem for matching the flow of information with the capacity of a communication channel without interference.
  • Theorem for matching the information flow with the capacity of a binary communication channel with noise.
  • Theorem for estimating the capacity of a continuous communication channel.
  • Theorem for error-free reconstruction of a continuous signal.

In 1956, the scientist stopped working at Bell Laboratories and took the position of professor at two faculties at the Massachusetts Institute of Technology: electrical engineering and mathematics.

When he turned 50, he stopped teaching and devoted himself entirely to his favorite hobbies. He created a unicycle with 2 saddles, robots that solve a Rubik's cube and juggle balls, and a folding knife with many blades. In 1965 he visited the USSR. And recently, Claude Shannon was very ill and died in February 2001 from Alzheimer's disease in a Massachusetts nursing home.

Claude Shannon interesting facts

Shannon was instilled with a love of science by his grandfather. Shannon's grandfather was an inventor and farmer. He invented the washing machine along with many other useful agricultural equipment

As a teenager he worked as a messenger at Western Union.

He was fond of playing the clarinet, listened to music and read poetry.

Shannon married Mary Elizabeth Moore Shannon, whom he met at Bell Labs, on March 27, 1949. She worked there as an analyst. The couple had three children: Andrew Moore, Robert James and Margarita Katerina.

Claude Shannon liked to go to Las Vegas on weekends with his wife Betty and a colleague to play blackjack. Shannon and his friend even designed the world’s first wearable “card counting” computer.

He was involved in the development of devices that detected enemy aircraft and aimed anti-aircraft guns at them. He also created a cryptographic system for the US government, ensuring the secrecy of negotiations between Roosevelt and Churchill.

He loved to play chess and juggle. Witnesses of his youth at Bell Laboratories recalled how he rode around the company's corridors on a unicycle, while juggling balls.

He has created a unicycle with two saddles, a folding knife with a hundred blades, robots that solve a Rubik's cube, and a robot that juggles balls.

Shannon, in his own words, was an apolitical person and an atheist.

There was no “Information” in the old encyclopedias. The "Polish Inflants" were immediately followed there by the "Infralapsarians". And when such an article appeared in a public publication, the Small Soviet Encyclopedia of 1929, the meaning of the term information was very far from modern.

Information (lat.), awareness. Informational - informative. In periodicals, the information department is that part of a newspaper, magazine, etc., which contains telegrams, correspondence, interviews, as well as information given by reporters."

Numerous informants in those years, supplying inquisitive organizations with data about the lives of colleagues and neighbors, were not published in public publications for obvious reasons. But it is interesting that although in communication technology, especially in stationary channels, the Morse key had by that time been replaced by start-stop devices, the activities of the above-described informants were popularly described laconically - “knocking.” Only in the next century did Computerra have to tell its readers that a woodpecker in the forest works like a living modem, and the wise collective unconscious had already assigned those numerous and modest information workers the lapidary name “woodpecker”!

The ancient Romans used the word informatio in the sense of interpretation, presentation, but the modern, exceptionally capacious concept of information is the fruit of the development of both science and technology. The Latin in-formo (to give form, to compose, to imagine) was used by Cicero and is connected with Plato’s original ideas through the Greek, very polysemantic, eidos. But the full-fledged term “information” appeared relatively late, even later than the first electronic computers, and atomic boilers, and bombs.

It was introduced by Claude Shannon in his 1948 work "A Mathematical Theory of Communication". And it was brought to life by the harsh practical need to transmit messages through channels full of noise. Just as generations earlier, the need to take into account the dynamic characteristics of telegraph lines gave rise to the ingenious works of Heaviside.

The concept of information is introduced purely mathematically. A fairly popular, but also accurate concept is given in the old, very famous in the USSR, work of the Yaglomov brothers. Information is introduced as opposition, opposition to entropy. Noise, chaos, uncertainty, “not knowing.”

And entropy was introduced into the theory of communications even earlier, in 1928, by the pioneering work of the American electronics engineer Ralph Vinton Lyon Hartley. Let us note, by the way, that, speaking about purely practical communication problems, Hartley mentioned “psychological factors” influencing the measure of uncertainty. In general, the mathematical apparatus he (and later Shannon) used was the theory of probability. A discipline that arose from the French aristocrats' love of gambling.

And the concept of entropy in those days, even in physics, had an emotional connotation - of course! - after all, the very first consequence of this early formulation was the understanding of the impossibility of creating a Perpetual Motion Machine of the second kind. A device that could produce work using only the heat dissipated around it.

The penetration of these concepts into the scientific community was extremely dramatic. First of all, the “heat death” of the Universe. In order to imagine how this concept was perceived during the years of its formulation, one must turn to the intellectual atmosphere of the nineteenth century. Although it began with a meat grinder perpetrated by the military-bureaucratic machine of Napoleon, this century turned out to be a time of triumph of progress.

And here the splendor of progress, the shining road to complexity and happiness, was invaded by the idea of ​​the non-decrease of chaos, of the “heat death” of the Universe predicted by SCIENCE. No, not about the End of Religions, followed by a new life among new heavens and on a new earth. About scientific, tangible and hopeless universal destruction, although very distant, but absolutely inevitable.

And it was not only scientists who realized this. Let's turn to Chekhov's "The Seagull".

Here is Nina Zarechnaya’s monologue in an episode with an insert play by a deliberately bad writer: “People, lions, eagles and partridges, horned deer, geese, spiders, silent fish that lived in the water, starfish and those that could not be seen with the eye - in a word, all lives, all lives, having completed their sad circle, faded away...<...>Cold, cold, cold. Empty, empty, empty. Scary, scary, scary...."

Let us recall the suicide of K. E. Tsiolkovsky’s son, caused, according to biographers, by the horror of the impending “heat death”. Indeed, for supporters of materialism, which was very widespread in intellectual circles of that time, the cessation of molecular motion due to the achievement of maximum entropy meant UNIVERSAL destruction. Life became meaningless for a person prone to generalizations and extrapolations.

Ralph Hartley's introduction of the concept of entropy in the transmission of messages, and - Claude Shannon - its antithesis - information itself, with all their apparent technicality, were events of colossal ideological significance. For the first time, positive sciences, hand in hand with engineering, the then frontier of high-tech, came to where philosophy, metaphysics and theology had previously reigned.

The funniest reaction to information theory and cybernetics was in the Stalinist USSR. It would seem - the triumph of materialism. But that was not the case... In the Bolshevik country, there was no place for the theory of information (as well as the theory of relativity, quantum mechanics, and the non-stationary Universe) in the set of ideological foundations. Well, the classics of Marxism did not have time to speak out about this. And they couldn’t yet... And the very young cybernetics was declared a corrupt girl of imperialism. Attempts to rehabilitate her in the early sixties did not solve anything. However, this is a topic for another discussion.

And the concept of information introduced by Shannon, having transformed the world around us, becoming the basis of Second Nature, is increasingly returning to theoretical physics. Israeli physicist, creator of the thermodynamic theory of black holes Jacob David Bekenstein (Bekenstein, b. 1947) suggested5 that this is a general trend in modern natural science.

In the phenomenon of “quantum entanglement” of particles, information appears in one of its new clothes, and perhaps with new features of its character. But the main thing is that the more and more we learn about our surroundings, the more clearly we see the deep connections of information with the objective world.

From the magazine "Computerra"
Information forever

Ralph Hartley

Ralph. Hartley was born in Spruce, Nevada, November 30, 1888. He graduated with an A.B. from the University of Utah in 1909. As a Rhodes Scholar, he received a B.A. in 1912 and a B.Sc. in 1913 from Oxford University. After returning from England, Hartley joined the Western Electric Company Research Laboratory and took part in the creation of a radio receiver for transatlantic tests. During the First World War, Hartley solved problems that had hampered the development of sound-type directional finders.

After the war, the scientist came to grips with the problem of transmitting information (in particular sound). During this period, he formulated the law "that the total amount of information that can be transmitted is proportional to the frequency range transmitted and the time of transmission." Hartley was a pioneer in the field of Information Theory. He introduced the concept of "information" as a random variable and was the first to attempt to define a "measure of information" (1928: "Transmission of Information", in Bell System Technology. Journal, vol. 7, pp. 535-563). Publishing in the same journal as Nyquist, and yet without citing Nyquist (or anyone else, for that matter), Hartley developed the concept of information based on "physical as contrasted with psychological considerations" for use in the study of electronic communications. In fact, Hartley defines this basic concept accordingly. Instead, he refers to the "accuracy...of information" and "quantity of information."

Information exists in the transmission of symbols, with symbols having "specific meanings to the party message." When someone receives information, each symbol received allows the recipient to "eliminate possibilities" by eliminating other possible symbols and their associated meanings. "

The accuracy of the information depends on what other symbol strings may have been selected"; the measure of these other strings provides an indication of the amount of information transmitted. Nyquist then suggests that we take "as our practical measure of information the logarithm of the number of possible symbol strings." Thus, if we had 4 different symbols occurring with equal frequency, it would represent 2 bits Hartley has been awarded awards for excellence in science, this scientist was a member of the American Association for the Advancement of Science. Hartley holds more than 70 patents (inventions). Ralph W. L. Hartley died on May 1, 1970, aged 81.

Claude Elwood Shannon


Claude Ellwood Shannon (1916 - 2001) - American engineer and mathematician. The man who is called the father of modern theories of information and communication.

On an autumn day in 1989, a correspondent for Scientific American magazine walked into an old house overlooking a lake north of Boston. But the owner who met him, a 73-year-old slender old man with a lush gray mane and a mischievous smile, did not at all want to remember “the affairs of bygone days” and discuss his scientific discoveries of 30-50 years ago. Perhaps the guest would rather look at his toys?

Without waiting for an answer and without listening to the admonitions of his wife Betty, the owner took the amazed journalist into the next room, where with the pride of a 10-year-old boy he showed off his treasures: seven chess machines, a circus pole with a spring and a gasoline engine, a folding knife with a hundred blades, a two-seater unicycle , a juggling mannequin, as well as a computer calculating in the Roman numeral system. And it doesn’t matter that many of these owner’s creations have long been broken and quite dusty - he is happy.

Who is this old man? Was it really he who, while still a young engineer at Bell Laboratories, wrote the “Magna Carta” of the information age - “The Mathematical Theory of Communications” in 1948? Was his work called “the greatest work in the annals of technical thought”? Was his pioneering intuition compared to the genius of Einstein? Yes, it's all about him. And in the same 40s, he designed a flying disk on a rocket engine and rode, while juggling, on a unicycle along the corridors of Bell Labs. This is Claude Ellwood Shannon, the father of cybernetics and information theory, who proudly declared: “I have always followed my interests without thinking about how much they would cost me or their value to the world. I have wasted a lot of time on completely useless things.”

Claude Shannon was born in 1916 and grew up in Gaylord, Michigan. Even in his childhood, Claude became acquainted with both the detail of technical structures and the generality of mathematical principles. He was constantly tinkering with detector receivers and radio sets that his father, an assistant judge, brought him, and solving mathematical problems and puzzles that his older sister Katherine, who later became a mathematics professor, supplied him with. Claude fell in love with these two worlds, so different from each other - technology and mathematics.

As a student at the University of Michigan, where he graduated in 1936, Claude majored in both mathematics and electrical engineering. This duality of interests and education determined the first major success that Claude Shannon achieved during his graduate years at the Massachusetts Institute of Technology. In his dissertation, defended in 1940, he proved that the operation of switches and relays in electrical circuits can be represented using algebra, invented in the mid-19th century by the English mathematician George Boole. “It just happened that no one else was familiar with both areas at the same time!” - Shannon modestly explained the reason for his discovery.

Nowadays, it is completely unnecessary to explain to readers of a computer publication what Boolean algebra means for modern circuitry. In 1941, 25-year-old Claude Shannon went to work at Bell Laboratories. During the war, he was involved in the development of cryptographic systems, and this later helped him discover error-correcting coding methods. And in his free time, he began to develop ideas that later resulted in information theory. Shannon's original goal was to improve the transmission of information over a telegraph or telephone channel affected by electrical noise. He quickly came to the conclusion that the best solution to the problem was to package information more efficiently.

But what is information? How to measure its quantity? Shannon had to answer these questions even before he began researching communication channel capacity. In his works of 1948-49, he defined the amount of information through entropy - a quantity known in thermodynamics and statistical physics as a measure of the disorder of a system, and took as a unit of information what was later dubbed a “bit”, that is, the choice of one of two equally probable options. Shannon later liked to say that he was advised to use entropy by the famous mathematician John von Neumann, who motivated his advice by the fact that few mathematicians and engineers knew about entropy, and this would provide Shannon with a great advantage in the inevitable disputes. Whether this is a joke or not, how difficult it is for us now to imagine that just half a century ago the concept of “amount of information” still needed a strict definition and that this definition could cause some controversy.

On the solid foundation of his definition of the quantity of information, Claude Shannon proved an amazing theorem about the capacity of noisy communication channels. This theorem was published in its entirety in his works of 1957-61 and now bears his name. What is the essence of Shannon's theorem? Any noisy communication channel is characterized by its maximum information transfer rate, called the Shannon limit. At transmission speeds above this limit, errors in the transmitted information are inevitable. But from below this limit can be approached as close as desired, providing with appropriate coding of information an arbitrarily small probability of error for any noisy channel.

These ideas of Shannon turned out to be too visionary and could not find application in the years of slow tube electronics. But in our time of high-speed microcircuits, they work everywhere where information is stored, processed and transmitted: in a computer and a laser disk, in a fax machine and an interplanetary station. We don't notice Shannon's theorem, just as we don't notice air.

In addition to information theory, the irrepressible Shannon worked in many areas. He was one of the first to suggest that machines could play games and teach themselves. In 1950, he made a mechanical mouse, Theseus, remotely controlled by a complex electronic circuit. This mouse learned to find a way out of the maze. In honor of his invention, the IEEE established an international micromouse competition, which is still attended by thousands of engineering students. In the same 50s, Shannon created a machine that “read minds” when playing “coin”: a person thought of “heads” or “tails”, and the machine guessed with a probability above 50%, because a person cannot avoid which -or patterns that the machine can use.

Shannon left Bell Labs in 1956 and became a professor at the Massachusetts Institute of Technology the following year, where he retired in 1978. Among his students were, in particular, Marvin Minsky and other famous scientists working in the field of artificial intelligence.

Shannon's works, which are treated with reverence by scientists, are just as interesting for specialists solving purely applied problems. Shannon also laid the foundation for modern error correction coding, which is essential to every hard drive or streaming video system today, and perhaps to many products yet to see the light of day.

At MIT and in retirement, he was completely captivated by his lifelong passion for juggling. Shannon built several juggling machines and even created a general theory of juggling, which, however, did not help him break his personal record - juggling four balls. He also tried his hand at poetry, and also developed various stock exchange models and tested them (successfully, according to him) on his own shares.

But since the early 60s, Shannon has done virtually nothing more in information theory. It looked as if he had become tired of the theory he had created after only 20 years. This phenomenon is not uncommon in the world of science, and in this case they say one word about the scientist: burnt out. Like a light bulb, or what? It seems to me that it would be more accurate to compare scientists with stars. The most powerful stars do not shine for long, about a hundred million years, and end their creative life with a supernova explosion, during which nucleosynthesis occurs: the entire periodic table is born from hydrogen and helium. You and I are made up of the ashes of these stars, and our civilization also consists of the products of the rapid combustion of the most powerful minds. There are stars of the second type: they burn evenly and for a long time, and for billions of years they provide light and heat to inhabited planets (at least one). Researchers of this type are also very much needed by science and humanity: they provide civilization with the energy of development. And third-class stars - red and brown dwarfs - shine and warm a little, just under their breath. There are plenty of such scientists, but it is simply indecent to talk about them in an article about Shannon.

In 1985, Claude Shannon and his wife Betty unexpectedly attended the International Symposium on Information Theory in the English city of Brighton. Shannon did not appear at conferences for almost a generation, and at first no one recognized him. Then the symposium participants began to whisper: that modest gray-haired gentleman over there is Claude Elwood Shannon, the same one! At the banquet, Shannon said a few words, did a little juggling with three (alas, only three) balls, and then signed hundreds of autographs for the stunned engineers and scientists who lined up in a long line. Those standing in line said that they experienced the same feelings that physicists would experience if Sir Isaac Newton himself appeared at their conference.

Claude Shannon died in 2001 in a Massachusetts nursing home from Alzheimer's disease at the age of 84.

Based on materials from an article by Sergei Sery in the newspaper Computer News, No. 21, 1998.
Website address:

Scientific field: Place of work: Alma mater: Known as: Awards and prizes


  • Prize named after A. Nobel AIEE (1940);
  • Prize in memory of M. Libman (English) Russian IRE (1949);
  • IEEE Medal of Honor (1966);
  • National Medal of Science (1966);
  • Harvey Award (1972);
  • Kyoto Prize (1985).

Biography

In 1985, Claude Shannon and his wife Betty attend the International Symposium on Information Theory in Brighton. Shannon did not attend international conferences for quite a long time, and at first they did not even recognize him. At the banquet, Claude Shannon gave a short speech, juggled only three balls, and then gave out hundreds and hundreds of autographs to the amazed scientists and engineers who stood in a long line, feeling reverent feelings towards the great scientist, comparing him with Sir Isaac Newton.

He was the developer of the first industrial radio-controlled toy, which was produced in the 50s in Japan (photo). He also developed a device that could fold a Rubik's cube (photo), a mini computer for the board game Hex, which always defeated the opponent (photo), a mechanical mouse that could find a way out of a maze (photo). He also realized the idea of ​​the comic machine “Ultimate Machine” (photo).

Communication theory in secret systems

Shannon's work "The Theory of Communication in Secret Systems" (1945), classified as "secret", which was declassified and published only in 1949, served as the beginning of extensive research in the theory of coding and transmission of information, and, in general opinion, gave cryptography the status of a science. It was Claude Shannon who first began to study cryptography using a scientific approach. In this article, Shannon defined the fundamental concepts of the theory of cryptography, without which cryptography is no longer conceivable. Shannon's important merit is the research of absolutely secure systems and proof of their existence, as well as the existence of cryptographically strong ciphers, and the conditions required for this. Shannon also formulated the basic requirements for strong ciphers. He introduced the now familiar concepts of scattering and mixing, as well as methods for creating cryptographically strong encryption systems based on simple operations. This article is the starting point for studying the science of cryptography.

Article "Mathematical theory of communication"

  • The Nyquist-Shannon theorem (in Russian-language literature - Kotelnikov's theorem) is about the unambiguous reconstruction of a signal from its discrete samples.
  • (or silent encryption theorem) sets a limit for maximum data compression and a numerical value for Shannon entropy.
  • Shannon-Hartley theorem

see also

  • Whittaker-Shannon interpolation formula

Notes

Literature

  • Shannon C. E. A Mathematical Theory of Communication // Bell System Technical Journal. - 1948. - T. 27. - P. 379-423, 623-656.
  • Shannon C. E. Communication in the presence of noise // Proc. Institute of Radio Engineers. - Jan. 1949. - T. 37. - No. 1. - P. 10-21.
  • Shannon K. Works on information theory and cybernetics. - M.: Foreign Literature Publishing House, 1963. - 830 p.

Links

  • Bibliography (English)

Categories:

  • Personalities in alphabetical order
  • Scientists by alphabet
  • Born on April 30
  • Born in 1916
  • Michigan born
  • Died on February 24
  • Died in 2001
  • Deaths in Massachusetts
  • US mathematicians
  • Information theory
  • Cryptographers
  • Cybernetics
  • Pioneers of computer technology
  • Artificial Intelligence Researchers
  • Scientists in the field of systems science
  • MIT alumni
  • University of Michigan alumni
  • MIT faculty
  • Members and Corresponding Members of the US National Academy of Sciences
  • Foreign Fellows of the Royal Society of London
  • Mathematicians of the 20th century
  • Harvey Award Winners
  • US National Medal of Science recipients
  • IEEE Medal of Honor Recipients
  • Persons:Computer chess
  • US Electrical Engineers

Wikimedia Foundation. 2010.

Share with friends or save for yourself:

Loading...