October 18, 2021

Biology Reporter

Complete News World

How Claude Shannon invented the future

How Claude Shannon invented the future

Science tries to discover the basic laws of nature, mathematicians search for new theories, and engineers create systems to meet human needs. All three disciplines are interrelated but separate and it is very rare for a single person to make a significant contribution to all of them. Claude Shannon was unique.

he is reading 7 minutes

He did not gain wide fame, although the documentary titled bit player. He never won a Nobel Prize, and besides, neither during his life nor after his death (he died in 2001), he did not become such a scientific star as Albert Einstein or Richard Feynman. But it was he who, more than 70 years ago, published an article laying the foundations for the information age and communication infrastructure of today’s world.

Shannon was born in Gaylord, Michigan, in 1916. He was the son of a local businessman and teacher. He earned degrees in electrical engineering and mathematics from the University of Michigan, then a master’s degree from the Massachusetts Institute of Technology. In his thesis he applied Boolean algebra to the analysis and synthesis of circuit commutation. It was a great job. Circuit design has a strong scientific basis. Shannon’s thesis is now believed to be the beginning of the history of digital circuits.

Then the scientist turned to a more ambitious problem: communication.

Request a digital subscription

Breaking news!

We have three free articles to read this month. This is the first of them. Perhaps, however, it is worth looking at our cheap rates now digital subscriptionTo make sure no one surprises you?

Communication is one of the basic needs of human being. From smoke signals and pigeon racing to the phone or TV, people have always tried to devise ways that allow them to send messages farther and faster, and the more likely the message will reach its recipient. But all this was of a practical nature. Engineers developed communication systems with specific senders in mind and used specific physical media. On the other hand, Shannon himself asked, “Is it possible to create a general and unified theory of communication?” In 1939, he presented his initial ideas on “The Fundamentals of Knowledge Transfer Systems” in a letter to Vannevar Bosch, who considered him a mentor. After 10 years of work, in 1948, Shannon finally published his masterpiece titled Mathematical theory of communication.

Its core is a simple generic model. The transmitter encodes the information and converts it into a signal. It is disturbed by the entropy and then decoded by the receiver. The whole model looks simple, but it has two very important advantages. First, Shannon separated sources of information and entropy for a specific physical communication system. Second, he modeled both types of sources probabilistically. He assumed that the sender always generates one of many possible messages. If so, a certain probability can be assigned to each message. The entropy adds some random noise (the probability of its occurrence can be estimated), which then has to be factored into the decoding of the message in the future.

See also  Beat the flag with the weather - Radio Lublin Polish

Before Shannon published his article, communication was viewed from a deterministic perspective. The basic question was: How is the received perturbed signal transformed by the physical communication medium to reconstruct the originally transmitted information as faithfully as possible? But Shannon made a great note: The key to communication is uncertainty. After all, if you knew from the very beginning what I was going to write in this article, then it would not make sense to publish it.

Thanks to this single observation, Shannon made the problem of communication now possible to be considered from an abstract perspective, in isolation from a specific physical medium. Uncertainty can be modeled using probability. For telecom engineers at the time, it was a real shock.

Photo: DobriZheglov (CC BY-SA 4.0)
Photo: DobriZheglov (CC BY-SA 4.0)

Since the categories of uncertainty and probability played a fundamental role in Shannon’s theory, the scientist in his article attempted to identify the fundamental limitations of communication. He divided the conclusions into three parts. Their most important component was the concept of the bit, which Shannon called the basic unit of uncertainty. Bit, or “binary number,” can be either 0 or 1. Shannon’s article was the first scientific publication to use this word (although it appeared earlier in the note of mathematician John Tukey).

See also  Run for health. Can anyone participate in the marathon? - Not connected

In the first step, Shannon came up with an equation for entropy, writing its value with the letter HDetermines the level of uncertainty in determining potential messages that will be generated by the issuer. The lower the entropy, the lower the uncertainty, and therefore the easier it is to compress the message. For example, if we have one minute to type an SMS containing 100 characters of the basic Latin alphabet, it gives us 26 messages as a possible message. To encode all of them, 470 bits are needed, since 2⁴⁷⁰ 26¹. This means that the universe is 470 bits per minute. But only in theory. In the real world, some combinations of letters in your text message are more likely than others, so the entropy is much lower, allowing for more compression.

Second, Shannon provided a formula for the maximum number of bits per second that can be transmitted despite the noise and counted to decode it correctly. Call it system capabilities.C). In other words: C It is the maximum rate at which the future can handle uncertainty. substantially C It is the upper limit of the rate of information.

Third, Shannon showed that a reliable communication of information in the face of entropy is possible if and only if H < C. Let’s take a simple comparison: if the water flow is less than the capacity of the pipe, then everything works fine. The same rule applies to information.

The theory presented here applies not only to communication, but also to information, especially its generation and transmission. This is why Shannon is today considered the “father of information theory”.

His claims sometimes lead to counterintuitive conclusions. Suppose you are having a conversation in a very noisy place. How do you make sure your message reaches the recipient? Idea one: Repeat it several times. But this method turned out to be ineffective. Yes, the more you repeat a message, the more likely the recipient will hear it, but the connection will be much slower. Shannon showed that there are better ways. For example, another, more complex code can be used. This will allow faster communication – of course, within the limit you set حدد C – At the same time maintaining the same level of reliability.

Another unexpected conclusion from Shannon’s theorem is as follows: Whatever the nature of the information, the greatest efficiency is achieved if we encode it in the form of bits before transmission. It doesn’t matter if it’s a Shakespeare sonnet, a Beethoven symphony, or a Kurosawa movie. Take, for example, the radio system. The original sound and the transmitted electromagnetic signal are analog waves. However, from Shannon’s theories, it is best to digitize them first, or “convert” them into bits, and then convert those bits into electromagnetic waves. This surprising conclusion supports the age of digital information. Bit prevails today as a global unit of information.

See also  Explorations - A unique weekend with science - Radio Via

Shannon’s general theory seems perfectly normal. He seems to have discovered the fundamental laws that govern the universe, as befits the world.

To describe them, Shannon needed new mathematical theories. He also introduced innovative concepts, such as the level of entropy in the probabilistic model, which was later used in various areas of mathematics – for example in ergodic theory devoted to the long-term performance of dynamical systems. From this point of view, Shannon was a mathematician.

Above all, he should be considered an engineer. He developed his theory because he wanted to solve some practical problems. Engineers at that time considered his concepts esoteric, but today they are the basis of all modern communication systems: optical, underwater, and even interplanetary. I was fortunate enough to participate in the global work that led to the application of Shannon’s overall theory to wireless communications, and it only took a few generations of standards to increase the speed of information transmission twofold. The 5G standard, which is currently being introduced, uses two working tokens instead of one, which reaches the speed limit described by Shannon.

We owe this to a theory developed more than 70 years ago. Shannon focused only on the essential features of the problem and ignored everything else. This is evidenced by the simplicity of the communication model that he created. He also knew that you should focus on what is possible, not what can be achieved at any given moment.

Here is science at its best. When I started my graduate studies, my supervisor told me that the best scientific papers allowed me to cut the branches of the tree of knowledge. I didn’t know what he meant at the time. I thought my job was to add some twigs here and there. It was only when I had to apply this philosophy that I began to understand it better.

By the time Shannon moved into the field of communications, there were already several technologies that engineers had developed. The theory he devised meant uniting the entire system, pruning the branches so that in the end it was a beautiful and graceful tree. Today the fruits are used by future generations of scientists, mathematicians and engineers.

Translated by Jan Dzerzhovsky

Article – Commodity Reprinted with permission Quanta Magazine, which is an independent magazine created by Simmons Foundation. Its mission is to deepen scientific knowledge by presenting the latest research and following trends in mathematics, physics, and the natural sciences.

date of publication: