Home » who was the father of information theory

who was the father of information theory

by admin
who was the father of information theory

Thanks to thinkers like Alan Turing and John von Neumann, the doors to the computer age have opened. Their contribution was essential, decisive. But whoever laid the solid foundations on which to build the new theory of information, who managed to free, to capture the essence of information, that is to abstract it from the material world, was Claude Shannon. Who together with Kotelnikov in the Soviet Union, independently developed the theory of information, which would revolutionize the telecommunications world.

The Shannon family

Gaylord is a small and charming rural town of 3000 inhabitants surrounded by greenery. Orchards, maple and beech forests. A small town in Michigan where Mr. Claude Shannon Sr. lives and runs a furniture store. While Mrs. Mabel Wolf, born in Lansing on September 14, 1880, daughter of an emigrant from Germany and second wife of Shannon Sr, teaches languages ​​in a high school in her hometown, Lansing, located a few kilometers from Gaylord. Lansing is also where the two were married in August 1909. Very religious and devout believers, they were frequent visitors to a local Protestant church. They had two children, eldest Catherine, born in 1910, and Claude Jr. born in 1916.

Childhood

Claude Elwood Shannon was born in Petoskey, Michigan on April 30, 1916. He spent his childhood in the town of Gaylord. A quiet and normal childhood, in which it is not he but his older sister who excels in the family. Model student with a knack for math and gifted with a strong creativity. To this creativity is added a passion for music and for the piano. Catherine manages to graduate from Michigan University with a degree in mathematics.

Such preparation aroused much curiosity in a young Claude and a drive to give his best. In fact, it was he himself who later declared: “My sister’s talent for mathematics may have prompted my interest in this subject.” Claude graduated from Gaylord High School in 1932, while simultaneously managing to perform small jobs, including: delivering telegrams and repairing radios.

During the same period, his interest in how things worked and in making, a Maker before the letter, pushed him towards making equipment. He began the construction of a telegraph with which he managed to connect his house with that of a friend of his. He hardwired it with the barbed wire that surrounded and protected their dwellings.

See also  the younger brother of Magic5 Pro with triple camera

Education

Shannon

Claude Shannon, father of information theory, graduated from Gaylord High School in 1932. Following in the family footsteps, he enrolled in college, his mother Mabel and sister Catherine were both college graduates. At Michigan University, he crossed his first milestone in 1936 by earning a bachelor’s degree in mathematics.

His passion for studying, the growing need for engineers, also imposed by the unfortunate historical events that America and the world were going through, prompted him to continue university. He decided to take advantage of an ad he sees posted on the bulletin board of the MIT Department of Electrical Engineering. That announcement advertised a position as a student-assistant in the development of the famous differential analyzer, an electromechanical analog computer designed to solve differential equations. Luck is on his side, he enters MIT. Claude obtains (also thanks to the advantage received from the overlapping of many of the exams of the two different courses) the degree in electrical engineering. Later, he will declare: “Thanks to this overlapping of exams, the second degree didn’t cost me much effort.”

The MIT years

Shannon

At MIT, he joined the group of Vannevar Bush, creator of the differential analyzer. Thanks to his scientific depth and his notoriety in the American academic world, Bush was later also assigned, expressly by President Roosevelt, to the National Defense Research Committee, contributing to the development of military apparatuses with roles even in the Manhattan Project.

As an assistant, Shannon’s job was to maintain and check the analyzer’s complex components. Components also composed of hundreds and hundreds of relays. But the turning point, the turning point, came in 1937, when he went to New York from MIT, to a place where another group of people was simultaneously studying to bring mathematical logic closer to circuit design. That place was Bell Laboratories, at the time laboratories of the American telephone company, where Shannon went for a summer internship.

The turning point

Il Theseus Mazean electromechanical play device designed by Shannon in 1952 and currently on display at the MIT museum in Cambridge (Massachusetts).

After this summer internship period, Shannon begins to compose his mental puzzle: he puts together for the first time, the knowledge he had of Boole’s logic with circuits. Thus combining Boolean algebra (the algebra developed in the mid-19th century by the English mathematician George Boole) with switching electrical circuits with the aim of reducing, or rather simplifying, switching in telephone networks with switches and relays.

See also  the convenience of a compact monitor...

His ideas and his studies were published in the MIT thesis, “A Symbolic Analysis of Relay and Switching Circuits” published in 1938 on the “Transactions of the American Institute of Electrical Engineers”. In the following decades it was called “the most important thesis that has ever been written”.

Shannon and information theory

Shannon was famed for his unparalleled memory and intelligence. He was able to dictate scientific articles by heart and very rarely used sketches or notes. He was enlisted by the Pentagon to research missile guidance and after the end of the war, in 1948, he published the two-part essay “A mathematical theory of communication”. This work is considered one of the seminal researches that laid the foundation for modern information theory. In the article, Shannon defined the essential components of digital communications:

  • The starting information, which constitutes the initial message.
  • The transmitter, which receives the information and translates it into a message to be transmitted over the channel.
  • The channel, which acts as a means of transmitting the signal, ensuring its delivery to the recipient.
  • The receiver, which receives the signal transmitted through the channel and decodes it.
  • The addressee, which can be a person or a machine, who receives the message and understands its meaning.

In his work, Shannon developed the fundamental concepts of information entropy and redundancy, which underlie the theory of codes and source coding in digital communications.

Shannon, the father of the word “bit”

In this article, Shannon coined the word “bit” that is so familiar to us today. The bit is a contraction of “binary digit”. It is the fundamental unit of measurement used to quantify and represent information in the digital environment. A bit can take on two distinct values: 0 or 1. It represents the smallest unit of data that can be processed or stored in an information processing or transmission system.

The concept of bits is the basis of the binary system, which is widely used in digital technology, computers and telecommunications. By combining bits, it is possible to represent more complex information, such as numbers, text, images and sounds, using specific encodings.

See also  FSP launches a new generation of ATX 3.0/PCIe 5.0 power supplies to support the latest RTX 40 series GPUs, starting at 850W

Shannon’s Theorem

Shannon’s most important contributions were undoubtedly his sampling and coding theorems. What if we told you that we can listen to our favorite music on digital devices, thanks only to the Shannon Theorem, the founder of information theory? In 1949, he demonstrated in a scientific article the sampling theorem which allows to establish the conditions in which it is possible to transform an analog signal into a digital one without degrading its quality. Actually, the theorem should be called Whittaker-Nyquist-Kotelnikov-Shannon (WNKS), according to the chronological order of those who demonstrated progressively more generalized versions.

But that’s not all, the other important Shannon Theorem used in telecommunications is called: “Theorem of coding in the presence of noise”. In information theory, Shannon’s second theorem, also known as the “channel coding theorem”, states that despite the presence of noise in a communication channel, it is still possible to transmit data or information through that channel with an error probability Pe as small as desired, up to a certain maximum frequency.

This surprising result, considered the fundamental theorem of information theory, was first presented by Claude Shannon in 1948. The Shannon capacity, or Shannon limit, represents the maximum data transfer rate that a communication channel can deliver for a given signal-to-noise ratio, while maintaining a low error rate at will.

academic life

In 1949 he published “The theory of communication in cryptographic systems”, giving birth to the mathematical theory of cryptography. From 1956 he became a member of the National Academy of Sciences and in addition to his passion for mathematics and engineering Shannon was also a juggler and chess player and dabbled in unicycling. In those years he also became a professor and researcher at MIT.

Claude Shannon died on February 24, 2001 in Medford, Massachusetts. Six statues have been dedicated to him, including one at the University of Michigan, one at MIT, and another at Bell Laboratories.

Main image credits: “Estate of Francis Bello / Science Source

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy