Saturday, January 29, 2022

"How Claude Shannon Helped Kick-start Machine Learning"

From IEEE Spectrum, January 25:

The “father of information theory” also paved the way for AI

Among the great engineers of the 20th century, who contributed the most to our 21st-century technologies? I say: Claude Shannon.

Shannon is best known for establishing the field of information theory. In a 1948 paper, one of the greatest in the history of engineering, he came up with a way of measuring the information content of a signal and calculating the maximum rate at which information could be reliably transmitted over any sort of communication channel. The article, titled “A Mathematical Theory of Communication,” describes the basis for all modern communications, including the wireless Internet on your smartphone and even an analog voice signal on a twisted-pair telephone landline. In 1966, the IEEE gave him its highest award, the Medal of Honor, for that work.

If information theory had been Shannon’s only accomplishment, it would have been enough to secure his place in the pantheon. But he did a lot more.

A decade before, while working on his master’s thesis at MIT, he invented the logic gate. At the time, electromagnetic relays—small devices that use magnetism to open and close electrical switches—were used to build circuits that routed telephone calls or controlled complex machines. However, there was no consistent theory on how to design or analyze such circuits. The way people thought about them was in terms of the relay coils being energized or not. Shannon showed that Boolean algebra could be used to move away from the relays themselves, into a more abstract understanding of the function of a circuit. He used this algebra of logic to analyze, and then synthesize, switching circuits and to prove that the overall circuit worked as desired. In his thesis he invented the AND, OR, and NOT logic gates. Logic gates are the building blocks of all digital circuits, upon which the entire edifice of computer science is based....


The Bit Bomb: The True Nature of Information

The subject of this article, Claude Shannon has a couple interesting connections to finance/investing/trading beyond 'just' creating information theory (along with MIT's Norbert Wiener who was coming in on a different angle of attack), more after the jump.
Both Aeon and Climateer are reposting, "The Bit Bomb" first appeared at Aeon on August 30, 2017 and graced our pages over the Labor Day weekend, September 3, 2017.

Today In Big Numbers, Information Theory Edition: "There are 6×10^80 Bits of Information in the Observable Universe"

A Review of Garry Kasparov’s Deep Thinking: Where Machine Intelligence Ends and Human Creativity Begins

 "Claude Shannon, the Las Vegas Shark"
"How Information Got Re-Invented"
"How did Ed Thorp Win in Blackjack and the Stock Market?"
How Big Data and Poker Playing Bots Are Taking the Luck Out of Gambling 

There was also a shout out to Shannon from the quants at Ruffer in July 17's Ruffer Review: "Navigating information"