Saturday, January 20, 2018

The Bit Bomb: The True Nature of Information

The subject of this article, Claude Shannon has a couple interesting connections to finance/investing/trading beyond 'just' creating information theory (along with MIT's Norbert Wiener who was coming in on a different angle of attack), more after the jump.
Both Aeon and Climateer are reposting, "The Bit Bomb" first appeared at Aeon on August 30, 2017 and graced our pages over the Labor Day weekend, September 3, 2017.

From Aeon:

It took a polymath to pin down the true nature of ‘information’. His answer was both a revelation and a return
Just what is information? For such an intuitive idea, its precise nature proved remarkably hard to pin down. For centuries, it seemed to hover somewhere in a half-world between the visible and the unseen, the physical and the evanescent, the enduring medium and its fleeting message. It haunted the ancients as much as it did Claude Shannon and his Bell Labs colleagues in New York and New Jersey, who were trying to engirdle the world with wires and telecoms cables in the mid-20th century.
Shannon – mathematician, American, jazz fanatic, juggling enthusiast – is the founder of information theory, and the architect of our digital world. It was Shannon’s paper ‘A Mathematical Theory of Communication’ (1948) that introduced the bit, an objective measure of how much information a message contains. It was Shannon who explained that every communications system – from telegraphs to television, and ultimately DNA to the internet – has the same basic structure. And it was Shannon who showed that any message could be compressed and transmitted via a binary code of 0s and 1s, with near-perfect accuracy, a notion that was previously pegged as hopelessly utopian. As one of Shannon’s colleagues marvelled: ‘How he got that insight, how he even came to believe such a thing, I don’t know.’

These discoveries were scientific triumphs. But in another way, they brought the thinking about information full-circle. Before it was the province of natural scientists, ‘information’ was a concept explored by poets, orators and philosophers. And while Shannon was a mathematician and engineer by training, he shared with these early investigators a fascination with language.

In the Aeneid, for example, the Roman poet Virgil describes the vast cave inhabited by the god Vulcan and his worker-drones the Cyclopes, in which the lightning bolt of Jupiter is informatum – forged or given shape beneath their hammers. To in-form meant to give a shape to matter, to fit it to an ideal type; informatio was the shape given. It’s in this sense that Cicero spoke of the arts by which young people are ‘informed in their humanity’, and in which the Church Father Tertullian calls Moses populi informator, the shaper of the people.

From the Middle Ages onwards, this form-giving aspect of information slowly gave way, and it acquired a different, more earthy complexion. For the medieval scholastics, it became a quintessentially human act; information was about the manipulation of matter already on Earth, as distinct from the singular creativity of the Creator Himself. Thomas Aquinas said that the intellect and the virtues – but also the senses – needed to be informed, enriched, stimulated. The scientific revolution went on to cement these perceptible and grounded features of information, in preference to its more divine and form-giving aspects. When we read Francis Bacon on ‘the informations of the senses’, or hear John Locke claim that ‘our senses inform us’, we feel like we’re on familiar ground. As the scholar John Durham Peters wrote in 1988: ‘Under the tutelage of empiricism, information gradually moved from structure to stuff, from form to substance, from intellectual order to sensory impulses.’

It was as the study of the senses that a dedicated science of information finally began to stir. While Lord Kelvin was timing the speed of telegraph signals in the 1850s – using mechanisms rigged with magnets, mirrors, metal coils and cocoon silk – Hermann von Helmholtz was electrifying frog muscles to test the firing of animal nerves. And as information became electric, the object of study became the boundary between the hard world of physics and the elusive nature of the messages carried in wires.

In the first half of the 20th century, the torch passed to Bell Labs in the United States, the pioneering communications company that traced its origins to Alexander Graham Bell. Shannon joined in 1941, to work on fire control and cryptography during the Second World War. Outside of wartime, most of the Labs’ engineers and scientists were tasked with taking care of the US’ transcontinental telephone and telegraph network. But the lines were coming under strain as the human appetite for interaction pushed the Bell system to go further and faster, and to transmit messages of ever-higher quality. A fundamental challenge for communication-at-a-distance was ‘noise’, unintended fluctuations that could distort the quality of the signal at some point between the sender and receiver. Conventional wisdom held that transmitting information was like transmitting power, and so the best solution was essentially to shout more loudly – accepting noise as a fact of life, and expensively and precariously pumping out a more powerful signal.
The information value of a message depends on the range of alternatives killed off in its choosing
But some people at the Labs thought the solution lay elsewhere. Thanks to its government-guaranteed monopoly, the Labs had the leeway to invest in basic theoretical research, even if the impact on communications technology many years in the future. As the engineer Henry Pollak told us in an interview: ‘When I first came, there was the philosophy: look, what you’re doing might not be important for 10 years or 20 years, but that’s fine, we’ll be there then.’ As a member of the Labs’ free-floating mathematics group, after the war Shannon found that he could follow his curiosity wherever it led: ‘I had freedom to do anything I wanted from almost the day I started. They never told me what to work on.’...MUCH MORE
Last year we linked to in "Claude Shannon, the Las Vegas Shark" which highlighted some of his dealings with one of the 'quantfathers', Ed Thorp:

The father of information theory built a machine to game roulette, then abandoned it.
Many of Claude Shannon’s off-the-clock creations were whimsical—a machine that made sarcastic remarks, for instance, or the Roman numeral calculator. Others created by the Massachusetts Institute of Technology professor and father of information theory showed a flair for the dramatic and dazzling: the trumpet that spit flames or the machine that solved Rubik’s cubes. Still other devices he built anticipated real technological innovations by more than a generation. One in particular stands out, not just because it was so far ahead of its time, but because of just how close it came to landing Shannon in trouble with the law—and the mob.

Long before the Apple Watch or the Fitbit, what was arguably the world’s first wearable computer was conceived by Ed Thorp, then a little-known graduate student in physics at the University of California, Los Angeles. Thorp was the rare physicist who felt at home with both Vegas bookies and bookish professors. He loved math, gambling, and the stock market, roughly in that order. The tables and the market he loved for the challenge: Could you create predictability out of seeming randomness? What could give one person an edge in games of chance? Thorp wasn’t content just pondering these questions; like Shannon, he set out to find and build answers.

In 1960, Thorp was a junior professor at MIT. He had been working on a theory for playing blackjack, the results of which he hoped to publish in the Proceedings of the National Academy of Sciences. Shannon was the only academy member in MIT’s mathematics department, so Thorp sought him out. “The secretary warned me that Shannon was only going to be in for a few minutes, not to expect more, and that he didn’t spend time on subjects (or people) that didn’t interest him. Feeling awed and lucky, I arrived at Shannon’s office to find a thinnish, alert man of middle height and build, somewhat sharp featured,” Thorp recalled.

Thorp had piqued Shannon’s interest with the blackjack paper, to which Shannon recommended only a change of title, from “A Winning Strategy for Blackjack” to the more mundane “A Favorable Strategy for Twenty-One,” the better to win over the academy’s staid reviewers. The two shared a love of putting math in unfamiliar territory in search of chance insights. After Shannon “cross-examined” Thorp about his blackjack paper, he asked, “Are you working on anything else in the gambling area?”

Thorp confessed. “I decided to spill my other big secret and told him about roulette. Ideas about the project flew between us. Several exciting hours later, as the wintery sky turned dusky, we finally broke off with plans to meet again on roulette.” As one writer, William Poundstone, put it, “Thorp had inadvertently set one of the century’s great minds on yet another tangent.”

Thorp was immediately invited to Shannon’s house. The basement, Thorp remembered, was “a gadgeteer’s paradise. ... There were hundreds of mechanical and electrical categories, such as motors, transistors, switches, pulleys, gears, condensers, transformers, and on and on.” Thorp was in awe: “Now I had met the ultimate gadgeteer.”

It was in this tinkerer’s laboratory that they set out to understand how roulette could be gamed, ordering “a regulation roulette wheel from Reno for $1,500,” a strobe light, and a clock whose hand revolved once per second. Thorp was given inside access to Shannon in all his tinkering glory:...MUCH MORE
If interested see also 2012's "How did Ed Thorp Win in Blackjack and the Stock Market?" and a couple more from 2017
A Review of Garry Kasparov’s Deep Thinking: Where Machine Intelligence Ends and Human Creativity Begins
"How Information Got Re-Invented"