Friday 30 December 2016

The Information: A History, a Theory, a Flood



The library will endure; it is the universe. As for us, everything has not been written; we are not turning into phantoms. We walk the corridors, searching the shelves and rearranging them, looking for lines of meaning amid leagues of cacophony and incoherence, reading the history of the past and of the future, collecting our thoughts and collecting the thoughts of others, and every so often glimpsing mirrors, in which we may recognize creatures of the information.
The first time I saw Jurassic Park, I was surprised by the inclusion of Jeff Goldblum's character: why would the initial outsider tour of a dinosaur safari park require the presence of a Chaos Theorist? I decided his character was a popsciencey “dissenting expert” trope; “the bad boy of mathematics” being present to make author Michael Crichton look smarter than the viewer (or the reader of the original book) and to give the Paleontologists in the story someone to argue with. But in light of James Gleick's The Information, it's Jeff Goldblum in Jurassic Park that I'm primarily put in mind of: this is yet another book that I've read recently that takes as a given that the universe is simply a giant algorithm and humans are merely the machines evolved to compute it. If all of “the information” that we process can be expressed in binary digits, then all of life and all of experience is simply and purely math. As Gleick sketches out in this history of Information Theory, mathematicians have been grasping at this fact for the past century and a half, and throughout this time, there have always been Jeff Goldblum-type characters in the background, looking like lunatics while speaking the truth. If all of experience can be reduced to math, it follows that math can be used to predict experience, and ultimately, the Chaos Theorist was the most important member of the safari (helpfully reinforced in Crichton's story by having Jeff Goldblum be right about everything from the start). Today, we all benefit from the practical applications of Information Theory – with our laptops and smart phones and the dawn of quantum computing – but we should remember that in the shadows live these mathematician-philosophers who not only advance technology, but who are in effect changing the way that human brains work. 

I picked up The Information after having so enjoyed Gleick's latest book, Time Travel. What I most liked about that book were the intersections Gleick created between hard science, philosophy, and literature, and that is essentially the ethos of this earlier work, too. I do not have a math brain, but Gleick was able to lead me through the equations in the history of Information Theory, and with an interesting prose style, always bring the focus back to the fascinating people behind the evolving hypotheses. While discussing information as a flood, Gleick quotes Lewis Carrol's satirical bit about the drawbacks of creating a 1:1 scale map, and that's useful for me to remember as I attempt to record all the ideas in this book that interested me; the impulse would be to copy and paste the whole thing. 

Gleick begins by describing the earliest modes of faster-than-human-or-animal-conveyance long distance communication, and I was most fascinated by the idea that both African drumming (which was infinitely more rich and lyrical than later Morse Code; employing embedded redundancies much like those found later in the gene itself) and a string of bonfires are essentially both binary systems (the drums always communicated through two distinct tones and the fires were only on or off). Gleick covers the beginning of writing and the mental leap it took to evolve from hieroglyph-type pictograms to the metalanguage of substituting symbols for words and then for word-parts (which are themselves symbols). This leads to the Greek philosophers and my first wow moment: Socrates lived in a pre-writing era, and his protege Plato resisted the fad of writing down his own thoughts: warning that removing the need to remember facts would lessen our collective wisdom (and doesn't that sound familiar today?) When Plato's own protege, Aristotle, did begin to assemble knowledge in the written form, he was essentially inventing the way we now think (and I know that I could have learned, or even intuited, this before but I didn't: the reason why Aristotle was the father of all sciences and all theories of thought was because he was the first [Western] person to ever write it all down).

Logic might be imagined to exist independent of writing – syllogisms can be spoken as well as written – but it did not. Speech is too fleeting to allow for analysis. Logic descended from the written word, in Greece as well as India and China, where it developed independently. Logic turns the act of abstraction into a tool for determining what is true and what is false: truth can be discovered in words alone, apart from concrete experience. Logic takes its form in chains: sequences whose members connect one to another. Conclusions follow from premises. These require a degree of constancy. They have no power unless people can examine and evaluate them. In contrast, an oral narrative proceeds by accretion, the words passing by in a line of parade past the viewing stand, briefly present and then gone, interacting with one another via memory and association.
Because of Aristotle, we are now people who think in terms of categories (proven to be true even for illiterate people in a literate society; a trait not found in nonliterate societies), but it wasn't until the invention of the printing press that knowledge became standardized.
Thomas Hobbes, in the seventeenth century, resisted his era’s new-media hype: “The invention of printing, though ingenious, compared with the invention of letters is no great matter.” Up to a point, he was right.  Every new medium transforms the nature of human thought. In the long run, history is the story of information becoming aware of itself.
Not long after Hobbes came Newton, and while I did know that he invented Calculus and ignited the Scientific Revolution in England, I knew but hadn't really appreciated that Newton also invented the language of Physics; being the first to use terms like “mass” and “force”; pretty much doing for math what Aristotle had earlier done for written language. The next parts that I found interesting involve the evolution of long distance communication, and especially the telegraphy towers – looking somewhat like windmills, they spoke a type of semaphore that could be seen and relayed from tower to tower – which was the height of communication right up until it wasn't: with the advent of electrical telegraphy and Morse Code, which was the height of communication right up until it wasn't: with the advent of telephones, which were the height...And in the background, there are all these mathematicians, looking like lunatics while speaking the truth, and I especially enjoyed the image of Charles Babbage and his steampunk computer, and his contemporary, the Lady Ada Lovelace (illegitimate daughter of Lord Byron and, like Babbage, a genius thinker born before her time; forced to express her mathematical insights as philosophy because technology hadn't yet caught up to her vision). Imagine the first dictionary needing to begin by explaining what is meant by “alphabetical” and how to use the system to find entries within the book itself. With the turn of the twentieth century, I was fascinated by Gödel's incompleteness theorem, Maxwell's Demon, the Turing Machine, Wiener's Cybernetics, Claude Shannon's Information Theory, and Richard Dawkin's selfish gene; for what is a human if not a means for a gene to transmit its information?
The macromolecules of organic life embody information in an intricate structure. A single hemoglobin molecule comprises four chains of polypeptides, two with 141 amino acids and two with 146, in strict linear sequence, bonded and folded together. Atoms of hydrogen, oxygen, carbon, and iron could mingle randomly for the lifetime of the universe and be no more likely to form hemoglobin than the proverbial chimpanzees to type the works of Shakespeare. Their genesis requires energy; they are built up from simpler, less patterned parts, and the law of entropy applies. For earthly life, the energy comes as photons from the sun. The information comes via evolution.
So here's the tl;dr: I have read other reviews that lay out all the ways in which Gleick is supposed to have stressed the wrong ideas or misinterpreted others, but as a primer for the history of Information Theory, I found The Information to have been an accessible and fascinating read. As a non-expert, I have no idea if I understood this book down to its bones, but I do now appreciate why Jeff Goldblum's character was in Jurassic Park.