1 Following


Currently reading

400 Years Of Freethought
Samuel Porter Putnam
Thomas Jefferson: The Art of Power
Jon Meacham
Fairy Tales from the Brothers Grimm: A New English Version
Philip Pullman, Jacob Grimm
Doubt: A History: The Great Doubters and Their Legacy of Innovation from Socrates and Jesus to Thomas Jefferson and Emily Dickinson
Jennifer Michael Hecht
The Kingdom of Matthias: A Story of Sex and Salvation in 19th-Century America
Paul E. Johnson, Sean Wilentz

The Information: A History, a Theory, a Flood

The Information: A History, a Theory, a Flood - James Gleick Richard Dawkins’ fundamental contribution to science, says Gleick, is the idea that “Genes, not organisms, are the true units of natural selection” (Kindle Locations 5328-5329). He cites The Selfish Gene, which I really ought to get around to reading again soon. But then he takes it someplace I’m not sure Dawkins intended (although given Dawkins’ conclusion that memes act like genes in the real world, maybe he did…), and suggests that genes are not in fact the strings of base pairs seen under the microscope. They are ideas. After all, Gleick says, “There is no gene for long legs; there is no gene for a leg at all. To build a leg requires many genes…[and what about] more complex qualities—genes for obesity or aggression or nest building or braininess or homosexuality. Are there genes for such things? Not if a gene is a particular strand of DNA that expresses a protein. Strictly speaking, one cannot say there are genes for almost anything—not even eye color. Instead, one should say that differences in genes tend to cause differences in phenotype (the actualized organism).” (Kindle Locations 5414-5421). So what are genes? The information? Or the observed changes in phenotypes that result? Gleick concludes, “The gene is not an information-carrying macromolecule. The gene is the information. (Kindle Location 5462). But what we observe depends on our focus, our values. So once again it’s a confusion between information as signals and information as meaningful data we care about. Aside: have memes already jumped the shark?In his section on probability and entropy, Gleick mentions that an infinitely long random string will ultimately include every possible combination. “Given a long enough random string, every possible short-enough substring will appear somewhere. One of them will be the combination to the bank vault. Another will be the encoded complete works of Shakespeare. But they will not do any good, because no one can find them.” (Kindle Locations 5814-5816). But isn’t that the point, if we end up saying the universe is information (which is where this is going)? Because Shakespeare DID find them…“Researchers have established that human intuition is useless both in predicting randomness and in recognizing it. Humans drift toward pattern willy-nilly” (Kindle Locations 5819-5820). See Rosencrantz and Guildenstern Are Dead. Pi is not random, because it is computable. But if you took the digits between say 1,000 and 2,000,0000 in the string, wouldn’t THAT be a random number? So, in the real world, where context and completeness are not always discernible, don’t we get a lot of apparent randomness that might well be orderly? And that’s not even counting the mysteriousness produced by chaos and quantum indeterminacy. You just can’t get away from mystery. “Given an arbitrary string of a million digits,” Gleick says, “a mathematician knows that it is almost certainly random, complex, and patternless—but cannot be absolutely sure.” He continues, “A chaotic stream of information may yet hide a simple algorithm. Working backward from the chaos to the algorithm may be impossible” (Kindle Locations 6070-6095). You can’t decompile the program, or unstir the coffee (also from Tom Stoppard). Gleick discusses compression, which at its heart is a process of finding patterns that can be expressed in fewer bits than the original message. But again, we’re operating on something that is already an abstraction. It’s a photograph, or a digitized sound, or a string of text. So all we’re talking about is human perception and language efficiency. Lossy compression is the key to human consciousness. We can’t deal with the reality all around us, so we filter it. This is old philosophy. John Archibald Wheeler said “It from Bit”: “Every it — every particle, every field of force, even the space-time continuum itself — derives its function, its meaning, its very existence … from bits” (Kindle Locations 6350-6351). But the bits are answers to yes-no questions. They require the questions in order to have any meaning. So once again, we’re talking not about reality, but about human perception of reality. It’s David Hume all over again.Finally, at the end of it all, Gleick admits “The birth of information theory came with its ruthless sacrifice of meaning — the very quality that gives information its value and its purpose” (Kindle Locations 7462-7463). Yes! Finally!! So the obvious thing to do at this point is to regain subjectivity. At long last we realize “words are not themselves ideas, but merely strings of ink marks; we see that sounds are nothing more than waves. In a modern age without an Author looking down on us from heaven, language is not a thing of definite certainty, but infinite possibility; without the comforting illusion of meaningful order we have no choice but to stare into the face of meaningless disorder…” (Kindle Locations 7505-7507). And make our own meaning.