[personal profile] fiefoe
James Gleick covered a huge amount of ground in this book and I was very much with it for about 2/3 of it, testament to the author's appealing and illuminating narrative style. Sadly the quantum stuff in the later parts taxed my poor comprehension too much.
  • “A Mathematical Theory of Communication”—and the message was hard to summarize. But it was a fulcrum around which the world began to turn. Like the transistor, this development also involved a neologism: the word bit, chosen in this case not by committee but by the lone author, a thirty-two-year-old named Claude Shannon. The bit now joined the inch, the pound, the quart, and the minute as a determinate quantity—a fundamental unit of measure.
  • one could see a chain of abstraction and conversion: the dots and dashes representing letters of the alphabet; the letters representing sounds, and in combination forming words; the words representing some ultimate substrate of meaning, perhaps best left to philosophers.
  • Shannon’s theory made a bridge between information and uncertainty; between information and entropy; and between information and chaos... Information processing was born, along with information storage and information retrieval. People began to name a successor to the Iron Age and the Steam Age. <> We can see now that information is what our world runs on: the blood and the fuel, the vital principle.
  • Evolution itself embodies an ongoing exchange of information between organism and environment. <> “The information circle becomes the unit of life,” says Werner Loewenstein after thirty years spent studying intercellular communication. He reminds us that information means something deeper now: “It connotes a cosmic principle of organization and order, and it provides an exact measure of that.” The gene has its cultural analog, too: the meme.
  • as scientists finally come to understand information, they wonder whether it may be primary: more fundamental than matter itself. They suggest that the bit is the irreducible kernel and that information forms the very core of existence. Bridging the physics of the twentieth and twenty-first centuries, John Archibald Wheeler, the last surviving collaborator of both Einstein and Bohr, put this manifesto in oracular monosyllables: “It from Bit.”
------ch1
  • Instead of “don’t be afraid,” they would say, “Bring your heart back down out of your mouth, your heart out of your mouth, get it back down from there.” The drums generated fountains of oratory. This seemed inefficient.
  • That result was a technology much sought in Europe: long-distance communication faster than any traveler on foot or horseback. Through the still night air over a river, the thump of the drum could carry six or seven miles. Relayed from village to village, messages could rumble a hundred miles or more in a matter of an hour.
  • In the Aeschylus version, Clytemnestra gets the news of the fall of Troy that very night, four hundred miles away in Mycenae... The meaning of the message had, of course, to be prearranged, effectively condensed into a single bit. A binary choice, something or nothing: the fire signal meant something, which, just this once, meant “Troy has fallen.” To transmit this one bit required immense planning, labor, watchfulness, and firewood. Many years later, lanterns in Old North Church likewise sent Paul Revere a single precious bit, which he carried onward, one binary choice: by land or by sea.
  • The annals of invention offered scarcely any precedent. How to convert information from one form, the everyday language, into another form suitable for transmission by wire taxed his ingenuity more than any mechanical problem of the telegraph. It is fitting that history attached Morse’s name to his code, more than to his device.
  • In solving the enigma of the drums, Carrington found the key in a central fact about the relevant African languages. They are tonal languages... When they transliterated the words they heard into the Latin alphabet, they disregarded pitch altogether. In effect, they were color-blind.
  • the drum language went a difficult step further. It employed tone and only tone. It was a language of a single pair of phonemes, a language composed entirely of pitch contours... So in mapping the spoken language to the drum language, information was lost. The drum talk was speech with a deficit... The extra drumbeats, far from being extraneous, provide context. Every ambiguous word begins in a cloud of possible alternative interpretations; then the unwanted possibilities evaporate.
  • The resemblance to Homeric formulas—not merely Zeus, but Zeus the cloud-gatherer...—is no accident. In an oral culture, inspiration has to serve clarity and memory first. The Muses are the daughters of Mnemosyne.
  • Neither Kele nor English yet had words to say, allocate extra bits for disambiguation and error correction. Yet this is what the drum language did.
  • Ralph Hartley, even had a relevant-looking formula: H = n log s, where H is the amount of information, n is the number of symbols in the message, and s is the number of symbols available in the language.
------ch2
  • Language is not a technology, no matter how well developed and efficacious. It is not best seen as something separate from the mind; it is what the mind does. “Language in fact bears the same relationship to the concept of mind that legislation bears to the concept of parliament,” says Jonathan Miller: “it is a competence forever bodying itself in a series of concrete performances.”
  • Plato (channeling the nonwriter Socrates) warned that this technology meant impoverishment: <> For this invention will produce forgetfulness in the minds of those who learn to use (writing)
  • The power of this first artificial memory was incalculable: to restructure thought, to engender history... whereas the total vocabulary of any oral language measures a few thousand words, the single language that has been written most widely, English, has a documented vocabulary of well over a million words... Each word has a provenance and a history that melts into its present life. <> With words we begin to leave traces behind us like breadcrumbs: memories in symbols for others to follow.
  • The writing system at the opposite extreme took the longest to emerge: the alphabet, one symbol for one minimal sound. The alphabet is the most reductive, the most subversive of all scripts... The alphabet was invented only once... the same original ancestor, which arose near the eastern littoral of the Mediterranean Sea, sometime not much before 1500 BCE, in a region that became a politically unstable crossroads of culture, covering Palestine, Phoenicia, and Assyria.
  • Greece had not needed the alphabet to create literature—a fact that scholars realized only grudgingly, beginning in the 1930s. That was when Milman Parry, a structural linguist... proposed that the Iliad and the Odyssey not only could have been but must have been composed and sung without benefit of writing.
  • Conclusions follow from premises. These require a degree of constancy. They have no power unless people can examine and evaluate them. In contrast, an oral narrative proceeds by accretion, the words passing by in a line of parade past the viewing stand, briefly present and then gone, interacting with one another via memory and association. There are no syllogisms in Homer. Experience is arranged in terms of events, not categories. Only with writing does narrative structure come to embody sustained rational argument.
  • Logic implicates symbolism directly: things are members of classes; they possess qualities, which are abstracted and generalized. Oral people lacked the categories that become second nature even to illiterate individuals in literate cultures
  • The Babylonians computed linear equations, quadratic equations, and Pythagorean numbers long before Pythagoras.
  • “This is the procedure” was a standard closing, like a benediction, and for Knuth redolent with meaning. In the Louvre he found a “procedure” that reminded him of a stack program on a Burroughs B5500.
  • Putting pen to parchment in this time of barest literacy, (John of Salisbury) tried to examine the act of writing and the effect of words: “Frequently they speak voicelessly the utterances of the absent.” The idea of writing was still entangled with the idea of speaking. The mixing of the visual and the auditory continued to create puzzles, and so also did the mixing of past and future: utterances of the absent. Writing leapt across these levels. <> Every user of this technology was a novice... (They found it awkward to keep tenses straight, like voicemail novices leaving their first messages circa 1980.)
  • Writers loved to discuss writing, far more than bards ever bothered to discuss speech. They could see the medium and its messages, hold them up to the mind’s eye for study and analysis. And they could criticize it—for from the very start, the new abilities were accompanied by a nagging sense of loss... Unfortunately the written word stands still.
-------ch3
  • In fact, few had any concept of “spelling”—the idea that each word, when written, should take a particular predetermined form of letters. The word cony (rabbit) appeared variously
  • He recognized another motivating factor: the quickening pace of commerce and transportation made other languages a palpable presence, forcing an awareness of the English language as just one among many.
  • Of all the world’s languages English was already the most checkered, the most mottled, the most polygenetic. Its history showed continual corruption and enrichment from without. Its oldest core words, the words that felt most basic, came from the language spoken by the Angles, Saxons, and Jutes, Germanic tribes that crossed the North Sea into England in the fifth century, pushing aside the Celtic inhabitants.
  • his dates of birth and death are uncertain; he is said to have visited London in 1611 and there to have seen a dead crocodile; and little else is known. His Expositour appeared in 1616
  • Not until the OED, though, did lexicography attempt to reveal the whole shape of a language across time. The OED becomes a historical panorama.
  • Other new words appear without any corresponding innovation in the world of real things. They crystallize in the solvent of universal information.
------ch4
  • in 1871 the Times obituarist declared (Charles Babbage) “one of the most active and original of original thinkers” but seemed to feel he was best known for his long, cranky crusade against street musicians and organ-grinders... Examining the economics of the mail, he pursued a counterintuitive insight, that the significant cost comes not from the physical transport of paper packets but from their “verification”—the calculation of distances and the collection of correct fees—and thus he invented the modern idea of standardized postal rates.
  • Babbage’s machine was designed to manufacture vast quantities of a certain commodity. The commodity was numbers. The engine opened a channel from the corporeal world of matter to a world of pure abstraction.
  • In any case the computers, being human, made errors, so the same work was often farmed out twice for the sake of redundancy. (Unfortunately, being human, computers were sometimes caught saving themselves labor by copying from one other.)
  • (Napier’s:) till by Logarithms it was found out: for otherwise it requires so many laborious extractions of roots, as will cost more paines than the knowledge of the thing is accompted to be worth.” Knowledge has a value and a discovery cost, each to be counted and weighed. <> Even this exciting discovery took several years to travel as far as Johannes Kepler, who employed it in perfecting his celestial tables in 1627.. “A Scottish baron has appeared on the scene (his name I have forgotten) who has done an excellent thing,” Kepler wrote a friend, “transforming all multiplication and division into addition and subtraction.”
  • He joined with two other promising students, John Herschel and George Peacock, to form what they named the Analytical Society, “for the propagation of d’s” and against “the heresy of dots,” or as Babbage said, “the Dot-age of the University.” (He was pleased with his own “wicked pun.”) In their campaign to free the calculus from English dotage,
  • It replaced muscles everywhere. It became a watchword: people on the go would now “steam up” or “get more steam on” or “blow off steam.” Benjamin Disraeli hailed “your moral steam which can work the world.”
  • To decide whether a machine qualified as automatic, he needed to ask a question that would have been simpler if the words input and output had been invented: “Whether, when the numbers on which it is to operate are placed in the instrument, it is capable of arriving at its result by the mere motion of a spring, a descending weight, or any other constant force.” This was a farsighted standard.
  • the drawings covering more than 400 square feet. The level of complexity was confounding. Babbage solved the problem of adding many digits at once by separating the “adding motions” from the “carrying motions” and then staggering the timing of the carries. The addition would begin with a rush of grinding gears, first the odd-numbered columns of dials, then the even columns. Then the carries would recoil across the rows. To keep the motion synchronized, parts of the machine would need to “know” at critical times that a carry was pending. The information was conveyed by the state of a latch. For the first time, but not the last, a device was invested with memory.
  • Lardner’s own explanation of “carrying,” for example, was epic. A single isolated instant of the action involved a dial, an index, a thumb, an axis, a trigger, a notch, a hook, a claw, a spring, a tooth, and a ratchet wheel:
  • He was a mathematical raconteur—that was no contradiction, in this time and place. Lyell reported approvingly that he “jokes and reasons in high mathematics.”
  • She was a prodigy, clever at mathematics, encouraged by tutors, talented in drawing and music, fantastically inventive and profoundly lonely. When she was twelve, she set about inventing a means of flying... For a while she signed her letters “your very affectionate Carrier Pigeon.”
  • Inspiring him, as well, was the loom on display in the Strand, invented by Joseph-Marie Jacquard, controlled by instructions encoded and stored as holes punched in cards. <> What caught Babbage’s fancy was not the weaving, but rather the encoding, from one medium to another, of patterns... He meant the cogs and wheels to handle not just numbers but variables standing in for numbers. Variables were to be filled or determined by the outcomes of prior calculations, and, further, the very operations—such as addition or multiplication—were to be changeable, depending on prior outcomes.
  • As he traversed the rails, he realized that a peculiar danger of steam locomotion lay in its outracing every previous means of communication. Trains lost track of one another. Until the most regular and disciplined scheduling was imposed, hazard ran with every movement. One Sunday Babbage and Brunel, operating in different engines, barely avoided smashing into each other.
  • His obsessions belonged to no category—that is, no category yet existing. His true subject was information: messaging, encoding, processing. <> He took up two quirky and apparently unphilosophical challenges, which he himself noted had a deep connection one to the other: picking locks and deciphering codes.
-------ch5
  • A telegraph message outraced him with his description (“in the garb of a kwaker, with a brown great coat on”—no Q’s in the English system); he was captured in London and hanged in March. The drama filled the newspapers for months. It was later said of the telegraph wires, “Them’s the cords that hung John Tawell.”
  • Some worried that the telegraph would be the death of newspapers, heretofore “the rapid and indispensable carrier of commercial, political and other intelligence,”.. as the infallible Telegraph will contradict their falsehoods as fast as they can publish them.
  • In this manner, the telegraph may be made a vast national barometer, electricity becoming the handmaid of the mercury. <> This was a transformative idea. In 1854 the government established a Meteorological Office in the Board of Trade... Meteorologists began to understand that all great winds, when seen in the large, were circular, or at least “highly curved.”
  • It requires no small intellectual effort to realize that this is a fact that now is, and not one that has been. <> History (and history making) changed, too. The telegraph caused the preservation of quantities of minutiae concerning everyday life.
  • This process—the transferring of meaning from one symbolic level to another—already had a place in mathematics. In a way it was the very essence of mathematics. Now it became a familiar part of the human toolkit. Entirely because of the telegraph,
  • Two motivations went hand in glove: secrecy and brevity. Short messages saved money—that was simple. So powerful was that impulse that English prose style soon seemed to be feeling the effects. Telegraphic and telegraphese described the new way of writing. Flowers of rhetoric cost too much,... Almost immediately, newspaper reporters began to contrive methods for transmitting more information with fewer billable words. “We early invented a short-hand system, or cipher,”
  • That word, differences, must have struck Wilkins’s readers (few though they were) as an odd choice. But it was deliberate and pregnant with meaning. Wilkins was reaching for a conception of information in its purest, most general form... Any difference meant a binary choice. Any binary choice began the expressing of cogitations. Here, in this arcane and anonymous treatise of 1641, the essential idea of information theory poked to the surface of human thought, saw its shadow, and disappeared again for four hundred years.
----ch6
  • At the peak, American farmers, ranchers, and railroads laid more than a million miles a year. Taken collectively the nation’s fence wire formed no web or network, just a broken lattice... Unwilling to wait for the telephone companies to venture out from the cities, rural folk formed barbed-wire telephone cooperatives. They replaced metal staples with insulated fasteners. They attached dry batteries and speaking tubes and added spare wire to bridge the gaps.
  • In a deeply abstract way, these problems lined up. The peculiar artificial notation of symbolic logic, Boole’s “algebra,” could be used to describe circuits. <> This was an odd connection to make. The worlds of electricity and logic seemed incongruous. Yet, as Shannon realized, what a relay passes onward from one circuit to the next is not really electricity but rather a fact: the fact of whether the circuit is open or closed.
  • “It is possible to perform complex mathematical operations by means of relay circuits,” he wrote. “In fact, any operation that can be completely described in a finite number of steps using the words if, or, and, etc. can be done automatically with relays.” As a topic for a student in electrical engineering this was unheard of... it pointed to the future. Logic circuits. Binary arithmetic. Here in a master’s thesis by a research assistant was the essence of the computer revolution yet to come.
  • Then, the processes of genetic combination and cross-breeding could be predicted by a calculus of additions and multiplications. It was a sort of road map, far abstracted from the messy biological reality.
  • _This statement is false. The statement cannot be true, because then it is false. It cannot be false, because then it becomes true. It is neither true nor false, or it is both at once. But the discovery of this twisting, backfiring, mind-bending circularity does not bring life or language crashing to a halt—one grasps the idea and moves on—because life and language lack the perfection, the absolutes, that give them force.
  • _S is the set of all sets that are not members of themselves. <> This version is known as Russell’s paradox. It cannot be dismissed as noise. <> To eliminate Russell’s paradox Russell took drastic measures. The enabling factor seemed to be the peculiar recursion within the offending statement: the idea of sets belonging to sets. Recursion was the oxygen feeding the flame.
  • Within PM, and within any consistent logical system capable of elementary arithmetic, there must always be such accursed statements, true but unprovable. Thus Gödel showed that a consistent formal system must be incomplete; no complete and consistent system can exist.
  • Gödel’s conclusion sprang not from a weakness in PM but from a strength. That strength is the fact that numbers are so flexible or “chameleonic” that their patterns can mimic patterns of reasoning.… PM’s expressive power is what gives rise to its incompleteness.
  • It may sound ridiculous to say that Bell and his successors were the fathers of modern commercial architecture—of the skyscraper... Suppose there was no telephone and every message had to be carried by a personal messenger? How much room do you think the necessary elevators would leave for offices?
  • in Lowell, Massachusetts... An epidemic of measles broke out, and Dr. Moses Greeley Parker worried that if the operators succumbed, they would be hard to replace. He suggested identifying each telephone by number.
  • exchanges everywhere discovered that boys were wild, given to clowning and practical jokes, and more likely to be found wrestling on the floor than sitting on stools to perform the exacting, repetitive work of a switchboard operator. A new source of cheap labor was available, and by 1881 virtually every telephone operator was a woman.
  • But new difficulties arose in understanding the action of networks; network theorems were devised to handle these mathematically. Mathematicians applied queuing theory to usage conflicts; developed graphs and trees to manage issues of intercity trunks and lines; and used combinatorial analysis to break down telephone probability problems.
  • Although the fluid may be at rest and the system in thermodynamic equilibrium, the irregular motion perseveres, as long as the temperature is above absolute zero. By the same token, he showed that random thermal agitation would also affect free electrons in any electrical conductor—making noise. <> Physicists paid little attention to the electrical aspects of Einstein’s work, and it was not until 1927 that thermal noise in circuits was put on a rigorous mathematical footing,
-----ch7
  • Also, because every number corresponds to an encoded proposition of mathematics and logic, Turing had resolved Hilbert’s question about whether every proposition is decidable. He had proved that the Entscheidungsproblem has an answer, and the answer is no. An uncomputable number is, in effect, an undecidable proposition. <> So Turing’s computer—a fanciful, abstract, wholly imaginary machine—led him to a proof parallel to Gödel’s.
  • Turing encoded instructions as numbers. He encoded decimal numbers as zeroes and ones. Shannon made codes for genes and chromosomes and relays and switches. Both men applied their ingenuity to mapping one set of objects onto another: logical operators and electric circuits; algebraic functions and machine instructions. The play of symbols and the idea of mapping, in the sense of finding a rigorous correspondence between two sets, had a prominent place in their mental arsenals. This kind of coding was not meant to obscure but to illuminate:
  • The linguist Edward Sapir wrote of “symbolic atoms” formed by a language’s underlying phonetic patterns. “The mere sounds of speech,” he wrote in 1921, “are not the essential fact of language, which lies rather in the classification, in the formal patterning.… Language, as a structure, is on its inner face the mold of thought.” Mold of thought was exquisite.
  • (Shannon) established the scientific principles of cryptography. Among other things, he proved that perfect ciphers were possible—“perfect” meaning that even an infinitely long captured message would not help a code breaker... he also proved that the requirements were so severe as to make them practically useless. In a perfect cipher, all keys must be equally likely, in effect, a random stream of characters; each key can be used only once; and, worst of all, each key must be as long as the entire message.
  • Information is uncertainty, surprise, difficulty, and entropy: <> “Information is closely associated with uncertainty.” Uncertainty, in turn, can be measured by counting the number of possible messages. If only one message is possible, there is no uncertainty and thus no information.
  • Shannon sidestepped this problem by treating the signal as a string of discrete symbols. Now, instead of boosting the power, a sender can overcome noise by using extra symbols for error correction... Shannon considered the discrete case to be more fundamental in a mathematical sense as well. And he was considering another point: that treating messages as discrete had application not just for traditional communication but for a new and rather esoteric subfield, the theory of computing machines.
  • If English is 75 percent redundant, then a thousand-letter message in English carries only 25 percent as much information as one thousand letters chosen at random. Paradoxical though it sounded, random messages carry more information. The implication was that natural-language text could be encoded more efficiently for transmission or storage.
  • Whether removing redundancy to increase efficiency or adding redundancy to enable error correction, the encoding depends on knowledge of the language’s statistical structure to do the encoding. Information cannot be separated from probabilities. A bit, fundamentally, is always a coin toss.
------ch8
  • (Wiener) worried that it would devalue the human brain as factory machinery had devalued the human hand. <> He developed the human-machine parallels in a chapter titled “Computing Machines and the Nervous System.” First he laid out a distinction between two types of computing machines: analog and digital
  • By now Miller’s argument had become something in the nature of a manifesto. Recoding, he declared, “seems to me to be the very lifeblood of the thought processes.”... This was the beginning of the movement called the cognitive revolution in psychology, and it laid the foundation for the discipline called cognitive science, combining psychology, computer science, and philosophy.
  • Throughout the 1950s, Shannon remained the intellectual leader of the field he had founded. His research produced dense, theorem-packed papers, pregnant with possibilities for development, laying foundations for broad fields of study. What Marshall McLuhan later called the “medium” was for Shannon the channel, and the channel was subject to rigorous mathematical treatment. The applications were immediate and the results fertile: broadcast channels and wiretap channels, noisy and noiseless channels, Gaussian channels, channels with input constraints and cost constraints, channels with feedback and channels with memory, multiuser channels and multiaccess channels.
---ch9
  • IT WOULD BE AN EXAGGERATION TO SAY that no one knew what entropy meant. Still, it was one of those words. The rumor at Bell Labs was that Shannon had gotten it from John von Neumann, who advised him he would win every argument because no one would understand it. Untrue, but plausible. The word began by meaning the opposite of itself. It remains excruciatingly difficult to define.
  • The ability of a thermodynamic system to produce work depends not on the heat itself, but on the contrast between hot and cold. .. It is the unavailability of this energy—its uselessness for work—that Clausius wanted to measure. He came up with the word entropy, formed from Greek to mean “transformation content.”
  • Maxwell had first considered entropy as a subtype of energy: the energy available for work. On reconsideration, he recognized that thermodynamics needed an entirely different measure. Entropy was not a kind of energy or an amount of energy; it was, as Clausius had said, the unavailability of energy.
  • But there were too many to measure and calculate individually. Probability entered the picture. The new science of statistical mechanics made a bridge between the microscopic details and the macroscopic behavior.
  • The clever young Thomasina says in Tom Stoppard’s Arcadia, “You cannot stir things apart,” and this is precisely the same as “Time flows on, never comes back.” Such processes run in one direction only. Probability is the reason. What is remarkable—physicists took a long time to accept it—is that every irreversible process must be explained the same way. Time itself depends on chance, or “the accidents of life,”
  • The demon replaces chance with purpose. It uses information to reduce entropy. Maxwell never imagined how popular his demon would become, nor how long-lived. Henry Adams, who wanted to work some version of entropy into his theory of history, wrote to his brother Brooks in 1903, “Clerk Maxwell’s demon who runs the second law of Thermo-dynamics ought to be made President.”
  • Then the demon began to haunt Leó Szilárd, a very young Hungarian physicist with a productive imagination who would later conceive the electron microscope and, not incidentally, the nuclear chain reaction.
  • for them, in a simpler time, it was as if the information belonged to a parallel universe, an astral plane, not linked to the universe of matter and energy, particles and forces, whose behavior they were learning to calculate. <> But information is physical. Maxwell’s demon makes the link. The demon performs a conversion between information and energy, one particle at a time... Every time the demon makes a choice between one particle and another, it costs one bit of information. The payback comes at the end of the cycle, when it has to clear its memory (Szilárd
  • To the physicist, entropy is a measure of uncertainty about the state of a physical system: one state among all the possible states it can be in... To the information theorist, entropy is a measure of uncertainty about a message: one message among all the possible messages that a communications source can produce.
  • (Schrödinger) mocked this idea—vis viva or entelechy—and he also mocked the popular notion that organisms “feed upon energy.” Energy and matter were just two sides of a coin, and anyway one calorie is as good as another. No, he said: the organism feeds upon negative entropy.
  • The mathematical reckoning of order and chaos remains more ticklish, the relevant definitions being subject to feedback loops of their own.
  • Schrödinger felt something was missing. Crystals are too orderly—built up in “the comparatively dull way of repeating the same structure in three directions again and again.”... Life must depend on a higher level of complexity, structure without predictable repetition, he argued. He invented a term: aperiodic crystals. This was his hypothesis: We believe a gene—or perhaps the whole chromosome fiber—to be an aperiodic solid.
-------ch10
  • For the next decade, the struggle to understand the genetic code consumed a motley assortment of the world’s great minds, many of them, like Gamow, lacking any useful knowledge of biochemistry. For Watson and Crick, the initial problem had depended on a morass of specialized particulars: hydrogen bonds, salt linkages, phosphate-sugar chains with deoxyribofuranose residues. They had to learn how inorganic ions could be organized in three dimensions; they had to calculate exact angles of chemical bonds. They made models out of cardboard and tin plates. But now the problem was being transformed into an abstract game of symbol manipulation.
  • An unexpected cast of scientists joined the hunt: Max Delbrück, an ex-physicist now at Caltech in biology; his friend Richard Feynman, the quantum theorist; Edward Teller, the famous bomb maker; another Los Alamos alumnus, the mathematician Nicholas Metropolis; and Sydney Brenner, who joined Crick at the Cavendish. <> They all had different coding ideas.
  • They thought protein synthesis couldn’t be a simple matter of coding from one thing to another; that sounded too much like something a physicist had invented. It didn’t sound like biochemistry to them.… So there was a certain resistance to simple ideas like three nucleotides’ coding an amino acid; people thought it was rather like cheating.
  • Douglas Hofstadter was the first to make this connection explicitly, in the 1980s: “between the complex machinery in a living cell that enables a DNA molecule to replicate itself and the clever machinery in a mathematical system that enables a formula to say things about itself.” In both cases he saw a twisty feedback loop. “Nobody had ever in the least suspected that one set of chemicals could code for another set,”
  • To them it looked like a classic problem in Shannon coding theory: “the sequence of nucleotides as an infinite message, written without punctuation, from which any finite portion must be decodable into a sequence of amino acids by suitable insertion of commas.” They constructed a dictionary of codes. They considered the problem of misprints.
  • When the genetic code was solved, in the early 1960s, it turned out to be full of redundancy. Much of the mapping from nucleotides to amino acids seemed arbitrary—not as neatly patterned as any of Gamow’s proposals. Some amino acids correspond to just one codon, others to two, four, or six.
  • Brenner said. “Biochemists only thought about the flux of energy and the flow of matter. Molecular biologists started to talk about the flux of information.
  • Samuel Butler had said a century earlier—and did not claim to be the first—that a hen is only an egg’s way of making another egg.
  • You might as well say (as the geneticist John Maynard Smith did, mockingly) that there is a gene for tying shoelaces. But Dawkins was undaunted. He pointed out that genes are about differences, after all. So he began with a simple counterpoint: might there not be a gene for dyslexia?
-------ch11
  • Dawkins’s way of speaking was not meant to suggest that memes are conscious actors, only that they are entities with interests that can be furthered by natural selection. Their interests are not our interests. “A meme,” Dennett says, “is an information packet with attitude.” When we speak of fighting for a principle or dying for an idea, we may be more literal than we know.
  • Rhyme and rhythm are qualities that aid a meme’s survival, just as strength and speed aid an animal’s. Patterned language has an evolutionary advantage. Rhyme, rhythm, and reason—for reason, too, is a form of pattern.
  • Before anyone understood anything of epidemiology, its language was applied to species of information. An emotion can be infectious, a tune catchy, a habit contagious.
  • Researchers studying the Internet itself as a medium—crowdsourcing, collective attention, social networking, and resource allocation—employ not only the language but also the mathematical principles of epidemiology.
  • It is a perfect milieu for self-replicating programs to flourish.” Indeed, the Internet was in its birth throes. Not only did it provide memes with a nutrient-rich culture medium; it also gave wings to the idea of memes. Meme itself quickly became an Internet buzzword.
---ch12
  • But why do we say Π is not random? Chaitin proposed a clear answer: a number is not random if it is computable—if a definable computer program will generate it. Thus computability is a measure of randomness.
  • He first saw Claude Shannon’s Mathematical Theory of Communication rendered into Russian in 1953, purged of its most interesting features by a translator working in Stalin’s heavy shadow. The title became Statistical Theory of Electrical Signal Transmission. The word information, was everywhere replaced with data.
  • And I know that Academician Kolmogorov has the same feeling when reading my works.” But the feeling was evidently not shared. Kolmogorov steered his colleagues toward Shannon instead.
  • Soviet telephony was notoriously dismal, a subject for eternally bitter Russian humor. As of 1965, there was still no such thing as direct long-distance dialing.
  • Kolmogorov introduced a new word for the thing he was trying to measure: complexity. As he defined this term, the complexity of a number, or message, or set of data is the inverse of simplicity and order and, once again, it corresponds to information. The simpler an object is, the less information it conveys.
  • The three are fundamentally equivalent: information, randomness, and complexity—three powerful abstractions, bound all along like secret lovers.
  • In effect, Solomonoff, too, had been figuring out how a computer might look at sequences of data—number sequences or bit strings—and measure their randomness and their hidden patterns. When humans or computers learn from experience, they are using induction: recognizing regularities amid irregular streams of information. From this point of view, the laws of science represent data compression in action. A theoretical physicist acts like a very clever coding algorithm.
  • Solomonoff, Kolmogorov, and Chaitin tackled three different problems and came up with the same answer... They all arrived at minimal program size. And they all ended up talking about complexity.
  • The rational numbers are not normal, and there are infinitely many rational numbers, but they are infinitely outnumbered by normal numbers. Yet, having settled the great and general question, mathematicians can almost never prove that any particular number is normal.
  • Bennett brought it back. There is no logical depth in the parts of a message that are sheer randomness and unpredictability, nor is there logical depth in obvious redundancy—plain repetition and copying. Rather, he proposed, the value of a message lies in “what might be called its buried redundancy—parts predictable only with difficulty, things the receiver could in principle have figured out without being told, but only at considerable cost in money, time, or computation.” When we value an object’s complexity, or its information content, we are sensing a lengthy hidden computation.
------ch13
  • QUANTUM MECHANICS HAS WEATHERED in its short history more crises, controversies, interpretations (the Copenhagen, the Bohm, the Many Worlds, the Many Minds), factional implosions, and general philosophical breast-beating than any other science. It is happily riddled with mysteries. It blithely disregards human intuition.
  • In 1989 (Wheeler) offered his final catchphrase: It from Bit. His view was extreme. It was immaterialist: information first, everything else later. “Otherwise put,” he said, every it—every particle, every field of force, even the space-time continuum itself—derives its function, its meaning, its very existence … from bits. <> Why does nature appear quantized? Because information is quantized. The bit is the ultimate unsplittable particle.
  • If the black hole evaporates, where does the information go? According to quantum mechanics, information may never be destroyed. The deterministic laws of physics require the states of a physical system at one instant to determine the states at the next instant; in microscopic detail, the laws are reversible, and information must be preserved. Hawking was the first to state firmly—even alarmingly—that this was a problem challenging the very foundations of quantum mechanics.
  • What is the physical cost of logical work? “Computers,” (Bennett) wrote provocatively, “may be thought of as engines for transforming free energy into waste heat and mathematical work.” Entropy surfaced again. A tape full of zeroes, or a tape encoding the works of Shakespeare, or a tape rehearsing the digits of Π, has “fuel value.” A random tape has none.
  • Landauer devoted his career to establishing the physical basis of information. “Information Is Physical” was the title of one famous paper, meant to remind the community that computation requires physical objects and obeys the laws of physics. Lest anyone forget, he titled a later essay—his last, it turned out—“Information Is Inevitably Physical.”
  • it seemed that most logical operations have no entropy cost at all. When a bit flips from zero to one, or vice-versa, the information is preserved. The process is reversible. Entropy is unchanged; no heat needs to be dissipated. Only an irreversible operation, he argued, increases entropy... In every case, Bennett found, heat dissipation occurs only when information is erased. Erasure is the irreversible logical operation... Forgetting takes work.
  • The qubit has this dreamlike character, too. It is not just either-or. Its 0 and 1 values are represented by quantum states that can be reliably distinguished—for example, horizontal and vertical polarizations—but coexisting with these are the whole continuum of intermediate states, such as diagonal polarizations, that lean toward 0 or 1 with different probabilities. So a physicist says that a qubit is a superposition of states; a combination of probability amplitudes. It is a determinate thing with a cloud of indeterminacy living inside.
  • “Feynman’s insight,” says Bennett, “was that a quantum system is, in a sense, computing its own future all the time. You may say it’s an analog computer of its own dynamics.”
  • In quantum computing, multiple qubits are entangled. Putting qubits at work together does not merely multiply their power; the power increases exponentially. In classical computing, where a bit is either-or, n bits can encode any one of 2n values. Qubits can encode these Boolean values along with all their possible superpositions. This gives a quantum computer a potential for parallel processing that has no classical equivalent.
  • “Many people can read a book and get the same message,” Bennett says, “but trying to tell people about your dream changes your memory of it, so that eventually you forget the dream and remember only what you said about it.” Quantum erasure, in turn, amounts to a true undoing: “One can fairly say that even God has forgotten.”
----------ch14
  • Edgar Allan Poe, following Babbage’s work eagerly, saw the point. “No thought can perish,” he wrote in 1845, in a dialogue between two angels. “Did there not cross your mind some thought of the physical power of words? Is not every word an impulse on the air?”
  • Each atom, once disturbed, must communicate its motion to others, and they in turn influence waves of air, and no impulse is ever entirely lost. The track of every canoe remains somewhere in the oceans.
  • in Tom Stoppard’s drama Arcadia: We shed as we pick up, like travelers who must carry everything in their arms, and what we let fall will be picked up by those behind. The procession is very long and life is very short. We die on the march. But there is nothing outside the march so nothing can be lost to it. The missing plays of Sophocles will turn up piece by piece, or be written again in another language.
  • Even so, the experts responsible for the third edition (“in Eighteen Volumes, Greatly Improved”), a full century after Isaac Newton’s Principia, could not bring themselves to endorse his, or any, theory of gravity, or gravitation. “There have been great disputes,” the Britannica stated.
  • In his novel The Infinities, John Banville imagines the god Hermes saying: “A hamadryad is a wood-nymph, also a poisonous snake in India, and an Abyssinian baboon. It takes a god to know a thing like that.” Yet according to Wikipedia, hamadryad also names a butterfly, a natural history journal from India, and a Canadian progressive rock band.
  • The collision of names, the exhaustion of names—it has happened before, if never on this scale. Ancient naturalists knew perhaps five hundred different plants and, of course, gave each a name.
----ch15
  • The sense of when we are—the ability to see the past spread out before one; the internalization of mental time charts; the appreciation of anachronism—came with the shift to print.
  • Different media have different event horizons—for the written word, three millennia; for recorded sound, a century and a half—and within their time frames the old becomes as accessible as the new... Most of Bach’s music was unknown to Beethoven; we have it all—partitas, cantatas, and ringtones.
--epilogue
  • It takes a human—or, let’s say, a “cognitive agent”—to take a signal and turn it into information. “Beauty is in the eye of the beholder, and information is in the head of the receiver,” says Fred Dretske. At any rate that is a common view, in epistemology—that “we invest stimuli with meaning, and apart from such investment, they are informationally barren.”
  • A world of information glut and gluttony; of bent mirrors and counterfeit texts; scurrilous blogs, anonymous bigotry, banal messaging. Incessant chatter. The false driving out the true.

Profile

fiefoe

February 2026

S M T W T F S
1 2 3 4 567
8 9 10 11121314
15 16 1718192021
2223 2425262728

Style Credit

Expand Cut Tags

No cut tags
Page generated Feb. 16th, 2026 12:30 pm
Powered by Dreamwidth Studios