Bit By Bit, 'The Information' Reveals Everything

  • Playlist
  • Download
  • Embed
    Embed <iframe src="" width="100%" height="290" frameborder="0" scrolling="no">
  • Transcript
The Information by James Gleick
The Information
By James Gleick
Hardcover, 544 pages
List Price: $29.95
Read An Excerpt

The Information, written by James Gleick, covers nearly everything — jungle drums, language, Morse code, telegraphy, telephony, quantum mechanics, thermodynamics, genetics and more — as it relates to information, which he describes as the "fundamental core of things." Information theory can now be seen as the overarching concept for our times, describing how scientists in many disciplines see a common thread to their work.

Gleick's book spans centuries and geographic locations, but one person stays throughout the story for almost 400 pages: Claude Shannon, an engineer and mathematician who worked at Bell Labs in the mid-20th century. Shannon created what is now called information theory, Gleick tells Robert Siegel on All Things Considered:

"He was the first person to use the word 'bit' as a scientific unit of measuring this funny abstract thing that until this point in time scientists had not thought of as a measurable scientific quantity."

Bits are more commonly recognized as the 1s and 0s that enable computers to store and share information, but can also be thought of in this context as a yes/no, either/or or on/off switch. Gleick describes the bit as "the irreducible quantum of information," upon which all things are built.

Just like Isaac Newton took vague words like "force" and "mass" that had fuzzy contemporary meanings and turned them into specific mathematical definitions, "information" now can refer to a specific scientific definition similar to a bit.

"Binary yes or no choices are at the root of things," Gleick explains. The physicist John Archibald Wheeler coined an epigram to encapsulate the concept behind information theory: "It from bit." It described the idea that the smallest particle of every piece of matter is a binary question, a 1 or a 0. From these pieces of information, other things could develop — like DNA, matter and living organisms. The field of information theory, in addition to creating new meanings for words like "information," also builds upon knowledge from other scientific disciplines such as thermodynamics, even though the result may be a little tough to understand.

James Gleick also wrote Chaos: Making a New Science, which popularized the idea of the butterfly effect. His books have been finalists for the Pulitzer Prize and the National Book Award.

James Gleick also wrote Chaos: Making a New Science, which popularized the idea of the butterfly effect. His books have been finalists for the Pulitzer Prize and the National Book Award. Phylis Rose hide caption

itoggle caption Phylis Rose

"When Claude Shannon first wrote his paper and made a connection between information and the thermodynamic concept of entropy, a rumor started around Bell Labs that the great atomic physicist John von Neumann had suggested to Shannon, 'Just use the word entropy — no one will know what you're talking about, and everyone will be scared to doubt you.' "

Though it may be a difficult subject to conceptualize, entropy does have a deep connection to information science, Gleick says. Entropy is associated with disorder in thermodynamic systems, and analogously so in informational systems. Though it may seem paradoxical to link information to disorder, Gleick explains that each new bit of information is a surprise — if you knew what a particular message contained, there would not be information in it.

"Information equals disorder, disorder equals entropy and a lot of physicists have been both scratching their heads and making scientific progress ever since," Gleick says.

In the everyday — not scientific — sense, an object like the moon only seems to contain information when we perceive it and develop thoughts about it, whether that's the man in the moon, the moon being made of cheese or the moon driving people to madness. But Gleick says that even without our perceiving it, the moon is more than just matter — it still has its own bits of intrinsic information.

"It sounds mystical, and I can't pretend that I fully understand it either, but it's just one of the many ways in which scientists have discovered a conception of information that helps them solve problems in a whole range of disciplines."

Excerpt: 'The Information'

The Information by James Gleick
The Information
By James Gleick
Hardcover, 544 pages
List Price: $29.95

We can see now that information is what our world runs on: the blood and the fuel, the vital principle. It pervades the sciences from top to bottom, transforming every branch of knowledge. Information theory began as a bridge from mathematics to electrical engineering and from there to computing. What English speakers call "computer science" Europeans have long since known as informatique, informatica, and Informatik. Now even biology has become an information science, a subject of messages, instructions, and code. Genes encapsulate information and enable procedures for reading it in and writing it out. Life spreads by networking. The body itself is an information processor. Memory is stored not just in brains but in every cell. No wonder genetics bloomed along with information theory. DNA is the quintessential information molecule, the most advanced message processor at the cellular level—an alphabet and a code, 6 billion bits to form a human being. "What lies at the heart of every living thing is not a fire, not warm breath, not a 'spark of life,'" declares the evolutionary theorist Richard Dawkins. "It is information, words, instructions. . . . If you want to understand life, don't think about vibrant, throbbing gels and oozes, think about information technology." The cells of an organism are nodes in a richly interwoven communications network, transmitting and receiving, coding and decoding. Evolution itself embodies an ongoing exchange of information between organism and environment.

"The information circle becomes the unit of life," says Werner Loewenstein after thirty years spent studying intercellular communication. He reminds us that information means something deeper now: "It connotes a cosmic principle of organization and order, and it provides an exact measure of that." The gene has its cultural analog, too: the meme. In cultural evolution, a meme is a replicator and propagator—an idea, a fashion, a chain letter, or a conspiracy theory. On a bad day, a meme is a virus.

Economics is recognizing itself as an information science, now that money itself is completing a developmental arc from matter to bits, stored in computer memory and magnetic strips, world finance coursing through the global nervous system. Even when money seemed to be material treasure, heavy in pockets and ships' holds and bank vaults, it always was information. Coins and notes, shekels and cowries were all just short-lived technologies for tokenizing information about who owns what.

And atoms? Matter has its own coinage, and the hardest science of all, physics, seemed to have reached maturity. But physics, too, finds itself sideswiped by a new intellectual model. In the years after World War II, the heyday of the physicists was at hand. The great news of science appeared to be the splitting of the atom and the control of nuclear energy. Theorists focused their prestige and resources on the search for fundamental particles and the laws governing their interaction, the construction of giant accelerators and the discovery of quarks and gluons. From this exalted enterprise, the business of communications research could not have appeared further removed. At Bell Labs, Claude Shannon was not thinking about physics. Particle physicists did not need bits.

And then, all at once, they did. Increasingly, the physicists and the information theorists are one and the same. The bit is a fundamental particle of a different sort: not just tiny but abstract—a binary digit, a flip-flop, a yes-or-no. It is insubstantial, yet as scientists finally come to understand information, they wonder whether it may be primary: more fundamental than matter itself. They suggest that the bit is the irreducible kernel and that information forms the very core of existence. Bridging the physics of the twentieth and twenty-first centuries, John Archibald Wheeler, the last surviving collaborator of both Einstein and Bohr, puts this manifesto in oracular monosyllables: "It from bit." Information gives rise to "every it—every particle, every field of force, even the space-time continuum itself." This is another way of fathoming the paradox of the observer: that the outcome of an experiment is affected, or even determined, when it is observed. Not only is the observer observing; she is asking questions and making statements that must ultimately be expressed in discrete bits. "What we call reality," Wheeler writes coyly, "arises in the last analysis from the posing of yes-no questions." He adds: "All things physical are information-theoretic in origin, and this is a participatory universe." The whole universe is thus seen as a computer—a cosmic information-processing machine.

How much does it compute? How fast? How big is its total information capacity, its memory space? What is the link between energy and information; what is the energy cost of flipping a bit? These are hard questions, but they are not as mystical and metaphorical as they sound. Physicists and quantum information theorists, a new breed, struggle with them together. They do the math and produce tentative answers. ("The bit count of the cosmos, however it is figured, is ten raised to a very large power," according to Wheeler. According to Seth Lloyd: "No more than 10120 ops on 1090 bits.") They look anew at the mysteries of thermodynamic entropy and at those notorious information swallowers, black holes. "Tomorrow," Wheeler declares, "we will have learned to understand and express all of physics in the language of information."

Excerpted from The Information: A History, A Theory, A Flood by James . Copyright © 2010 by James Gleick. Excerpted by permission of Pantheon, a division of Random House, Inc. All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.

Books Featured In This Story

The Information

A History, a Theory, a Flood

by James Gleick

Hardcover, 526 pages | purchase

Purchase Featured Book

The Information
A History, a Theory, a Flood
James Gleick

Your purchase helps support NPR Programming. How?



Please keep your community civil. All comments must follow the Community rules and Terms of Use. NPR reserves the right to use the comments we receive, in whole or in part, and to use the commenter's name and location, in any medium. See also the Terms of Use, Privacy Policy and Community FAQ.