MELISSA BLOCK, Host:
From NPR News, this is ALL THINGS CONSIDERED. I'm Melissa Block.
ROBERT SIEGEL, Host:
And I'm Robert Siegel.
BLOCK: A history, A Theory, A Flood."
It is about Jungle Drums, language, Morse code, telegraphy, telephony, quantum mechanics, thermodynamics, genetics and much more. It's the story of how we came to speak of what's in our DNA and of what's in our brains and matter is as information, not just in a passing metaphorical way but in the stuff of serious, rigorous, scientific discourse.
James Gleick, thanks for joining us.
JAMES GLEICK: It's a pleasure.
SIEGEL: And is it fair to say that information is now the overarching concept for our times that describes how scientists across many disciplines see their work in common?
GLEICK: Yes. It's fair to say that information is, we now know, the vital principle of our world. It's what the world runs on. And we've been talking about the information age for 50 years now. It's more than just a metaphor. Information is really the currency that's most important to us.
SIEGEL: There are many individuals who figure in the story, as you tell it, but one stands out and nearly survives to page 400, and that is Claude Shannon. I want you to tell us who Claude Shannon was and how important he was.
GLEICK: Well, he was an engineer and a mathematician born in the Midwest of the United States in the early part of the 20th century, who, at Bell Labs, at a particular point in time, in 1948, created what is now called information theory.
He was the first person to use the word bit as a scientific unit of measuring this funny, abstract thing that, until this point in time, scientists had not thought of as a measurable scientific quantity.
SIEGEL: A bit being kind of an either-or choice, a toggle switch.
GLEICK: That's right, the irreducible quantum of information, yes or no, either-or, one or off, if it's an electrical circuit. And, of course, we all know that that's what our computers are filled with, all that information in the forms of ones and zeros.
SIEGEL: There's a concept that Claude Shannon applied to language and information, the concept of entropy. It used to be that I didn't understand this concept very well as one of thermodynamics and dissipated energy. Now I also don't understand it entirely as a concept of information. I want you to explain: What does entropy to do with information?
GLEICK: Well, no one understands it, and in fact, when Claude Shannon first wrote his paper and made a connection between information and the thermodynamic concept of entropy, a rumor started around Bell Labs that the great atomic physicist John von Neumann had suggested to Shannon: Just use the word entropy. No one will know what you're talking about, and everybody will be scared to doubt you.
(SOUNDBITE OF LAUGHTER)
GLEICK: But there is a deep connection there. You know, entropy is associated thermodynamically, in systems involving heat, with disorder. And in an analogous way, information is associated with disorder, which seems paradoxical. But when you think about it, a bit of information is a surprise.
If you already knew what the message contained, there would be no new information in it. And so, information equals disorder, and disorder equals entropy, and a lot of physicists have been both scratching their heads and making scientific progress ever since.
SIEGEL: Now, as I was reading your book, "The Information," I was trying to grapple with this notion that everything is information, that information scientifically describes all things. And for some reason, the moon occurred to me.
And I understand one could imagine memes, kind of ideas that have a lot of legs and relevance, about the man in the moon or the moon being made of cream cheese or people who are influenced by the moon being lunatics.
But I kept wondering, I mean, doesn't the moon exist in a way that without our perceiving it isn't information, it's just something. It's matter. It's substance. And information only enters the picture when we start observing and thinking about it.
GLEICK: That's what you'd think. And it's hard to get past that. I'm not suggesting that physicists have given up on the idea that there are such things as matter and energy. Far from it. But physicists have started to talk as though at the fundamental core of things lies information.
The great, late physicist John Archibald Wheeler, who was the last surviving collaborator of both Einstein and Bohr, had an expression. He would say: It from bit. Matter is based ultimately on information. Binary, yes or no choices are at the root of things.
Now, it sounds mystical, and I can't pretend that I fully understand it, either, but it's just one of the many ways in which scientists have discovered a conception of information that helps them solve problems in a whole range of disciplines.
SIEGEL: Well, James Gleick, thank you very much for talking with us.
GLEICK: Robert, it's been my pleasure.
SIEGEL: James Gleick is the author of "The Information: A History, A Theory, a Flood." He spoke with us from Key West, Florida.
NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.