Book Review: 'The Shallows' by Nicholas Carr — 'Has The Internet Rewired Your Brain?' Nicholas Carr asks us to look up from our laptops long enough to appreciate the way multitasking and technology are changing the way we think. In his book The Shallows, he laments all that we are losing in exchange for our dynamic, interconnected, Internet-fueled world.

Review

Book Reviews

'The Shallows': Has The Internet Rewired Your Brain?

The Shallows
The Shallows: What the Internet Is Doing to Our Brains
By Nicholas Carr
Hardcover, 276 pages
W.W. Norton & Co.
List price: $26.95
Read An Excerpt

Nicholas Carr wants you to know he's not a Luddite. The former executive editor of the Harvard Business Review and author of The Big Switch: Rewiring the World, from Edison to Google, Carr has been, and still is, very much taken with technology, going all the way back to the first Apple home computers. Social networking, blogging -- he's embraced it. Devices such as Wi-Fi-equipped DVD players that allow people to stream music, movies and YouTube videos through their entertainment systems … well, "I have to confess: It's cool," he writes. "I'm not sure I could live without it."

Yet Carr wants us to know what we're losing in exchange for our dynamic, interconnected, Internet-fueled world. The Shallows is a rebuttal to those who unquestioningly accept a life in which information is unlimited, easily accessed but fractured and unmoored from context, and where people are constantly online and multitasking among e-mail, Facebook and websites. Extrapolating from the sagacity of Western philosophers like Plato and Marshall McLuhan and guided by recent, pertinent discoveries in neuroscience, Carr argues that the Internet physically "rewires" our brain to where we end up acting like computers -- avaricious gobblers of information –- and our grip on what it means to be human slackens.

Nicholas Carr is also the author of The Big Switch: Rewiring the World, from Edison to Google. He blogs at Rough Type. Joanie Simon hide caption

toggle caption
Joanie Simon

Nicholas Carr is also the author of The Big Switch: Rewiring the World, from Edison to Google. He blogs at Rough Type.

Joanie Simon

A large part of what it means to be human, he writes, is our capacity for "deep reading," an ability bestowed on us by Gutenberg's printing press, which fostered an "intellectual tradition of solitary, single-minded concentration." (It's a testament to Carr's seriousness and thoughtfulness that he spends half his book providing a crisp, fascinating history of the written word, all to carefully show how the Internet is but "the latest in a long series of tools that have helped mold the human mind." In other words, he has taken pains to ensure The Shallows is far from being a half-baked screed.)

Deep reading, which requires "sustained, unbroken attention to a single, static object," has for ages allowed people to make "their own associations, draw their own inferences and analogies, fostered their own ideas." The Internet works against this, Carr writes. "Dozens of studies by psychologists, neurobiologists, educators and Web designers point to the same conclusion: when we go online, we enter an environment that promotes cursory reading, hurried and distracted thinking, and superficial learning." Sure, deep reading is possible, but that's not the kind of reading "the technology encourages and rewards."

With The Shallows, Carr attempts to snap us out of the hypnotic pull of our iPhones, laptops and desktops. He reveals why we're suddenly having a hard time focusing at length on any given thing, and why we compulsively check our e-mail accounts and Twitter feeds and never seem to be able to get our work done. (It's because we've been abusing our brains.) He wants us to value wisdom over knowledge, and to use new technology intelligently. "We shouldn't allow the glories of technology to blind our inner watchdog to the possibility that we've numbed an essential part of our self," Carr pleads. It remains to be seen if he's shouted down or listened to.

Excerpt: 'The Shallows'

The Shallows
The Shallows: What the Internet Is Doing to Our Brains
By Nicholas Carr
Hardcover, 276 pages
W.W. Norton & Co.
List price: $26.95

Pundits have been trying to bury the book for a long time. In the early years of the nineteenth century, the burgeoning popularity of newspapers -- well over a hundred were being published in London alone -- led many observers to assume that books were on the verge of obsolescence. How could they compete with the immediacy of the daily broadsheet? "Before this century shall end, journalism will be the whole press -- the whole human thought," declared the French poet and politician Alphonse de Lamartine in 1831. "Thought will spread across the world with the rapidity of light, instantly conceived, instantly written, instantly understood. It will blanket the earth from one pole to the other -- sudden, instantaneous, burning with the fervor of the soul from which it burst forth. This will be the reign of the human word in all its plenitude. Thought will not have time to ripen, to accumulate into the form of a book -- the book will arrive too late. The only book possible from today is a newspaper."

Lamartine was mistaken. At the century's end, books were still around, living happily beside newspapers. But a new threat to their existence had already emerged: Thomas Edison's phonograph. It seemed obvious, at least to the intelligentsia, that people would soon be listening to literature rather than reading it. In an 1889 essay in the Atlantic Monthly, Philip Hubert predicted that "many books and stories may not see the light of print at all; they will go into the hands of their readers, or hearers rather, as phonograms." The phonograph, which at the time could record sounds as well as play them, also "promises to far outstrip the typewriter" as a tool for composing prose, he wrote. That same year, the futurist Edward Bellamy suggested, in a Harper's article, that people would come to read "with the eyes shut." They would carry around a tiny audio player, called an "indispensable," which would contain all their books, newspapers, and magazines. Mothers, wrote Bellamy, would no longer have "to make themselves hoarse telling the children stories on rainy days to keep them out of mischief." The kids would all have their own indispensables.

Five years later, Scribner's Magazine delivered the seeming coup de grace to the codex, publishing an article titled "The End of Books" by Octave Uzanne, an eminent French author and publisher. "What is my view of the destiny of books, my dear friends?" he wrote. "I do not believe (and the progress of electricity and modern mechanism forbids me to believe) that Gutenberg's invention can do otherwise than sooner or later fall into desuetude as a means of current interpretation of our mental products." Printing, a "somewhat antiquated process" that for centuries "has reigned despotically over the mind of man," would be replaced by "phonography," and libraries would be turned into "phonographotecks." We would see a return of "the art of utterance," as narrators took the place of writers. "The ladies," Uzanne concluded, "will no longer say in speaking of a successful author, ‘What a charming writer!' All shuddering with emotion, they will sigh, 'Ah, how this "Teller's" voice thrills you, charms you, moves you.'"

The book survived the phonograph as it had the newspaper. Listening didn't replace reading. Edison's invention came to be used mainly for playing music rather than declaiming poetry and prose. During the twentieth century, book reading would withstand a fresh onslaught of seemingly mortal threats: moviegoing, radio listening, TV viewing. Today, books remain as commonplace as ever, and there's every reason to believe that printed works will continue to be produced and read, in some sizable quantity, for years to come. While physical books may be on the road to obsolescence, the road will almost certainly be a long and winding one. Yet the continued existence of the codex, though it may provide some cheer to bibliophiles, doesn't change the fact that books and book reading, at least as we've defined those things in the past, are in their cultural twilight. As a society, we devote ever less time to reading printed words, and even when we do read them, we do so in the busy shadow of the Internet. "Already," the literary critic George Steiner wrote in 1997, "the silences, the arts of concentration and memorization, the luxuries of time on which ‘high reading' depended are largely disposed." But "these erosions," he continued, "are nearly insignificant compared with the brave new world of the electronic." Fifty years ago, it would have been possible to make the case that we were still in the age of print. Today, it is not.

Some thinkers welcome the eclipse of the book and the literary mind it fostered. In a recent address to a group of teachers, Mark Federman, an education researcher at the University of Toronto, argued that literacy, as we've traditionally understood it, "is now nothing but a quaint notion, an aesthetic form that is as irrelevant to the real questions and issues of pedagogy today as is recited poetry -- clearly not devoid of value, but equally no longer the structuring force of society." The time has come, he said, for teachers and students alike to abandon the "linear, hierarchical" world of the book and enter the Web's "world of ubiquitous connectivity and pervasive proximity" -- a world in which "the greatest skill" involves "discovering emergent meaning among contexts that are continually in flux."

Clay Shirky, a digital-media scholar at New York University, suggested in a 2008 blog post that we shouldn't waste our time mourning the death of deep reading -- it was overrated all along. "No one reads War and Peace," he wrote, singling out Tolstoy's epic as the quintessence of high literary achievement. "It's too long, and not so interesting." People have "increasingly decided that Tolstoy's sacred work isn't actually worth the time it takes to read it." The same goes for Proust's In Search of Lost Time and other novels that until recently were considered, in Shirky's cutting phrase, "Very Important in some vague way." Indeed, we've "been emptily praising" writers like Tolstoy and Proust "all these years." Our old literary habits "were just a side-effect of living in an environment of impoverished access." Now that the Net has granted us abundant "access," Shirky concluded, we can at last lay those tired habits aside.

Such proclamations seem a little too staged to take seriously. They come off as the latest manifestation of the outré posturing that has always characterized the anti-intellectual wing of academia. But, then again, there may be a more charitable explanation. Federman, Shirky, and others like them may be early exemplars of the post-literary mind, intellectuals for whom the screen rather than the page has always been the primary conduit of information. As Alberto Manguel has written, "There is an unbridgeable chasm between the book that tradition has declared a classic and the book (the same book) that we have made ours through instinct, emotion and understanding: suffered through it, rejoiced in it, translated it into our experience and (notwithstanding the layers of readings with which a book comes into our hands) essentially become its first readers." If you lack the time, the interest, or the facility to inhabit a literary work -- to make it your own in the way Manguel describes -- then of course you'd consider Tolstoy's masterpiece to be "too long, and not so interesting."

Although it may be tempting to ignore those who suggest the value of the literary mind has always been exaggerated, that would be a mistake. Their arguments are another important sign of the fundamental shift taking place in society's attitude toward intellectual achievement. Their words also make it a lot easier for people to justify that shift -- to convince themselves that surfing the Web is a suitable, even superior, substitute for deep reading and other forms of calm and attentive thought. In arguing that books are archaic and dispensable, Federman and Shirky provide the intellectual cover that allows thoughtful people to slip comfortably into the permanent state of distractedness that defines the online life.

Our desire for fast-moving, kaleidoscopic diversions didn't originate with the invention of the World Wide Web. It has been present and growing for many decades, as the pace of our work and home lives has quickened and as broadcast media like radio and television have presented us with a welter of programs, messages, and advertisements. The Internet, though it marks a radical departure from traditional media in many ways, also represents a continuation of the intellectual and social trends that emerged from people's embrace of the electric media of the twentieth century and that have been shaping our lives and thoughts ever since. The distractions in our lives have been proliferating for a long time, but never has there been a medium that, like the Net, has been programmed to so widely scatter our attention and to do it so insistently.

David Levy, in Scrolling Forward, describes a meeting he attended at Xerox's famed Palo Alto Research Center in the mid-1970s, a time when the high-tech lab's engineers and programmers were devising many of the features we now take for granted in our personal computers. A group of prominent computer scientists had been invited to PARC to see a demonstration of a new operating system that made "multitasking" easy. Unlike traditional operating systems, which could display only one job at a time, the new system divided a screen into many "windows," each of which could run a different program or display a different document. To illustrate the flexibility of the system, the Xerox presenter clicked from a window in which he had been composing software code to another window that displayed a newly arrived e-mail message. He quickly read and replied to the message, then hopped back to the programming window and continued coding. Some in the audience applauded the new system. They saw that it would enable people to use their computers much more efficiently. Others recoiled from it. "Why in the world would you want to be interrupted -- and distracted -- by e-mail while programming?" one of the attending scientists angrily demanded.

The question seems quaint today. The windows interface has become the interface for all PCs and for most other computing devices as well. On the Net, there are windows within windows within windows, not to mention long ranks of tabs primed to trigger the opening of even more windows. Multitasking has become so routine that most of us would find it intolerable if we had to go back to computers that could run only one program or open only one file at a time. And yet, even though the question may have been rendered moot, it remains as vital today as it was thirty-five years ago. It points, as Levy says, to "a conflict between two different ways of working and two different understandings of how technology should be used to support that work." Whereas the Xerox researcher "was eager to juggle multiple threads of work simultaneously," the skeptical questioner viewed his own work "as an exercise in solitary, singleminded concentration." In the choices we have made, consciously or not, about how we use our computers, we have rejected the intellectual tradition of solitary, single-minded concentration, the ethic that the book bestowed on us. We have cast our lot with the juggler.

Excerpted from The Shallows: What the Internet Is Doing to Our Brains by Nicholas Carr. Copyright 2010 by Nicholas Carr. Excerpted by permission of W.W. Norton & Co.