You are using an outdated browser.
Please upgrade your browser
and improve your visit to our site.

The Uses of Half-True Alarms

Nicholas Carr’s lucid if tendentious book improves on his essay in the Atlantic a couple years ago, which was more memorably—and misleadingly—titled with the self-answering question, “Is Google Making Us Stupid?” Carr’s article was all the more interesting because he was not a grumpy and decadent humanist but an engaging tech writer and a former executive editor of the Harvard Business Review. He was asking out loud a question that was deservedly on a lot of contemporary minds. The Shallows is a less catchy and more accurate title for his alarm, which turns out to have little to do with Google. It is much bigger than that.

Carr grabs our lapels to insist that the so-called information society might be more accurately described as the interruption society. It pulverizes attention, the scarcest of all resources, and stuffs the mind with trivia. Our texting, IM-ing, iPhoning, Twittering, computer-assisted selves—or self-assisted computing networks—are so easily diverted that our very mode of everyday thought has changed, changed utterly, degraded from “calm, focused, undistracted” linearity into “a new kind of mind that wants and needs to take in and dole out information in short, disjointed, often overlapping bursts.” Google searches, too, break our concentration, which only makes matters worse: “Google is, quite literally, in the business of distraction,” Carr writes. Because we are always skimming one surface after another, memories do not consolidate and endure. So we live in a knife-edge present. We turn into what the playwright Richard Foreman called “pancake people—spread wide and thin as we connect with that vast network of information accessed by the mere touch of a button.” We collect bits and the bits collect us.

Worse still, no one has dragooned us into the shallows. Nobody is forcing us from pixel to post. We are our own victimizers, because we crave interruption. When we grow up texting every few minutes, legato—which now feels like an eternity—yields to staccato. Taking a break during the writing of this review, while watching a recent Lakers-Suns playoff game, I observed a couple of women in four-figure courtside seats behind the Suns’ bench working their thumbs on BlackBerries as the camera panned over them. Maybe they were live-blogging, or day-trading on Asian markets.

With so many interruptions so easy to arrange, Carr argues, it is no wonder that we cannot concentrate, or think straight, or even think in continuous arabesques. Where deep reading encourages intricacies of thought, the electronic torrent in which we live—or which lives in us—turns us into Twittering nerve nodes. The more links in our reading, the less we retain. We are what we click on.  We no longer read, we skim. With Wikipedia a click away, are we more knowledgeable? Or even more efficient? Multi-tasking, Carr quotes the neuroscientist David Meyer as saying, “is learning to be skillful at a superficial level.”

After all, the brain that has been re-wired online governs us offline, too. The more we multi-task, the more distractible we are. But aren’t we more sophisticated at “visual-spatial skills”? Sure, but at the price of “a weakening of our capacities for the kind of ‘deep processing’ that underpins ‘mindful knowledge acquisition, inductive analysis, critical thinking, imagination, and reflection,” writes Carr, quoting a Science article that reviewed more than fifty relevant studies.

And so we devolve inexorably into “lab rats constantly pressing levers to get tiny pellets of social or intellectual nourishment.” These sweet tidbits are rotting our mental teeth. This is so, Carr maintains, because “the Net delivers precisely the kind of sensory and cognitive stimuli—repetitive, intensive, interactive, addictive—that have been shown to result in strong and rapid alterations in brain circuits and functions,” and that consequently, “with the exception of alphabets and number systems, the Net may well be the single most powerful mind-altering technology that has ever come into general use.”

It is undeniable that some of the analyses that I have quoted suffer from exaggeration and overkill. Carr is not shy about plunging headlong into extravagant claims. “The computer screen bulldozes our doubts with its bounties and conveniences.” “We become mindless consumers of data.” “The strip-mining of ‘relevant content’ replaces the slow excavation of meaning.” Perhaps aware of this propensity, at other times Carr pulls back from the brink with weasel-word conditionals such as “may well be,” as in: “The consequences [of multitasking online] for our intellectual lives may prove ‘deadly.’” Well, yes—but whatever may prove deadly may also not prove deadly.

So Carr, alert as well as alarmed, confronts himself as well as his reader with the classic smoke-fire problem. His alarms come clanging on almost every page. What to make of them? They cannot be dismissed as the mutterings of an obsolescent graybeard—Carr is in his early forties. To his credit, moreover, he pauses to address some objections to his line of argument—for example, the striking, well-established finding that IQ scores almost everywhere have been rising for a century while the means of distraction have been multiplying exponentially. “If we’re so dumb,” he italicizes, “why do we keep getting smarter?”

But we don’t, Carr argues—at least not in any simple way. The notion that smartness comes in a single variety is too crude. The testing signals are actually mixed. Some skills have increased as computers spread, but “tests of memorization, vocabulary, general knowledge, and even basic arithmetic have shown little or no improvement.” Worse, there are some recent signs of loss. While math scores have held steady over the past decade, verbal scores have declined. Between 1992 and 2005, something called “literary reading aptitude” dropped 12 percent. A testing skeptic might doubt the worth of any such findings, but it does seem to be the case that something real is being measured, and that whatever it is, it is slipping. Clumsy statistics are not foolproof evidence, and neither are the dumbing-down anecdotes any reader can supply. But they are not nothing.  

Unfortunately Carr does not entertain the possibility of unexpected gifts from the internet. He does not ask whether associational thinking—thinking that leaps horizontally, connecting dots that previously were segregated or “siloed”— might actually benefit from the non-stop multitasking in which one’s center of consciousness is constantly intruded upon by fragments of periphery. Could it be that the great electronic torrent of bits, bytes, and buzz does not only turn all minds into short-term data dumps, but also might promote the creative discerning of patterns where none were evident before? This strikes me as an unanswerable question but not a worthless one, even though it can only be properly asked if one reverts to weasel-word qualifiers.

One might be more prone to ask such questions if one were more attentive to the fact that they are not altogether new. Carr himself quotes T. S. Eliot, who anticipated the courtside BlackBerriers (“strained time-ridden faces/Distracted from distraction by distraction/Filled with fancies and empty of meaning”) in 1935. An English writer once clucked at the unpleasant development that “reading has to be done in snatches”—and that was in 1890. “Prolonged reflection almost gives people a bad conscience”—thus Nietzsche in 1882. Perhaps the difference between 1882 and 2010 is that the conscience about taking the mind’s time is gone. Modernity is nothing if not a long-running speed-up, with the world unceasingly going to the speedier dogs. Much of what Carr notices, or fears, was already in play, and accelerating, long before the internet. It was an alarmed and anti-modern Henry Adams who first pondered the idea of “the acceleration of history” way back in 1907—though arguably the history of today is spinning its wheels.

Carr would no doubt respond that a repeated alarm is not necessarily a false alarm, and he would be right. There is good reason, after all, why we are living through something of a backlash against the frenzy of attention dispersion, a backlash for which Carr’s book will become canonical. The tech engineer-promoter Jaron Lanier, who coined the term “virtual reality,” has a book out called You Are Not a Gadget: A Manifesto, denouncing “the hive mind” of the internet and declaring that the “widespread practice of fragmentary, impersonal communication has demeaned interpersonal interaction.” Even old-school capitalists object to the new dispensation, although not always for Carr’s reasons. In another recent book, Googled: The End of the World As We Know It, the business journalist Ken Auletta catches media mogul Barry Diller recalling that at a meeting with Google co-founder Larry Page, Page “did not lift his head from his PDA device.” “It’s one thing if you’re in a room with twenty people and someone is using their PDA,” Diller told Auletta. “I said to Larry, ‘Is this boring?’”  “No. I’m interested. I always do this,” said Page. “Well, you can’t do this,” said Diller. “Choose.” “’I’ll do this,’ said Page matter-of-factly, not lifting his eyes from his handheld device.” That’s new-media power for you—the power to offend Barry Diller.

So Carr, however loud his tocsins, is worth attending to. I am not the only professor these days who has a pretty good idea what the fidgety fingers of students are doing as they ostensibly take notes, and, having watched a graduate student clicking the keys for inordinate periods of a seminar during a whole semester, I am on the verge of banning laptops from my classrooms. The arts of contemplation have been hard to practice for centuries, but that is no reason to make them any harder.

Todd Gitlin is the author of, among other books, Media Unlimited. His new book (co-authored with Liel Leibovitz), The Chosen Peoples: America, Israel, and the Ordeals of Divine Election, will be published by Simon & Schuster in September.