You are using an outdated browser.
Please upgrade your browser
and improve your visit to our site.
Skip Navigation

The Perils of Peak Attention

Two new books assess the quality of our digital lives: How do we shake off the village when we carry the world in our pocket?

Ethan Miller / Getty

“I am alarmed,” wrote Henry David Thoreau in “Walking,” his 1862 essay, “when it happens that I have walked a mile into the woods bodily, without getting there in spirit.” The point of his saunter had been to “forget all my morning occupations, and my obligations to society.” Alas: “It sometimes happens I cannot easily shake off the village.” His thoughts were elsewhere. With a gentle lashing of self­-reproach, he asks: “What business have I in the woods, if I am thinking of something out of the woods?”

Thoreau was surely being dogmatic: Must one only think arboreal thoughts on a tree-lined path? Kierkegaard said we walk ourselves into our best thoughts, but one doesn’t need to stroll down library stacks teeming with philosophy tomes to think philosophically. And yet Thoreau, in an age long before digital detox, was getting at something that is very much on our minds, when those minds have time for such reflection: the task of paying attention. But how do we shake off the village when we carry the world in our pocket?

Early on in The Attention Merchants, Tim Wu’s startling and sweeping examination of the increasingly ubiquitous commercial effort to capture and commodify our attention, we are presented with a sort of cognitive paradox: To pay attention to one thing we need to screen other things out. But that very “capacity to ignore,” as Wu puts it, “is limited by another fact: We are always paying attention to something.” It is these “in between” moments, when our attention may be about to shift from one thing to another, that the attention merchants have long sought to colonize—in everything from nineteenth-century Parisian street posters to the twenty-first century’s “long flight of ad-laden clickbaited fancy.”

THE ATTENTION MERCHANTS by Tim Wu
Alfred A. Knopf, 416 pp, $28.95

Wu is a law professor at Columbia University, and in his 2010 book, The Master Switch, he chronicled how fledgling communications networks, from radio to the internet, went from heady, freewheeling hobbyist playgrounds to entities corralled by mercantilist and governmental control. While there have always been grand “claimants to attention,” Wu notes—organized religion being one—he dates the emergence of “industrialized” attention to Benjamin Day’s tabloid newspaper the New York Sun, which launched on September 3, 1833 with, among other things, a story of a suicide by laudanum. Such sensational stories—the “you won’t believe what happened next” of their day—were meant to sell papers, but the Sun’s costs weren’t covered by the newsstand price. To make money, Day needed to sell ads, which meant he needed to sell his readers: “What Day understood—more firmly, more clearly than anyone before him,” Wu writes, “was that while his readers may have thought themselves his customers, they were in fact his product.”

Day wanted eyeballs, so he published stuff you couldn’t look away from, including the infamous Great Moon Hoax of 1835—a series of six articles detailing the presence of winged creatures on the moon—which was a huge, circulation-building success. The Sun’s advertisers were snake-oil salesmen, literally; medical quacks who boiled rattlesnakes and bottled the skimmed-off froth. The term would acquire a pejorative cast—one that shadowed advertising itself—but snake-oil salesmen, among other vendors of patent medicines, were the engines of the early attention industry. Thoreau, as you might guess, was hardly a fan of these publications—as he wrote in Walden, “I am sure that I never read any memorable news in a newspaper.”

The carnivalesque history of early American advertising, with its hodgepodge of pseudoscience and quasi-religious transcendence, has been told before, most memorably in Jackson Lears’s 1994 Fables of Abundance. But Wu’s succinct reexamination is important, for he shows how the nascent attempt at capturing attention was marked by a set of events—the emergence of a new medium whose power or use was not yet fully understood, a land-rush by commercial interests, a cultural backlash against that commercialization, and an ongoing quest for legitimacy by advertisers—that would be repeated in decades to come, each time drawing closer to the privacy of our inner lives.

When broadcast radio arrived in the 1920s, the idea that it should transmit advertising was anathema. Radio was a “utopian” medium, Wu notes, too close to the family hearth. (Herbert Hoover himself said it was “inconceivable” that such a vital platform could be “drowned in advertising chatter.”) But before the decade was over, Amos ’n’ Andy was delivering “Super Bowl audiences each and every evening.” Experiments blending advertising and entertainment had previously shown them to be incompatible: A silent movie house that had patrons watch ads, rather than buy tickets, was a typical failure. But the toothpaste company Pepsodent, which had taken a gamble by sponsoring Amos ’n’ Andy, saw its sales double. Closer to the hearth, radio was also closer to our personal lives. Several decades later, television, an even more immersive, attention- demanding medium, ushered us into the era of what Wu calls “peak attention”: “the moment when more regular attention was paid to the same set of messages at the same time than ever before or since.”

Catching a Pokémon in Arlington National Cemetery. New technologies that capture our attention, writes Tim Wu, encroach on “the privacy of our inner lives.”
Thomas Dworzak

And yet the “conquest of attention” was still an incomplete project. While media had entered the home, “the domain of the interpersonal remained inviolate.” You didn’t hear an ad when you placed a phone call, for instance; whether that was due to an inviolate line of sanctity or a lack of entrepreneurial imagination is an open question. But then the internet arrived, haltingly and awkwardly. Far from being attentive to it, we struggled to know what to do with this new media. In 1991, AOL charged $9.95 for the first five hours a month—beyond that, you began to pay extra, implying that the kind of usage we now see in a day was plenty for 30 days. Like before, there was the usual clamor over the sacrosanct nature of the revolutionary new medium. But, as Wu writes in what might be the book’s central thesis, “Where the human gaze goes, business soon follows.” When that gaze eventually shifted to the smartphone—portable, social, location-aware, always on—whatever last reserves of human attention were still left unexploited were suddenly on the table. The smartphone would become “the undisputed new frontier of attention harvesting in the twenty-first century, the attention merchants’ manifest destiny.”

Picture Thoreau now, on his obligation-shedding saunter through the Massachusetts woods. There are unanswered emails from the morning’s business a twitchy finger away. Facebook notifications fall upon him like leaves. The babbling brook is not only lovely, but demands to be shared via Instagram, once the correct filter (“Walden,” natch) has been applied. Perhaps a quick glance at the Health app to track his steps, or a browse of the TripAdvisor reviews of Walden Pond (“serene and peaceful”). There may be Pokémon Go baubles to collect—the app may have even compelled his walk in the first place.

We are these days, suggests Laurence Scott in his pensive, provocative book, The Four-Dimensional Human, “inhabiting space in a way that could be called four-dimensional.” The lines between the physical and online—still so robust when getting on AOL for one’s five hours a month meant an aching process of dialing up a working local access number via a creaking modem—have been virtually erased. We no longer “surf” the internet, Scott notes, because we are always already submerged by the waves.

One of the implications here is that it is not only looking at the screen that consumes our attention, as in the old days, for the screen has changed the way we look at the world. It is the way we think of Twitter excerpts as we read an article, or pre-frame the world in Instagram-worthy moments—as Scott notes, our phones “consume” concerts and dinner before we do, while “the real, biologically up-to-the-minute ‘me’ thus becomes a ghost of my online self.” It is our strange view, intimate yet indirect, of people’s Potemkin collages on Facebook—“a place of full-frontal glimpses, where we encounter the periphery head on,” writes Scott. It is the way we worry offline about what has happened online (Did my post get any likes?), and how what is online can cause us to worry about what is happening offline. A friend recently told me how his teenage daughter had not been invited to the weekend gathering of a few of her friends. At one time, the snub, hurtful though it may have been, would have been encountered only on Monday morning (You guys did what?); now, however, she watched her friends’ weekend unfold on Instagram, a live crawl of social anxiety.

WASTING TIME ON THE INTERNET by Kenneth Goldsmith
Harper Perennial, 256 pp., $14.99

Kenneth Goldsmith, a conceptual artist, writing professor, and “first poet laureate of the Museum of Modern Art,” wants to challenge what has become a standard litany of critique and alarm about time spent online: that it weakens our powers of concentration, that it is antisocial, that it is almost entirely frivolous. Even “getting away” from our digital devices, as Goldsmith suggests in his winningly cheeky manifesto Wasting Time on the Internet—an outgrowth of his University of Pennsylvania creative writing class, where all communication took place in chat rooms and on social media—we are often just thinking about them, the ghostly phantom limbs of consciousness. He points to an article written by a journalist going “off the grid” for the weekend on a Swedish island: The first thing she does upon returning is tweet a link to her article. Does this negate her time away? No, he counters: “Our devices amplify our sociability.” Looking at people looking at their phones, he notes: “I’ve never seen such a great wealth of conversation, focus, and engagement.” For all the time it wastes, Facebook is the “greatest collective autobiography that a culture has ever produced” (albeit one already so vast it could never be fully comprehended by anything but artificial intelligence). The same phone that hosts frivolous games might also record a once-occluded atrocity. And yet, Goldsmith suggests, we have come to be rather one-dimensional in thinking about the four-dimensional life. “There’s something perverse about how well we use the web yet how poorly we theorize our time spent on it.” Just as the arrival of new, globe-shrinking mediums fails to live up to their early, uplifting promise, there is a parallel history of doom-laden jeremiadism accompanying those same forms that can look quaint in retrospect—never forget that many of those dowdy Penguin Classics on your shelf were once considered a gateway drug toward moral turpitude.

One question that Wu, in The Attention Merchants, never really resolves is what exactly constitutes a meaningful use of one’s attention. He laments that we have taken our attention and parted with it “cheaply and unthinkingly,” but at one point, he seems to hold up cable shows like House of Cards and Game of Thrones as harbingers of “deep engagement.” Exactly why ten hours of binge-watching is qualitatively better or more life-affirming than ten hours of pursuing one’s active interests online, he does not convincingly say, but it speaks to the reflexive distrust of time spent, as Goldsmith terms it, “clicking around.” But, as a journalist, clicking around virtually defines my job these days; what matters, I suppose, is where, and how, you are clicking.

A Pokéball against a mural: “The screen has changed the way we look at the world.”
Thomas Dworzak

Goldsmith tells the story of sipping wine on a terrace overlooking the Adriatic Sea, and suddenly seeing a “giant peachy yellow moon” emerge, which captivates everyone, except for the guy having a text conversation with his girlfriend. There is a lament that he is not “in the moment.” But it gives Goldsmith pause. “Why is looking at the moon somehow perceived to be more ‘present’ than looking at your phone?” How do we even know, I might add, per Thoreau, that the person sitting looking at the moon is actually thinking about the moon? Maybe he is thinking about his girlfriend, a continent away. The threat of our devices is that they take away, among other things, the possibility of this introspective longing and sense of absence.

Cultural critic Virginia Heffernan has noted that our wholesale move to digital life has “moments of magic and an inevitable experience of profound loss”; not to acknowledge these polarities, she argues, is “propaganda.” At times, Goldsmith, in making his vigorous counterclaims, leans too much on this magic. “When I click on a link, I literally press down on language, something that never happens when I’m reading a book.” And yet—so what? How does this “pressing down on language” achieve anything different than turning a page? Sometimes when we tweet, he argues, “we feel like we’ve posted a small literary jewel.” But to what end? People are still reading Montaigne’s essays, centuries on, while last week’s Twitter is long forgotten. To claim that he can productively jog through Manhattan, listen to music, and dictate the outline of his book seems a bit of a stretch. (Studies show that people’s walking begins to suffer as they talk on a phone.) His description of device-distracted pedestrians as merely surrealistic urban “sleepwalkers,” whose attention is geographically distributed, sounds a lot better on paper than on the street, when you’re almost knocked over by someone braying into FaceTime.

In one of the book’s most interesting passages, Goldsmith makes the compelling case for the artist Joseph Cornell as a kind of avant la lettre oracle of the digital age. Cornell’s rampant assemblages were like a living Tumblr, his filmic collages were predecessors of the internet “supercut,” and his famous boxes, suggests Goldsmith, presage today’s onscreen life—the division into “windows,” odd collections of things thrown together, “launching pads for interior voyages.” Most boxes, Goldsmith writes, had mirrors, so people could see themselves in the artwork, like the way today’s digital interfaces constantly reflect us. Nothing wrong with that—self-love and all that—or is there? In his latest collection of essays, Utopia is Creepy, journalist Nicholas Carr, who has become one of the most eminent of those unnamed critics of digital life alluded to by Goldsmith, quotes Lewis Mumford: “When one is completely whole and at one with the world one does not need the mirror.” When we are feeling less than whole we reach out to the “lonely image,” but online, we can at least all be “alone together,” Carr says—and unlike in real life, “we can be confident that the simulation is real.”

The arc of this story is the historical evolution of media forms that increasingly captured more of our attention and made “us” into an ever more tangible, and saleable, construct. An organ like the New York Sun didn’t sell newspapers, it sold readers; Google’s goal is “no longer to read the web,” writes Carr. “It’s to read us.” Eventually the absorption became so complete that traditional divisions were hard to read. Our attention was being sold by a company like Facebook, but, as Carr suggests, we paid not only with our attention but by providing the energy and material to capture further attention. We’ve become the consumers, the producers, and the content. We are selling ourselves to ourselves.