You are using an outdated browser.
Please upgrade your browser
and improve your visit to our site.
Skip Navigation

Micro-Targeted

It’s not difficult to imagine the political manipulation of social-media platforms. You can see the trolls, maybe similar to the employees of the Internet Research Agency in St. Petersburg that Adrian Chen investigated for The New York Times back in 2015, more than a year before Donald Trump’s and the Kremlin’s complementary successes in the last U.S. presidential election. You can see these trolls squinting into screens, clacking away on keyboards, making their fake content show up in bad fonts down the Facebook feeds of gullible people all across America. You can see the targets of this manipulation shaking their heads in disdain about Hillary Clinton’s latest conspiracy against freedom. You might be inclined to have sympathy for the targets: They are, after all, citizens of the United States of America; their heads shouldn’t be messed with—especially by a Foreign Power. But just the same you might suppose them to be, as a type, more politically excitable than informed. You might even suppose they’re stupid. Not like you, right?

That’s what’s difficult to imagine: the political manipulation of you. I’ll flatter The New Republic in assuming that if you’re reading this (thank you), you’re a sophisticated person. How are any of these vandals of democracy ever going to get to you? This month, Sue Halpern shows us. The advanced micro-targeting she describes doesn’t principally exploit ignorance or unintelligence, and it isn’t principally represented by wild falsehoods networked through fools; it exploits the whole background of thought and feeling, as our online activity encodes them, against which we make our political choices; and it looks and sounds utterly reasonable. As Halpern writes:

Political campaigns contend that their use of data . . . enables an accurate ideological alignment of candidate and voter. That could very well be true. Even so, the manipulation of personal data to advance a political cause undermines a fundamental aspect of American democracy that begins to seem more remote with each passing campaign: the idea of a free and fair election. That, in the end, is the most important lesson of Cambridge Analytica: It didn’t just “break Facebook,” it broke the idea that campaigns work to convince voters that their policies will best serve them.

But if that still seems abstract, let’s consider what we already know from our lived experiences of the social media where all this targeting happens: We willingly help social platforms, the entire purpose of which is to monetize our data, build customized mind spaces that powerfully normalize in-group consensus and suppress constructive dissent. And yet when we’re on these platforms, we imagine we’re just thinking.