You are using an outdated browser.
Please upgrade your browser
and improve your visit to our site.
Skip Navigation

Life in the Age of Algorithms

As society becomes more wedded to technology, it's important to consider the formulas that govern our data.

David Ramos/Getty

Every age has its organizing principles. The nineteenth century had the novel, and the twentieth had TV; in our more modern times, they come and go more quickly than ever—on Web 1.0 it was the website, for example, and a few years later, for 2.0, it was the app. And now, another shift is underway: Today’s organizing principle is the algorithm. (Though you could productively argue that our new lingua franca will either be artificial intelligence or virtual reality.) Algorithms rule the modern world, silent workhorses aligning datasets and systematizing the world. They’re everywhere, in everything, and you wouldn’t know unless you looked. For some of the most powerful companies in the world—Google, Facebook, etc.—they’re also closely held secrets, the most valuable intellectual property a company owns. And they are not neutral.

Gizmodo reminded us of this recently, after they reported that Facebook’s “trending topics”—which many assumed to be decided by an algorithm—were not only curated by a team of contractors, but that those same people purportedly suppressed news from right-leaning sites, particularly those prone to hyperbolic claims. Condemnation was swift, both from liberals such as Glenn Greenwald, and those allegedly suppressed, like this eminently reasonable response from The Red State. For its part, Facebook has responded by saying it does not suppress anything, arguing that its own technology prevents such prejudice, and that it only works to avoid junk news.

All of that, however, was undercut by a subsequent report from The Guardian later in the week that stated that far from relying on algorithms at all, each part of trending topics is the result of human intervention, relying in particular on ten trusted sources (such as the BBC or Fox News), and also pushing stories the company felt should get highlighted. Facebook had lied. (The company further clarified its policies in a memo.)

But the broader goal of the project spans more than just the gap between assumed neutrality and editorializing. As one former contractor hired to curate trending topics put it to Gizmodo: “[W]e felt like we were part of an experiment that, as the algorithm got better, there was a sense that at some point the humans would be replaced.” Facebook’s aim appears to have been to eventually replace its humans with smarter formulas.

Is it naïve to believe algorithms should be neutral? Perhaps, but it’s also deceptive to advance the illusion that Facebook and the algorithms that power it are bias-free. Choosing what stories are and are not worth reading is a part of every editor’s job, and the fact that Facebook hired curators at all suggests they wanted that judgment to be exercised. But it also suggests that they see themselves as an editorial force, rather than as a platform. What’s wrong is to market the service as the latter but privately act as the former.

As such, with now well over a billion users, and still growing, it’s worth asking: What role should Facebook play in shaping public discourse? And just how transparent should it be?

Because they are mathematical formulas, we often feel that algorithms are more objective than people. As a result, with Facebook’s trending column, there was a degree of trust that wouldn’t necessarily be given to the BBC or Fox News. Yet algorithms can’t be neutral. To illustrate, consider the following: Should trending topics reflect the most used words? What is shared most, and what generates the most conversation? Should it only consider posts in English, or should it also use Spanish or Mandarin or Hindi? What if a white supremacist post becomes extremely popular? Should Facebook push that to millions of users? And these are only a few questions. The answers only become knottier from there.

Deceiving users into believing a service is bias-free is one thing. But to do so in service of denying Facebook’s enormous role in public discourse is quite another. Our experience of the site and the information we receive there are the product of  decisions made regarding ethics, branding, user experience, and more—no algorithm will ever give you a pure response, because they reflect the biases of their designers. Facebook knows this, too. It’s why, as the Gizmodo story reports, Facebook’s curators made sure the Black Lives Matter movement made it onto the trending bar. (As it well should have.) Facebook can either choose to very clearly outline how a trending topic algorithm works—e.g., sorting articles by number of shares—or it can express a set of corporate ethics. What it cannot do is both at once.

Still, the fact that we as Facebook users ever wanted neutrality speaks to a belief in digital democracy. That is the contrast that Facebook have themselves set up: They deliberately positioned themselves as a distribution network, which is explicitly not an editorial entity. Facebook is intended to be the home of what the world is talking about. Their business model depends on it, even if that’s an impossible goal.

Instead, we’ve come face to face with something quite different—what we created. For example: In his long analysis of the rise of presumptive Republican presidential nominee Donald Trump, Andrew Sullivan suggests that the populist candidate’s rise is a side effect of “too much democracy”—Sullivan believes the flattened hierarchies and increased tolerance that characterized the postmodern actually allow for the rise of an authoritarian demagogue like Trump. It’s not an airtight argument—it certainly ignores the virulent racism and belief in white supremacy that Trump’s policies belie—but surely some of Trump’s popularity stems from precisely where Facebook fits in. In the filter bubbles of Facebook feeds and comment threads, support can build for a candidate with a shaky relationship to truth and a ham-fisted grasp of policy. My point isn’t that editorial choices on the part of Facebook would have somehow stopped Trump; rather, it’s that the fact of Trump’s rise makes scrutiny of Facebook’s role in public discourse all the more important.

After all, Facebook is mind-bogglingly massive. It accounts for a huge portion of traffic directed to news sites; small tweaks in its own feed algorithm can have serious consequences for media companies’ bottom lines. And trending topics are just part of the bigger issue, which is Facebook’s newsfeed itself. The algorithms and editorial choices that determine what most of us see when we log in is the problem with trending topics writ large. It’s important to remember that Facebook is a private company with its own vested interests.

What can be done? At least one response has come from the Senate Commerce Committee, which requested that Facebook prepare its staff to brief the Senate on how the social network filters trending topics. The government is acknowledging that, in order for public discourse to work, speech must first prioritize the public.