You are using an outdated browser.
Please upgrade your browser
and improve your visit to our site.
Skip Navigation

Facebook Is Even Worse Than Anyone Imagined

A trove of internal documents reveal a company that is willfully spreading misinformation and hate in pursuit of profit and growth.

A close-up of Mark Zuckerberg's face.
Chip Somodevilla/Getty Images
Facebook founder Mark Zuckerberg

Speaking to Recode earlier this year, Facebook founder Mark Zuckerberg made his case for Facebook. “I think if you look at the grand arc here, what’s really happening is individuals are getting more power and more opportunity to create the lives and the jobs that they want,” Zuckerberg told Casey Newton. “And to connect with people they want. And to connect to the ideas that they want and to share the ideas that they want. And I just think that that will lead to a better world. It will be different from the world that we had before. I think it will be more diverse, I think more different ideas and models will be able to exist.” 

But Zuckerberg suggested that a problem remained. By empowering individuals, Facebook was disrupting a hierarchy that had existed for generations: “My concern is that we’re too frequently telling the negative sides of it, from the perspective of the institutions that may be not on the winning side of these changes.” For the last several years, Facebook had been besieged by a narrative that its products were making the world an uglier and divisive place. Here, Zuckerberg inverted the critique: The real victims of Facebook’s rise weren’t its users but a number of dusty institutions that were raging as their power was being redistributed to the people. In this version of events, Facebook wasn’t just empowering its users, it was liberating them. 

Over the last few days, that pretty little picture has taken a serious hit, as several news organizations have begun reporting out revelations from internal Facebook documents provided to them by whistleblower (and crypto enthusiast) Frances Haugen, who worked at the company for two years before leaving in May. These documents illuminate a Facebook that is the opposite of Zuckerberg’s rose-tinted view—a company that knowingly provides a product that is used to spread misinformation and hate; that is used to facilitate hate speech, terrorism, and sex trafficking; and whose meager efforts to stop these things have often failed—as they did, most notably, in the lead-up to the January 6 insurrection. 

These are the most damning internal leaks from Facebook to come to light yet—more hair-raising than the revelations contained within Sheera Frenkel’s insider account, An Ugly Truth: Facebook’s Battle for Domination; perhaps even more damning than the Cambridge Analytica scandal that rocked the social network three years ago. These new disclosures reveal that Facebook executives know exactly what their product is doing to its users—and by extension, to the world—and are aware of the inadequacy of their halting efforts to mitigate these society-damaging impacts. In almost every instance, the company privileged its own profits, growth, and—in particular—its efforts to boost its anemic popularity with young people over the health and well-being of its user base. 

“Time and again, [researchers at Facebook] determined that people misused key features or that those features amplified toxic content, among other effects,” reported The New York Times’ Mike Isaac. “In an August 2019 internal memo, several researchers said it was Facebook’s “core product mechanics”—meaning the basics of how the product functioned—that had let misinformation and hate speech flourish on the site. ‘The mechanics of our platform are not neutral,’ they concluded.”

It’s hard to think of a more damning determination: Facebook’s product inevitably led to the spread of hate speech and misinformation. But this conclusion is inescapable when you look at other findings. Election misinformation continued to spread and proliferate rapidly in the aftermath of the 2020 election; one data scientist warned that 10 percent of content viewed in the wake of the election alleged widespread fraud. Facebook discovered its product would spin up recommendations of QAnon content to users who merely showed interest in conservative topics within a matter of days. “The body of research consistently found Facebook pushed some users into ‘rabbit holes,’ increasingly narrow echo chambers where violent conspiracy theories thrived,” NBC News reported. “People radicalized through these rabbit holes make up a small slice of total users, but at Facebook’s scale, that can mean millions of individuals.”

The documents also found that Facebook’s efforts to stop anti-vax misinformation from spreading were often wildly deficient and that the company was slow to understand just how woeful its response was—an especially shocking revelation given that the last five years have consistently and repeatedly demonstrated that the platform is overtaken by misinformation with an ease that suggests it was built to do just that. 

The situation in the rest of the world, meanwhile, is worse than in America. A handful of countries—the United States being one—are given extensive content moderation attention. While these moderation efforts are often inadequate, they’re significantly better than what most of the world gets. Researchers have found that Facebook has been used to spread everything from hate speech to ethnic cleansing. Mark Zuckerberg, meanwhile, has intervened on behalf of authoritarian governments: Given the choice between helping Vietnam’s autocratic government censor posts and ceasing to do business in the country, he personally elected to go the former route. Again and again, you see Facebook making this choice, in a slew of different ways: The company always chooses profits and growth, even when that choice demonstrably sows discord, spreads misinformation of violent incitement, or makes the world a worse place.  

Most striking of all is the pervasive sense that desperation drives many of these choices; that Facebook is always amid some existential threat even as growth is pushed to the exclusion of all other considerations. Facebook has lately struggled to attract younger users, and the company is doing everything it can to reverse that trend, a choice that predictably leads to devastating results that likely exacerbate the original problem. That is, I suppose, a silver lining—it suggests that Facebook may organically at some point cease being as important to the world as it is now. But that moment, if it comes, is a long, long way away—and if the company is truly in some sort of doom loop, there’s no telling what will govern its next decision or what harm the next quick fix or incautious action might unleash. For now, it is enough to note that the company is clearly doing great harm to the entire world and it is doing so knowingly.