You are using an outdated browser.
Please upgrade your browser
and improve your visit to our site.
Skip Navigation

YouTube’s Fake News Problem Isn’t Going Away

The vulnerabilities that Parkland conspiracy theorists are exploiting are baked into the video site’s DNA.

Don Juan Moore/Getty Images

It’s become an inevitability in the social media era. Whenever a massacre occurs, a flood of fake news follows. The narrative is predictable, even if the details are often convoluted. What transfixed and horrified the country was all a stunt, or the victims are paid actors, or a mix of the two. The conclusion is that some sinister plot was advanced, usually involving the confiscation of firearms. The shooting at Stoneman High School in Parkland, Florida, that left seventeen students and teachers dead has been no different.

Earlier this week, a video circulated alleging that David Hogg, a Stoneman student who has been an outspoken gun control advocate in the wake of the tragedy, was an actor. It racked up hundreds of thousands of views and briefly claimed the top spot on YouTube’s “trending” rankings. YouTube pulled the video hours later, saying it had “been removed for violating YouTube’s policy on harassment and bullying.” The video had used an interview Hogg had done over the summer with a local news station in California about a fight between a lifeguard and a swimmer as “proof” that Hogg was a “crisis actor.”  

YouTube characterized the incident as an anomaly. “Because the video contained footage from an authoritative news source, our system misclassified it,” YouTube said in a statement. “As soon as we became aware of the video, we removed it from Trending and from YouTube for violating our policies. We are working to improve our systems moving forward.”

Most of the discussion of “fake news” following the 2016 election has centered on Facebook and Twitter. But YouTube has increasingly also become the focus of scrutiny. YouTube has a fake news problem, too, of course. It also has a creepy kids video problem. And a Logan Paul problem. YouTube says it is taking steps to “improve its systems,” but there are questions about what it can actually do to curb fake news. That’s partly because sophisticated groups seeking to sow chaos will find new ways to exploit vulnerabilities in social networks. But it’s also because the vulnerabilities that are being exploited are baked into YouTube’s DNA.

In the introduction to Videocracy, his recently published celebration of online videos, Kevin Allocca, YouTube’s head of culture and trends, highlights the inherently democratic nature of sites like YouTube. “The manner in which we obtain and spread knowledge—and share our immediate experiences—has become more personal and direct, influencing the way we see the world and one another,” Allocca writes. “When niche passions and interests drive programming, seemingly small communities can come to have huge influence in our lives, and the entertainment we consume can begin to reflect deeper, less conscious needs that often go unacknowledged in our media.”

“The result?” Allocca continues. “A new type of pop culture is forming.”

This is the story that YouTube likes to tell about itself. Once upon a time, there were cultural gatekeepers who controlled who saw what and when they saw it. Advances in streaming video bypassed those gatekeepers, who have spent the past decade-plus running around like chickens with their heads cut off as their cultural power was transferred to digital upstarts. Allocca is right to highlight YouTube’s power to provide platforms for subcultures that might otherwise have struggled to gain prominence. The democratic potential of platforms like YouTube can elevate marginal groups.

The Parkland survivors who have spoken out powerfully about the need for gun control are part of that generation of digital natives. Born in the 21st century, they’ve spent their entire conscious life in a world overflowing with screens. When Facebook opened up to non-college students, they were toddlers. These students’ knowledge of social media has allowed them to exploit the same kinds of feedback loops that Donald Trump did during the 2016 campaign: They tweet something, cable news networks cover it, and then they respond to the coverage they’ve received. Without social media, it’s hard to see how the Parkland students could have acquired so much influence so quickly.

But the aftermath of Parkland and other massacres shows the double-edged sword of this power. As The New York Times’s John Hermann pointed out earlier this week, “After the massacre in Las Vegas last October, YouTubers filled a void of information about the killer’s motives with dark speculation, crowding the site with videos that were fonts of discredited and unproven information, including claims that the tragedy had been staged.” Similar videos circulated after Sandy Hook, Sunderland Springs, and San Bernardino, among other tragedies.

The problem is that one of the “deeper, subconscious needs” YouTube is built to fulfill leads people to insane conspiracy theories. There is something deeply human about this—the desire to find meaning and order amid chaos. But that doesn’t change the fact that YouTube is both a giant invitation and mechanism to circulate misinformation after events like the Parkland massacre.

Part of the problem is that the social concept of “trending” is inherently broken. As New York’s Brian Feldman wrote earlier this week, “it selects and highlights content with no eye toward accuracy, or quality. Automated trending systems are not equipped to make judgments; they can determine if things are being shared, but they cannot determine whether that content should be shared further.” YouTube is built to amplify videos that people are gravitating toward, even if those videos have no basis in reality.

This is part of a broader quality problem as well. YouTube’s Channels has actively rewarded attention-grabbing narcissists like Logan Paul on the basis of a similar principle: If people are viewing it, it has value. But as YouTube has become increasingly easy to hijack by people seeking to grab our attention—for reasons good and ill—this philosophy seems increasingly antiquated. YouTube has built a platform that is free of judgment and that existential flaw has come back to haunt it. 

YouTube is still pointing to its early viral successes, most of which were harmless, as the core of what it does. But the most memorable YouTube videos of the past year haven’t been of dudes having their minds blown by double rainbows. They’ve been of crazy conspiracy theories.