You are using an outdated browser.
Please upgrade your browser
and improve your visit to our site.

Facebook also has a terrorism problem.

Gerardo Mora / Getty Images

On Monday, the families of three victims of the Pulse nightclub massacre in Orlando filed a lawsuit against Facebook, Twitter, and Google. First reported by Fox News, the families of Tevin Crosby, Javier Jorge-Reyes, and Juan Ramon Guerrero contend that the three Silicon Valley tech giants provided “the terrorist group ISIS with accounts they use to spread extremist propaganda, raise funds, and attract new recruits.”

This is not the first time that these companies have been sued. In June, the father of Nohemi Gonzalez, the only American killed in the November 2015 Paris attacks, claimed the three companies provided “material support” instrumental to the rise of ISIS.

The lawsuit takes aim at Section 230 of the Communications Decency Act of 1996, which states: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content.” In other words, online platforms are not liable for content published by other people.

But the provision, established in 1996, reflects how difficult it is for our laws to accurately address the rapid changes and increasing influence of technology. These changes demand a re-examination of the moral responsibilities of Silicon Valley. As Sarah Jones wrote in The New Republic, “Algorithms can’t solve everything.” From potentially influencing an election to endangering the lives of citizens, this year revealed the real-life consequences of internet phenomena.