A New York state judge has ruled that several social media companies, including Facebook, YouTube, Discord, Twitch, and Reddit, are not legally liable for the radicalization and actions of the gunman who carried out the 2022 mass shooting in Buffalo. The decision is a major victory for tech firms frequently under fire for hosting extremist content.
๐งพ Families Sued for Accountability After Tragic Hate Crime
Families of the 10 Black victims who were fatally shot at a Tops supermarket in Buffalo had filed a civil suit, arguing that social media platforms enabled the 18-year-old white supremacist, Payton Gendron, to become radicalized and carry out the racially motivated attack. Their lawsuit sought to hold the companies responsible for spreading hate speech and failing to remove violent content.
๐ Section 230 Shields Tech Firms Once Again
Judge Paula Feroleto concluded that the companies were protected under Section 230 of the Communications Decency Act, a federal law that immunizes platforms from liability for third-party content. The ruling emphasized that although the shooter may have consumed and shared hateful material online, the platforms were not legally responsible for his criminal actions.
๐ง Content Access Doesn’t Equal Causation, Says Court
The court stated that while it was “deeply sympathetic” to the victims’ families, legal precedent and statutory protections mean social media exposure is not sufficient to establish direct causation or liability. “Publishing content—even abhorrent content—is not the same as participating in a crime,” the judge wrote.
๐ข Civil Rights Advocates Disappointed but Not Surprised
Advocates and legal representatives of the families expressed disappointment, noting that the decision underlines the urgent need for legislative reform. “Big Tech must not continue to hide behind outdated laws while their platforms become breeding grounds for domestic terrorism,” said one attorney involved in the case.
๐งญ Wider Implications for Future Lawsuits
The ruling could have lasting consequences for future legal actions attempting to hold social media companies accountable for offline violence linked to online radicalization. It reinforces a string of similar outcomes where courts have largely sided with platforms under Section 230 protections.
๐ Background: A Livestreamed Tragedy
On May 14, 2022, Gendron live-streamed the deadly attack on Twitch, wearing tactical gear and using a semi-automatic rifle. Authorities later found that he had posted manifestos and violent plans on Discord, Reddit, and 4chan. His content was widely condemned and later removed, but not before being viewed and shared across the internet.
โ๏ธ The Legal and Ethical Debate Continues
While this ruling marks the end of the lawsuit in New York courts, it adds fuel to the ongoing political and ethical debate surrounding tech company responsibility, algorithmic amplification of hate, and the need for tighter digital content regulation. Lawmakers are once again being urged to revisit Section 230 and establish clearer accountability mechanisms.