Biden Admin. Urges SCOTUS to Narrow Protections for Big Tech

Facts

  • In a filing to SCOTUS on Wednesday, Pres. Biden’s admin. argued there should be limits to Section 230 protections for social media companies, and companies should face liability in cases where their algorithms recommend dangerous content to users.
  • Section 230 of the Communications Decency Act was passed by Congress in 1996 to shield all websites, including social media platforms, from being sued over the dissemination of third-party content. It has been criticized from both sides of the political aisle because of its potential contribution to the spread of hate speech and misinformation, as well as political discrimination.
  • SCOTUS is presiding over a lawsuit filed by the family of Nohemi Gonzalez, an American who was killed by Islamist militants in a mass shooting at a Paris bistro in 2015. Gonzalez’s family argues Alphabet —the parent company of Google and YouTube — should share liability for the deaths because YouTube recommended videos by the Islamic State group to the killers through an algorithm.
  • In its brief, the Dept. of Justice didn’t argue for Google to be held liable in this specific case and voiced support for most of the protections Section 203 provides. But the Dept. suggested that more scrutiny should come for algorithms like the ones used by YouTube and other platforms.
  • Sen. Josh Hawley (R-Mo.) also filed a brief in this case, writing that Section 230 hasn’t been applied correctly to social media platforms, and “far from making the internet safer for children and families, Section 230 now allows platforms to escape any real accountability for their decision-making — as the tragic facts, and procedural history, of this case make clear."

Sources: Reuters, CNN, and Archive.

Narratives

  • Narrative A, as provided by Axios. Reducing Section 230 protections would cause a tidal wave of litigation and the financial collapse of social media platforms. These companies have invested heavily in content-moderating technology and are doing their best to keep dangerous content from spreading. Leave Section 230 alone, it holds an important function in protecting platforms from overwhelming lawsuits.
  • Narrative B, as provided by Bloomberg Law. It’s time for Big Tech to stop ducking responsibility for the harm its platforms cause and face the same liability as other industries. For too long social media platforms have gotten away with aiding and abetting horrific events and movements. It’s time for the antiquated Section 230 to be reformed. The reality of the internet of the 1990s does not apply to 2022.