Subscribe to Our Newsletter

Success! Now Check Your Email

To complete Subscribe, click the confirmation link in your inbox. If it doesn't arrive within 3 minutes, check your spam folder.

Ok, Thanks

Study: Meta Algorithm Has Little Effect on Political Polarization

Four research papers released by the journals Science and Nature on misinformation and political polarization on social media have revealed that tweaks made to the Facebook and Instagram algorithms "did not sway political attitudes."

Improve the News Foundation profile image
by Improve the News Foundation
Study: Meta Algorithm Has Little Effect on Political Polarization
Image credit: Unsplash

Facts

  • Four research papers released by the journals Science and Nature on misinformation and political polarization on social media have revealed that tweaks made to the Facebook and Instagram algorithms "did not sway political attitudes."1
  • In the studies, researchers from several institutions, including Princeton University, Dartmouth College, and the University of Texas, were granted access by Facebook and Instagram owner Meta to social media data pertaining to the 2020 US presidential election.2
  • Of particular interest to the researchers were social media "echo chambers," where users are mostly delivered partisan content that agrees with their worldview. Analyzing the data of 208M Facebook users found stark differences in the content left-wing and right-wing users engage with.3
  • In an experiment, the researchers altered the social media feeds of thousands of Meta users in late 2020 to show them recent posts instead of those Meta believed to be most engaging. The chronological feed had users spend less time on the platforms but didn't alter political polarization, according to follow-up surveys.4
  • Other proposals, such as limiting content from like-minded users, also failed to alter political behavior online or offline. Joshua Tucker, a lead investigator in the project, concluded that proposals to reduce social media echo chambers would have had a limited impact on the 2020 election.3
  • According to the research, right-wing users were also more likely to consume information flagged as misinformation by Meta. The authors noted, however, that the studies don't take into account the long-term impact of the algorithm on users, with forthcoming studies expected.5

Sources: 1Guardian, 2CNBC, 3Nature, 4Washington Post, and 5Associated Press.

Narratives

  • Establishment-critical narrative, as provided by Guardian. These studies are carefully crafted pieces designed to shift the blame off of the social media platforms that fuel intense political polarization. Social media platforms are in no way absolved from the damage they have caused democracy and have even reversed some of their misinformation guardrails in the run-up to the 2024 election. No matter how odious the content, engagement is the only thing of value to these companies.
  • Pro-establishment narrative, as provided by Renew Democracy Initiative. Everybody loves a scapegoat, but the inconvenient reality of political polarization is that social media platforms play a limited role. The algorithms of Meta and others simply make it easier for people to see the content they want to see without affecting their underlying beliefs or values. The issues in America's democracy go deeper than Facebook and speak to political discourse in profound need of repair. Polarization is driven by people, not platforms, and the solution lies within everyday people, not Silicon Valley.

Improve the News Foundation profile image
by Improve the News Foundation

Get our free daily newsletter

Success! Now Check Your Email

To complete Subscribe, click the confirmation link in your inbox. If it doesn’t arrive within 3 minutes, check your spam folder.

Ok, Thanks

Read More