Subscribe to Our Newsletter

Success! Now Check Your Email

To complete Subscribe, click the confirmation link in your inbox. If it doesn't arrive within 3 minutes, check your spam folder.

Ok, Thanks
Study: Facebook Altered Algorithm During Study on Trustworthy News
Image credit: Nikolas Kokovlis/Contributor/NurPhoto via Getty Images

Study: Facebook Altered Algorithm During Study on Trustworthy News

New research claims that 63 temporary Facebook algorithm changes, made by parent company Meta between November 2020 and March 2021, affected a study published in 2023 concerning whether the social media platform encouraged untrustworthy news....

Improve the News Foundation profile image
by Improve the News Foundation

Facts

  • New research claims that 63 temporary Facebook algorithm changes, made by parent company Meta between November 2020 and March 2021, affected a study published in 2023 concerning whether the social media platform encouraged untrustworthy news.[1]
  • The study, conducted jointly between academic researchers and Meta, analyzed data from September to December 2020. It found that Meta's machine-learning algorithm didn't cause noticeable changes in polarization and showed less untrustworthy news content than a reverse chronological feed.[2][3]
  • At the time, the results of Guess et al.'s study were covered by multiple media outlets including The Washington Post, AP News, and The Atlantic.[3][4][5]
  • However, the new analysis found that while the algorithm changes beginning in November 2020 briefly decreased misinformation by 24%, they began to rise again to pre-study levels beginning in March 2021.[6][7]
  • The editorial team for the journal Science said the research 'casts doubt on Facebook's contentions.' This goes against a previous statement made by Meta Pres. of Global Affairs Nick Clegg, who, citing the 2023 paper, claimed Facebook and Instagram did not 'serve people content that keeps them divided.'[8][9]
  • Meta's collaborative studies face growing scrutiny, with critics calling for more government access to social media company's internal data. This comes ahead of another scheduled study, to be conducted by the Center for Open Science, on the mental health effects of Meta-owned Instagram.[8]

Sources: [1]Phys, [2]Science (a), [3]Washington Post, [4]Atlantic, [5]Associated Press, [6]Zenodo, [7]Eurekalert, [8]Wsj and [9]Science (b).

Narratives

  • Narrative A, as provided by The Daily Wildcat and Los Angeles Times. Social media has played a key part in the rise of polarization, with algorithms and fake news seeking to exploit audiences who continue to spend an increasing amount of time online. Unless there's an immediate and widespread effort to teach key skills to help identify and combat this digital epidemic, social cohesion will continue to crumble in the face of division and hatred.
  • Narrative B, as provided by The Conversation. Social media is one of a multitude of independent factors that can be attributed to today's tide of polarization — ranging from a country's political freedom to the unique psychological state of any given individual. While social media can't be denied as an important part of the puzzle, solving the problem of an increasingly divided world involves more than placing sole blame on online platforms.
  • Cynical narrative, as provided by Washington Examiner and UnHerd. Everyone knows that misinformation is a problem — across the world and with respect to all ideologies — but that doesn't mean we should give tech companies or governments the power to define what is true. Whether it's the Chinese government-linked TikTok app or politicians in Western democracies, so-called 'defenders' of democracy are going to use this fearmongering as a trojan horse to impose rules in their favor.

Predictions

Improve the News Foundation profile image
by Improve the News Foundation

Get our free daily newsletter

Success! Now Check Your Email

To complete Subscribe, click the confirmation link in your inbox. If it doesn’t arrive within 3 minutes, check your spam folder.

Ok, Thanks

Read More