Major Publishers Preparing to Sue AI Firms
A handful of major publishers are reportedly close to joining together to file a lawsuit against artificial intelligence (AI) platforms, as well as request legislative action to regulate the use of the companies’ data for training of AI models.
Facts
- A handful of major publishers are reportedly close to joining together to file a lawsuit against artificial intelligence (AI) platforms, as well as request legislative action to regulate the use of the companies’ data for training of AI models.1
- The New York Times and News Corp., owner of the Wall Street Journal, are among the larger entities in the group. Joey Levin, CEO of IAC, another company in the group, warned about how an AI takeover of the media "could be more profound than" "whether AI is going take over the world to eliminate humans and all that stuff."1
- Reportedly the biggest worry of the publishers is that AI might allow Google to answer questions from searchers rather than redirecting the user to a link or providing attribution with the information.2
- IAC owner Barry Diller has been an outspoken critic of AI, recently calling it "overhyped." However, he says his lawsuit is necessary to protect publishers' copyrights in the future as generative AI could abuse the fair use doctrine by ingesting large amounts of copyrighted material.3
- Multiple lawsuits have been filed in the past few months against Google, OpenAI, Meta, and other AI companies on behalf of content creators — including comedian Sarah Silverman — accusing the companies of copyright infringement.2
- Meanwhile, Google recently revealed it is developing Genesis, an AI content creation tool, which could use existing data to produce news articles.4
Sources: 1Semafor, 2Forbes, 3THE DECODER, and 4WinBuzzer.
Narratives
- Narrative A, as provided by MIT Technology Review. With legislative action unlikely to come out of Congress, the best ways to act against AI platforms that may be violating the law are lawsuits. With even the threat of lawsuits hanging over them, AI companies should be motivated to build and train their AI in a way that’s fairer and more equitable, and may even get content creators compensated.
- Narrative B, as provided by Tech News World. With so much data being used to train these AI products, it will be difficult to identify what content is being used and who originally produced them. Winning a lawsuit — which would be costly to both the winner and the loser — could prove difficult. In addition, punishing AIs for using others’ work for learning could lead to similar claims between humans and those who train by watching them.