Australian Mayor Readies Defamation Suit Over ChatGPT
0:00
/0:00
Facts
- Mayor of Hepburn Shire, Australia, Brian Hood, said he may sue OpenAI, owner of the artificial intelligence (AI) ChatGPT, amid reports that the chatbot has made false claims that he served time in prison for bribery.1
- ChatGPT has reportedly said that Hood went to prison in connection with a foreign bribery scandal involving a subsidiary of the Reserve Bank of Australia, Note Printing Australia, in the early 2000s. His lawyers said he did work there but that he was the one who notified authorities of the scheme.2
- Hood's lawyers also said they sent OpenAI a letter of concern on March 21, which gave the company 28 days to fix any errors about their client or face a possible defamation lawsuit.3
- According to the legal team, the San Francisco-based company hasn't yet responded to their letter. If the lawsuit goes forward, it would likely be the first time a person has sued OpenAI for claims made by the automated language product.4
- One of Hood's lawyers, James Naughton, argued that 'He's an elected official, his reputation is central to his role,' adding that 'it makes a difference to him if people in his community are accessing this material.'2
- Hood is unaware of the exact number of users who accessed the false information — a determinant in the payout size — but Naughton said the nature of the lie was serious enough that he may claim more than A$200K ($134K).1
Sources: 1Reuters, 2Mint, 3Reason.com and 4The straits times.
Narratives
- Narrative A, as provided by Wall Street Journal. While the slandering of Hood's reputation is troublesome, it will be very difficult to prove an AI algorithm is at fault for disseminating defamatory information. To defame someone — legally speaking — the perpetrator must have knowingly disseminated the falsities with malice, but how could a computer do that? Such cases involving public figures will lead to ever-growing debate on the issue of AI and its role in public discourse, but to sue a robot isn't a winnable course of action.
- Narrative B, as provided by Gizmodo. The case against OpenAI has nothing to do with the algorithm and everything to do with the company's delayed response to Hood's request. Once Hood proved the information to be false, OpenAI should have scraped it off the platform immediately, but, according to the lawyers, it hasn't done that and therefore has opened itself up to a legitimate allegation of defamation.