Watchdog Groups: Investigate Technology Behind AI Systems

Facts

  • 13 artificial intelligence (AI) watchdog groups have written to their respective European regulators to urge them to investigate the underlying technology that powers chatbots such as ChatGPT.1
  • The groups also urged US Pres. Biden to commence research into the technology and its potential harm to humanity.2
  • Consumer groups from Sweden, Italy, Spain, Holland, Greece, Denmark, and other countries cited potential problems with AI — including providing incorrect medical information, manipulating the public, creating fake news stories, and illegally gathering people's data from around the internet.3
  • The group's goal, citing research from the Norwegian Consumer Council, is for governments to take action toward emerging AI and for regulators to thoroughly study it before implementing pending or future laws.2
  • Some EU countries have already acted, such as Italy's privacy watchdog ordering ChatGPT owner OpenAI to temporarily halt its processing of users' personal information. The entire European bloc is finalizing the world's first set of AI laws, though they aren't expected to take effect for two years.1

Sources: 1Associated Press, 2CryptoRank, and 3The Economic Times.

Narratives

  • Narrative A, as provided by New York Times. Western lawmakers’ ignorance on this topic has put countries well behind in their regulation of AI. Laws like those proposed in Europe are a good start, but the public also deserves to know exactly how these companies make the systems they've released into the world. AI is a technology that can pose an existential risk and requires serious intervention by the public sector.
  • Narrative B, as provided by Reuters. AI is actually not new — it's been used to convenience consumers for years now. Although there’s always the potential for human abuse, combating the issue doesn't necessarily require over-regulation. AI instead requires a more nuanced understanding of the promise and context of evolving technology by the public.