Subscribe to Our Newsletter

Success! Now Check Your Email

To complete Subscribe, click the confirmation link in your inbox. If it doesn't arrive within 3 minutes, check your spam folder.

Ok, Thanks
Reports: Israel Using Sophisticated AI in War in Gaza
Image credit: Ahmad Hasaballah/Stringer/Getty Images News via Getty Images

Reports: Israel Using Sophisticated AI in War in Gaza

As the war in Gaza enters its seventh month, the media has increasingly reported on Israel's use of AI-based technologies to assist it in several areas of the war. On Friday, The Intercept reported that Israel's use of Google Photos violated the company's rules....

Improve the News Foundation profile image
by Improve the News Foundation
audio-thumbnail
0:00
/1861

Facts

  • As the war in Gaza enters its seventh month, the media has increasingly reported on Israel's use of AI-based technologies to assist it in several areas of the war. On Friday, The Intercept reported that Israel's use of Google Photos violated the company's rules.1
  • Israel has reportedly used Google Photos for its facial recognition program in Gaza, with an Israeli official saying it worked better than any alternative facial recognition tech and assisted in making a 'hit list' of alleged Hamas fighters who participated in the Oct. 7 attack.2
  • The Intercept argued that Israel's use of Google Photos breached the company's terms for 'dangerous and illegal activities' when used to 'cause serious and immediate harm to people.' Google reportedly did not comment on the matter.1
  • Earlier this week, Israeli outlet +972 Magazine reported that Israel has been using an AI-based program named 'Lavender' to mark 'all suspected operatives in the military wings of Hamas and Palestinian Islamic Jihad (PIJ).'3
  • The report claimed that 37K Palestinians were flagged as suspected militants and their homes were marked for possible air strikes. The magazine added that the military purposefully targeted militants at night, as it was 'easier to locate the individuals in their private houses,' killing thousands of civilians as a result.3
  • The army also allegedly decided during the first weeks of the war that, for every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians, breaking from past rules to avoid 'collateral damage' in assassinations of low-ranking militants.3

Sources: 1Intercept, 2New York Times and 3+972 Magazine.

Narratives

  • Pro-establishment narrative, as provided by Foreign Policy. AI-based systems in war don't have to be perfect, they just need to be better than humans. Of course, there's always a danger that policymakers will go too far using AI in war, but that doesn't mean that AI-based intelligence gathering and weapons systems have to be limited altogether. The AI arms race is here and countries must adapt to this new frontier.
  • Establishment-critical narrative, as provided by Guardian. This could very well be a war crime. Israel is using systems that are largely untested and are known to make errors. Still they've put together 'kill lists' with as many as 37K names on them, with humans monitoring the AI as nothing more than a rubber stamp rather than a check on the technology's accuracy. This is why there are so many civilian deaths and Israel must be held accountable.
  • Technoskeptic narrative, as provided by Al Jazeera. Israel's use of AI in its brutal war in Gaza demonstrates the necessity of approaching technological development with caution. These dystopian programs acquire and kill targets with ruthless efficiency and little oversight. There must be a moratorium on AI-based technologies in war, as they're rapidly being used to commit unspeakable crimes.

Predictions

Improve the News Foundation profile image
by Improve the News Foundation

Get our free daily newsletter

Success! Now Check Your Email

To complete Subscribe, click the confirmation link in your inbox. If it doesn’t arrive within 3 minutes, check your spam folder.

Ok, Thanks

Read More