Subscribe to Our Newsletter

Success! Now Check Your Email

To complete Subscribe, click the confirmation link in your inbox. If it doesn't arrive within 3 minutes, check your spam folder.

Ok, Thanks
Google Apologizes for Gemini's Historical Inaccuracies
Image credit: Michael M. Santiago/Staff/Getty Images News via Getty Images

Google Apologizes for Gemini's Historical Inaccuracies

Google on Thursday issued an apology after its artificial intelligence (AI) chatbot Gemini generated historically inaccurate or implausible images in its attempt to be diverse and inclusive....

Improve the News Foundation profile image
by Improve the News Foundation
audio-thumbnail
0:00
/1861

Facts

  • Google on Thursday issued an apology after its artificial intelligence (AI) chatbot Gemini generated historically inaccurate or implausible images in its attempt to be diverse and inclusive.1
  • Google paused image generation by Gemini after several examples of this were posted online, including racially diverse Nazi-era German soldiers and a Black pope. Gemini apparently also failed to generate images of White people.2
  • One Gemini user said it would generate the image of 'a black family' but not when asked for 'a white family.' When asked to create an image of America’s Founding Fathers, it included a woman and people of color in it.3
  • Gemini also reportedly provided answers about the Israel-Palestine conflict with a bias against Israel.4
  • Jack Krawczyk, a senior director on Google’s Gemini team, Wednesday said Gemini's depiction of diverse people was 'generally a good thing because people around the world use it,' but added that it's still 'missing the mark.'5
  • This month the image-generation service was added to Gemini, formerly known as Bard.6

Sources: 1The New York Times, 2Quartz, 3The Telegraph, 4Daily Wire, 5The Guardian and 6Verge.

Narratives

  • Left narrative, as provided by Forbes. These issues are to be expected because most AI developers are White men, who are trying to avoid embedding their innate biases into the technology. There's been an overcorrection, but it's fixable and preferable to AI adopting disturbing stereotypes about other races.
  • Right narrative, as provided by New York Post. Gemini was designed by humans who seem to think American and European history is too white. So now what was previously assumed about the Big Tech leaders and what they think of Western civilization has become fact. No apology can make up for their clear disregard for actual history.
  • Narrative C, as provided by Digital Information World. Gemini and other AI products are in their infancy and it's better to have this debate out in the open than to allow the technology to develop biases one way or another. It will take far more training, but eventually, AI will strike the right balance when it comes to understanding and explaining history.

Predictions

Improve the News Foundation profile image
by Improve the News Foundation

Get our free daily newsletter

Success! Now Check Your Email

To complete Subscribe, click the confirmation link in your inbox. If it doesn’t arrive within 3 minutes, check your spam folder.

Ok, Thanks

Read More