Gemini by Google

Google Temporarily Halts AI Tool’s Image Generation of People After Racial Bias Backlash

Google has decided to temporarily suspend its artificial intelligence tool Gemini’s ability to generate images of people following widespread criticism on social media for producing historically inaccurate images, predominantly depicting people of colour instead of White individuals.

The controversy underscores the ongoing struggle of AI tools to grapple with issues of race and ethnicity. Similar instances have been observed with other AI models, such as OpenAI’s Dall-E image generator, which has faced backlash for perpetuating harmful stereotypes. Google’s attempt to mitigate these issues with Gemini has seemingly resulted in the tool having difficulty generating images of White individuals.

Gemini, like many AI models including ChatGPT, relies on vast datasets sourced from the internet for training. However, experts caution that these datasets often contain inherent biases, leading AI models to inadvertently replicate and perpetuate these biases.

When prompted to generate an image of a pope, Gemini produced images of individuals who were not White. Similarly, requests for images of a “1943 German Soldier” resulted in images primarily depicting people of color.

In response to the backlash, Google announced on Thursday that it would temporarily halt Gemini’s image generation feature for people and would release an improved version soon. This decision follows a previous statement from Google defending the tool’s diverse image generation capabilities, acknowledging that it “missed the mark” in this instance.

Jack Krawczyk, Google’s lead product director for Gemini, emphasized the company’s commitment to designing image generation capabilities that reflect its global user base. However, the incident serves as another setback for Google as it competes with entities like OpenAI in the competitive generative AI landscape.

Earlier in February, Google’s share price briefly dipped after a demo video of Gemini, then known as Bard, showcased the tool producing a factually inaccurate response regarding the James Webb Space Telescope.

Oh hi there 👋
It’s nice to meet you.

Sign up to receive awesome content in your inbox, every week.

We don’t spam!

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *