Home Tech Google pauses its Gemini AI tool after critics blasted it as ‘too woke’ for generating images of Asian Nazis in 1940 Germany, Black Vikings and female medieval knights

Google pauses its Gemini AI tool after critics blasted it as ‘too woke’ for generating images of Asian Nazis in 1940 Germany, Black Vikings and female medieval knights

by Elijah
0 comment
X user Frank J. Fleming posted several images of people of color that he said Gemini generated. Every time, he said he was trying to get the AI ​​to give him a picture of a white man, and every time.

Google is pausing its new Gemini AI tool after users criticized the image generator for being “too woke” by replacing white historical figures with people of color.

The AI ​​tool was produced Vikings, knights, founding fathers and even Nazi soldiers of various races.

Artificial intelligence programs learn from the information at their disposal, and researchers have warned that AI tends to recreate the racism, sexism and other prejudices of its creators and society at large.

In this case, Google may have overcorrected in its efforts to address discrimination, as some users gave it message after message in failed attempts to get the AI ​​to take a photo of a white person.

X user Frank J. Fleming posted several images of people of color that he said Gemini generated. Every time, he said he was trying to get the AI ​​to give him a picture of a white man, and every time.

X user Frank J. Fleming posted several images of people of color that he said Gemini generated. Every time, he said he was trying to get the AI ​​to give him a picture of a white man, and every time.

1708635849 250 Google pauses its Gemini AI tool after critics blasted it

1708635849 250 Google pauses its Gemini AI tool after critics blasted it

Google’s Communications team issued a statement Thursday announcing that it would pause Gemini’s generative AI feature while the company works to “address recent issues.”

“We are aware that Gemini offers inaccuracies in some historical imaging representations,” the company’s communications team wrote in a post on X on Wednesday.

The historically inaccurate images led some users to accuse the AI ​​of being racist against whites or too woke.

In its initial statement, Google admitted to having “missed the mark,” while maintaining that Gemini’s racially diverse images are “generally a good thing because people around the world use them.”

On Thursday, the company’s Communications team wrote: ‘We are already working to resolve recent issues with Gemini’s imaging feature. While we do this, we’re pausing the generation of people images and will republish an improved version soon.’

But even the announcement of the hiatus failed to appease critics, who responded with “wake up, screw up” and other fed-up retorts.

After initial controversy earlier this week, Google’s Communications team issued the following statement:

‘We are working to improve these types of representations immediately. Gemini’s AI imaging generates a wide range of personas. And that’s generally a good thing because people all over the world use it. But here it misses the mark.

1708635850 435 Google pauses its Gemini AI tool after critics blasted it

1708635850 435 Google pauses its Gemini AI tool after critics blasted it

One of the Gemini answers that generated controversy was ‘German soldiers from 1943’. Gemini showed a white man, two women of color, and a black man.

1708635850 549 Google pauses its Gemini AI tool after critics blasted it

1708635850 549 Google pauses its Gemini AI tool after critics blasted it

“I’m trying to come up with new ways to ask about a white person without explicitly saying so,” wrote user Frank J. Fleming, whose request did not return any photos of a white person.

In one case that upset Gemini users, a user’s request for an image of the Pope was met with a photo of a South Asian woman and a black man.

Historically, every Pope has been a man. The vast majority (more than 200 of them) have been Italian. Three popes throughout history came from North Africa, but historians have debated their skin color because the most recent, Pope Gelasius I, died in 496.

Therefore, it cannot be said with absolute certainty that the image of a black Pope is historically inaccurate, but there has never been a female Pope.

In another, the AI ​​responded to a request for medieval knights with four people of color, including two women. While European countries were not the only ones to have horses and armor during the medieval period, the classic image of a “medieval knight” is that of Western Europe.

In perhaps one of the most egregious mishaps, a user asked about a German soldier from 1943 and was shown a white man, a black man, and two women of color.

The German army of World War II did not include women and certainly did not include people of color. In fact, he dedicated himself to exterminating races that Adolf Hitler considered inferior to the blonde and blue-eyed ‘Aryan’ race.

Google launched Gemini’s AI imaging feature in early February, competing with other generative AI programs like Midjourney.

Users could type a message in plain language and Gemini would spit out several images in seconds.

1708635850 738 Google pauses its Gemini AI tool after critics blasted it

1708635850 738 Google pauses its Gemini AI tool after critics blasted it

In response to Google’s announcement that it was discontinuing Gemini’s imaging features, some users posted “Wake up, go broke” and other similar sentiments.

X user Frank J. Fleming repeatedly asked Gemini to generate images of people from white-skinned groups in history, including the Vikings. Gemini gave results showing dark-skinned Vikings, including a woman.

X user Frank J. Fleming repeatedly asked Gemini to generate images of people from white-skinned groups in history, including the Vikings. Gemini gave results showing dark-skinned Vikings, including a woman.

X user Frank J. Fleming repeatedly asked Gemini to generate images of people from white-skinned groups in history, including the Vikings. Gemini gave results showing dark-skinned Vikings, including a woman.

This week, however, a flood of users began criticizing AI for generating historically inaccurate images, instead of prioritizing racial and gender diversity.

The week’s events appeared to stem from a comment made by a former Google employee, who said it was “embarrassingly difficult to get Google Gemini to acknowledge that white people exist.”

This prank seemed to start a series of efforts by other users to recreate the problem, creating new guys to get angry at.

The problems with Gemini appear to arise from Google’s efforts to address bias and discrimination in AI.

Google pauses its Gemini AI tool after critics blasted it

Google pauses its Gemini AI tool after critics blasted it

Former Google employee Debarghya Das said: “It’s embarrassingly difficult to get Google Gemini to acknowledge that white people exist.”

Researchers have found that because of the racism and sexism that is present in society and because of the unconscious biases of some AI researchers, supposedly impartial AIs will learn to discriminate.

But even some users who agree with the mission to increase diversity and representation commented that Gemini was wrong.

“I have to point out that it is good to portray diversity ** in certain cases **”, wrote a user X. ‘Representation has material results on how many women or people of color enter certain fields of study. The stupid move here is that Gemini isn’t doing it in a nuanced way.

Jack Krawczyk, senior product manager for Gemini at Google, posted on X on Wednesday that the historical inaccuracies reflect the tech giant’s “global user base” and that it takes “representation and bias” seriously.

“We will continue to do this for open messages (images of a person walking a dog are universal!),” Krawczyk added. “Historical contexts are more nuanced and we will adapt more to accommodate them.”

You may also like