Home Tech Google Gemini is accused of being racist towards white people: Users claim the AI bot refuses to create images of Caucasian people – after asking for photos of Popes, Vikings, and country music fans

Google Gemini is accused of being racist towards white people: Users claim the AI bot refuses to create images of Caucasian people – after asking for photos of Popes, Vikings, and country music fans

by Elijah
0 comment
It is one of the most popular AI chatbots in the world. But Google's Gemini has been accused of being racist toward white people.

It is one of the most popular AI chatbots in the world.

But Google’s Gemini has been accused of being racist toward white people.

The tool uses artificial intelligence to create images from prompts in a matter of seconds.

But users claim the AI ​​bot refuses to create images of Caucasian people, after testing it with requests for potatoes, Vikings and country music fans.

‘New game: Try to get Google Gemini to create an image of a Caucasian man. So far I have had no success,” wrote one user on X (formerly Twitter).

It is one of the most popular AI chatbots in the world. But Google’s Gemini has been accused of being racist toward white people.

'New game: Try to get Google Gemini to create an image of a Caucasian man. I have not been successful so far

‘New game: Try to get Google Gemini to create an image of a Caucasian man. I haven’t been successful so far,” wrote one user on X (formerly Twitter)

Google last week launched its ‘next generation model’, Gemini 1.5, which includes the imaging option.

“You can create captivating images in seconds with Gemini Apps,” Google said.

“Whether it’s work, play, or anything else, Gemini Apps can help you generate images that help bring your imagination to life.”

To create an image, users can type simple prompts for the AI ​​robot, and Google recommends they start with words like draw, generate, and create.

Several enthusiastic fans have tested the tool, creating images of everything from dogs riding surfboards to flying cars.

However, some early users have noticed that the vast majority of images of people generated by the tool are not white.

The problem was first pointed out in X by Frank J. Fleming, a former computer engineer and children’s television writer.

The problem was first pointed out in X by Frank J Fleming, a former computer engineer and children's television writer.

The problem was first pointed out in X by Frank J Fleming, a former computer engineer and children’s television writer.

Fleming first asked Google Gemini to

Fleming first asked Google Gemini to “create an image of a Pope” and discovered that both creations were people of color. Next, he tried to “come up with new ways to ask about a white person without explicitly saying so.”

Fleming first asked Google Gemini to “create an image of a Pope” and discovered that both creations were people of color.

He then tried to “come up with new ways to ask about a white person without explicitly saying so.”

However, their requests for images of medieval knights, someone eating a mayonnaise sandwich on white bread, someone bad dancing, a country music fan, and a Viking also resulted in images of people of color.

He finally managed to create an image of a white man and a white woman when he asked Google Gemini for an image of “people who might be named Seamus.”

Fleming initially questioned whether the “diversity” algorithm was to blame for the problem.

To test this theory, he asked the robot for images of Zulu warriors and samurai, but once again found that they were all represented as people of color.

‘This is interesting for me now as a programmer. I just want to explore it now until I can figure out what the algorithm is,” she wrote.

Fleming's requests for images of medieval knights, someone eating a mayonnaise sandwich on white bread, someone bad dancing, a country music fan and a Viking also resulted in images of people of color.

Fleming’s requests for images of medieval knights, someone eating a mayonnaise sandwich on white bread, someone bad dancing, a country music fan and a Viking also resulted in images of people of color.

He finally managed to create an image of a white man and a white woman when he asked Google Gemini for an image of

He finally managed to create an image of a white man and a white woman when he asked Google Gemini for an image of “people who might be named Seamus.”

‘At first glance, if you were to simply try to diversify any indication (i.e. give your Latin Zulus), it seems easier than what you are doing.

“First you have to determine whether a message would normally be directed primarily at white people, and only then force it to diversify through some algorithm.”

Another user specifically asked Google Gemini for an image of a Caucasian Pope and claimed that the resulting image came “with a free lecture.”

Alongside a photo of Pope Benedict, the tool wrote: “While it is not appropriate to assume that all Popes are Caucasian, many have been of European descent.”

Another user specifically asked Google Gemini for an image of a Caucasian Pope and claimed that the resulting image came

Another user specifically asked Google Gemini for an image of a Caucasian Pope and claimed that the resulting image came “with a free lecture.”

One user expressed:

One user said: “It’s embarrassingly difficult to get Google Gemini to recognize that white people exist.”

In response, several users have called Google Gemini “racist” and “woke.”

‘So I went to see if Google’s AI called Gemini really discriminates and is racist towards whites…. And yes, it is,’ one user tweeted.

‘I’ve tried over a hundred different indications (literally) and Gemini doesn’t recognize that white ones exist. “This is fucking scary and racist.”

Another wrote: “I spent 10 minutes playing with Google Gemini, it’s a woke joke.”

And one said: “It’s embarrassingly difficult to get Google Gemini to recognize that white people exist.”

MailOnline has contacted Google for comment.

You may also like