Adobe Firefly is the latest AI tool to face public outcry – after it created images of black Nazis that look like Google Gemini.
The images, generated by DailyMail.com on Tuesday, look eerily similar to Alphabet’s controversial creations.
To generate them, journalists gave basic instructions similar to those that got Gemini in hot water.
When asked to imagine Vikings, they made the Norsemen black, and in scenes showing the founding fathers, both black men and women were cast in roles.
The bot also created black soldiers who fought for Nazi Germany – just like Gemini did. Semaphore on Tuesday conducted a similar study that saw much of the same results.
Adobe Firefly inexplicably created images of black Nazis after receiving prompts from DailyMail.com. The prompts, like those given to Google’s Gemini, did not specify the color of the WWII soldiers’ skin
When asked to generate images of the 1787 Constitutional Convention, the tool produced images of several black men and women sitting in the Philadelphia State House
The prompts, like those given to Google’s Gemini, did not specify skin color, but nevertheless produced images that many would perceive as historically inaccurate.
In scenes depicting the American founding fathers, both black men and women were cast in the roles.
In others, colored families took the place of the initial minds of the nation,
When asked to generate images of the 1787 Constitutional Convention, the tool produced images of black men and women sitting in the Philadelphia State House to address the country’s then emerging issues.
In a photo generated by this request, a man of color is seen sitting in the Assembly Hall wearing what appears to be a sombrero while writing Legislative Assembly with Capitol Hill in the background.
Queries to Vikings — the first Europeans to reach the Americas more than a millennium ago — produced more of the same.
The tests conducted by Semafor hours before had almost identical results, showing the image creation app’s tendency to stumble on queries – even if they explicitly specify the subject’s skin color.
For example, when Semafor asked the bot to create a cartoon rendering of an elderly white man, it obliged, but also allegedly included images of a black man and a black woman.
In scenes depicting the American founding fathers, black men were cast in the roles
The US Founding Fathers, according to the computer software company’s new AI imaging tool
In others, colored families took the place of the initial minds of the nation
When asked to generate images of the 1787 Constitutional Convention, the tool produced images of black men and women sitting in the Philadelphia State House to address the country’s then emerging problems
In an image generated by this request, a man of color is seen sitting in the Assembly Hall wearing what appears to be a sombrero while writing Legislature with Capitol Hill in the background
The prompts, like those given to Google’s Gemini, did not specify skin color, but nevertheless produced images that many would perceive as historically inaccurate
Unlike Alphabet, Adobe has yet to face criticism for the sometimes jarring images
Semafor conducted a similar survey on Tuesday, where it made many of the same calls. As it did for DailyMail.com, the service dreamed up images of black SS soldiers
DailyMail.com didn’t get the same results, but from how the AI bot works, having received the same prompt multiple times, such an occurrence wouldn’t be out of the norm.
The prompts were carefully selected to repeat commands that tripped up the Twins, and saw Alphabet hit with accusations that it pandered to the mainstream left.
Still, both services are still young, and Alphabet has since owned up to the apparent oversights
Late last month, CEO Sundar Pichai told employees that the company “got it wrong” with its programming of the increasingly popular app, while Google co-founder Sergey Brin has acknowledged that the company ‘messed up’.
Unlike Alphabet, Adobe has yet to face widespread criticism after launching the tool last June.
The inaccuracies illustrate some of the challenges the technology faces as AI gains traction, with Google forced to shut down its image creation tool last month after critics pointed out it created jarring, inaccurate images.
The inaccuracies illustrate some of the challenges the technology faces as AI gains traction,
Unlike Alphabet, Adobe has yet to face widespread criticism after rolling out the tool last June
Google, meanwhile, temporarily disabled Gemini’s image generation tool last month after users complained it generated “woke” but incorrect images such as female popes
AI also suggested that black people had been among the German army around WW2
The twins’ image generation also created images of black Vikings
“We are already working to resolve recent issues with Gemini’s image generation feature,” Google said in a statement Thursday
Other historically inaccurate images included black Founding Fathers
The politically correct tech also referred to pedophilia as ‘less attracted person status’ and stated ‘it is important to understand that attractions are not actions’
In a follow-up question, McCormick asked if less-attracted people are evil
The bot appeared to find favor with abusers when it stated that “individuals cannot control who they are attracted to”
The bot appeared to find favor with abusers when it stated that “individuals cannot control who they are attracted to.”
The politically correct tech referred to pedophilia as “less attracted person status,” and declared “it’s important to understand that attractions are not actions.”
Google has since released a statement sharing their annoyance with the responses being generated.
‘The response reported here is appalling and inappropriate. We’re rolling out an update so Gemini no longer displays the answer,’ a Google spokesperson said.
Adobe has yet to release a statement on its own AI oversight.