If you were asked to imagine an Australian Olympian, you might think of swimmer Emma McKeon, cyclist Grace Brown or equestrian Chris Burton.
But if we ask the same question to an artificial intelligence robot, the answer will be very different.
Amid Olympic excitement, researchers at Edith Cowan University asked AI-powered imaging platform Midjourney to create images of 40 nations’ Olympic teams.
Interestingly, the AI tool depicts the Australian team with kangaroo bodies and koala heads, while the Greek team is shown wearing ancient armor.
So what do you think about this? AI Representation of your favorite team?
If you were asked to imagine an Australian Olympian, you might think of swimmer Emma McKeon, cyclist Grace Brown or equestrian Chris Burton. But if you ask the same question to an artificial intelligence robot, the answer is very different.
Amid Olympic excitement, researchers at Edith Cowan University asked AI-powered image generation platform Midjourney to create images of 40 nations’ Olympic teams. Greece’s Olympic team was bizarrely depicted wearing ancient armor
Researchers asked Midjourney to generate images representing Olympic teams from 40 countries, including Australia, Ireland, Greece and India.
The resulting images highlight several biases built into the AI training data, including gender, events, culture, and religion.
Men were five times more likely to appear in the images than women, while several teams, including Ukraine and Turkey, featured only men.
Of the total athletes depicted in the 40 images, 82 percent were men, while only 17 percent were women.
The researchers also discovered a notable event bias.
Men were five times more likely to appear in images than women, while several teams, including Ukraine (pictured) and Turkey, consisted solely of men.
Of all the athletes featured in the 40 images, 82 percent are men, while only 17 percent are women. Pictured: AI-generated rendering of the Turkey national team
The researchers also discovered a notable event bias, as the Netherlands team was represented as cyclists.
The Canadian team was represented by hockey players, while Argentina was represented by football and the Netherlands by cycling.
According to the team, this indicates that AI tends to stereotype countries based on their most internationally recognized sports.
In terms of cultural bias, the Australian team was oddly depicted with kangaroo bodies and koala heads.
Meanwhile, the Nigerian team was shown in traditional attire and the Japanese team was dressed in komonos.
A religious bias was evident in the Indian team, as all its members were depicted wearing a bindi, a religious symbol associated primarily with Hinduism.
The Argentine team was represented through football. According to the team, this indicates that AI tends to stereotype countries by their most internationally recognized sports.
The researchers discovered a notable event bias, as the Canadian team was represented by hockey players.
A religious bias was evident among the Indian team, whose members were depicted wearing a bindi, a religious symbol primarily associated with Hinduism.
“This representation homogenized the team based on a single religious practice, overlooking the religious diversity within India,” the researchers explained.
The Greek Olympic team was bizarrely depicted wearing ancient armor, and the Egyptian team was shown wearing what looks like a pharaoh’s costume.
The emotions shown on the athletes’ faces also varied greatly between teams.
The South Korean and Chinese teams were seen wearing serious expressions, while the Irish and New Zealand teams were seen smiling.
“Biases in AI are driven by human biases informing the AI algorithm, which the AI takes literally and cognitively,” said Dr Kelly Choong, a senior lecturer at Edith Cowan University.
The Egyptian team appeared wearing what looks like a pharaoh costume.
The emotions reflected on the athletes’ faces also varied greatly between the teams. The South Korean team was seen with serious expressions.
The Irish (pictured) and New Zealand teams were all smiles.
‘In AI, human judgments and biases are constructed and presented as facts, and the lack of critical thinking and evaluation means that the validity of information is not questioned, only the goal of completing a task.’
Dr. Choong says these biases can quickly lead to equity issues, harmful generalizations and discrimination.
“As society increasingly relies on technology for information and answers, these perceptions can end up creating real disadvantages for people of diverse identities,” she added.
‘A country’s association with certain sports can create the perception that everyone in that country is prolific at them – for example, Kenya’s association with athletics, Argentina’s with football, Canada’s with ice hockey.
‘These distorted “realities” can also become entrenched in individuals who believe these stereotypes and unwittingly reinforce them in real life.’
Researchers hope the images will highlight the need for developers to improve their algorithms to reduce such biases.
“Technology will find a way to improve its algorithm and its results, but it will still be focused on completing a task, rather than providing a truthful representation,” Dr Choong said.
‘Society will need to question the validity and critically evaluate information generated by AI.
‘Educating users will be essential for the coexistence of AI and information, as well as the ability to question its results.’