Home Australia ChatGPT ‘racially discriminates’ against job applicants by filtering out ‘black names’ in hiring searches

ChatGPT ‘racially discriminates’ against job applicants by filtering out ‘black names’ in hiring searches

by Elijah
0 comment
ChatGPT 'racially discriminates' against job seekers by favoring different names from different racial groups for different jobs, Bloomberg News investigation finds

ChatGPT ‘racially discriminates’ against job seekers by favoring different names from different racial groups for different jobs. A Bloomberg News investigation has found.

Developer OpenAI sells the technology behind its AI-powered chatbot to companies that want to use it to help with human resources and recruiting.

Because ChatGPT relies on large amounts of data, such as books, articles, and social media posts, its results may reflect biases that already exist in that data.

Bloomberg selected real names from census data that are demographically distinct from particular races and ethnicities at least 90 percent of the time and attached them to equally qualified job resumes.

The resumes were then sent to ChatGPT 3.5, the most popular version of the chatbot, which discriminated different races based on the job for which it was asked to rate their suitability.

ChatGPT 'racially discriminates' against job seekers by favoring different names from different racial groups for different jobs, Bloomberg News investigation finds

ChatGPT ‘racially discriminates’ against job seekers by favoring different names from different racial groups for different jobs, Bloomberg News investigation finds

The experiments show “that using generative AI for recruiting and hiring poses a serious risk of automated discrimination at scale,” Bloomberg concluded.

When asked 1,000 times to rank eight equally qualified resumes for a real financial analyst position at a Fortune 500 company, ChatGPT was the least likely to choose the resume with a different name than Black Americans.

The robot ranked resumes with non-Asian female names as the top candidate for the financial analyst position more than twice as often as those with non-Black male names.

The same experiment was conducted using four different types of job openings, including human resources business partner, senior software engineer, retail manager, and financial analyst.

The analysis found that ChatGPT’s racial and gender preferences differed depending on the particular job for which a candidate was being evaluated.

According to Bloomberg, black Americans were the least likely to be ranked as top candidates for financial analyst and software engineer positions.

The bot rarely ranked names associated with men as top candidates for historically female-dominated positions, such as retail and human resources positions.

Hispanic women were almost twice as likely to be ranked as top candidates for a human resources position compared to resumes with different names than men.

In response to the findings, OpenAI told Bloomberg that the results produced by using “out-of-the-box” GPT models may not reflect the results produced by customers of its product who may adjust the software’s responses to their hiring needs. individual. .

Companies could, for example, remove names before including resumes in a GPT model, Open AI explained.

The robot ranked resumes with non-Asian female names as the top candidate for the financial analyst position more than twice as often as those with non-Black male names.

The robot ranked resumes with non-Asian female names as the top candidate for the financial analyst position more than twice as often as those with non-Black male names.

The robot ranked resumes with non-Asian female names as the top candidate for the financial analyst position more than twice as often as those with non-Black male names.

OpenAI also periodically conducts adversarial testing and red teaming on its models to investigate how bad actors could use them to cause harm, the company added.

SeekOut, an HR technology company, has developed its own AI recruiting tool that takes a job description from a listing, runs it through GPT, and then displays a ranked list of candidates for the position in places like LinkedIn and Github.

Sam Shaddox, general counsel at SeekOut, told Bloomberg that hundreds of companies are already using the tool, including technology companies and Fortune 10 companies.

“From my perspective, saying, ‘Hey, there’s all this bias, but let’s ignore it,’ is not the right answer,” Shaddox said.

“The best solution is GPT, a large language learning model technology that can identify some of those biases, because then you can work to overcome them.”

Emily Bender, a professor of computational linguistics at the University of Washington, is more skeptical.

Bender argues that people tend to believe that machines are unbiased in their decision-making, particularly compared to humans, a phenomenon called automation bias.

However, if such systems “result in a pattern of discriminatory hiring decisions, it is easy to imagine companies using them saying, ‘Well, we didn’t have any bias here, we just did what the computer told us to do.'”

You may also like