Home Tech Google’s non-consensual explicit image problem is getting worse

Google’s non-consensual explicit image problem is getting worse

0 comments
Google's non-consensual explicit image problem is getting worse

In early 2022, two Google policy staff members met with a trio of women who were victims of a scam that resulted in explicit videos of them circulating online, including through Google search results. The women were among hundreds of young adults who responded to ads seeking swimsuit models and were coerced into performing in sexual videos distributed by the website GirlsDoPorn. The site Closed in 2020and a producer, An accountantand A cameraman after pleaded guilty to sex traffickingBut the videos kept appearing in Google searches faster than women could request their removal.

The women, joined by a lawyer and a security expert, presented a host of ideas for how Google could better keep criminal and degrading clips hidden, according to five people who attended or were briefed on the virtual meeting. They wanted Google search to ban websites devoted to GirlsDoPorn and videos bearing its watermark. They suggested that Google could borrow the 25-terabyte hard drive on which women’s cybersecurity consultant Charles DeBarber had saved all the GirlsDoPorn episodes, take a mathematical fingerprint, or “hash,” of each clip and block them from reappearing in search results.

The two Google employees who attended the meeting hoped to use what they learned to get more resources from higher-ups. But the victim’s lawyer, Brian Holm, left with doubts. The policy team was in “a difficult situation” and “did not have the authority to make changes within Google,” he said.

His instinctive reaction was the right one. Two years later, none of the ideas raised at the meeting have been put into practice and the videos continue to appear in searches.

WIRED has spoken to five former Google employees and 10 victim advocates who have been in contact with the company. All of them say they appreciate that because of recent changes Google has made, survivors of image-based sexual abuse, like the GirlsDoPorn scam, can more easily and successfully come forward. remove unwanted search resultsBut they are frustrated that the search giant’s management has not approved proposals, such as the hard drive idea, that they believe would more fully restore and preserve the privacy of millions of victims around the world, most of them women.

The sources describe previously unreported internal deliberations, including Google’s justification for not using an industry tool called StopNCII that shares information about non-consensual intimate images (NCII) and the company’s failure to require pornographic websites to verify consent to qualify for search traffic. Google’s own research team has published measures that technology companies can take against NCII, including the use of StopNCII.

Sources believe such efforts would help better contain a growing problem, in part through expanded access to AI tools that create explicit deepfakes, including those of the GirlsDoPorn survivors. General reports to the UK revenge porn hotline more than double Last year, the figure rose to around 19,000, as did the number of cases involving synthetic content. Half of the more than 2,000 Britons in a recent survey worried about being a victim of deepfakes. The White House in May He urged to act more quickly by lawmakers and industry to curb NCII in general. In June, Google joined seven other companies and nine organizations in announcing a working group to coordinate responses.

Right now, victims can demand that abusers be prosecuted or file lawsuits against the websites hosting the content, but neither of those avenues is guaranteed and both can be costly due to legal fees. Having Google remove results may be the most practical tactic and serves the ultimate goal of keeping the infringing content out of the sight of friends, hiring managers, potential landlords, or dates, who are likely to turn to Google to search for people.

A Google spokeswoman, who requested anonymity to avoid harassment of perpetrators, declined to comment on the call with GirlsDoPorn victims. She says combating what the company calls non-consensual explicit imagery (NCEI) remains a priority and that Google’s actions go far beyond what the law requires. “Over the years, we’ve invested heavily in industry-leading policies and protections to help protect people affected by this harmful content,” she says. “Google teams continue to work diligently to strengthen our safeguards and thoughtfully address emerging challenges to better protect people.”

You may also like