Meta revoked a job offer to a prominent cyber intelligence analyst immediately after he criticized Instagram for failing to protect children online.
Paul Raffile had been offered a job as a human exploitation investigator focusing on issues such as sextortion and human trafficking. On April 24, he participated in a webinar on protection against financial sextortion schemes, during which he criticized Instagram for allowing children to fall victim to scammers and offered possible solutions.
“The only reason I can think of for the offer to be rescinded is that I’m trying to shed light on this big issue of these crimes happening on Instagram, and that Instagram has done little to prevent it so far,” Paul Raffile said.
Raffile was a co-organizer of the webinar, which featured the parents of four children who had died after being scammed on Instagram. Among the 350 attendees were employees from Meta, the National Center for Missing and Exploited Children (NCMEC), law enforcement agencies, the United Nations Office on Drugs and Crime, Visa, Google and Snap.
Raffile told The Guardian that he made some quick introductory comments in the webinar, which took less than a minute to deliver.
Raffile was due to start his new $175,000-a-year position the following Monday, but received the call rescinding the offer just hours after concluding the webinar. Meta’s hiring manager did not share the reason for her firing, stating that the directive came from “many salary levels above us,” Raffile said.
Meta declined to comment, calling the situation an “individual personnel matter.”
Raffile said: “This shows that Meta is not willing to take this issue seriously. “I have raised legitimate concerns and recommendations, and they are potentially unwilling to be aggressive enough to address this issue.”
Financial sextortion schemes have skyrocketed over the past two years, with more than 26,700 cases of underage victims reported to NCMEC in 2023 alone. According to the FBI, sextortion is the fastest growing cybercrime in the United States.
The victims are mainly teenage boys, who are approached by scammers posing as attractive girls. After forcing victims to send sexually explicit images of themselves, a scammer threatens to distribute the photos to their friends and family unless they pay a ransom.
An important part of these cases are the result of cybercriminals in Nigeria aimed at teenagers abroad. The scammers refer to themselves as ‘Yahoo Boys’ and typically operate on Instagram and Snapchat. Crime can be deadly. Minors are often overwhelmed by threats from scammers, and financial sextortion led to at least 20 teen suicides between October 2021 and March 2023, the FBI has said.
Meta said in a statement that it has strict rules against non-consensual sharing of intimate images.
Raffile questioned the reasons why Meta and other social media companies have failed to take effective measures against financial sextortion.
“I had faced Yahoo Boys at previous employers, which were financial institutions and technology companies,” he said.
Previously he held positions at the consulting firms Booz Allen Hamilton and Teneo.
He said: “We were able to eradicate them from our platforms in four to six months. However, social media platforms have had two years to deal with this.”
A Meta spokesperson said its expert teams are aware that sextortion actors are disproportionately based in several countries, including West Africa.
Raffile said Instagram’s design features help facilitate these cybercrimes, including plans to encrypt direct messages, which offers greater privacy but can hinder investigations. Another major problem is the inability of users to keep their followers and follower lists private, meaning a blackmailer can access his victims’ friends and family, he said.
“They message the victim and say, ‘Hey, I have your nudes and I took screenshots of all your friends and family, your followers.’ “Meta is not taking teen privacy seriously enough,” Raffile said.
Raffile criticized April Meta Announcement which would blur detected images containing nudity as the default setting for under 18s. But teens can still choose to watch them.
“It seems illegal to allow minors to stream these images on your platform,” he said. “Why not just block them?”
Meta said in a statement: “This feature aims to strike a balance between protecting people from viewing nude images and educating them about the risks of sharing them, without impeding or interrupting people’s important conversations.”