Home Tech Instagram will delete nudity in its messages to protect teenagers

Instagram will delete nudity in its messages to protect teenagers

by Elijah
0 comment
Instagram will delete nudity in its messages to protect teenagers

Instagram says it is rolling out new tools to protect young people and combat sexual extortion, including a feature that will automatically blur nudity in direct messages.

The social media company said in a blog post on Thursday that it was testing the features as part of its campaign to combat sex scams and other forms of “image abuse,” and to make it more difficult for criminals to contact teenagers. .

Sexual extortion, or sextortion, involves persuading a person to send explicit photographs online and then threatening to make the images public unless the victim pays money or performs sexual favors. Recent high-profile cases include two Nigerian brothers who pleaded guilty to sexually extorting teenagers and young men in Michigan, including one who took his own life, and a Virginia sheriff’s deputy who sexually extorted and kidnapped a 15-year-old girl .

Instagram said scammers often use direct messages to request “intimate images.” To counter this, it will soon begin testing a nudity protection feature for direct messages that blurs any images with nudity “and encourages people to think twice before sending nude images.”

“The feature is designed to not only protect people from seeing unwanted nudity in their direct messages, but also to protect them from scammers who may send nude images to trick people into sending their own images in return,” he said. Instagram.

The feature will be activated by default globally for teenagers under 18 years of age. Adult users will receive a notification encouraging them to activate it. Images with nudity will be blurred with a warning, giving users the option to view them. They will also have the option to block the sender and report the chat.

People who send direct messages containing nudes will receive a message reminding them to be careful when sending “sensitive photos.” They will also be informed that they can cancel sending the photos if they change their mind, but that there is a possibility that other people have already seen them. Meta also owns Facebook and WhatsApp, but the nude blurring feature will not be added to messages sent on those platforms.

Instagram and other social media companies have faced growing criticism for not doing enough to protect young people. Mark Zuckerberg, CEO of Meta Platforms, which owns Instagram, apologized to parents of victims of such abuse during a Senate hearing earlier this year. New Mexico’s attorney general has sued Meta, alleging that its social media sites are the world’s “largest market for pedophiles.” The lawsuit follows a two-year investigation by The Guardian into Meta’s fight to curb child sex trafficking.

Instagram said it was working on technology to help identify accounts that could be involved in sexual extortion scams, “based on a variety of signals that could indicate sextortion behavior.”

To prevent criminals from connecting with young people, it is also taking steps including not showing the “message” button on a teen’s profile to potential sextortion accounts, even if they already follow each other, and testing new ways to hide to the teenagers of these accounts.

You may also like