WhatsNew2Day
Latest News And Breaking Headlines

Instagram is developing a tool to protect users from receiving unsolicited nude photos in their DMs

Instagram is developing a tool that can block unsolicited nude photos sent in a direct message (DM), a spokesperson for its parent company Meta has confirmed.

Known as ‘Nudity Protection’, the feature will reportedly work by detecting a nude image and covering it before giving the user the option to open it or not.

More details should be released in the coming weeks, but Instagram claims it won’t be able to see the actual photos or share them with third parties.

This has been confirmed by Liz Fernandez, Meta’s head of product communications, who said it will help users ‘protect themselves from nudity as well as other unwanted messages’.

She told The edge: ‘We are working closely with experts to ensure these new features preserve people’s privacy while giving them control over the messages they receive.’

Known as 'Nudity Protection', the feature will reportedly work by detecting a nude image and then filtering that message from the inbox
Known as 'Nudity Protection', the feature will reportedly work by detecting a nude image and then filtering that message from the inbox

Known as ‘Nudity Protection’, the feature will reportedly work by detecting a nude image and then filtering that message from the inbox

It is still in the early stages of development, but will hopefully help reduce 'cyber-flash' incidents.  Cyber-flash is when a person is sent an unsolicited sexual image on their mobile device by an unknown person nearby (stock image)
It is still in the early stages of development, but will hopefully help reduce 'cyber-flash' incidents.  Cyber-flash is when a person is sent an unsolicited sexual image on their mobile device by an unknown person nearby (stock image)

It is still in the early stages of development, but will hopefully help reduce ‘cyber-flash’ incidents. Cyber-flash is when a person is sent an unsolicited sexual image on their mobile device by an unknown person nearby (stock image)

News of the feature was first announced on Twitter by leaker and mobile developer Alessandro Paluzzi.

He said that ‘Instagram is working on nudity protection for chats’ and posted a screenshot of what users can see when they open the feature.

It said: ‘Discover and cover nudity safely. The technology on your device covers images that may contain nudity in chats. Instagram cannot access the photos.

‘Choose to see pictures or not. Images remain covered unless you choose to view them.

‘Get safety tips. Learn ways to stay safe if you interact with sensitive images.

‘Turn on or off at any time. Update in your settings.’

Liz Fernandez, Meta's head of product communications, said the tool will help users 'protect themselves from nudity as well as other unwanted messages'
Liz Fernandez, Meta's head of product communications, said the tool will help users 'protect themselves from nudity as well as other unwanted messages'

Liz Fernandez, Meta’s head of product communications, said the tool will help users ‘protect themselves from nudity as well as other unwanted messages’

Ms Fernandez compared the feature to the 'Hidden Words' feature on Instagram that was introduced last year.
Ms Fernandez compared the feature to the 'Hidden Words' feature on Instagram that was introduced last year.

Ms Fernandez compared the feature to the ‘Hidden Words’ feature on Instagram that was introduced last year.

Ms Fernandez compared the feature to the ‘Hidden Words’ feature on Instagram that was introduced last year.

This allows users to automatically filter messages that contain words, phrases and emojis they don’t want to see.

She also confirmed that Nudity Protection will be an optional feature that users can turn on and off as they please.

It is still in the early stages of development, but will hopefully help reduce ‘cyber-flash’ incidents.

Cyber-flash is when a person is sent an unsolicited sexual image on their mobile device by an unknown person nearby.

This can be through social media, messaging or other sharing features such as Airdrop or Bluetooth.

In March, it was announced by British ministers that men who send unsolicited 'd**k pictures' will soon face up to two years in prison (stock image)
In March, it was announced by British ministers that men who send unsolicited 'd**k pictures' will soon face up to two years in prison (stock image)

In March, it was announced by British ministers that men who send unsolicited ‘d**k pictures’ will soon face up to two years in prison (stock image)

HOW DOES THE ‘NUDITY PREVENTION’ TOOL WORK?

The new ‘Nudity Prevention’ tool will reportedly work by detecting images that may contain nudity that have been sent to the user via chat.

It will automatically cover the image and the user can choose whether to see it or not when they open the message.

Instagram will not be able to access the photos and the user can turn the feature on or off at any time.

In March, it was announced that men who send unsolicited ‘d**k pictures’ will soon face up to two years in prison.

Ministers confirmed that laws banning this behavior will be included in the government’s Online Safety Act, due to be passed in early 2023.

The move will apply to England and Wales – as cyber flash has been illegal in Scotland since 2010.

It came after a study by the UCL Institute of Education found that the practice of sharing images without consent was ‘particularly pervasive and consequently normalized and accepted’.

Researchers surveyed 144 boys and girls aged 12 to 18 in focus groups and another 336 in a survey about digital photo sharing.

37 percent of the 122 girls surveyed had received an unwanted sexual image or video online.

a shocking 75 percent of the girls in the focus groups had also been sent an explicit image of male genitalia, with the majority of these ‘unsolicited’.

Snapchat was the most common platform used for image-based sexual harassment, according to the study’s findings.

But reporting on Snapchat was deemed ‘useless’ by young people because the photos are automatically deleted.

Furthermore, research by YouGov found that four in ten millennial women have been sent a picture of a man’s genitalia without consent.

Men who send unsolicited ‘d*** pics’ may be NARCISSISTS and usually expect to receive ‘something in return’

Men who send other people unsolicited pictures of their genitalia are likely to be more narcissistic and sexist than those who don’t, psychologists have found.

Researchers at Pennsylvania State University surveyed over a thousand men to compare the personalities and motivations of those who sent intimate photos and those who did not.

Rather than for personal gratification, men who share pictures of their genitalia typically do so in hopes of arousing the recipient and getting pictures back in return.

A small minority of participants reported that they sent the private photos to deliberately provoke a negative reaction from women.

The researchers conclude that the practice can neither be perceived as exclusively sexist nor as a positive sexual outlet.

read more here

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More