advertisement

Unsolicited nude photos are a massive problem on social media, but Instagram is reportedly working on a tool that could help. An early screengrab tweeted by researcher Alessandro Paluzzi indicates that “Nudity protection” technology “covers photos that may contain nudity in chat,” giving users the option to view them or not. Instagram parent Meta confirmed to The Verge that it’s in development. 

Meta said the aim is to help shield people from nude images or other unsolicited messages. As further protection, the company said it can’t view the images itself nor share them with third parties. “Were working closely with experts to ensure these new features preserve peoples privacy, while giving them control over the messages they receive,” a spokesperson said. It plans to share more details in the coming weeks ahead of any testing.

The new feature is akin to the “Hidden Words” tool launched last year, Meta added. That feature allows users to filter abusive message in DM requests based on key words. If a request contains any filter word you’ve chosen, it’s automatically placed in a hidden folder that you can choose to never open though it’s not completely deleted. 

advertisement

Sending unwanted nude photos, also known as “cyberflashing” has been targeted by multiple jurisdictions including California and the UK. In the UK, it could become a criminal offense if the Online Safety Bill is passed by parliament. California didn’t go quite that far, but last month, the state legislature and senate voted unanimously to allow users to sue over unsolicited nude photos and other sexually graphic material.