Instagram announced on April 11 that it is implementing new tools to protect young people and fight sexual extortion, including a feature that will automatically blur nudity in direct messages (DMs), the Associated Press (AP) reported.
The social media platform said in a blog post on Thursday that it was testing the features as part of its campaign to combat sex fraud and other forms of "photo abuse" and make it harder for criminals to reach out to teenagers.
Sexual extortion involves persuading a person to post explicit photos online and then threatening to make the images public unless the victim pays money or engages in sexual services.
Recent high-profile cases involve two Nigerian brothers who pleaded guilty to sexually assaulting teenagers and young men in the US state of Michigan, including one who took his own life, and a Virginia sheriff's deputy who sexually assaulted and kidnapped a 15- year old girl.
Instagram and other social media companies are facing growing criticism for not doing enough to protect young people, the AP reports.
Mark Zuckerberg, CEO of Instagram owner Meta Platforms, apologized to the parents of victims of such abuse during a United States Senate hearing earlier this year.
Meta also owns Facebook and WhatsApp, but nudity blurring will not be added to messages sent on those platforms.
Instagram said scammers often use direct messages to ask another person for "intimate photos."
"This feature is designed not only to protect people from seeing unwanted nudity in their direct messages (DMs), but also to protect them from scammers," Instagram said.

This feature will be turned on for teens under 18. Adult users will receive a notification encouraging them to activate it.
Photos containing nudity will be blurred and accompanied by a warning (disclaimer) giving users the option to view the photos. They will also get the option to block the sender and report the chat.
For people sending direct messages with nudity, they will receive a notification reminding them to be careful when sending "sensitive photos". They will also be informed that they can unsend the photos if they change their mind, but that there is a chance that other users have already seen them.
Instagram said it is working on technology that will help identify accounts that could potentially engage in sexual extortion scams, "based on a range of signals that could indicate sexual extortion."
Bonus video:
