Instagram, owned by Meta, is taking steps to shield users from unsolicited nude photos. The app will now automatically blur such images in direct messages (DMs), using AI to identify and obscure genitals.
This feature aims to combat “intimate image abuse” and prevent harmful situations like sextortion scams.
For users aged 13 to 17, nudity protection will be enabled by default, with a prompt for older users to activate it. If someone receives a nude photo, it will be initially hidden, allowing them to choose whether to view it. Recipients can also block the sender and report the chat.
Additionally, Instagram will discourage forwarding nude images and provide safety tips for both recipients and senders. However, some critics argue that these measures are insufficient in protecting children from online harm, especially given the platform’s end-to-end encryption.