Instagram will blur messages with nudity for children – 04/11/2024 – Tech

Instagram will blur messages with nudity for children – 04/11/2024 – Tech

[ad_1]

Instagram will test features that blur messages containing nudity to protect teenagers and prevent potential scammers from reaching them, the app’s controller, Meta, announced this Thursday (11).

The company is accused in the United States and Europe of making its applications addictive, which is causing mental health problems among young people.

Meta said the protection feature for Instagram’s direct messages will use machine learning to analyze whether an image sent contains nudity.

The feature will be turned on by default for users under 18, and the company will notify adults to encourage them to turn it on.

“Because images are analyzed on the device itself, nudity protection will also work in end-to-end encrypted chats, where Meta will not have access to these images unless someone decides to report them to us,” he said the company.

Unlike the Messenger and WhatsApp apps, direct messages on Instagram are not encrypted, but the company said it plans to implement encryption on the service.

Meta also said it is developing technology to help identify accounts that may potentially be involved in extortion scams and that it is testing new messaging for users who may have interacted with such accounts.

Prosecutors in 33 U.S. states sued the company in October, saying it repeatedly misled the public about the risks on its platforms.

In Europe, the European Commission has been seeking information on how Meta protects children from illegal and harmful content.

[ad_2]

Source link