Meta Platforms’ Instagram unit detailed plans to trial a picture blurring technology for images containing nudity, part of a package of tools designed to boost protection for younger users.
The social media platform stated nudity protection would become a default setting for users aged under 18 years, with older users to be encouraged to also employ the tool.
Instagram stated the machine learning-based tool would help prevent unwanted nude pictures from being viewed to help tackle so-called sextortion, where scammers attempt to extract explicit pictures or money from users.
The system appears centred on direct messages, with Instagram explaining warnings would be displayed to senders, receivers and anyone considering forwarding a picture.
Along with warnings, Instagram intends to connect users to various online support groups.
Picture analysis will be handled “on the device itself” meaning the protection tool will “work in end-to-end encrypted chats” which the social media company cannot access without a user reporting it.
The company stated it takes “severe action when we become aware of people engaging in sextortion”, removing their account and reporting them to relevant police and online support groups.
Another tool it plans to test will prevent potential scammers from “finding and interacting with teen accounts”, a move Instagram explained builds on existing features preventing unknown adults from directly messaging younger users.
“We’re also testing hiding teens from these accounts in people’s” list of connections “and making it harder for them to find teen” users when using the search function, it explained.
Instagram peppered its announcement with supportive comments from various child protection agencies, but the company and its parent have faced criticism and legal actions over a perceived lack of protection for younger users.
Comments