Meta Platforms-owned Instagram introduced safety features for teenagers’ profiles under the age of 16, including stricter content moderation and enhanced user visibility systems.
In a blog post, Instagram explained users under the age of 16 will be automatically registered to “Teen Accounts”, which feature a built-in protection system limiting accounts who are able to message them and the type of content they see.
The measures will only allow “age appropriate” content and teen accounts will be set to private by default.
Changing the teen accounts settings will require permission from parents, and Instagram added a “parental supervision” feature can be activated to enable more oversight over the use of the content-sharing platform. The specific feature also allows parents to directly manage the account.
Instagram added messaging restrictions and limited interaction settings for teen accounts will only allow messages or “mentions” by people known by the teenagers.
“This new experience is designed to better support parents, and give them peace of mind that their teens are safe with the right protections in place,” an Instagram blog post read.
The system also comes with a new tool enabling teenagers to select topics they want to see more in their algorithms, allowing them “to focus on the fun, positive content”. It also offers a time-limit and night-mode feature.
Instagram will start identifying accounts owned by users under 16 starting today (17 September) and will move them onto the teenage category within the next 60 days.
The safety features will be rolled out in US, UK, Canada, Australia and the European Union (EU).
The update comes nearly two weeks after Australia’s online safety commissioner asked eight social media companies including Instagram, TikTok and Snap to report how many children are using their platforms and the ways they enforce age limits.
Comments