Facebook and Instagram are to start hiding more types of content for teenagers as part of an effort to better protect younger users from harmful material online.
As part of the changes, teenager users will no longer see posts from others discussing their personal struggle with thoughts of self-harm or suicide – even if they follow the user in question.
Meta said it was placing all under 18s into the most restrictive content control settings categories on Instagram and Facebook, and was restricting additional terms in Search on Instagram.
This setting already applies to new users who join the site, but is now being expanded to all teenagers using the apps.
Meta said the settings make it more difficult for people to come across potentially sensitive content or accounts across the apps, including in the Explore sections.
The new measures will be rolled out on the two platforms over the coming months.
On self-harm and suicide content on Instagram, Meta said it was “focused on ways to make it harder to find”, while also offering support to those who post about it.
“While we allow people to share content discussing their own struggles with suicide, self-harm and eating disorders, our policy is not to recommend this content and we have been focused on ways to make it harder to find,” the social media firm said in a blog post.
“Now, when people search for terms related to suicide, self-harm and eating disorders, we’ll start hiding these related results and will direct them to expert resources for help.
“We already hide results for suicide and self harm search terms that inherently break our rules and we’re extending this protection to include more terms. This update will roll out for everyone over the coming weeks.”
In addition, Meta said it would also begin sending notifications to teens, reminding them to check and update their privacy settings.