advertisement

Follow Mint Lounge

Latest Issue

Home > Smart Living> Innovation > Meta rolls out new content policies for teenagers

Meta rolls out new content policies for teenagers

Meta will now automatically place all teenager accounts into the most restrictive content control settings on Instagram and Facebook

When people search for terms related to suicide, self-harm and eating disorders, Meta will hide the related results.
When people search for terms related to suicide, self-harm and eating disorders, Meta will hide the related results. (REUTERS)

Meta has announced new content policies for teenagers on Facebook and Instagram, after rising pressure from regulators around the world who claimed that content on its apps was harmful to the mental well-being of children.

On Tuesday, in a blog post, the Mark Zuckerberg-owned company said it wants teens to “have safe, age-appropriate experiences” on their apps. The platform will regularly consult with experts in adolescent development, psychology and mental health about how to make the social media platforms safe for young people, including improving understanding of which types of content may be less appropriate for teens, the blog post said.

Also read: How to ensure safe online experiences for kids?

This move comes after a slew of lawsuits were filed against Meta. According to a Reuters report, the company is under pressure in the US and Europe over claims that its apps are addictive and have helped fuel a youth mental health crisis. In Europe, the European Commission has asked for information on how Meta protects children from illegal and harmful content.

Now, Meta is placing teenagers’ accounts in the most restrictive settings on the platforms and will ensure that they won’t be able to search for terms that might be harmful. For instance, if someone has posted about their ongoing struggle with thoughts of self-harm, Meta will remove this type of content from teens’ experiences on Instagram and Facebook. This also applies to other types of age-inappropriate content, the post explains.

When people search for terms related to suicide, self-harm and eating disorders, Meta will hide the related results and instead, direct them to expert resources for help. Currently, it hides results for suicide and self-harm search terms that break the rules but now they are extending this to include more terms. This update will roll out for everyone over the coming weeks, Meta said in the post.

Teens will also have the option to “Turn on recommended settings”. This will automatically change their settings to restrict who can repost their content, tag or mention them, or include their content in Reels remixes. These settings will also ensure that only their followers can message them and help hide offensive comments.

“Meta’s new policies to hide content that might be less age-appropriate will give parents more peace of mind,” Vicki Shotbolt, CEO of ParentZone.org, said in the blog post.

Also read: The battle to build a child-friendly metaverse

Next Story