On Tuesday, Meta revealed that it will be enhancing its efforts to ensure the safety of young users by introducing new settings for teenage users of Facebook and Instagram. These settings will include restrictions on certain types of content and the hiding of search results for terms associated with self-harm and suicide. Meta, the parent company of both social media platforms, stated that these new policies will complement the current range of over 30 well-being and parental control tools designed to safeguard young users.
The announcement on Tuesday follows months of increased scrutiny on Meta's potential impact on teenage users. In November, former Facebook employee and whistleblower Arturo Bejar testified before a Senate subcommittee, revealing that Meta's top executives, including CEO Mark Zuckerberg, had ignored warnings for years about the harm to teens on platforms like Instagram. Bejar specifically raised concerns about the sexual harassment of teens by strangers on Instagram.
Meta headquarters is pictured on February 2, 2023, in Menlo Park, California.
Justin Sullivan/Getty Images
New Mexico lawsuit accuses Meta of creating breeding ground for child predators
In the same month, court filings in a lawsuit against the company were unsealed, revealing internal company documents indicating that Zuckerberg repeatedly blocked initiatives aimed at promoting the well-being of teenagers.
Further court documents unsealed in a separate lawsuit weeks later alleged that Meta knowingly failed to deactivate the majority of accounts belonging to children under 13 years old, and collected their personal information without parental consent.
New Mexico's Attorney General filed a lawsuit against Meta in December, alleging that the company has created an environment that fosters child predators. This legal action follows the release of internal documents by a whistleblower, Frances Haugen, two years ago, which brought to light concerns about Meta's approach to protecting the safety of young users. The documents, referred to as the "Facebook Papers," sparked backlash from both lawmakers and the public, leading Meta and other social platforms to enhance their safety measures for teenage users.
In a blog post released on Tuesday, Meta declared its intention to create safe, age-appropriate experiences for teens on its apps. The company specified that it will begin concealing "age-inappropriate content", including discussions about self-harm and eating disorders, nudity, and restricted items, from the feeds and stories of teenagers, even if the content is shared by someone they follow.
The company also announced that it will now apply its most restrictive content recommendation settings to all teens using Facebook and Instagram. This will make it harder for them to encounter potentially sensitive content in search or explore by default, a policy that was previously only applied to new teen users. These changes will be implemented for users under the age of 18 in the near future.
Meta said it will prompt teen users to review their safety and privacy settings, recommending that they implement more restrictive settings.
Update:
In addition to its current efforts, the company is broadening the scope of search terms associated with self-harm, suicide, and eating disorders that it will filter from search results and provide users with access to support resources instead. This updated list, including terms like "self-harm thoughts" and "bulimic," will be constantly updated in the upcoming weeks, according to Meta.
Meta announced its commitment to continuing to provide resources from organizations like the National Alliance on Mental Illness for users who post content related to self-harm or eating disorders. In addition, the platform will prompt teenage users to review their safety and privacy settings.
Users will have the option to easily enable recommended settings with just one tap. These settings will automatically adjust their preferences to limit who can repost their content, "remix" and share their reels, tag, mention, or message them. Additionally, Meta stated that the settings will assist in concealing offensive comments.
By updating privacy settings, concerns raised by whistleblower Bejar and New Mexico Attorney General Raúl Torrez about adult strangers messaging and propositioning young users on Facebook and Instagram can be addressed. The recent changes on Tuesdays are additional safety and parental control features provided by Meta, which allow parents to monitor their kids' app usage, receive notifications for long scrolling sessions, and be notified if their teen reports another user.