Meta is teaming up with Snapchat and TikTok as part of an initiative to prevent content featuring suicide or self-harm from spreading across the social media platforms, Meta said Thursday in a blog post.
The program, named Thrive, was created in conjunction with the Mental Health Coalition, a group of mental health organizations working to destigmatize such issues.
Through the Mental Health Coalition’s Thrive program, Meta, Snap and TikTok will be able to share signals about violating suicide or self-harm content with one another so they can investigate and act if the same or similar content has been posted on their respective apps. A spokesperson for Meta described Thrive as a database that all participating companies will have access to.
Meta said it is using technology it created and uses in conjunction with the Tech Coalition’s Lantern program — which aims to make technology safe for children and includes companies like Amazon, Apple, Google, Discord, OpenAI and more — to ensure that data is shared in Thrive securely.
The spokesperson for Meta said that when content featuring suicide or self-harm is discovered, it will be removed and flagged in the Thrive database so other social media companies can act.
Meta’s blog post made it clear that the program is intended to target content — not users.
“We’re prioritizing this content because of its propensity to spread across different platforms quickly. These initial signals represent content only, and will not include identifiable information about any accounts or individuals,” Antigone Davis, Meta’s global head of safety, wrote in the post.
When content featuring suicide or self-harm is identified on a Meta platform, it will be assigned a number known as a “hash,” according to the spokesperson. The hash can be checked by the other social media companies so they can search for the same content and remove it if it exists on one of the participating platforms.
Social media platforms, including Meta, TikTok and Snapchat, have long been criticized for not doing more to moderate content teens consume, including video and images of self-harm. All three platforms have been sued by parents and communities who said content on the platforms led to suicides. In addition, in 2021, leaked internal research, known as the “Facebook Papers,” revealed Meta was aware that Instagram, which it owns, could have harmful effects on teen girls.
Daniel Weiss, chief advocacy officer of the nonprofit media and technology watchdog Common Sense Media, said the Thrive program, while well-intentioned, appears to be similar to safety measures companies have taken under pressure "from lawmakers and advocates to make their products safer for kids."
"We are glad to see these companies working together to remove the types of content associated with self-harm and suicide," Weiss said. "Without proper regulatory guardrails, the jury is out on whether this will have a significant impact on the harms that kids and teens are facing online.”
A study, which is archived in the National Library of Medicine, found a major uptick in minors’ use of social media has led to an increase in depression and suicidal ideation in those groups. The study also suggests that young people who self-harm are more active on social media.
The company announced this year that it would begin removing and limiting sensitive “age-inappropriate” content from teenagers’ feed on its apps and that it had plans to hide search results and terms relating to suicide, self-harm and eating disorders for all users.
In its blog post Thursday, Meta said it removed 12 million pieces of content featuring suicide and self-harm from Facebook and Instagram from April to June. While Meta said it still wants to facilitate important conversations around self-harm and suicide, it hopes Thrive will help to keep graphic content off participating platforms.
If you or someone you know is in crisis, call or text 988 to reach the Suicide and Crisis Lifeline or chat live at 988lifeline.org. You can also visit SpeakingOfSuicide.com/resources for additional support.
CORRECTION (Sept. 13, 2024, 12:54 p.m.): A previous version of this article misstated the role of the National Library of Medicine. It archived a study on social media use and depression, it did not publish the study.