Meta, Snap, and TikTok Join Forces to Address Self-Harm Content in News Initiative

Thursday, 12 September 2024, 09:14

News alerts as Meta collaborates with Snapchat and TikTok to fight the spread of self-harm content online. This initiative aims to ensure healthier social media interactions for users. By establishing the Thrive program, significant efforts are dedicated to identifying and flagging content involving suicide and self-harm.
Nbcnews
Meta, Snap, and TikTok Join Forces to Address Self-Harm Content in News Initiative

Meta, Snap, and TikTok Unveil Thrive Program

Meta is taking a stand against the proliferation of self-harm content by teaming up with popular platforms Snapchat and TikTok. Announced on Thursday, this initiative, termed Thrive, aims to prevent disturbing content related to suicide from circulating across these major social media sites. Developed in collaboration with The Mental Health Coalition, Thrive represents a significant step toward promoting mental health and safeguarding users.

How Thrive Works

  • Thrive enables Meta to identify content with suicidal themes and flag it for Snapchat and TikTok.
  • A dedicated database will support all participating companies in addressing harmful content.
  • Meta's technology, in cooperation with the Tech Coalition’s Lantern program, ensures data is shared securely.

Key Points Highlighted by Meta

Meta emphasizes that this initiative is focused on the content, not the users. According to Antigone Davis, Meta's global head of safety, “We’re prioritizing this content because of its propensity to spread across different platforms quickly.” The system assigns a specific hash to identified harmful content, allowing other social platforms to locate and remove identical materials.

Addressing Concerns with Social Media

These collaborative efforts come amidst growing scrutiny of how social media impacts teenage mental health. Research from the National Library of Medicine points to increased risks of depression and suicidal ideation linked to heightened social media engagement among minors.

Earlier this year, Meta announced proactive measures to limit sensitive content from featuring in teenagers’ feeds and concealed search results associated with suicide and self-harm. From April to June, Meta reported removing 12 million pieces of alarming content across Facebook and Instagram.

Support Resources

If you or someone you know is experiencing a crisis, resources are available. Reach out by calling or texting 988 for the Suicide and Crisis Lifeline or chat live at 988lifeline.org. Further assistance can be found at SpeakingOfSuicide.com/resources.


Disclaimer: The information provided on this site is for informational purposes only and is not intended as medical advice. We are not responsible for any actions taken based on the content of this site. Always consult a qualified healthcare provider for medical advice, diagnosis, and treatment. We source our news from reputable sources and provide links to the original articles. We do not endorse or assume responsibility for the accuracy of the information contained in external sources.

This article was prepared using information from open sources in accordance with the principles of Ethical Policy. The editorial team is not responsible for absolute accuracy, as it relies on data from the sources referenced.


Related posts


Newsletter

Subscribe to our newsletter for the latest and most reliable health updates. Stay informed and enhance your wellness knowledge effortlessly.

Subscribe