
Sun Sep 15 13:45:28 UTC 2024: ## Social Media Giants Unite to Combat Harmful Content
**San Francisco, CA** – In a groundbreaking move, Meta, Snapchat, and TikTok have joined forces with the Mental Health Coalition to launch “Thrive,” a program designed to combat harmful content related to suicide and self-harm.
Thrive will utilize a secure signal-sharing system, allowing participating companies to anonymously share information about flagged content. This will enable platforms to identify and remove similar content across their respective networks, effectively preventing its spread.
“Like many other types of potentially problematic content, suicide and self-harm content is not limited to any one platform,” stated a Meta blog post. “That’s why we’ve worked with the Mental Health Coalition to establish Thrive, the first signal-sharing program to share signals about violating suicide and self-harm content.”
The program relies on the sharing of hashes, anonymized codes associated with pieces of content. When a participating company detects harmful content, it will share these hashes with other tech companies, allowing them to quickly identify and remove similar content from their platforms.
This collaboration represents a significant step forward in the fight against online harm. While artificial intelligence can play a role in flagging potentially harmful content, human intervention remains crucial to ensure the detection and removal of nuanced and subtle forms of harmful content.
“It’s good to see social media platforms… actually taking some responsibility and working together,” said Matt, TechRadar’s expert on fitness and wellness. “This should just be the first step on the road to success.”
The launch of Thrive signifies a positive shift in the approach to content moderation on social media platforms. While the challenges remain, this collaborative effort is a promising sign that tech giants are taking concrete steps to address the harmful content that plagues their platforms.