Study Reveals Major Platforms’ Failures in Addressing Harmful Content

Mary

A recent investigation conducted by the Molly Rose Foundation has brought to light the inadequacies of major social media platforms in proactively identifying and removing harmful content, particularly content related to suicide and self-harm.

Between September 2023 and April 2024, the foundation analyzed over 12 million moderation decisions made by six prominent platforms, including Instagram, Facebook, TikTok, Pinterest, Snapchat, and an undisclosed platform referred to as ‘X’.

The study found that while suicide and self-harm content comprised a relatively small 2% of all decisions during this period, the majority of reported content related to illegal activities (33%), unsafe or illegal products (23%), and pornographic or sexualized material (18).

Notably, Pinterest accounted for 74% of all content moderation decisions concerning suicide and self-harm, with TikTok following at 24%. In contrast, platforms like TikTok and Instagram only made up 1%, while ‘X’ and Snapchat lagged significantly at 0.14% and 0.04%, respectively.

The report highlighted a significant disparity in the platforms’ responses to harmful content, pointing to a clear lack of commitment and investment from Meta’s platforms (including Instagram) in effectively addressing violative suicide and self-harm content.

Moreover, the study revealed that less than 0.2% of major platforms took multiple measures to restrict harmful content. For instance, TikTok detected nearly three million instances of suicide and self-harm content but only suspended two accounts.

The analysis also emphasized the platforms’ varied focus on different types of content. While Instagram primarily focused on image-based posts, TikTok’s decisions were distributed across video, image, and audio content, with a notable emphasis on audio content, an area often overlooked in terms of its potential impact on promoting harmful behaviors.

The research underscored the need for major platforms to improve their response to harmful content swiftly and consistently. TikTok stood out as the most responsive platform, with 94.4% of its moderation decisions related to content posted on the same day.

In conclusion, the study highlighted the urgent need for major platforms to address these shortcomings and ensure a safer online environment for users, particularly concerning sensitive topics like suicide and self-harm.

You Might Be Interested In

Leave a Comment