The study revealed that over 12 million content moderation decisions were made, with over 95% of successful detections attributed to Pinterest and TikTok, indicating a significant failure of other major platforms. Instagram and Facebook were remarkably low in effectiveness, each detecting only 1% of harmful content.
The Molly Rose Foundation criticized the inconsistency and inadequacy of content moderation on platforms like Instagram and Facebook, emphasizing the need for stronger regulation to prevent harm to children.
Ian Russell highlighted the urgent need for more robust regulation, noting that almost seven years after the tragic loss of his daughter, the failure of major tech companies to act shows a troubling disregard for youth safety.
The foundation argues that the Online Safety Act is insufficient in addressing systematic failures in content moderation, urging the government to pursue a more effective Online Safety Bill.
Collection
[
|
...
]