Bluesky's 2024 moderation report shows how quickly harmful content grew as new users flocked in
Briefly

In 2024, Bluesky's explosive growth led to a staggering rise in user reports, with a 17-fold increase highlighting the platform's challenges in moderation and user safety.
Bluesky's report confirms its moderation team has expanded to approximately 100 members, emphasizing specialized roles to address key policy areas such as child safety and harassment.
The platform's proactive measures will include updating users on reported content actions and enabling in-app appeals for takedown decisions, enhancing user experience.
With 26 million users, Bluesky faces numerous challenges including harassment, misinformation, and impersonation, prompting substantial efforts in moderation and user report handling.
Read at Engadget
[
|
]