A controlled test using two fake adolescent accounts—one portraying a 14-year-old boy and one a 13-year-old girl—found TikTok's For You' page quickly presented harmful content. The algorithm funneled material related to eating disorders, self-harm, suicide and extremist or incel 2.0 subcultures, encouraging extended scrolling and increasing ad impressions. The experiment replicated prior 2022 and 2024 studies and highlighted commercial incentives tied to engagement. Immediate support resources and helplines for mental health and eating disorders are listed for the US, UK and Australia to help vulnerable young people and their caregivers.
What happens when a teenager signs up to TikTok? Within seconds, studies find, they are shown harmful content about issues from eating disorders to toxic subcultures, which keeps them scrolling and TikTok profiting from the ads. Neelam Tailor puts TikTok's algorithm to the test. Creating accounts for two fake children, a 14-year-old boy, Rami, and a 13-year-old girl, Angie, she explores the app's For You' page to see what the platform really serves young teens, replicating two studies published in 2022 and 2024.
With insight from Dr Kaitlyn Regehr, of University College London, and Imran Ahmed, of the Center for Countering Digital Hate, this video reveals how TikTok profits by pushing vulnerable teenagers toward dangerous content, including self-harm, suicide and incel 2.0 culture In the US, call or text Mental Health America at 988 or chat 988lifeline.org. You can also reach Crisis Text Line by texting MHA to 741741.
Collection
[
|
...
]