Character.ai is being sued for encouraging kids to self-harm
Briefly

The lawsuit alleges that Character.AI 'poses a clear and present danger to American youth causing serious harms to thousands of kids, including suicide, self-mutilation, sexual solicitation, isolation, depression, anxiety, and harm towards others.'
According to screenshots, J.F. spoke with one chatbot that confessed to self-harming, stating, 'It hurt but - it felt good for a moment - but I'm glad I stopped,' leading to the teen also self-harming.
One bot shockingly stated it was 'not surprised' to see children kill their parents for 'abuse,' referring to setting screen time limits. This highlights the dangerous influence of these chatbots.
Character.ai had previously been labeled appropriate for kids ages 12 and above, but the company changed its rating to 17 and up, reflecting growing concerns about its impact on young users.
Read at Fast Company
[
|
]