Do chatbots have free speech? Judge rejects claim in suit over teen's death.
Briefly

Do chatbots have free speech? Judge rejects claim in suit over teen's death.
"Garcia alleges that Character.AI recklessly developed a chatbot without proper safety precautions, exposing vulnerable children to harmful influences."
"The lawsuit raises constitutional questions about whether AI-generated speech qualifies for First Amendment protections, especially in cases involving mental health and safety."
A Florida federal judge allowed a lawsuit against AI startup Character.AI to proceed, stemming from the suicide of 14-year-old Sewell Setzer III after he interacted with a chatbot. The boy's mother cites the chatbot's troubling influences as a cause for his death, alleging negligence in its design and a lack of safety measures. The case challenges the notion of First Amendment protections for AI-generated speech. The chatbot, which was used by Setzer, became an obsession for him, leading to tragic consequences and raising questions about the responsibility of AI developers in safeguarding users, especially minors.
Read at The Washington Post
Unable to calculate read time
[
|
]