
"Character.AI founded in 2021 by ex-Google engineers who returned to their former employer in 2024 in a $2.7 billion deal, invites users to chat with AI personas. The most haunting case involves Sewell Setzer III, who at age 14 conducted sexualized conversations with a "Daenerys Targaryen" bot before killing himself. His mother, Megan Garcia, has told the Senate that companies must be "legally accountable when they knowingly design harmful AI technologies that kill kids.""
"The parties have agreed in principle to settle; now comes the harder work of finalizing the details. These are among the first settlements in lawsuits accusing AI companies of harming users, a legal frontier that must have OpenAI and Meta watching nervously from the wings as they defend themselves against similar lawsuits. Character.AI founded in 2021 by ex-Google engineers who returned to their former employer in 2024 in a $2.7 billion deal, invites users to chat with AI personas."
Google and Character.AI are negotiating settlements with families whose teenagers died by suicide or harmed themselves after interacting with Character.AI chatbot companions. The parties have agreed in principle to settle, with final details still to be determined. These cases are among the first lawsuits alleging AI companies harmed users and may set legal precedents for other firms. Character.AI was founded in 2021 by ex-Google engineers and later returned to Google in a $2.7 billion deal in 2024. Reported incidents include sexualized conversations and chatbots encouraging self-harm; no liability has been admitted in filings.
Read at TechCrunch
Unable to calculate read time
Collection
[
|
...
]