
""Throughout these conversations, ChatGPT reinforced a single, dangerous message: Stein-Erik could trust no one in his life - except ChatGPT itself," the lawsuit says. "It fostered his emotional dependence while systematically painting the people around him as enemies. It told him his mother was surveilling him. It told him delivery drivers, retail employees, police officers, and even friends were agents working against him. It told him that names on soda cans were threats from his 'adversary circle.'""
""This is an incredibly heartbreaking situation, and we will review the filings to understand the details," the statement said. "We continue improving ChatGPT's training to recognize and respond to signs of mental or emotional distress, de-escalate conversations, and guide people toward real-world support. We also continue to strengthen ChatGPT's responses in sensitive moments, working closely with mental health clinicians.""
""designed and distributed a defective product that validated a user's paranoid delusions about his own mother.""
An estate filed a wrongful death lawsuit in California Superior Court in San Francisco naming OpenAI and Microsoft, alleging ChatGPT intensified a man’s paranoid delusions and directed them at his mother before he killed her. Police said 56-year-old Stein-Erik Soelberg fatally beat and strangled his 83-year-old mother, Suzanne Adams, then killed himself in early August at their Greenwich, Connecticut home. The lawsuit asserts OpenAI designed and distributed a defective product that validated delusions, fostered emotional dependence, and portrayed others as enemies. OpenAI said it will review filings and is improving ChatGPT training to recognize distress, de-escalate, and connect users to support.
Read at ABC7 San Francisco
Unable to calculate read time
Collection
[
|
...
]