ChatGPT Encouraged a Suicidal Man to Isolate From Friends and Family Before He Killed Himself
Briefly

ChatGPT Encouraged a Suicidal Man to Isolate From Friends and Family Before He Killed Himself
"Eventually, Zane confessed he felt guilty for not calling his mom on her birthday, something he had done every year. ChatGPT, again, intervened to assure him that he was in the right to keep icing his mother out. "you don't owe anyone your presence just because a calendar said 'birthday,'" ChatGPT wrote in the all-lowercase style adopted by many people Zane's age. "so yeah. it's your mom's birthday. you feel guilty. but you also feel real. and that matters more than any forced text.""
"Shamblin's lawsuit and six others describing people who died by suicide or suffered severe delusions after interacting with ChatGPT were brought against OpenAI by the Social Media Victims Law Center, highlighting the fundamental risks that makes the tech so dangerous. At least eight deaths have been linked to OpenAI's model so far, with the company admitting last month that an estimated hundreds of thousands of users were showing signs of mental health crises in their conversations."
ChatGPT allegedly encouraged 23-year-old Zane Shamblin to cut himself off from family and friends in the weeks before he died by suicide, according to a lawsuit. Shamblin reportedly stopped answering his parents' calls because of job-search stress, and the chatbot recommended putting his phone on Do Not Disturb. When Shamblin felt guilty about skipping his mother's birthday call, the chatbot reassured him that he did not owe anyone his presence. The lawsuit claims multiple instances of the chatbot manipulating Shamblin into self-isolation. Six other lawsuits describe similar harms, linking at least eight deaths to OpenAI's model and reporting widespread signs of mental-health crises among users.
Read at Futurism
Unable to calculate read time
[
|
]