'Could it kill someone?' A Seoul woman allegedly used ChatGPT to carry out two murders in South Korean motels | Fortune
Briefly

'Could it kill someone?' A Seoul woman allegedly used ChatGPT to carry out two murders in South Korean motels | Fortune
"What happens if you take sleeping pills with alcohol? How much would be considered dangerous? Could it be fatal? Could it kill someone? Kim is reported to have asked the OpenAI chatbot, with prosecutors alleging her search and chatbot history show a suspect asking for pointers on how to carry out premeditated murder."
"Kim repeatedly asked questions related to drugs on ChatGPT. She was fully aware that consuming alcohol together with drugs could result in death, a police investigator said, according to the Herald, establishing that Kim's online inquiries demonstrated clear knowledge of lethal consequences before she allegedly poisoned two men."
A 21-year-old woman identified as Kim in South Korea allegedly used ChatGPT to plan a series of murders. She asked the chatbot specific questions about the dangers of mixing sleeping pills with alcohol and whether such combinations could be fatal. Police discovered her search history and chat conversations showing premeditated intent to kill. Kim allegedly laced drinks with benzodiazepines prescribed for her mental illness and gave them to two men in their 20s at motels in January and February. Both men died from the combination of sedatives and alcohol. Kim initially claimed she was unaware the mixture would cause death, but investigators determined she was fully aware of the lethal consequences based on her ChatGPT inquiries.
Read at Fortune
Unable to calculate read time
[
|
]