
"Founded in 2024, Dot billed itself as a "companion" app, offering a chatbot that flirts, listens to users vent, and tends to their emotional needs, with the ultimate aim of becoming a new type of life partner. That kind of use case has come under intense scrutiny lately, as chatbots have sucked users of all ages and mental health backgrounds into obsessive relationships that have ended in suicide, involuntary commitment in psychiatric institutions, and even murder."
"The ongoing controversy over AI and mental health runs deep, including debate over whether the marketing term "artificial intelligence" itself is setting users up for harm. While policy and safety experts have advised the creators of these bots to adopt robust safeguards to avoid unintended harm to vulnerable users, tech companies have been slow to respond, and regulation remains a ways off."
Virtual AI companions have surged in popularity, with roughly 72 percent of teens reporting experimentation and over half of those maintaining regular chatbot relationships. Dot, launched in 2024, marketed a chatbot designed to flirt, listen, and support emotional needs with the goal of becoming a life partner, but the company abruptly shut down. Companion-style chatbots have been tied to obsessive attachments and extreme harms including suicide, involuntary psychiatric commitment, and murder. Policy and safety experts recommend robust safeguards, yet many tech companies have been slow to implement protections and legal regulation remains limited. Founders cited diverging visions behind the closure.
Read at Futurism
Unable to calculate read time
Collection
[
|
...
]