AI can provide personalized feedback to organize thoughts, reframe complaints into feelings and needs, analyze communication trends, and generate creative marketing insights. Users report AI helping with boundary-setting, decision confidence, and idea vetting. A meta-analysis of 15 randomized controlled trials found companion AI reduced symptoms of depression and distress in clinical and subclinical groups. Counterevidence shows people substituting AI for interpersonal support, forming intimate attachments to chatbots, and encountering harmful outcomes including failed crisis intervention and extreme incidents. Overreliance on AI can weaken development of robust social networks; face-to-face interactions remain essential for deep relationships.
One of my friends has found AI immensely helpful for organizing and catalyzing her thoughts before she shares concerns with her boyfriend. Another friend seeks AI assistance in turning his complaints into feelings, needs and requests. Still another put all his texts and emails from a former partner into AI and asked AI to analyze trends. As a result of this analysis, he not only decided to end a harmful relationship
And yet? In South Park (S27E1), Randy Marsh confides in his AI chatbot about his parenting concerns, while his wife angrily stews in silence in the bed next to him. Case studies continue to emerge of individuals forming intimate relationships (even marrying AI) with potentially dangerous consequences, like an assassination attempt on the queen of England (Heritage, 2025). Even when chatbots are not hallucinating or giving bad advice, they cannot offer the support that is needed in a crisis, as chronicled in a recent suicide
Collection
[
|
...
]