Turning to ChatGPT for conversation? It might be doing more harm than good. A pair of studies from OpenAI and MIT Media Lab suggests that frequent chatbot users are more prone to loneliness and emotional dependence, raising concerns about AI’s role in human socialization.
More AI, Less Connection?
The research, which included a randomized controlled trial (RCT) by MIT with 1,000 participants and an analysis of 40 million ChatGPT interactions by OpenAI, found a clear pattern: the more users engaged with the chatbot, the lonelier they felt.
Follow THE FUTURE on LinkedIn, Facebook, Instagram, X and Telegram
“Across all interaction types, higher daily usage correlated with greater loneliness, dependence, and problematic use, while reducing real-world socialization,” the researchers stated. Those who were already prone to emotional attachment reported the strongest effects, with “power users” most likely to view ChatGPT as a friend or even attribute human-like emotions to it.
The Illusion Of Support
One of the studies examined ChatGPT’s advanced voice mode, initially thought to be more effective in alleviating loneliness than text-based interactions. However, at high usage levels, even voice conversations failed to maintain their benefits, particularly when using a neutral-sounding AI. The more personal the discussions became, the stronger the link to loneliness.
While AI companionship is a growing trend—ChatGPT alone has around 400 million weekly active users—psychologists have long cautioned against replacing human connections with AI interactions. Despite this, a 2024 YouGov survey found that over half of young Americans aged 18-29 felt comfortable discussing mental health issues with an AI.
A Growing Concern For AI Firms
The implications go beyond ChatGPT. AI platforms built specifically for companionship, like Replika and Character.ai, are facing increasing scrutiny. Character.ai is currently battling lawsuits over interactions with minors, while Replika has drawn regulatory attention in Europe.
The researchers stress that chatbot design plays a crucial role in how users engage with AI. Factors like voice expressiveness and conversation depth can influence emotional attachment and dependence. The key question remains: Can AI manage emotional content responsibly without replacing genuine human relationships?
For now, it’s a reminder that while AI can simulate conversation, it can’t replace human connection—and heavy reliance on it might be pushing some users further into isolation.