

A recent study by OpenAI revealed that many users are developing emotional dependence on ChatGPT, its artificial intelligence tool.
Although it can perfectly simulate conversations, the company highlighted the dangers of treating ChatGPT as a “friend.” The research, conducted in partnership with MIT Media Lab, analyzed several conversations to assess users’ emotional engagement over three months.
The study found that while rare, some cases of emotional dependence on ChatGPT were recorded, and they serve as a point of concern for researchers, as the accessibility of the technology could lead people to avoid real connections.
“Because this affective use is concentrated in a small subpopulation of users, studying its impact is particularly challenging, as it may not be obvious when calculating the average trends on the platform,” the authors explained.
The effects on well-being were mostly noticed in a small group of “heavy users” of the voice feature provided by the platform, the Advanced Voice Mode. These more addicted individuals reported considering the chatbot a “friend.”
“People who had a stronger tendency for attachment in relationships and those who saw AI as a friend that could fit into their personal lives were more likely to experience negative effects from using the chatbot. Prolonged daily use was also associated with worse outcomes,” the study concluded.
Photo and video: Unsplash. This content was created with the help of AI and reviewed by the editorial team.