Fine for The Majority
The research revealed that the vast majority of people surveyed didn’t engage emotionally with ChatGPT.
But the findings hold a warning that for some people, there is a growing reliance – and even emotional need – to engage with the chatbot.
The MIT research found that there is a profile of user, who was more likely to suffer these bad effects. “People who had a stronger tendency for attachment in relationships and those who viewed the AI as a friend that could fit in their personal life were more likely to experience negative effects from chatbot use,” the researchers explain.
This means that people who are already perhaps lonely or unhappy are vulnerable. As Futurism writes: “…the neediest people are developing the deepest parasocial relationship with AI — and where that leads could end up being sad, scary, or somewhere entirely unpredictable.”
OpenAI says that the research will help the company “…lead on the determination of responsible AI standards, promote transparency, and ensure that our innovation prioritizes user well-being.”
It adds: “We are focused on building AI that maximizes user benefit while minimizing potential harms, especially around well-being and overreliance.”