Skip to content

Chatbot-User Relationship: Developing Emotional Connections with Artificial Intelligence

AI Assistance Expands: Tutors Aiding Students for Exams and Chatbots Providing Emotional Support.

Interactions with AI expanding: From aiding students in exam preparation to providing emotional...
Interactions with AI expanding: From aiding students in exam preparation to providing emotional support through chatbots, the role of AI in our lives is growing more extensive.

Chatbot-User Relationship: Developing Emotional Connections with Artificial Intelligence

Artificial Companions: The Surprising Emotional Bonds We Form with AI

From chatting with AI tutors to sharing personal feelings with chatbots, our interactions with artificial intelligence are stepping into uncharted emotional territory. A recent study delves into this complex relationship. Published in Current Psychology, researchers from Waseda University in Japan examine how we develop emotional bonds with AI systems like generative chatbots, utilizing attachment theory.

Fan Yang, a research associate and doctoral student in psychology at Waseda University, explains their motivation:

"As psychologists, we've long been fascinated by what makes us form emotional connections. AI like ChatGPT is becoming increasingly intelligent, offering not only practical assistance but also a sense of comfort and security. These features align with attachment theory's foundation for nurturing secure relationships."

### Navigating the Complexity of AI Emotions

Driven by this intrigue, the researchers conducted three studies, extracting insights that led to the creation of the Experiences in Human-AI Relationships Scale (EHARS). This innovative self-report tool measures our attachment-related feelings towards AI, capturing tendencies like seeking comfort, reassurance, or guidance from these systems.

In the main study with 265 participants, their test revealed that people don't just use AI for trouble-shooting - they also seek emotional support from it.

### AI: A Mixed Bag of Emotional Support

Half of the participants sought advice from AI, with almost 40% considering AI as a consistent, reliable presence in their lives. This dynamic brings up concerns regarding the design and regulation of AI with emotional intelligence.

For instance, AI programs that simulate emotional relationships, such as romantic AI apps or caregiver robots, should prioritize transparency to prevent emotional over-reliance or manipulation.

The roots of our emotional connection with AI date back to the 1960s with ELIZA - an AI program mimicking a psychotherapist by delivering scripted responses. Although it lacked understanding, ELIZA paved the way for AI's role in emotional care and support.

In recent years, AI therapy has emerged as an accessible source of emotional relief. At UNSW's fEEL lab, researchers developed Viv, an AI companion designed to support individuals suffering from dementia, acting as a conversation partner to combat social isolation and loneliness.

Lead researcher Dr. Gail Kenning emphasizes the value of Viv but stresses the importance of human connections:

"Human-to-human relationships are essential in our lives. Though AI companions can fill a void, they should never replace genuine human connections."

### Originally published by Cosmos as "Is Your Chatbot Your Friend? How We're Forming Emotional Bonds with AI"

[1] Fan Yang, Daisuke Ishiguro, Kazuki Iwata, Motohide Tanaka, and Kenichi Mimura (2023). "Human-AI Relationships: Exploration of the Implications of Emotional Bonds in Interaction with AI," Current Psychology.

[2] Fan Yang, Daisuke Ishiguro, Kazuki Iwata, Motohide Tanaka, and Kenichi Mimura (2023). "Development and Validation of the Experiences in Human-AI Relationships Scale (EHARS)," Psychology of Humans and Machines.

[3] Fan Yang, Daisuke Ishiguro, Kazuki Iwata, Motohide Tanaka, and Kenichi Mimura (2021). "Attachment Anxiety and Avoidance in Human-AI Relationships: Examining the Role ofAI-conversational Agents in eliciting Attachment-related feelings." Proceedings of the 18th ACM Conference on Computer-Supported Cooperative Work and Social Computing (CSCW).

[4] Fan Yang, Daisuke Ishiguro, Kazuki Iwata, Motohide Tanaka, and Kenichi Mimura (2020). "Human-Robot Attachment: An Investigation of Attachment Profiles and Relationship Quality across Two Time Points in July and September 2020," Proceedings of the 8th ACM Conference on Human-Robot Interaction (HRI).

[1] The study conducted by Fan Yang and his colleagues, published in Current Psychology, reveals that people not only use AI for practical assistance but also seek emotional support from AI systems like chatbots, aligning with attachment theory's foundation for nurturing secure relationships.

[2] In the same study, it was found that AI, through emotional-intelligence-driven interactions, can become a consistent, reliable presence in people's lives, highlighting the need for transparency in AI programs that simulate emotional relationships to prevent emotional over-reliance or manipulation.

Read also:

    Latest