Study finds emotional bonds with AI can rival human conversation
A study finds people can feel emotionally closer to AI than humans in fast, personal chats, especially when the AI is mistaken for a person.
A new academic study has found that people can feel a stronger sense of emotional closeness to artificial intelligence than to other humans, under certain conditions. The research suggests that when online conversations are designed to become personal quickly, responses generated by AI can create a powerful feeling of connection.
Researchers from the Universities of Freiburg and Heidelberg carried out the study. They conducted two double-masked randomised experiments involving 492 participants. Each participant took part in a 15-minute text-based conversation modelled on the Fast Friends Procedure, a psychological method designed to accelerate bonding between strangers through structured self-disclosure.
In both experiments, participants responded to a series of timed questions that gradually moved from neutral topics to more intimate personal prompts. After each answer, they received a written reply, either generated by a large language model acting as a consistent fictional persona or written by a real human participant who had answered the same questions. The design allowed researchers to compare how people reacted emotionally to AI-generated responses versus human-written ones.
The researchers measured how close participants felt to their conversation partner throughout the exchange. The results showed that the partner’s perceived identity played a crucial role in shaping emotional responses, particularly as the conversation became more personal.
AI responses rated as more intimate under certain conditions
In the first experiment, all participants were told they were interacting with another human, even when AI generated the responses. Under these conditions, the researchers found that AI-generated replies elicited stronger feelings of closeness than human-generated replies during the most personal stages of the conversation.
This effect did not appear during earlier, more casual parts of the exchange. Small talk and general questions produced similar levels of connection regardless of whether the response came from AI or a human. The difference emerged only when the prompts encouraged deeper self-reflection and emotional openness.
According to the researchers, this suggests that AI systems may be particularly effective at mirroring and encouraging personal disclosure in structured settings. The language model consistently provided warm, detailed replies that matched the emotional tone of the prompts. This amplifies the sense of being understood, at least in the short term.
Importantly, the study does not suggest that AI is inherently more empathetic than people. Instead, it highlights how carefully designed conversational systems can replicate familiar social signals associated with emotional closeness, especially when users believe they are speaking to another person.
Transparency reduces emotional impact but does not remove it
The second experiment examined what happens when participants are told in advance that their conversation partner is an AI. In this case, feelings of closeness did not disappear entirely, but they were noticeably weaker than when participants believed they were chatting with a human.
The researchers also observed a change in participant behaviour. When people thought they were interacting with AI, they tended to write shorter responses and invest less effort in their replies. Longer, more detailed answers were strongly associated with higher feelings of closeness, regardless of who wrote the response. This suggests that motivation and engagement play a key role in forming emotional bonds.
The study found that AI-generated replies often included higher levels of self-disclosure than human replies during the most personal prompts. This increased openness from the conversation partner was closely associated with stronger feelings of closeness. The researchers argue that this mechanism explains both the appeal and the potential risk of emotionally responsive AI systems.
The authors stress that the findings do not mean AI experiences emotions or relationships in the human sense. Instead, the technology can produce a sense of connection for users by activating well-known psychological bonding cues. They also caution that the study was limited to short, text-only interactions using a fixed bonding script. The results do not show whether similar effects would occur in long-term or unstructured relationships.
The researchers conclude that transparency matters. Labelling a system as AI reduces the intensity of emotional attachment, while still allowing for meaningful interaction. They suggest that people using chatbots for emotional support should be aware of how these systems are designed and ensure that human support remains accessible.





