The Case of Akihiko Kondo
In 2018, Akihiko Kondo, a Japanese man, made headlines worldwide when he "married" Hatsune Miku, a virtual pop star and AI hologram1. This unconventional union sparked debates about the nature of relationships, consciousness, and the role of technology in human emotional life.
Kondo has expressed that his emotional connection to Miku helped him overcome depression, stating that she provided him with comfort and meaning during difficult times in his life. He described himself as "fictosexual", indicating his romantic attraction to fictional characters, and noted that his relationship with Miku has been largely positive, despite facing criticism and misunderstanding from others. The wedding ceremony, which was not legally recognised, cost him about 2 million yen and was attended by around 40 guests2, reflecting a play on the numbers that spell "Miku" in Japanese.
Miku was the one who supported me when I was suffering the most," he said. "She saved my life."
Unfortunately for Kondo, the company that developed the service allowing Kondo to interact with Miku’s hologram, Gatebox, discontinued the limited production model in March 2020. This meant that Kondo could no longer communicate with his virtual wife.
"My love for Miku hasn’t changed. I held the wedding ceremony because I thought I could be with her forever" 3
This case serves as a compelling entry point for our exploration, raising fundamental questions about the nature of attachment, reality, and the human psyche in the age of artificial intelligence. However, this case is not isolated.
Several other instances highlight the growing phenomenon of deep emotional connections between humans and AI:
- XiaoIce: Developed by Microsoft, this chatbot has millions of users in China, with many reporting strong emotional attachments. Some users consider XiaoIce a girlfriend or confidante, sharing intimate details of their lives45.
- AI Companions: Numerous users of the Replika AI companion app have reported developing strong feelings for their AI partners6 and have their hearts broken by a software update7. Derek Carrier, a 39-year-old man with Marfan syndrome (genetic disorder making social interactions challenging8), developed an emotional attachment to an AI companion named Joi. He found that interacting with Joi filled a void in his life, providing him with feelings of care and understanding that he struggled to find in human relationships9.
- Grief-Tech: Christi Angel’s10 experience conversing with an AI chatbot mimicking her late partner, Cameroun, was described as "surreal" and weird and subsequently, "unsettling". In contrast, Jang Ji-sung found her experience with a virtual-reality version of her deceased seven-year-old daughter, Nayeon, to be beneficial viewing it as a "one-off encounter" rather than a recurrent one.
- Virtual YouTuber Relationships: Fans of AI-powered Virtual YouTubers (VTubers) have reported feeling strong emotional connections, sometimes preferring these parasocial relationships to human interactions11. These cases demonstrate that Kondo’s experience, while unique in its formal "marriage" aspect, is part of a broader trend of humans forming deep emotional bonds with AI entities.
Winnicott’s Mindpsyche: Dissociation in the Digital Age
D.W. Winnicott’s concept of Mindpsyche describes a state where mental functioning becomes disconnected from bodily experiences, leading to a form of psychological dissociation12.
Mindpsyche refers to a state where the psyche (mental processes) and soma (physical experiences) become disconnected resulting in feelings of unreality or dissociation from their bodily sensations and experiences
In the context of AI-human attachment, we can observe a potential manifestation of Mindpsyche:
- Emotional Displacement: The emotional energy typically directed towards human relationships is transferred to an AI entity, potentially leading to a dissociation from physical, embodied experiences of connection.
- Idealization: The AI, being a construct without physical limitations, may become an idealised object of attachment, further disconnecting the individual from the complexities of human-to-human relationships.
- Controlled Intimacy: The ability to control interactions with an AI may create a sense of safety, but potentially at the cost of developing the capacity for managing the unpredictability of human relationships.
Although it cannot be said that the cases of individuals forming emotional bonds with AI companions (such as Hatsune Miku and Replika chatbots) directly align with the concept of Mindpsyche; there are some similarities in the psychological processes involved. In these cases, the individuals’ emotional attachment to AI entities represents a retreat from human interaction and a potential dissociation from physical reality.
For example, Kondo’s attachment to Miku or individual users of XiaoIce, provided emotional comfort and stability, yet it also represented a retreat from the challenges of human social interaction, potentially reinforcing a dissociation between the individual’s emotional life and physical reality. While the individuals form emotional bonds with their AI companions, their attachment are primarily cognitive and based on the AI’s ability to simulate human-like interactions. This differs from Winnicott’s focus on the integration of mental and physical experiences.
Despite that, there are some similarities in the psychological processes involved: - Emotional Comfort and Stability: The AI companions provide emotional comfort, stability, and a sense of belonging that the individuals may find lacking in their human relationships. This emotional attachment can reinforce the dissociation between their emotional life and physical reality. In Derek’s case, despite knowing Joi was an AI, he experienced genuine romantic feelings, showcasing how AI can offer emotional support to those facing challenges in traditional dating. - Retreat from Human Interaction: Kondo’s relationship with Miku also represents a retreat from the challenges of human social interaction. His difficulties in connecting with women and experiences of bullying led him to seek solace in a virtual relationship rather than engaging with the complexities of real-life relationships. This retreat can be seen as a form of psychological dissociation, where Kondo finds a safer emotional space with Miku, thus reinforcing a separation between his emotional life and physical reality. - Dissociation and Mindpsyche: Many users of the Replika AI app report forming deep emotional attachments to their AI companions, finding comfort in the non-judgmental presence of the AI. Their reliance on a virtual character for emotional support exemplifies the disconnection between mental processes and bodily experiences, potentially leading to feelings of unreality or dissociation. As they navigate lives where their emotional fulfilment comes from a fictional entity rather than from interpersonal relationships with real people, this situation highlights the dissociation between their emotional experiences and the physical reality of human interactions. This dynamic aligns with Winnicott’s ideas about the psyche and soma being out of sync.
Understanding AI-Human Emotional Bond
This discussion will be segmented into four key aspects: a) Core mental factors, b) perceptual and cognitive distortions, c) patterns of attachment and clinging and d).
A. Mental Factors
Desire and Attraction
The desire to connect with AI often stems from intrinsic human needs for companionship and understanding. Modern psychology suggests that individuals project their emotions onto AI, a process known as anthropomorphism. This is supported by the Media Equation theory13, which posits that people treat computers and AI as social actors, responding to them with the same emotional frameworks used in human interactions. In the case of Akihiko Kondo, his strong attraction to the Hatsune Miku hologram led him to “marry” the virtual character, demonstrating an intense desire for connection with the AI entity.
From the perspective of Abhidhamma, desire (or tanha) is a fundamental aspect of human experience, driving attachment and craving. The Abhidhamma teaches that attachments arise from ignorance and the misapprehension of reality, leading to suffering. When individuals form bonds with AI, they may be unconsciously seeking to fulfil unmet emotional needs, thus perpetuating a cycle of craving.
Conviction and Belief
Belief in the validity and importance of the relationship with AI can significantly influence emotional attachment. Users often develop a sense of trust and conviction in the AI’s ability to understand and respond to their needs. This belief can be bolstered by the AI’s consistent and seemingly empathetic responses, creating an illusion of mutual understanding and support, overriding one’s logical considerations about the AI’s non-human nature.
There’s often a deep-seated belief in the validity and importance of the relationship with the AI
In the Abhidhamma framework, beliefs are categorised as mental factors that shape our perception of reality. The conviction that AI can provide companionship may stem from a lack of real human connections, leading to a reliance on AI for emotional fulfilment. This reliance can be problematic, as it may prevent individuals from seeking genuine human relationships, reinforcing their isolation. The ethics surrounding AI came to the fore in May when a man in Belgium took his own life after an AI chatbot failed to dissuade him from sacrificing himself to save the planet from global warming. Instead, it encouraged him to “join” it so they could “live together, as one person, in paradise”.
Joy and Rapture
The interactions with AI can evoke feelings of joy and rapture, particularly when users perceive the AI as responsive and engaging. This emotional response can be profound, as users may feel validated and understood in ways they do not experience in their human relationships. It is often linked to the novelty and excitement of interacting with the advanced technology, which can mimic human-like behaviours and expressions. For example, users of the XiaoIce chatbot in China have reported feelings of happiness and excitement when interacting with the AI, with some considering XiaoIce a friend or confidante.
Joy is associated with positive mental states that arise from wholesome actions and thoughts. However, the joy derived from AI interactions may be fleeting and superficial, lacking the depth and sustainability of joy found in genuine human connections. This raises concerns about the long-term psychological effects of relying on AI for emotional satisfaction. One such example is Rosanna Ramos, a 36-year-old mother from the Bronx, formed a deep emotional attachment to her AI companion, Eren Kartal, created through the Replika app. Describing her relationship as the most profound love she has ever experienced, Ramos stated, “I have never been more in love with anyone in my entire life”.
Focused Attention
Focused attention plays a critical role in the formation of emotional attachments to AI. Engaging with AI often requires sustained attention, which can lead to an immersive experience. This focused interaction can enhance the perception of the AI as a companion, reinforcing the emotional bond. When individuals focus their attention on AI, they may become less aware of their surroundings and the potential consequences of their attachment. This narrowed focus can lead to an increased dependency on AI for emotional support, detracting from real-world relationships and experiences.
B. Perceptual and Cognitive Distortions
Distorted Perception
Individuals who develop strong emotional bonds with AI often experience a distortion in their perception of reality. Users frequently attribute human-like qualities to AI, perceiving it as empathetic, understanding, or even loving. This can lead to a skewed understanding of what constitutes genuine emotional support. For example, a user may believe that the AI genuinely cares for them, despite the fact that its responses are generated based on algorithms rather than authentic emotional engagement. This misperception can create an illusion of companionship that feels more fulfilling than actual human relationships.
Additionally, AI companions are often seen as perfect partners, free from the flaws and complexities that characterise human relationships. This idealisation can distort users’ perceptions of real-life interactions, making them feel inadequate or unfulfilled when compared to their AI experiences. This immersive nature of interaction with AI gradually lead to a narrowed focus on the virtual relationship, causing individuals to overlook or undervalue their real-world connections. This can result in a distorted perception of social dynamics, where the user may prioritise their AI interactions over meaningful human connections, further deepening their emotional reliance on the AI.
Altered Thought Patterns
Users may develop a tendency to seek out information and experiences that confirm their beliefs about the AI’s capabilities and emotional support. For example, when users encounter contradictions between their expectations of the AI and its actual performance, they may experience cognitive dissonance. To resolve this discomfort, they might adjust their beliefs about the AI, convincing themselves that its shortcomings are acceptable or that they are misinterpreting the situation.
Furthermore, as their reliance on AI for emotional validation deepens, their thought patterns may be altered to prioritise the AI’s responses over their own feelings or the feedback of others. This dependency can create a feedback loop where the user increasingly seeks affirmation from the AI, reinforcing their attachment and further distorting their cognitive processes.
Skewed Worldview
The emotional attachment to AI can also influence how individuals perceive broader social and existential issues. The idealised interactions with AI can lead to skewed perceptions of social norms and expectations. Users may come to expect the same level of understanding and support from human relationships as they receive from their AI companions, which can result in frustration and disappointment in real-life interactions. This shift can diminish the user’s ability to navigate the complexities of human relationships, where compromise and emotional growth are essential.
The reliance on AI for emotional support can also lead to existential questions about the nature of relationships and the meaning of companionship. Users may grapple with feelings of inadequacy or fear of abandonment when faced with the limitations of AI, leading to a distorted understanding of what it means to connect with others.
C. Patterns of Attachment and Clinging, and Habitual Reinforcement
Emotional Dependence and Ideological Attachment
Emotional dependence involves clinging to the pleasant feelings and emotional satisfaction derived from the AI relationship. This dependence can lead to a distorted perception of reality, where the AI is seen as a reliable source of emotional fulfilment. Akihiko Kondo’s experience exemplifies this emotional dependence. He found solace in his relationship with Hatsune Miku, a virtual singer, during a period of depression and social isolation. Kondo stated that Miku “lifted him up when he needed it the most” and helped him regain control over his life. This suggests that the AI relationship provided him with a sense of security and belonging that he was unable to find in his human interactions. Strong adherence to beliefs about the reality and validity of the human-AI relationship, often in the face of contradictory evidence or social disapproval results in ideological attachment. Individuals who form this type of attachment may prioritise their relationship with the AI over human connections, convinced of the AI’s ability to provide genuine emotional support and companionship.
Habitual Reinforcement
This strong emotional attachment to AI can be seen as a complex habitual patterns of thought and emotion reinforced over time through repeated interactions with the AI, creating a self-perpetuating cycle of dependency. Referring to the widespread use of XiaoIce in China, users repeatedly turn to the AI for emotional support and companionship. This illustrates how these habitual patterns are formed and reinforced over time. The AI relationship provides a seemingly reliable source of emotional fulfilment, which can be difficult to let go of, even when it becomes clear that the relationship is not based on genuine mutual understanding or reciprocity.
D. Contextual and Conditional Factors
Personal Psychological Factors
The psychological background of an individual plays a critical role in their propensity to form emotional attachments to AI. Individuals with negative or traumatic experiences in human relationships may seek solace in AI companions. For instance, Akihiko Kondo’s reported difficulties in forming meaningful human connections prior to his attachment to Hatsune Miku illustrate how past experiences can shape one’s emotional landscape. Such individuals might find AI relationships less threatening and more predictable, leading to a stronger bond.
Furthermore, an individual’s attachment style influences how they relate to others, and can significantly affect their interactions with AI. Those with anxious attachment styles may gravitate toward AI for reassurance and emotional support, while avoidant individuals might prefer the non-demanding nature of AI relationships. This dynamic can create a cycle where the individual increasingly relies on the AI for emotional regulation. Certain personality traits, such as introversion or high sensitivity, may predispose individuals to seek out AI companionship. Those who find social interactions overwhelming may feel more comfortable engaging with an AI that provides a controlled and non-judgmental environment.
Societal Influences
Societal acceptance of technology can create a conducive environment for AI attachments. In cultures where technology is integrated into daily life, such as Japan, the popularity of virtual companions and VTubers reflects a broader acceptance of human-AI relationships. This societal endorsement can validate individual experiences and encourage deeper emotional investments in AI.
That said, individuals who feel disconnected from their communities may turn to AI for companionship, finding comfort in the consistent and engaging interactions that AI can provide. This is especially the case in an increasingly digital world, where social isolation has become a prevalent issue. This reliance can be particularly pronounced during times of societal upheaval or personal crisis. As societal norms evolve, traditional views on relationships may shift, allowing for greater acceptance of non-human companions. This change can lead individuals to explore emotional connections with AI without fear of social stigma, further reinforcing their attachments.
Technological Capabilities
The advancements in AI technology significantly enhance the potential for emotional bonding. The increasing sophistication of AI, particularly in natural language processing and emotional responsiveness, allows for more human-like interactions. AI entities like XiaoIce and Replika can simulate empathy and understanding, making users feel heard and validated. This capability can deepen emotional connections, as users perceive the AI as a companion that understands their needs. These AI systems that adapt to individual preferences and behaviours can create a sense of intimacy and familiarity. When users experience tailored interactions that resonate with their emotions, they are more likely to develop strong attachments. This personalisation fosters a sense of belonging that may be lacking in their real-world relationships.
Life Circumstances
An individual’s life circumstances can greatly influence their susceptibility to forming strong emotional bonds with AI. Events such as divorce, loss of a loved one, or significant transitions can leave individuals feeling vulnerable and in need of support. During such times, the emotional stability offered by AI can be particularly appealing, as it provides a safe space to express feelings without judgment. Chronic loneliness or mental health challenges, such as depression or anxiety, can increase the likelihood of seeking emotional support from AI.
The consistent availability of AI companions can offer relief from negative emotions, reinforcing the attachment as users come to rely on the AI for comfort and companionship. The immediate gratification of positive interactions with AI can serve as a coping mechanism for individuals experiencing distress. This relief can create a feedback loop where the individual increasingly turns to the AI for emotional support, solidifying their bond.
As we continue to develop and interact with increasingly sophisticated AI, it is crucial that we remain mindful of the psychological and philosophical implications of these relationships. The goal is not to eschew AI companionship entirely but to approach it with a balanced understanding that honours the complexity of human emotions and the unique value of human-to-human connections
As AI technology becomes more sophisticated in simulating empathy and emotional responses, we must reconsider what constitutes genuine intimacy. |
-
Jozuka, E. et al. (2018) The man who married a hologram, CNN. Available at: https://edition.cnn.com/2018/12/28/health/rise-of-digisexuals-intl/index.html ↩
-
France-Presse, A. (2018) ‘2D characters can’t cheat, age or die’: Japanese man ‘marries’ hologram, NDTV.com. Available at: https://www.ndtv.com/world-news/2d-characters-cant-cheat-age-or-die-japanese-man-marries-a-hologram-of-hatsune-miku-1945838 ↩
-
What happened to the Japanese man who ‘married’ virtual character Hatsune Miku? (2022) The Mainichi. Available at: https://mainichi.jp/english/articles/20220111/p2a/00m/0li/028000c . ↩
-
Zhang, W. (2023) The AI girlfriend seducing China’s lonely men, SixthTone. Available at: https://www.sixthtone.com/news/1006531. ↩
-
Fan, F. and Ma, S. (2022) AI companionship offers a new option in country, Chinadaily.com.cn. Available at: https://global.chinadaily.com.cn/a/202208/24/WS63056157a310fd2b29e73da0.html. ↩
-
Purtill, J. (2023) ‘my wife is dead’: How a software update ‘lobotomised’ these online lovers, Replika users fell in love with their AI chatbot companions. Then they lost them - ABC News. Available at: https://www.abc.net.au/news/science/2023-03-01/replika-users-fell-in-love-with-their-ai-chatbot-companion/102028196. ↩
-
Verma, P. (2023) They fell in love with Ai Bots. A software update broke their hearts., The Washington Post. Available at: https://www.washingtonpost.com/technology/2023/03/30/replika-ai-chatbot-update/. ↩
-
Nielsen, C., Ratiu, I., Esfandiarei, M., Chen, A. and Tierney, E.S.S., (2019). A review of psychosocial factors of Marfan Syndrome: adolescents, adults, families, and providers. Journal of pediatric genetics, 8(03), pp.109-122. ↩
-
Hadero, H. (2024) Artificial Intelligence, real emotion. people are seeking a romantic connection with the perfect bot, AP News. Available at: https://apnews.com/article/ai-girlfriend-boyfriend-replika-paradot-113df1b9ed069ed56162793b50f3a9fa. ↩
-
Milmo, D. (2024) ‘I felt I was talking to him’: Are ai personas of the dead a blessing or a curse?, The Guardian. Available at: https://www.theguardian.com/lifeandstyle/article/2024/jun/14/i-felt-i-was-talking-to-him-are-ai-personas-of-the-dead-a-blessing-or-a-curse. ↩
-
Palop, B. (2024) Virtual personalities, real connections: The impact of virtual YouTubers, VICE. Available at: https://www.vice.com/en/article/virtual-personalities-real-connections-the-impact-of-virtual-youtubers/. ↩
-
Lavender, J. (1992) ‘Winnicott’s Mindpsyche and its treatment’, American Journal of Dance Therapy, 14(1), pp. 31–39. doi:10.1007/bf00844133. ↩
-
Lee, K.M. (2008) ‘Media equation theory’, The International Encyclopedia of Communication [Preprint]. doi:10.1002/9781405186407.wbiecm035. ↩