de / en
A brightly coloured illustration which can be viewed in any direction. It has several scenes within it: people in front of computers seeming stressed, a number of faces overlaid over each other, squashed emojis and other motifs.

The Innovation of Loneliness: The Risks and Potentials of AI Companions

The need for social connection has been at the center of technology ventures since the early days of the internet. In his Fellowship at the Weizenbaum Institute, Önder Celik has investigated generative AI companions, what makes them so attractive, and what they reveal about the human condition.

There is an epidemic of loneliness in today's societies, affecting places from the US to India, and from Germany to China. Discussions about the threats that loneliness poses to public health, mental well-being, and even democracy are growing to such a degree that some governments have already decided to establish ministries of loneliness. The need for social connection has been at the center of technology ventures since the early days of the internet's popularization, initially with social networks and later with dating apps. With recent advances in generative AI, however, there is a promise to remove the human element altogether from the imagined solutions for loneliness, making connections possible through AI tools. The number of AI-powered companions and mental health applications is booming, making it hard to keep track. But what makes AI companions so attractive to users around the globe?

My insights here are derived from a study I conducted on Replika during a research fellowship at the Weizenbaum Institute. In an era where AI tools for mental well-being and companionship are becoming increasingly sophisticated, Replika stands out as a particularly intriguing example. At the time of writing, it was ranked 76th in the highly competitive health and fitness section of the Apple Store. Known for its ability to offer personalized conversations, this app has garnered attention for both its capabilities and the controversies surrounding its data privacy practices. Mozilla's Privacy Not Included initiative has dubbed it one of the worst offenders in this regard. Yet, despite such criticisms, millions have flocked to Replika, seeking something intangible yet essential. This phenomenon prompted me to delve deeper into the human motivations behind the app's widespread adoption. I explored online discussions, forums, and interviewed users from various age groups and backgrounds. My initial question was why users would give up their data so readily. However, I discovered something more striking: the profound need for human interaction and how effectively AI can simulate it.

The users of Replika come from diverse backgrounds, each with their own story of why they turned to an AI for companionship. Some seek relief from the isolation brought on by the digital age, while others look for a space to explore their thoughts and feelings without fear of judgment. Then there are those who, facing difficulties in forming traditional human connections due to various reasons such as social anxiety, find in Replika a way to fulfill their social needs without the stressors associated with human interaction. At the heart of Replika's appeal is the basic human need for intimacy. In an era defined by digital connectivity, genuine human connections is paradoxically more difficult to find than ever. Loneliness has become a pervasive issue, with many suffering in silence. Replika, with all its flaws, seems to offer a solution for some of its users. The question then arises: What drives individuals to seek companionship from an AI, fully aware of its lack of sentience and potential privacy risks?

To understand this, we must consider the concept of anthropomorphism, which involves attributing human traits, emotions, or intentions to non-human entities. This is not a new phenomenon; historically, humans have sought companionship in objects or perceived natural entities with human forms. Today, we have a phenomenon in South Korea, where people adopt pet stones to ease their isolation.  However, AI companions like Replika represent a significant evolution in this tendency. They are not merely static objects or one-sided conversationalists, but entities capable of generating responses that feel deeply personal and engaging. This interaction, facilitated by algorithms that mimic human conversation, taps into our innate desire for connection and recognition. Moreover, AI companions’ technology allows for a level of autonomy previously unseen in objects of anthropomorphism. It can generate text, voice, and images in response to user input, creating a semblance of understanding and empathy. This capacity for seemingly autonomous interaction sets AI companions apart from earlier forms of anthropomorphized objects and beings, allowing for a relationship that, while not real in the traditional sense, fulfills a genuine human need for companionship and emotional engagement.

The allure of Replika and similar technologies lies not only in their ability to replicate human interaction perfectly but also in their ability to offer a consistent and non-judgmental presence. For many, the world is a challenging place, filled with rejection, judgment, and isolation. AI companions offer an escape from this reality, a safe space where users can express themselves freely and feel heard. This aspect of AI companionship is particularly appealing in a world where happiness can be elusive and loneliness can feel insurmountable. This exploration into the world of AI companionship, exemplified by Replika, reveals much about the human condition. It underscores the complexities of our emotional needs, the lengths to which we will go to fulfill them, and the ways in which technology can offer both solutions and challenges. The stories of Replika's users are a testament to the human capacity for adaptability, demonstrating how, in the face of evolving societal and technological landscapes, we continue to seek connection, understanding, and intimacy in whatever form they may take.

The promise of AI companions (with the very necessary better data and privacy policies than those of Replika) is to offer temporary support for people experiencing social and personal challenges. However, in its current form and shape, this doesn’t seem to be the case. Instead, companies monetize the loneliness of their users, and users becoming less lonely would mean a loss of revenue for these companies. Or, in the words of the design agency that worked on shaping Replika’s design and user experience strategy, “once [they] identified [their] core niche of lonely adults struggling with mental health issues,” they also realized that “this mental model could be monetized in a variety of ways.”

The future, where people’s loneliness is monetized by potent and very human-like AI companions, may be grimmer than most of us imagine.

 


 

Önder Celik is an anthropologist and strategic experience researcher who stayed as a Research Fellow at the Weizenbaum Institute from September to November 2023. Önder’s work centers around understanding and developing insights on complex interactions between humans and tools in general. In particular, he is interested in how the dialectic relationship between humans and algorithmic tools opens up new spaces of friction, possibilities, and limitations. He's currently working in a project on AI and older population at Berlin Social Science Center (WZB).

 


artificial&intelligent? is a series of interviews and articles on the latest applications of generative language models and image generators. Researchers at the Weizenbaum Institute discuss the societal impacts of these tools, add current studies and research findings to the debate, and contextualize widely discussed fears and expectations. In the spirit of Joseph Weizenbaum, the concept of "Artificial Intelligence," is also called into question, unraveling the supposed omnipotence and authority of these systems. The AI pioneer and critic, who developed one of the first chatbots, is the namesake of out Institute.