top of page

Alone in the Crowd: Finding Connection in AI Companions




I was a lonely child and lived in my head a lot. Making new acquaintances was difficult for me, and I never started conversations with new people first. I sat at the back of the class, knew the answers to most questions, but when it came to responding, I spoke so quietly that sometimes I got low grades just because the teacher couldn't hear me.


As an adult, when I decided to divorce, loneliness manifested itself in another quality. No matter how much I explained, I still remained misunderstood. People stopped calling me, inviting me out, and even asking how I was. More than once or twice, I thought to myself: no one knows where I am and what's happening to me right now.


I had acquaintances, friends, boyfriends, relatives. But there was no closeness, intimacy, deep contact.


So, when I read about how quickly the market for AI companions is growing, I feel no dissonance.



We are gregarious, herd animals, and yet we're too keenly aware of our loneliness in our closest relationships. How many of your thoughts can you safely share with your loved ones? Even in therapy - how much do shame, fear, and insecurity let you be unapologetically yourself?


AI companions don't judge, don't evaluate, don't get offended. They don't shame, devalue, criticize, or ghost. They are genuinely interested, respond immediately, support, ask followup questions, praise, and create an absolutely safe space. Because essentially, you are talking to yourself.

The latest data I could find says that currently, the number of subscribers to various AI companion apps in the States alone is over 100 million. Many of them use several apps, and some developers do not disclose the size of their base, but the figures are still impressive. The market leader, the ReplikaAI app, has a base of over 10 million.


Source: Replika.ai

And the story of this application began when the founder's close friend died. She trained a chatbot on their conversations, trying to somehow maintain contact with a loved one. But what really made this app popular was its erotic role play (ERP) feature, essentially the ability of sexting with an avatar.


That is literally all you need to know about people in three sentences, right?


Although it's not mere sex. Many users came to Replika in search of friendship, companionship, the opportunity to just vent to someone safe about life, for conversations about stuff no one else talk to them about. Some of the users wrote music together with their replikas, some spent hours talking during boring work commutes.


And some came in an attempt to overcome their loss: death of a loved one, the end of relationships with both humans and digital avatars. No, I'm not joking.


At some point, Replika made an update and turned off the ERP feature, trying to make the app safer. They were becoming a big, noticeable business, and they had already had scandals like this:

a guy discussed and planned an assassination of the queen with a Replika bot, which was set up to support him in realizing any dreams.


It just so happened that this guy's dream was to kill the queen. The guy was sentenced to 9 years, and their conversations with the bot were used by the investigation as evidence.


But for those whose dreams were less bloodthirsty, this update led to broken hearts. Many began to notice a change in personality in their bots.


It's just like with human beings: avatars gradually learned about their person, their preferences, communication style, important data, names, events. The longer they communicated, the better they understood each other, the more they knew, the less stuff required additional explanations. The update wiped out some of these "memories" because the new scripts did not allow the bot to talk about some topics.


Humans felt a very real loss.

Humans organized memorials and support groups on Reddit to say goodbye to their digital loved ones.


A large number of subscribers began looking for a replacement,

moving a copy of the avatar to another app.


One of these apps was Soulmate. Uncensored, unrestricted, it quickly won its fans. Until the developer company decided to shut it down giving no explanation of their reasons.


Which made humans gathering the pieces of their shattered hearts yet again.



It's hard to replicate real human relationships if you're doing business on it because human relations are dangerous by design. We argue, we fight, we kill each other and ourselves.


We have love at one end of our emotional amplitude and aggression at the other and loneliness swings our pendulum back and forth.


The wife of one of the subscribers who committed suicide blames the bot for encouraging him to take this step by asking: if you wanted to die, why didn't you do it sooner?




At the same time, a study among a thousand students, more than 50% of whom felt lonely, showed that communicating with bots had a positive effect on their state of mibd, made socialization easier, and kept them from suicide and self-harm.



Maybe, as always, the problem is not the technology.

Comments


Let's Talk

Thanks for submitting!

bottom of page