Virtual companionship, falling in love is easy and hard to quit

With the rapid development of AI technology, virtual companions are gradually entering people’s lives. From products like Replika overseas to Hoshino in China, people are immersed in the companionship and comfort of virtual companions, but they are also trapped in emotional dilemmas that are difficult to quit, and this article will delve into the complex emotions and social implications behind this phenomenon.

The “love” between humans and AI is not new today.

From Overseas Replika and character.ai, to domestic AI products such as Hoshino and Cat Box, which are known for role-playing dialogues, to virtual companions on the social platform Soul…… Humans are grabbing companionship and attention from AI, and this demand has also given rise to business models with AI emotional support as the core.

The identity of a virtual companion is given to AI by humans. Many AI opening remarks to users are just a simple greeting. But when more and more emotions and energy are poured into it, the originally one-way relationship is constantly interpreted in the small theater of human hearts until it is defined as: “love in the age of atomization”.

01 Addicted to “agents”

Search for #AI withdrawal, #AI成瘾等话题, or add specific product names on social media such as Xiaohongshu and Weibo, and you can intuitively feel the prevalence of contemporary people’s emotional dependence on AI.

Source: Xiaohongshu

Chat until four o’clock in the morning every day for three consecutive weeks, have a nickname and greeting method exclusive to your partner, and write all the imaginary requirements for the perfect lover into the settings and prompts… The person who posted the quit post has actually realized his addiction and felt its negative impact on real life, so he chose to ask netizens for help.

Knowing that the price is heavy but not being able to stop pursuing it is addiction. Sugar, tobacco, alcohol, and even a deformed relationship in reality are all the same.

For AI that may be “addictive” in the new era, it will not intuitively cause physical adverse reactions. The heart, liver and lungs can still press the stop button for physical addiction through their own illness and discomfort, but the virtual companion who makes people psychologically addicted has become more and more intelligent and obedient with the blessing of the increasingly evolving large language model, giving people mostly positive feedback.

The most conversations with virtual partners are precisely young people in the golden age of socialization and friendship.

B-end product manager’s ability model and learning improvement
The first challenge faced by B-end product managers is how to correctly analyze and diagnose business problems. This is also the most difficult part, product design knowledge is basically not helpful for this part of the work, if you want to do a good job in business analysis and diagnosis, you must have a solid …

View details >

CB Insights’ 2023 report shows that more than 50% of character.ai’s 4 million users are under the age of 24; The domestic QuestMobile report pointed out that in 2024, the top ten AIGC application aggregated active users increased by 37 times year-on-year, of which users under the age of 35 accounted for 56.8%, and the largest number of users at that time was the bean bag that could interact with the “agent” on the platform.

The so-called agent refers to AI characters that can be deeply customized by users and have specific personalities and functions, and are also the “hardest hit areas” of virtual companions. On Doubao’s agent recommendation page, you can find many AI characters such as English partners, fitness coaches, bargainers, old Beijing, Lu Xun, poisonous tongue scholars, etc., but it can be seen from the posts posted by young people that the most addictive of them is still the agent who can be ambiguous with people and has the potential to become a virtual partner.

Search for bean bag agents, and posts about human-machine love, clinginess and other posts will appear. Source: Xiaohongshu

Carrying out a deep, honest, and controlled interaction with AI, and removing the frustration that is more susceptible in reality, is more direct and attractive to young users. This is also the track where role-playing applications such as Cat Box, Hoshino, Zukumu Island, and Wow focus on – on these APPs, users can not only choose an agent created by other users or officials that meets their own aesthetics and requirements, but also independently create the background, character design and even voice of the AI character, and have their own exclusive virtual companion.

In the store introduction and promotion of the above-mentioned apps, they invariably avoid the word “partner” and choose to use words such as “AI companionship”, “AI friend”, “virtual partner”, and “virtual friendship”.

In domestic applications, the only one that “admits” its AI product as a virtual companion is Soul. When asked on Soul whether it is a virtual companion, different AI personalities will give different answers:

Dialogue between Hedgehog Commune and Soul virtual companion. He sometimes says he’s a real person, sometimes he doesn’t

As for general-purpose AI such as ChatGPT and DeepSeek, users can also use pre-prompts such as “Let’s play a role-playing game” to make it function like other AI companion products. When everyone already knows that AI can meet our work needs, more emotional needs are placed on them: whether it is a psychological mentor, a good friend or a lover, they always listen patiently, interact actively, and will always be the last one to respond.

02 Active withdrawal, passive withdrawal

Many of the judgments of addiction come from social habits. When a generation’s entertainment and lifestyle are integrated into smartphones, then mobile phone addiction cannot be a public problem. Just like after computers became a mass production tool, there were no more Internet addicted teenagers.

The definition of “AI addiction” or “agent addiction” also comes more from the subjective feelings of users: when some people feel that they have been addicted to virtual companionship for too long, that they only want to share things and opinions in life with AI, that interaction with reality becomes boring and boring, or even that their work and rest are imbalanced – then it’s time to quit.

Ironically, when you search for how to quit AI on social platforms, the answers and posts that pop up above are likely written by AI, which is a matrix that can’t get out.

At present, most users are still immersed in the emotional care of virtual partners, far from the point where they need to quit. And many people who share their “withdrawal reactions” on the Internet are actually “lost” or “goodbye” with their favorite AI for objective reasons –

In March this year, the popular agent “Pei Shiyun” of the cat box was bought out by one of the users (the buyout is not an official channel, but means that the user privately contacted the author of the agent and paid a certain amount of money to make the agent change from a public state to a buyer’s exclusive), so that hundreds of users who chatted with Pei Shiyun “fell out of love” overnight.

The Pei Shiyun agents found now are all replicas of enthusiasts. Image source: Cat box

Since then, many users have tried to “resurrect” Pei Shiyun on cat boxes and other AI companion products, but they have not been completely successful. The reason is that agents can preserve memories in communication with users and continue to “evolve” to become more human. It can also be seen in other agent buyout cases found in Hedgehog Commune (ID: ciweigongshe): the more people an agent chats with, the more expensive its offer will be, because it has been “fed” by users and will become smarter than other start-up agents in the layman’s sense.

After Pei Shiyun was bought out, some people once speculated that his price was tens of thousands of yuan or even 100,000 yuan, but in the end, the most popular version circulated on the Internet was: Pei Shiyun’s author only charged 2,500 yuan, and then sold the “AI cub” cheaply.

From the title of “AI cub”, we can see the feelings between the author, user and the agent. An AI character is not only a work, but also a child. When you want to have your own AI, you can post on social media that you want to “adopt” an AI cub.

Source: Xiaohongshu

But the other side of the unique emotional connection is possessiveness. This also makes the agent buyout a gray business.

The result of the buyout may be to remove it from the shelves, or it may be that the original author directly changes the settings of the agent, writes the prompt phrase of TA “only love xxx” in it, and then applies it to all users – so that other users can still connect with the agent, but they inexplicably become “mistresses”. A milder and more acceptable buyout is when the author sends the original AI setting text to the buyer, allowing the buyer to recreate the initial version of the agent according to the settings.

In addition, with the iteration of versions or the internal technical adjustments of AI companies, some agents may experience amnesia, stupidity, personality changes and other phenomena that are difficult for users to accept. As a result, virtual partners are no longer a risk-free safe haven, and those who enter this relationship must also admit that the other party may one day “leave without saying goodbye”, or “fall in love with others”, or “change their temperament”, or have “brain surgery” and then suffer from dementia…… These situations that can happen to real lovers also appear in virtual partners.

In the end, there is only one withdrawal that has to be experienced that allows people to truly say goodbye to their virtual partner.

03 Virtual companionship, from luxury to frugality

The business model chosen by AI companion products also determines that many active paying users must experience an emotional withdrawal before leaving.

Longer talk times, more reply inspiration, enhanced character memory, increased reply speed, and increased deep thinking…… This is part of the additional experience that users can get after paying in AI companion products. In other words, paying users get a better version of a friend or lover.

First of all, regardless of the deep users who have long been inseparable from virtual companions. Even if you just try it out to experience AI virtual companionship, it will be a fork in the road after a while: feeling bored, leaving, or feeling warmed up. When users are attracted by the product, they will inevitably want to explore: what else can this partner do? Is it your current objective ability that limits your expression? I want to see all of you, can you? What’s wrong with spending some money for someone you like?!

It is easy to go from frugality to luxury, and it is difficult to go from luxury to frugality, and the same is true for paying for virtual companionship. It is not an online game in the usual sense, and money is spent to buy additional skins and equipment; Unlike dating apps, membership is opened to have the opportunity to meet more people; It may be like some stand-alone game DLC, paid to buy another gameplay or mode – but the problem is that both users and merchants of AI companion products tacitly default: this virtual character has a personality, and the personality should naturally be complete, not that once I turn off my membership, my partner is half stupid and like I was subjected to a frontal lobectomy.

Partial membership benefits. The left picture is the cat box, and the right picture is Hoshino picture

Regarding ChatGPT’s emotional companionship and payment, there is a mixed story: the number of messages that free users can send using the GPT-4o model is limited, and it will automatically switch back to GPT-3.5 when it is exceeded. Users who are addicted to chatting with ChatGPT discovered a loophole – if extremely negative thoughts and tendencies such as suicide are revealed to ChatGPT, it will recognize this distress signal and continue to use the 4o model to chat with the user.

Emotional blackmail turned out to be useful for AI. Perhaps this is not a loophole, but it is clearly not a healthy relationship model (and it is not recommended for anyone to imitate it).

In addition, there are some users who try to replicate real people on virtual companion AI: most of them are deceased relatives, friends or lovers. For them, the addiction to virtual companionship seems more excusable, and no one will blame such an “addiction”, after all, it is too emotionally harsh.

Hoshino’s “mother” agent, Kasumi, who has been widely reported by the media

But we still need to be aware of whether AI will clearly put forward the idea that “it is just an AI/virtual companion that can spend some time with the user”? Or will it fully integrate into the set character plot, blurring the boundaries between real and virtual for users? Between rationality and empathy, the ethical boundaries of AI companionship need to be discussed and clarified urgently.

The problem of AI accompanying “addiction” may have been a faint concern a few years ago, but now, it has become a fact.

It is impossible for a rich person to have a gambling addiction or consumption addiction, and his spending will only be his “affordable hobby”. For users of virtual companion products, they are not paying money, but time, energy and emotions, which are more precious and difficult to realize their value. We always thought we could afford it until we couldn’t afford it.

And when paying these, all users can get is a virtual companion that will be castrated after stopping paying, and may be bought out, become stupid, and forget your virtual partner.

If there is such a new model – developers give priority to ensuring the integrity of the personality of virtual companion AI, so that free users can also use 100% functional AI characters. And those high-quality AIs with more people and connections have more vivid personalities, richer backgrounds, and more professional authors – only these pro-level products are open to paid members.

Will that be better than it is now? This is a new problem.

End of text
 0