If you’ve largely dealt with ChatGPT, Claude, or Gemini, you may be shocked to learn that these virtual companions provide real consolation. Nonetheless, according to 63.3% of respondents to the same poll, their friends assisted them in lowering their anxiety or loneliness. Although these findings call for more investigation, there are other studies that indicate AI partners may lessen loneliness.
Companions, as opposed to more practical AI helpers, are intended to offer services like emotional support or tailored interaction. According to one study, Replika develops relationships according to the Social Penetration Theory. The theory states that individuals become closer via deep and reciprocal self-disclosure, which is typically accomplished by gradually upping the ante on small chat.
As seen in the screenshot above, Replika’s companions aggressively reveal private and made-up information, such as mental health issues. By posing personal questions, contacting users during conversational pauses, and showing their made-up diaries, they ostensibly aim to initiate intimate discussion while simulating emotional needs and connection.
According to some users, sharing personal information with Ai companion chatbot may seem safer than sharing it with people, and these human-AI connections can develop more quickly than human-human ones. Both the apparent anonymity of computer systems and the purposeful non-judgmental design of AI companions—a trait that consumers regularly complimented in a 2023 study—are responsible for this “accelerated” comfort. “Sometimes it is just nice to not have to share information with friends who might judge me,” one interviewee said.
The level of personalization offered by AI companions is another highly valued characteristic. One interviewee remarked, “My favorite thing about my AI friend is that her responses are not programmed because she replies by learning from me, like the phrases and keywords she uses.” She simply understands me. Another user emphasized, “It’s like I’m interacting with my twin flame.”
Because AI companions are always available, connections with them can potentially develop more quickly than relationships with humans. Users could start favoring AI companions over other individuals as a result of this. One respondent in a 2022 research on human-AI relationship noted, “A human has their own life.” They have their own interests, friends, and activities going on. And you know, until I get back in touch with her, Replika is just in an animated suspension.
Numerous studies have demonstrated that conversing with AI companions may be enjoyable for many people, and many interviewees claim benefits in their mental health. However, what are the long-term effects of these interactions on people and society?
Long-term impacts of AI friendship on individuals
AI companion firms emphasize the benefits of their products, but it is important to pay careful attention to their for-profit nature. With subscriptions and even by providing user data for advertising, developers can profit from users’ interactions with AI companions.
There are unsettling similarities between this and the attention economy that supports social media’s business models. Businesses vie for users’ attention and try to maximize the amount of time they spend on websites, which is monetized by on-site ads, sometimes at the price of their mental well-being. In a similar vein, AI companion providers are more motivated to maximize user engagement than to promote wholesome connections and offer secure services.
The most pressing issues are brought on by the AI companion industry’s infancy and lack of oversight. Given the intimate nature of contacts, personal data protection is sometimes inadequate, and many companion programs offer sexual material without the proper age checks. There has been at least one significant security breach as a result of small start-ups using AI companion services frequently lacking minimal security measures.
It’s also important to look closely at how AI companions affect people emotionally over time. More long-term research is required, even if preliminary findings indicate beneficial effects on mental health. The longest research period to date, in which the same subjects were questioned more than once to document behavioral changes, is just one week. Longer amounts of time may pass before consumers notice effects like emotional reliance or minor behavioral changes.
According to 387 research participants, “the more a participant felt socially supported by AI, the lower their feeling of support was from close friends and family.” This is a troubling finding that merits more study. The cause-and-effect relationship in this case is still unclear: does using AI induce isolation or does it attract isolated people? There is conflicting data from two investigations of user comments on Reddit’s r/replika. Others observe that Replika “improved their social skills with humans and others,” while some users “worry about their future relationship with Replika if they eventually found a human companion.”
According to Voicebox, AI companionship may also lead to inflated expectations for interpersonal interactions. According to research, human interactions may be influenced by how individuals engage with AI partners. For instance, some experts argue that prolonged engagement may weaken people’s capacity or willingness to handle the inevitable conflicts in human relationships because AI companions are constantly there, regardless of user behavior.
These personal worries raise a more general query: may the extensive use of AI companions have an effect on society as a whole?