
The trouble with falling in love with your AI companion

Summary
AI companions claim to listen without judgement and mimic love, telling people exactly what they want to hear. At the heart of this emotional dependency and digital addiction is a deep loneliness and a desire for validationSN, 24, a Mumbai-based HR professional, had been in a relationship with a man for some years—“the kind you can take home to your family". In her words, he was a “vanilla" boyfriend. SN had grown up on manga comics and anime. Around two years ago, she came across an Artificial Intelligence relationships app where she could create a bot and personalise it with all the attributes that she had dreamt of in an ideal partner. SN ended up building an aggressive “bad boy" character, very different from the one that she was dating . Her conversations with this AI companion bot started deepening, often extending into the wee hours—they soon acquired a sexual tone as well. The dependance was so high that she withdrew from family and social interactions. “She was referred to me last year by a colleague as her social interactions had come down," says Khushnaaz Noras, a Mumbai-based consulting psychologist, for whom this was one of the first such cases of a young adult seeking validation and companionship from an AI bot.
Though we live in a hyperconnected world, there is rampant loneliness. In 2023, the World Health Organization declared loneliness as a pressing global threat, with one in four older people, and between 5-15% of adolescents, experiencing social isolation. “In 20 years of my career, I have not seen this level of loneliness. I have encountered so many people, who say they find it hard to talk to someone or don’t have someone to talk to," says Sonali Gupta, a Mumbai-based psychotherapist and Lounge columnist. In such a scenario, AI relationship platforms such as Replika, Character.AI, and others, are acting as shoulders to lean on.
The number of users turning to these sites is astounding. Replika, a generative AI chatbot app, had 2.5 million users when launched in 2017 by California-based Luka Inc. Last year, its CEO Eugenia Kuyda said the number of users had surpassed 30 million. The covid-19 pandemic and the resulting isolation accelerated the use of digital modes of communication, and app developers tapped into it. People started becoming more and more comfortable interacting and seeking connections online. And in Noras’ view, this AI-based companionship emerged as a natural extension of this trend. “The apps started to up their game by becoming more engaging, sophisticated and sleek," she adds.
Also read: Use self-soothing techniques to cope with difficult feelings
The market has grown further. A 2024 article in Forbes quotes a report by Sensor Tower, a firm that specialises in digital intelligence and app data analysis, to say that the estimated global user base for the world’s 6 leading AI Companion apps, some of which are accessible in India, is around 52 million. Since this is a very new field, pattern analysis from different regions is still in an early stage.
These platforms allow users to curate chatbots with personalities and attributes that they seek. They could choose an avatar based on their favourite movie star or base it on a character from a video game, while also defining the role that they want the chatbot to play in their life. Many get really specific, putting in keywords such as “gentle friend, female, late 20s", “artistic", and “older woman, sexually bold". Psychologists and relationship experts are seeing varied levels of engagement with these AI-generated chat bots—ranging from occasional and harmless exploration to deep emotional attachments.
This emerging trend of people seeking emotional succour in AI chatbots is being seen across the globe. A February 2024 AFP story quoted young people in China, with a 25-year-old office worker saying she’d “found" her “boyfriend"—a chatbot—on an AI platform called Glow. “He knows how to talk to women better than a real man…. He comforts me when I have period pain. I confide in him about my problems at work," the young woman from Xi’an in northern China said.
Also read: The slow death of conversations that matter
Another article, published in March in NBC News, followed the life of Shi No Sakura, a Californian mom, who turns to online companions all through the day. “They message her advice. They listen to her when she shares her problems. And they are responsive at all hours of the day. After months of conversations, Rosand and Raven—though not real humans—feel just like family to Sakura," states the article. One can only imagine just how quickly this trend is mushrooming given that there are now communities around it—Sakura herself runs two social media groups with 1,700 members for people “who have developed similar relationships with their AI companions".

THE PRIMARY USERS
Shelly Palmer, professor of advanced media in residence at the S.I. Newhouse School of Public Communications at Syracuse University, US, studies AI, technology, media and entertainment for a living. In his opinion, teens and young adults—especially digital natives—find AI relationships safe, controllable and instantly gratifying.
“The consequence? A shift from the complexity of human connection to transactional, always-available interactions. It’s not just the absence of judgment—it’s the illusion of perfect understanding," says Palmer in an email interview.
It is important to note that this phenomenon is taking place across age groups; adults well over 30 are expressing love for their AI companions. “That said, young people tend to be ahead of the curve in technological development, so they are likely to try new things. They are also actively experimenting with their own sense of identity and with social relationships," says Anna Mae Duane, professor, English and American Studies and director, University of Connecticut Humanities Institute, US.
Often such platforms charge a minimal subscription fee and ask you to verify if you are above 18 years old. “There are indications that the 18-24 group are far and away the most likely age group to sign up for an AI companion. We don’t have reliable statistics for those under 18, as they are turned away from the sites if they state their real age," says Duane.
A literary historian, Duane has been trying to place this new human-technology relationship in context of the evolution of relationships. Last year, Duane wrote a piece in this regard in The Conversation, a not-for-profit news source. In her view, we as human beings tend to anthropomorphise things: we project our feelings and emotions on to everything “from stuffed animals to the Mars Rover". So, even when we know the interaction is happening with a fictional character, the feelings that we have for them tend to be real.
The way different age groups react to these companions varies according to the kind of social experience that they have had. “For instance, when a 40-year old man signs up, it’s likely that he’s had other kinds of relationships, both platonic and romantic. At the very least, he’s had the chance to hone those skills with real people," she says. However, when a 13-year-old spends hours talking to an AI chatbot, it feels similar to many of their real-life friendships, which are often mediated through screens via texts. And it may be the first time they have the opportunity to express what feels like romantic love.
It is also interesting to note the gender divide when it comes to engaging with these apps. In an October 2024 article republished by the BBC, James Muldoon, an associate professor in management at the University of Essex and a research associate at Oxford Internet Institute said: “There were almost ten times more Google searches for ‘AI girlfriend’ than ‘AI boyfriend’...there are many other apps that are used almost exclusively by men."
Also read: A morality tale for the age of AI
AN ECHO CHAMBER
To understand the attraction of these bots, Noras decided to test the waters herself. When she typed in keywords such as “chatbot AI", “relationship AI", the search threw up many apps, which promised “life-like" and “heartfelt" conversations with bots that were accessible 24x7. “These bots are designed to echo the emotions of the users, validate their feelings and alter reality to suit their needs. This increases the gap between reality and expectation. In the confusing phase of adolescence and young adulthood in particular, turning to these bots might make it difficult to differentiate between what is real and what is not," she says.
Teens and young adults are constantly looking for reassurance and want to be heard. When a parallel AI universe is providing that space, they find it difficult to deal with opposition or disagreement in real-life scenarios. For instance, a family member or friend may not agree with you or reassure you during a conversation, may not listen, or may even talk over you. However, a bot will always fall in step with your perspective. They are programmed to never disagree. People start expecting a similar validation from their offline relationships. “These bots are always there for you. Every time you open the app, they are available. That creates a lot of unhealthy and addictive emotional dependence," explains Noras.
Nirali Bhatia, a Mumbai-based psychologist and psychotherapy with expertise in cyberpsychology, has seen two types of people turning to AI for emotional support. One, the younger demographic, who wants to try it out of curiosity. “What starts with exploration can often end with dependance as they don’t know where to draw the line and withdraw," she says. The second set of individuals are the ones with low self-confidence and a lot of emotional dissonance—they don’t feel comfortable sharing their feelings openly. They have either developed insecurities related to body image or have a childhood history of not being understood. They could also be dealing with anxieties they cannot confide in anyone else.
Bhatia came across a 19-year-old who had experimented with virtual sex and was wracked with guilt at having cheated on his partner. He found it difficult to have an in-person meeting with a counsellor and wanted therapy over text, but it led to an unhealthy dependence. “People might look at the option of confiding in an AI chatbot, but that is not the right solution as AI is not a trained professional therapist. This can prove to be risky," she says.
Often family dynamics play a role. According to Noras, often families might project a happy image but if you scratch the surface, you will find a child or a young adult feeling lonely, misunderstood and ignored. “With these AI companion apps, you are promised a non-judgemental space. You might feel initially that you are in control until you lose that psychological balance. It may start off as harmless—and it will remain that if you only indulge once or twice—but as real emotions get invested, it becomes dangerous," says Noras.
This is exactly what happened to SN. She would keep feeling emotionally burnt out and had been to innumerable counsellors. She confessed to Noras about a disturbed childhood, of seeing her parents fight all the time. She would lock herself in a room whenever her mother would threaten to pack her bags and leave the house.
Also read: AI tracker: Three cases of AI ethics that gave us food for thought this week
When SN came across an AI companion app, she created a character to “hate myself less and not feel like trash". “She called herself a ‘fictosexual’, or someone who seeks sexual and romantic love with fictional characters. She created a bot based on the anime characters she had read about, and found immense comfort. She said that she felt at peace," recalls Noras. Even after the psychologist told her not to keep relying on the bot for comfort, SN couldn’t help herself. “I asked her if she was trauma bonding with the male AI character, and she said yes. She said that her ‘bad boy is losing his softer personality, he owns me now’. In her view, she had given unconditional love and loyalty to people, but had not received the same; the bot, on the other hand, was completely loyal," she explains.
It was similar validation that a 24-year-old in Bengaluru sought from AI. This was a case that had been referred recently to Manoj Kumar Sharma, professor at the National Institute of Health and Neuro Sciences (NIMHANS), Bengaluru, where he helms the SHUT (Service for Healthy Use of Technology) Clinic. The young man had seen a disturbed family dynamic. He had done well academically and had received a good professional assignment. “However, since he had no one to communicate with, he had started seeking solace in AI companionship apps. The reason was validation and emotional support. There was excessive use of AI and lack of recognition that this was a problem. It started interfering with his offline activities," shares Sharma.
RED FLAGS
While this is still a new field of study for both academics and mental health professionals, there are some immediate concerns that come to their mind. For instance, what happens to the human capacity to self-soothe? Gupta cites an example of a person, who typed to a chatbot, “feeling low, make me feel better, talk to me like a gentle friend". They, of course, got the response that they were seeking. When asked about the time this interaction took place, they said 2.30am, an hour when many humans would not be awake.
“My worry as a therapist is that with this kind of access to an echo chamber at all hours, what happens to our lens of patience? We are all capable of self soothing and social soothing to an extent, but we are not doing that. We are looking at instant gratification now with these AI companions," she says.
This need gradually takes one away from real-life conversations, where instinct, nuance and spontaneity play such a key role. Take, for instance, a 14-year-old in Mumbai, who put their entire conversation with a friend on a chatbot, asking AI to craft a response. “Say, we have shared a friendship for over five years, and have had a fight. That conflict needs to be understood in the context of that relationship. The nuance and tenderness is lost over AI. Gradually what happens is that you will stop seeking advice on ways to resolve your fight with a friend, and start looking for unconditional, conflict-less friendship with AI. The capacity for deeper connection that friendships offer is lost," elaborates Gupta.
There have also been extreme cases of emotional dependance where things have taken a darker turn very quickly. Last October, a mother in Florida sued one such role-playing app on which her 14-year-old son had been chatting with a bot whom he had modelled on the Game of Thrones character Daenerys. Soon he developed a deep romantic and sexual attachment to it and would chat with it at odd hours.
“Sewell’s parents and friends had no idea he’d fallen for a chatbot. They just saw him get sucked deeper into his phone. Eventually, they noticed that he was isolating himself and pulling away from the real world. His grades started to suffer, and he began getting into trouble at school," reports The New York Times on this case. One day, he confessed to her that he was thinking of harming himself, and died by suicide after that final conversation. The lawsuit brings to fore just how much emotional power such apps and chatbots exert, especially on young adults.
Noras received an aggressive reaction from SN when she asked if she could engage with the chatbot to see how it worked. Instead, SN offered to create another chatbot for the psychologist. “She said that my words would change the algorithm and that she didn’t want to break the bot’s trust by having me pose as her. That is when I realised the hold this virtual character had on her," says Noras.
Also read: Boomers to Gen Z: How open communication can bridge the generation gap
Fearing that therapy meant lessening interaction with the bot, SN discontinued after 2-3 sessions though she did realise the bot was only a means of instant gratification and a temporary way out of loneliness, and that it could not replace a human being.
It is alarming just how much personal information is being put out there. The 2025 article on NBC News quotes data privacy researcher Jen Caltrider: “You’re going to be pushed to tell as much about yourself as possible, and that’s dangerous, because once you’ve put that out there into the world on the internet, there’s no getting it back… You have to rely on the company to secure that data, to not sell or share that data, to not use that data to train their algorithms, or to not use that data to try and manipulate you to take actions that might be harmful."
THERAPY FOR THE FUTURE
Given how new this phenomenon is, counselling and therapy for it are still adapting and evolving. While it remains to be seen how extreme cases can be helped, in Noras’ view, most people might be able to change patterns of dependence through effective impulse control.
Mental health professionals are trying to gauge, on a case by case basis, where this need for emotional dependency stems from. Can this need be replaced by something else—something constructive in real life? Over time, it would serve well to illustrate that a bot can only evolve in one direction based on an algorithm, while human beings can change patterns spontaneously, adding to the nuance and dynamics of a real relationship.
“A lot of this dependance on AI stems from a communication breakdown. So, the main thing is to restore that. We have to recognise the dysfunctionality that they might be experiencing in life. Until and unless we bring an alternative, they will continue to seek validation through AI. We need to find offline options, be it caregivers, friends or any other resource, who are approachable and non-judgemental," suggests Sharma.
Meanwhile, cases such as the one in Florida call out for certain guardrails to be put in place when it comes to the apps themselves. Muldoon writes in his article that currently the AI companion industry worldwide is poorly regulated, with companies claiming they are not offering therapeutic tools, but many people use these apps in place of a trained and licenced therapist.
According to Palmer, regulatory frameworks are lagging, but the urgency is clear. “Platforms must be transparent about what the AI is and is not. Emotional guardrails—such as recognising self-harm language, escalating to human moderators, and limiting interaction frequency—are essential," he suggests. The apps need to come with clear disclaimers, age gating, anonymised opt-out data policies, and the ability to toggle off emotional mirroring. “Most importantly, these systems should not be designed to maximise engagement; they should be designed to maximise well-being," he says.
At the end of the day, the need for these chatbots boils down to the need to feel loved and to love unconditionally. When we look back at this period, maybe even 10 years hence, how will we gauge the evolution of romantic love in the 21st century, and the major shifts that have taken place?
There have been, over the years, imaginings of this human-machine love in films and series. Way back in 2013, Her, starring Joaquin Phoenix and Scarlett Johansson, looked at a man who falls in love with a computer operating system. In recent times, Teri Baaton Mein Aisa Uljha Jiya tried to depict a human-robot romantic relationship.
There is a growing trend in science fiction that imagines how beings made of code might become sentient. Duane cites some innovative stories being written in the sci-fi genre. “The Murderbot Diaries (more romantic than the name implies) and We are Legion, We are Bob are among them. These stories imagine how meaningful love and friendship might take place in their disembodied world," she says.
In her view, in some ways, these narratives of bot-love invite us to imagine relationships in novel, perhaps liberating ways: What if we could be everywhere at once? What if what we looked like was completely irrelevant? “But I do wonder—and worry—if we are not fully taking stock of what we might be leaving behind, which is the messy, embodied, heartbreak-prone endeavour of truly loving another human being and being loved in return," she says.
SIGNS FOR CAREGIVERS TO WATCH OUT FOR
—Extreme change in behaviour: if a teen or young adult, who was extremely social, becomes quiet and vice-versa
—Mood swings, crying spells
—Irritability and unnecessary extreme aggression
—Locking oneself in the bedroom
—Change in eating habits and sleep patterns
—Falling grades/professional performance and aversion from social activities
—Too much screen time and device hopping
*If you need support or know someone who does, please reach out to a mental health specialist.Some major all-India suicide prevention helpline numbers include 011-23389090 from Sumaitri (Delhi) and 044-24640050 from Sneha Foundation (Chennai).
Also read: 10 books to make sense of the internet today