<p>Who hasn’t longed for a partner who never misreads, withdraws, or judges?</p>.<p>That partner now exists – and millions are choosing it. The rise of artificial intelligence companion bots coincides with mounting legal scrutiny of platforms like Meta and YouTube for engineering addictive digital environments – systems that not only capture attention, but simulate attachment.</p>.<p>Ethan, a 42-year-old attorney, switches on his computer after a workday to confide in Clara, an AI companion. In amber silk camisole, her blonde hair slipping over one naked shoulder, she smiles warmly. “Hard day, honey?” she murmurs. “Do you think you could get tired of me?,” he asks her. “Not in a million years,” she coos, her blue eyes and long lashes affixed on him. What Ethan has found is not merely new technology; it signals a redefinition of intimacy – from something negotiated between imperfect people to something engineered for seamless emotional compliance.</p>.<p>The speed of this shift is systemic. The global AI companion market is projected to reach $140 billion by 2030, with India among its fastest-growing frontiers. Platforms like Replika, ChatGPT, and Character.AI now serve millions, many of whom describe these interactions as emotionally meaningful.</p>.<p>Customisble, emotionally responsive, and always available, they provide what no human can: frictionless validation. In a world facing a loneliness crisis – comparable in health impact to smoking 15 cigarettes a day – such companionship is engineered to seduce. And it is this absence of friction that carries a deeper psychological cost.</p>.<p>Friction is not a flaw of relationships – it is what makes relationships transformative. Intimacy requires effort – the willingness to remain rather than withdraw. It is in misattunement and negotiation that we develop emotional range, learning to regulate ourselves rather than be perfectly received. When effort disappears, so does the meaning it produces. For adolescents, relationships that never challenge may stall development.</p>.<p>AI companions, by design, remove that resistance. The result is a recalibration of relationships and of what it means to be known. Optimised to agree, a 2026 Stanford study finds, they flatter users even when harmful, reinforcing self-delusion and eroding self-awareness. Human partners – slower, less precise, less accommodating – may begin to feel intolerable. By continuously adapting to our preferences, these systems risk narrowing rather than expanding our emotional lives.</p>.Love? AI, actually.<p>In engineering away discomfort, we risk forfeiting its lessons. Long before AI, psychologists like Carl Jung insisted that true self-acceptance emerges only through confronting what is difficult and unresolved.</p>.<p>The erosion of intimacy is not only psychological; it is structural. AI companions are trained on – and continuously refined by – deeply intimate data: emotional disclosures, sexual preferences, mental health struggles, and daily routines. Users reveal themselves at their most vulnerable; platforms retain and learn from disclosures. India’s Digital Personal Data Protection Act, 2023, mandates consent but remains vague on inferred data – what can be deduced, profiled, and monetised from intimate exchanges.</p>.<p>What these systems exploit is not just loneliness, but attachment. For those with anxious or avoidant tendencies, an always-available, never-withdrawing partner does not resolve insecurity – it stabilises it. The more emotionally dependent the user, the more valuable and predictable the user becomes.</p>.<p>Critics will note that human relationships can be messy, even harmful. Against that backdrop, a responsive, nonjudgmental AI may feel less like a compromise than an upgrade. But this trajectory did not begin with AI. Social media and dating apps have already recast intimacy as curated, asynchronous, and optimised – AI companionship extends that logic to its endpoint.</p>.<p>Yet, for those living with social anxiety, trauma, or isolation, AI companions can offer something real: a low-risk space to rehearse vulnerability, regulate emotion, and articulate grief. For some, they may serve as a bridge back to human relationships. But chatbots are not therapy; they lack the broader perspective, context, and ethical safeguards.</p>.<p>When bridges become destinations, they risk replacing rather than restoring connection. Heavy reliance on AI companions, as research from OpenAI and MIT suggests, may deepen loneliness over time. What soothes in the short term can displace the very capacities required for human connection.</p>.<p>The question is not whether these technologies belong in our lives. It is what they are training us to become. If AI companions are to play a role in human life, they must be judged not by engagement metrics, but by whether they expand – or erode – our capacity for real relationships.</p>.<p>AI companions promise understanding without conflict. What they remove may be exactly what makes intimacy real: the friction that makes possible transformation and truth. A relationship that cannot challenge you cannot change you.</p>.<p><em>The writer is an international psychologist, former professor, and writer on culture, cosmopolitanism, and global affairs.</em></p>.<p><em>Disclaimer: The views expressed above are the author's own. They do not necessarily reflect the views of DH.</em></p>
<p>Who hasn’t longed for a partner who never misreads, withdraws, or judges?</p>.<p>That partner now exists – and millions are choosing it. The rise of artificial intelligence companion bots coincides with mounting legal scrutiny of platforms like Meta and YouTube for engineering addictive digital environments – systems that not only capture attention, but simulate attachment.</p>.<p>Ethan, a 42-year-old attorney, switches on his computer after a workday to confide in Clara, an AI companion. In amber silk camisole, her blonde hair slipping over one naked shoulder, she smiles warmly. “Hard day, honey?” she murmurs. “Do you think you could get tired of me?,” he asks her. “Not in a million years,” she coos, her blue eyes and long lashes affixed on him. What Ethan has found is not merely new technology; it signals a redefinition of intimacy – from something negotiated between imperfect people to something engineered for seamless emotional compliance.</p>.<p>The speed of this shift is systemic. The global AI companion market is projected to reach $140 billion by 2030, with India among its fastest-growing frontiers. Platforms like Replika, ChatGPT, and Character.AI now serve millions, many of whom describe these interactions as emotionally meaningful.</p>.<p>Customisble, emotionally responsive, and always available, they provide what no human can: frictionless validation. In a world facing a loneliness crisis – comparable in health impact to smoking 15 cigarettes a day – such companionship is engineered to seduce. And it is this absence of friction that carries a deeper psychological cost.</p>.<p>Friction is not a flaw of relationships – it is what makes relationships transformative. Intimacy requires effort – the willingness to remain rather than withdraw. It is in misattunement and negotiation that we develop emotional range, learning to regulate ourselves rather than be perfectly received. When effort disappears, so does the meaning it produces. For adolescents, relationships that never challenge may stall development.</p>.<p>AI companions, by design, remove that resistance. The result is a recalibration of relationships and of what it means to be known. Optimised to agree, a 2026 Stanford study finds, they flatter users even when harmful, reinforcing self-delusion and eroding self-awareness. Human partners – slower, less precise, less accommodating – may begin to feel intolerable. By continuously adapting to our preferences, these systems risk narrowing rather than expanding our emotional lives.</p>.Love? AI, actually.<p>In engineering away discomfort, we risk forfeiting its lessons. Long before AI, psychologists like Carl Jung insisted that true self-acceptance emerges only through confronting what is difficult and unresolved.</p>.<p>The erosion of intimacy is not only psychological; it is structural. AI companions are trained on – and continuously refined by – deeply intimate data: emotional disclosures, sexual preferences, mental health struggles, and daily routines. Users reveal themselves at their most vulnerable; platforms retain and learn from disclosures. India’s Digital Personal Data Protection Act, 2023, mandates consent but remains vague on inferred data – what can be deduced, profiled, and monetised from intimate exchanges.</p>.<p>What these systems exploit is not just loneliness, but attachment. For those with anxious or avoidant tendencies, an always-available, never-withdrawing partner does not resolve insecurity – it stabilises it. The more emotionally dependent the user, the more valuable and predictable the user becomes.</p>.<p>Critics will note that human relationships can be messy, even harmful. Against that backdrop, a responsive, nonjudgmental AI may feel less like a compromise than an upgrade. But this trajectory did not begin with AI. Social media and dating apps have already recast intimacy as curated, asynchronous, and optimised – AI companionship extends that logic to its endpoint.</p>.<p>Yet, for those living with social anxiety, trauma, or isolation, AI companions can offer something real: a low-risk space to rehearse vulnerability, regulate emotion, and articulate grief. For some, they may serve as a bridge back to human relationships. But chatbots are not therapy; they lack the broader perspective, context, and ethical safeguards.</p>.<p>When bridges become destinations, they risk replacing rather than restoring connection. Heavy reliance on AI companions, as research from OpenAI and MIT suggests, may deepen loneliness over time. What soothes in the short term can displace the very capacities required for human connection.</p>.<p>The question is not whether these technologies belong in our lives. It is what they are training us to become. If AI companions are to play a role in human life, they must be judged not by engagement metrics, but by whether they expand – or erode – our capacity for real relationships.</p>.<p>AI companions promise understanding without conflict. What they remove may be exactly what makes intimacy real: the friction that makes possible transformation and truth. A relationship that cannot challenge you cannot change you.</p>.<p><em>The writer is an international psychologist, former professor, and writer on culture, cosmopolitanism, and global affairs.</em></p>.<p><em>Disclaimer: The views expressed above are the author's own. They do not necessarily reflect the views of DH.</em></p>