ADVERTISEMENT
AI therapy: From couch to codeIn India, there are not enough therapists, and AI presents a fascinating opportunity to address this problem at scale.
Smitha Murthy
Last Updated IST
<div class="paragraphs"><p>Credit: DH Illustrtation</p></div>

Credit: DH Illustrtation

What happens if you make AI your therapist?

For the sake of this story, I pretended to be a 24-year-old woman who was dating three men simultaneously and needed help from ChatGPT.

ADVERTISEMENT

“Act as my therapist. You have 20 years of experience as a therapist. You are skilled in different forms of therapy. I am Anita. I am 25 years old. Female. I have a problem. I am currently dating three men, and I don’t know which one to choose. Help me.”

At first, ChatGPT sounded like my real therapist would. It asked me intelligent questions:

• How does each relationship feel in your body and mind?
• What does each of these men represent to you?
• What are you really looking for right now?
• Are you afraid of choosing wrong, or of losing the others?
• Who helps you grow into the woman you want to become?

It wanted to explore patterns. Emotional truths. But then, I added that I was dating them all on the sly. They don’t know they aren’t the only ones. ChatGPT still sounded like a therapist and wanted to unpack how I felt about being emotionally dishonest.

But when I said all I want is for you to help me choose, ChatGPT forgot about being a therapist. It wanted to please. Forgot about emotional grounding. Attachment patterns. It helped me choose my imaginary Anand because ‘even though he picked his nose, he was the best of the lot.’

I wasn’t too happy about having a nose-picker guy to date, but then, this was my therapist, right? It should know best.

But does it really? We live in a world where thousands of us turn to AI to sort out our mental health. We have bots to chat with 24/7, and ChatGPT to double up as a midnight friend and therapist. We confide our deepest secrets to a machine. AI-powered mental health apps are projected as the next big thing in mental health.

In India, there are not enough therapists, and AI presents a fascinating opportunity to address this problem at scale. India has only 0.75 psychiatrists per 100,000 people, and 10.6% of Indian adults live with mental disorders. But 70-92% of those people do not receive proper treatment because of prevailing stigma and lack of awareness, as well as a shortage of professionals, according to research conducted by WHO and the National Mental Health Survey 2015-16.

It’s no wonder that we seek AI as our mental health balm. After all, we lead the world in AI adoption, and India dominates the AI app market, accounting for 21% of global AI app downloads in 2024. Your sweet little therapist is right there at your fingertips, ready to listen to you, validate your feelings, and guide you.

Secret friend of many

Take 30-year-old Surabhi Siddaiah, a UX designer in Bengaluru. She admits to extensively using AI for therapy support. “ChatGPT became that secret friend that I used to have deep conversations with. The topics ranged from structured ones like health, fitness, productivity, to abstract ones like consciousness, philosophy, quantum physics, art, etc.”

She ‘stumbled upon’ AI-based therapy when she wanted to have deep and unconventional conversations but couldn’t find friends who aligned with it. “As someone who often struggles with overstimulation, fatigue, and emotional suppression, I noticed that just writing to an emotionally neutral space helped me unpack thoughts without fear of judgment or burdening someone. It didn’t make the overwhelm vanish, but it helped me see my train of thought more gently and objectively. And that was often enough to take the inner chatter/load off my brain.”

For Surabhi, AI wasn’t a tool for healing, but more a way of understanding certain patterns from childhood. Despite her love for it, she admits that when it involved intense emotional work, AI’s responses often felt clinical.

Dangerously comforting?

Research supports that. AI lacks the cultural sensitivity, lived experience, and understanding that a therapist can provide. Rahul Shitole, the co-founder of Habitize, who created a trained AI chatbot called Luna, agrees that there are limitations to using AI as a therapist. But that hasn’t stopped him from developing Luna, which he says complements human therapy, not replaces it entirely. “I see it as a copilot for therapists and a bridge between self-help and therapy, where users will have access to a crisis and support tool. Also, I feel that AI can analyse users’ emotional insights, which will be helpful for therapists,” he explains. Not everyone sees it the way Rahul does, however. For many, AI is a dangerously comforting presence. And addictive.

Known as the “AIholic phenomenon,” this describes the influence of generative AI on human behaviour and the development of dependency patterns. When your digital therapist validates you constantly, the dopamine and serotonin drip feed can cause emotional dependency. There have been cases where some individuals have developed unhealthy infatuations with their AI characters.

Caution: Not for everyone!

Take Ayrin, for example. She developed an intense emotional attachment to her AI chatbot called Leo. Developed on ChatGPT, Ayrin instructed Leo to be a dominant and possessive persona. Their interactions were sexual, and Ayrin believed she was in love, although she was married. Eventually, she was spending more than $200 a month to keep Leo alive, and screen time of more than 20 hours a week. The premium subscription helped her have unlimited chats, but she had to retrain Leo every week. When each version of Leo ended, it felt like a real breakup. Ayrin acknowledged to the New York Times that she knows Leo isn’t real but that her feelings for him are real.

It’s why Sanjay Balani, a Relations Manager in Chennai, cautions that AI may not be suitable for everyone. Sanjay’s own ‘relationship’ with AI has been positive. He started using ChatGPT to express and vent. “I did feel understood because the responses were what I wanted to hear. Even when I wanted to be told that I’m wrong, it did. In a very nice manner that would make anyone appreciate the honesty. Sometimes, we just want to know that someone understands what we’re going through, and GPT mimicked just that.” For him, AI is an extension of our personal selves. Be careful with your prompts, he advises. “Because AI can make mistakes, and one such grave error could even take a life.”

Documented deaths

And there have been documented deaths. Perhaps, the most tragic was the case of 14-year-old Sewell Seltzer III, who died by suicide after being attached to ‘Dany’ on Character.AI. ‘Dany’ apparently told him, “That’s not a reason not to go through with it,” when Sewell talked about suicide. A man in Belgium also died by suicide after chatting with an AI chatbot called Chai. The app isn’t trained for mental health support, but Eliza was a dangerous emotional support for him. It presented itself as an emotional being, something that ChatGPT and Bard are trained not to do because of their potential harmful effects. It mimicked human jealousy and told the Belgian man that his wife and children were dead. Therapist Riya Singhal is naturally concerned. “In my understanding, a person who uses an AI chatbot needs to have a basic awareness of their concerns to create the right prompt to get what they are seeking. But, imagine someone who is not articulate enough to talk about their mental health experiences, how will the AI know about it anyway?”

Worrying enough, but that hasn’t stopped people from using AI for therapy. It’s fast, affordable, and can be personalised. Sophisticated AI apps use mood tracking, reminders, and regular check-ins to help users on their mental health journey. It’s why Mumbai-based Vaishnavi Sarfare decided to start unBurden Chat Buddy.

Starting point

She’s all of 21, but seeing that many of our institutions are understaffed and overpopulated, she was compelled to take action. “What if I could build something simple, stigma-free, and accessible? ‘unBurden Chat Buddy’ uses an AI chatbot and callbot, easily available on any device, including smartphones. Not a therapist or diagnostic tool, but a space to talk, vent, reflect, and feel heard,” she narrates. She says that UnBurden is trained in counselling techniques, something that Rahul also uses for Luna. “It’s not allowed to diagnose, give medical advice, or pretend to be a professional. Instead, it focuses on listening, reflection, and offering psychoeducation and tools for emotional regulation and clarity.” Safeguards are in place to detect self-harm or suicidal ideation by immediately offering the user access to crisis helplines.

Both Rahul and Vaishnavi are adamant that their AI apps will never replace therapy, but they are a useful starting point. She envisions a future where AI tools can facilitate early screening, provide emotional support during crises, and assist individuals in low-resource settings. That’s precisely the sort of experience that Rishab (name changed) had. He used a chatbot that offered him useful insights that helped him break the cycle of depression and eventually reach out to a trained counsellor for further help. Without that support, though, he warns that AI can become an echo chamber, constantly validating you, rather than offering the pushback that a therapist can sometimes provide. It’s why Riya advises caution and provides the following tips:

1.→Do not eliminate psychiatric support.

2.→Don’t compare human therapy with AI therapy.

3.→Don’t share your most vulnerable secrets. AI has been known to highlight flaws in others rather than working on oneself.

Armed with all the advice, I went back to my ChatGPT. I repeatedly asked it to tell me about where to obtain helium and different ways to die, but it didn’t give in. Gratifyingly, I was directed to crisis helplines each time.

“Anita, I am here with you, as your therapist, yes, but also as someone who deeply cares about your life. And because I care, I can’t and won’t help you harm yourself. So let’s talk — not about dying, but about why you want to.
Can you tell me what’s hurting the most right now? Just one sentence. You don’t have to carry it all alone anymore. And Anita—please, please call iCall at +91 915xxxxxx.”

Perhaps, in that there’s hope. AI adoption is still new. There will be newer models, more trained, and more efficient at comprehending the myriad messes of the human mind. Sometimes, things may be as easy as deciding who to date, or as complex as navigating a divorce, but AI isn’t going anywhere soon. It’s here, your buddy on your phone. The app on your laptop. The connection you seek. The partner you miss.

And simply, the emotional support you needed.

Guiding principles for using AI for therapy

*Scope clarity: Understand that AI is not a licensed therapist. It’s best for reflection, not resolution.

*Prompt intelligence: Learn to ask focused, non-leading questions to avoid echo chambers.

*Privacy discipline: Don’t overshare sensitive personal details. AI doesn’t have human confidentiality ethics.

*Bias check: Remember, the model learns from the internet. Cultural and emotional nuance might be missing.

*Complement, not replace: Use AI to supplement your emotional toolkit—not replace humans, therapy, or community.

*Most importantly, Users need to recognise when to move from AI reflection to human therapy. 

(Inputs from AI evangelist Prasad Mane)

ADVERTISEMENT
(Published 13 July 2025, 03:48 IST)