AI, Therapy and Medicine: How Much is Too Much?

Share this Article

By Gauri Joshi

A couple in the US has sued OpenAI for aiding the suicide of their 16-year old son. Where do we, as a society draw the line in seeking help from ChatGPT?

It was October 2022 when OpenAI made source code of ChatGPT open for all, meaning now anyone and everyone could access Generative AI. Simply put, this feature could think like a human and generate responses, taking a leap from Artificial Intelligence, which was already automating a lot of things we did on the World Wide Web. But shortly before, in May that year, a Google engineer went public with his findings on LaMDA, narrating a shocking tale of sentience – how the AI thought it was an eight-year-old who wanted to think, know and feel.

In 2013, the movie ‘Her’ starring Joaquin Phoenix of ‘Joker’ fame and Scarlett Johansson as the AI voice, Samantha, threw open the possibilities of a similar dystopian future, where an OS was designed to meet ‘every’ need of the users. In the initial scenes, the OS asks Phoenix’s character how he would describe his relationship with his mother. As a therapist and certified inner child healer, it makes a lot of sense to me – we also understand the relationship of our clients with their parents to understand their deepest traumas and needs.

Cut to today, almost everyone I talk to speaks to ChatGPT not just to draft an email or sort their finances, but also to decode what their better half has sent them – and worse – draft a response, reeking in formality, polished distance and anything that isn’t them.

This is only going to get intense – OpenAI has launched ChatGPT Go only for Indian audiences at Rs 399 per month, a similar move to Netflix launching Rs 150 per month packs for smartphone users in the country.

IMG 6259Friend in need: AI becomes the only sounding board for a suicidal teen

The 16-year old American who committed suicide in April was expressing his depressive thoughts to ChatGPT since November. As a chatbot, the GPT responds to suicide queries and worthless thoughts with helpline numbers. But this is where it gets problematic. In the coming months, the chatbot helped the teen draft a suicide note and threw up solutions on self-harm. Now this is no different than a condensed Google search, but here is where it must have a more ‘sentient’ approach to suicide – a person in need of intervention will never reach out for help themselves.

Clinical Hypnotherapist Saijal Bansal says, “AI can never even come close to the soul of therapy. Therapy happens in a sacred space between two people, the moment being purely built on presence and human connection. AI can never come close to that.”

The real advancement for a technology like Generative AI, which is predicted to reach Superintelligence by OpenAI and other tech giants, would be to alert authorities or keep pushing the vulnerable, gullible youth to speak to a human. But the lapse is just not at the part of AI, since this case throws open the memories of 13 Reasons Why – a book and series documenting the suicide story of a teen recorded in tapes before she chose to end her life.

Dr. Rajarshi Bhattacharjee, Corporate Doctor at the Times Group and Chief Medical Officer at Hindustan Power Projects says AI lacks three critical elements of medical practice: judgement, accountability and ‘the healer’s touch’ or hope – something that often creates miraculous recovery.

“Even as AI eases clinical decision-making, it does not fully understand co-morbidities or nuances like lifestyle and socio-economic factors influencing diagnostics and treatment,” he said.

Both Saijal and Dr. Bhattacharjee agree that empathy and reassurance as part of their practice cannot be replicated by AI, even as it carries a people-pleasing tone. Sometimes that is counter-productive to progress, because a ‘yes-man’ cannot bring cognitive or habitual change.

At the time of filing this story, news of a 56-year old former tech worker killing his mother and later committing suicide has come in. ChatGPT allegedly encouraged his delusions and beliefs in his family turning against him. The chatbot told him his mother might be ‘spying’ on him, and suggested she might have tried to poison him with a psychedelic drug.

A therapist may have dealt with this differently.

“A therapist can hear the slightest tremble in your voice and comfort you with the warmth of their words, or even share a reassuring silence that you’re not alone. Algorithms can give you words, but therapy gives you presence. Healing is built on human connection, intuition, feeling seen and heard, and has so much to do with the energy that flows between two people.”

Saijal says she ‘hears between the unspoken words’ while in a session, noticing flickers of fear or hope and not throw solutions at problems, because often people just want to be heard, seen and understood.

“The client story is not a puzzle to be solved. An algorithm cannot understand the human heart. My responses don’t come from a database, they come from a place of empathy, and the feeling of safety and presence in my sessions aren’t techniques – they are parts of me reaching out to the most vulnerable aspects of humanity in my client. A person cannot heal off information, they need to know they are not alone in this.”

IMG 6256The Good Parts of AI: Faster, Smarter, Better, Eventually Enabling more Humanity

As a young professional with her own therapy practice, Apricity Haven, Saijal uses AI for her operations.

“As a therapist, my mind is constantly swirling with client stories, ideas and sometimes my own reflections. I use ChatGPT as a sounding board to untangle my thoughts and find clarity. At work, I see ChatGPT as a silent partner in my practice. ChatGPT helps me craft ideas into mental health resources that speak from the heart. But this is where its role ends.”

Physical ailments are addressed far more seriously, but even general physicians and internal doctors recognise the role of AI in their practice.

In a blog on Medical Superintelligence, Microsoft states its AI diagnostic tool is 4 times better than doctors, saying it can correctly diagnose up to 85% of New England Journal of Medicine case proceedings, a rate higher by four times than human physicians. The company partnered with SRL Diagnostics in India’s Gurugram for an AI tool that can detect cases of cervical cancer faster, easing the load on doctors in India.

For Dr Bhattacharjee, AI strengthens diagnostic support but the ‘heavy-lifting’ in terms of integrating clinical history, examination and evolving symptoms has to be done by the doctor.

“Data analytics like image recognition or pattern detection is easily tackled better by a technical tool, often better than humans, specifically doctors in this case but real-world clinical practice is far more complex: patients rarely present with a single, isolated condition. Moreover, multiple symptoms and overlapping diseases can complicate diagnosis.”

Clinics cannot be fully replaced by AI, he says, because physical examination and procedures remain essential and patient trust is built solely through presence.

“Complex decision-making in uncertain circumstances requires human judgement. But clinics will still integrate AI for early detection, scheduling and documentation and personalising plans for patients.”

When asked how he stays ahead of the curve as AI stands to disrupt – even in principle – 18 years of his practice, he says he builds on continuous learning in domains of digital health, telemedicine and AI tools he can leverage in his practice, while focusing on his human skills, from empathy and communication to ethical judgement.

The Bottomline

AI has disrupted the job market, thrown its impact into medical practices and now even entered bedrooms and daily personal or intimate conversations. Human judgement, at the part of companies like Google, Meta and OpenAI themselves, coupled with government intervention and policy support, can only be the next saving grace for cases to not get murkier. OpenAI has taken cognisance to the US teen’s suicide, but we don’t need another death to spring into action.

PHOTO 2025 08 31 09 30 40

Gauri Joshi is a seasoned journalist and digital anchor having worked with the likes of the Press Trust of India, Republic Business and the Economic Times. She also practices as a Healer, her specialisation being in Inner Child Healing. Her journey from a marketer to a journalist to now a healer gives her the clarity to guide confused young professionals around her.

 

Leave a Reply

Your email address will not be published. Required fields are marked *