Ananya Sethia, 33, a Bengaluru-based software programmer confesses that she has been hooked to ChatGPT ever since it launched in early December. “Honestly, it doesn’t feel like you’re talking to a bot. It’s a near-human experience,” she says. Sethia who is an avid user of the bot says that she has been asking it questions about how to navigate toxic relationships, flirting, family issues, and other intimate, private thoughts too. She articulates, “I don’t feel judged, it is a private conversation and the responses I receive are pushing me to think and behave positively. I feel AI has a lot of usage in therapeutic interventions.”
Another programmer based in Bengaluru, Aseem Upadhye, 29, feels the same way. “I’ve been struggling to find a decent therapist and for someone like me, with financial constraints, ChatGPT is a great tool.” Upadhye has been chatting with the AI since it launched and finds comfort in the responses he gets. “The functionality has improved greatly and now, it may seem as good as speaking with a therapist with decades of experience.”
While programmers and even laymen might find ChatGPT and similar AI tools useful in dire times and situations, mental health practitioners continue to debate whether or not ChatGPT can be effective for therapeutic interventions.
Also read: Feeling blue? It could be because of a thyroid disorder
Disrupting the mental health domain
Today, people can access support from others on community platforms, engage with self-care tools, and consume clinically-backed content from several credible websites. In some pockets of the world, researchers are also exploring the use of VR, or virtual reality therapy, for the treatment of different mental health conditions.
Most mental health practitioners agree that artificial intelligence is evolving and disrupting every industry, including the mental health space. “It is difficult to predict how it’s going to influence our daily lives from now onwards,” remarks Ajith Abraham, Dean, Faculty of Computing and Data Sciences, FLAME University, Pune. “Neurological analytics bring tremendous opportunities and challenges because of the availability of huge amounts of data for analysis. Several machine learning methods were found successful in diagnosing mental health conditions,” he observes. While computer vision helps in understanding non-verbal cues, such as facial expression, gestures, pose, eye gaze etc., natural language processing has been used for simulating human conversations via computer programs for self-assessments and therapy sessions.
Mahima Sahi, the chief psychologist at heyy, a mental health app, elaborates that there has been a huge rise in the usage of machine learning algorithms to train AIs for extended relief. “Some popular apps like Wysa have made use of an AI to solve user concerns, with their “relief-bot”.”
A cautionary tale?
Practitioners such as Sahi, though, are wary about AI being perceived as a “replacement” for traditional therapy. Having used ChatGPT, she says you get no visibility to the “sources” it picked its answers from, while you “sure” get an answer! While this can be fixed by the creators, as a psychologist, Sahi believes that an AI like this can narrow the processing of the brain with time and make it habitual towards getting well-cut information to any problem statement, thus, never enabling the development of one's own complex cognitive or executive functioning. “Initial concerns similar to these are already traceable from reports of students who are using the AI to complete their assignments,” she states.
At the same time, people fear therapy so much that they are trying to find replacements for it. As Dr Rashi Agarwal a psychiatrist based in Meerut says, “It’s good to take charge of your health and well-being, but some things need to be done by experts, people who have been trained specifically to handle the situations. We have to use technology for our benefit and not the other way round.”
Dr Meghana Dikshit, an author, and performance coach who has used the chatbot feels that it is all about perspective since it depends on the question that the user asks. “In my experience, when it comes to mental health, a lot of times what people say and what they actually mean are very different. They may come up with a certain challenge but beneath it, they have a completely different issue. AI is not always right, it’s full of biases. So, to get the best output we need a human in the loop.”
In many ways, ChatGPT operates like a search engine (for content) or a chatbot (for coping tools). And while having access to content and coping tools is important for a person with mental health difficulties, for treatment of clinical conditions, it’s best to rely on the advice of a trained professional. Dr Amit Malik, the founder & CEO of Amaha Health says, “I don’t see the platform replacing therapy/treatments delivered by a human anytime soon, especially given the diversity of presentations in mental health - when even within the same diagnosis, different individuals will have different nuances - that require a personalised therapeutic approach. Additionally, some people with clinical challenges will require medication support, for which they must consult a trained psychiatrist.”
How to use AI for emotional wellbeing
Emotional and mental health challenges are complex. There are several factors that need to be considered in order to understand, evaluate, and treat such challenges. These include, but are not limited to a person’s lifestyle, family history, past medical and psychological history, current environment and personality traits, among others.
Sahi believes that AI, if trained effectively, keeping the personalisation, accuracy and understanding of the user's concerns in mind, can definitely aid the mental health space and help create scalable mental health solutions in times to come. “MHPs and Institutes should, thus, invest some time and energy in doing credible research for finding effective means for personalisation and effective tracking of outcomes, of something as intangible as mental health, tangibly yet effectively!” she says.
It can potentially aid clinicians in making a faster, more accurate diagnosis and analysing client information/data to help clinicians predict relapses and other high-risk situations. Dr Dikshit adds that AI can pre-empt a person’s behaviour using algorithms and in turn, can suggest a course correction in advance
Also read: Can scream therapy improve your mental health?
During the last few years, several factors including the impact of the COVID-19 pandemic have aggravated mental health issues. The numbers are steadily rising, and there is a shortage of mental healthcare professionals. This is where AI can come into play. Accessibility and affordability are some of the biggest pros of AI, believes Dr Abraham, adding that AI can help clinicians evolve treatment plans. Having said that, all experts are on board with Dr Malik on this: AI in clinical situations is best used in conjunction with expert human support to provide safe, effective and personalised care.
Divya Naik is a Mumbai-based therapist