How startups are using technology to address mental health issues
- Wysa, another startup that addresses mental health issues, goes a step further. It uses chatbots that run on an advanced algorithm and machine learning
- YOURDost has handled 15 lakhs cases since it began in 2015
Over the years, several startups have mushroomed that use technologies like chatbots and online counseling to address mental health issues.
YOURDost, for instance, handles mental health through online counselling. The Bengaluru-based startup was set up by IITians Richa Singh and Puneet Manuja due to a personal loss. During their IIT days, Singh’s friend had committed suicide because of apprehension over placement. There was an on-campus counselling department but her friend did not seek help. “This incident triggered us to do a survey, which suggested the prevalence of stigma. Most people don’t go to a counsellor because they are afraid to be called mad or other names," says Manuja, who also has an engineering and software background. The problem of accessibility and awareness also cropped up during their research. People lacked basic awareness about mental health, says Manuja. “Stress is part of life," he adds. “How do I figure out that stress has crossed a certain threshold and requires preventive measures to manage the stress. When do I go to a counsellor?"
Manuja and Singh realized that rather than setting up two or three clinics they should use the route of technology, which could help address the issues of stigma, and lack of accessibility and awareness. It provides anonymity, as well as spreads awareness fast through online portals, and is accessible anytime, the co-founders say. They use cognitive behavioral therapy, or CBT, telephonic calls, chat-based counselling. “Now the access is pan India, it has 24x7 coverage and you can be connected to a specialist late at night," says Manuja.
YOURDost has handled 15 lakhs cases since it began in 2015. About 60% of the cases are from metro cities, while 40% from Tier 2 and Tier 3 cities. They get broadly three kinds of issues – relationship issues, both pre-marital and post marital from the college and younger crowd; concerns over work-life balance, career confusions and office stress from young professionals and third anxieties and fears of young entrepreneurs.
Wysa, another startup that addresses mental health issues, goes a step further. It uses chatbots that run on an advanced algorithm and machine learning.
Before setting up Wysa, which is named after Eliza, the first chatbot made in 1960s, Ramakant Vempati and Jo Aggarwal, founders of Wysa in 2017, used machine learning and phone data to detect depression. “Your phone is a biomarker of your mood and mental health, “ says Vempati. But when people, whose passive phone data was used to develop the correlation between their data and mental health, refused to come forward and be treated for depression, the duo started of an artificial intelligence-based mental health platform that would protect the identity of the user.
The chatbot uses more than 100 NLP models that have been built upon 80 million conversations to detect and understand user input. It responds with an appropriate conversation and uses self-help techniques like CBT, meditation techniques and motivational interviewing to address issues of the clients.
The response has been overwhelming. Wysa, which has a team of psychoanalysts and technicians who monitor the briefs provided by the chatbot and to see if that is safe, now has 13 million users spread across 30 countries. Most of them are from South Asia, the US, UK and Canada.
Nimesh Desai, professor of psychiatry and director, Institute of Human Behaviour and Allied Sciences, Delhi, states that online counselling and chatbots are mixed bags with merits and demerits. “Hiding the identity and protecting individual details are obvious merits," he says. It may also be useful as an initial help leading to in-person consultation." But in some cases, treatable severe mental disorders can be missed, leading to complications including suicide." According to him, the service providers need to ensure all legal obligations are met, and they should take the responsibility for complications if they do arise. “It’s their obligation to keep the clients informed of all aspects, and just like the print pamphlet you get in a medicine box, there should be an explicit disclaimer."