The Role of Chatbots in the Healthcare Sector


In the last feature we looked at how AI-powered chatbots were transforming the world of customer relationship management and e-commerce.

Here we’ll examine some of the more specialist roles chatbots are taking on, particularly in the healthcare sector.


Let’s face it, not everyone wants to talk to a human being, particularly when discussing intimate medical ailments, mental health issues or reporting sexual harassment in the workplace.

So in a growing number of such cases, chatbot apps are fulfilling the role of confidant, adviser and instructor.

Take’s chatbot Emily, for example. It takes people step by step through the process of providing a urine sample at home, then analysing the results using a dipstick and a clever image recognition algorithm on the user’s camera. The results are sent to a doctor who then advises on the best treatment.

The Israeli start-up initially failed to receive approval from the US Food and Drug Administration for its app and testing kit, but not because there was anything wrong with the algorithm or the science.

It was because the user interface – the instructions – were not deemed easy enough to understand, particularly for older users. The creation of Emily – a friendly, no-nonsense female chatbot – was key to winning regulatory approval, the start-up says.

Now people can test for a range of conditions, from urinary tract infection to diabetes, without having to waste time queuing at a clinic or suffering embarrassment. hopes the service will safe health services millions.

Chatbots are cropping up everywhere in healthcare, and Intelligent Mobile is at the forefront of this, assisting medical pioneers with initiatives related to pregnancy, autism and a healthier lifestyle.

Digital doctors, such as Babylon Health, Your.MD, and Ada Health are acting like triage nurses, assessing your symptoms and monitoring your health over the long-term to provide predictive and proactive care. 

They can also have a valuable role in simply reminding patients to take their medications. Patients who fail to complete a drug course are a cause of huge inefficiency in most health systems. 

Mental health is an obvious area where anonymity and a non-human interface could be an advantage, particularly given that in some cultures mental illness is still a taboo subject and difficult to talk about.

Even in more liberal cultures there is still a stigma attached to mental illness. Yet its treatment is estimated to cost the US economy alone more than $200bn a year. So there is clearly a huge gap in the market for cheaper, more convenient digitised mental health services.

One well-known chatbot, WoeBot, claims users can experience a reduction in the symptoms of anxiety and depression after just two weeks of using its cognitive behavioural therapy algorithm.

It says its solutions are backed by solid evidence-based science and that all information shared by users is encrypted, anonymised and de-identified.

Over time advances in natural language processing will help make the interactions more conversational and intimate, the company says.

There are plenty of other chatbots in this field adopting a similar CBT-based approach, including Wysa, Catch It and Tess, an Arabic-speaking version of which was recently offered to Syrian refuges suffering from post-traumatic stress in Lebanon.

And anonymity is valuable when people need to report sexual harassment in the workplace. The majority of people never report cases because they are worried about getting into trouble or are too embarrassed or ashamed to talk to a human.

Chatbots like Spot use interview techniques adopted by police forces to elicit full and frank reports of such cases that can then be used as evidence in future.

Of course, none of this technology is perfect yet by any means, and there have been critics among the professional classes – doctors and therapists – who’ve assessed these chatbots and found shortcomings.

For example, some doctors think the Babylon Health chatbot gives misdiagnoses of some symptoms, while there are concerns that mental health apps Woebot and Wysa failed to respond correctly to reports of child sexual abuse, eating disorders and drug use.

But to be fair, most healthcare chatbot developers would not claim that their products are designed to cope with crisis situations or that they have made human specialists redundant – not yet at least.

These are a work in progress, and there is much more work needed in the sphere of emulating human empathy and developing conversations that sound unscripted, natural and that can truly understand what’s being said.

There are also concerns about data privacy. While many healthtech start-ups claim to offer encrypted, fully anonymised services, the public could be forgiven for being a little sceptical given the number of data breaches, even involving tech giants such as Facebook and Google.

But chatbots offer convenience, lower costs and a non-judgmental intimacy that offers many benefits in these specialised fields. As they crunch more and more data, learn and improve, they are only going to get better at what they do.

Pioneers in the medical profession have picked up on this, so expect their adoption to be rapid in the next 12 months.

Megan Dickie