EHR data and AI to predict response to antidepressant treatment

Antidepressants are frequently prescribed for adults with depression, a common and often disabling psychiatric condition. However, identifying the most effective treatment for a particular patient is often a trial-and-error process that can result in prolonged morbidity, disability, and exposure to adverse effects, as well as substantial healthcare costs. Precision psychiatry aims to optimise treatment matching using patient-specific profiles, but there are few evidence-based predictors available to clinicians initiating antidepressant treatment.

Although average response rates are similar across different antidepressant classes, individual responses can vary widely in clinical practice. Therefore, accurately and scalably guiding antidepressant selection presents specific challenges. The gold standard for characterising antidepressant response from electronic health records (EHRs) remains expert chart review, which is labor- and time-intensive.

However, advances in machine learning (ML) and the growing availability of large-scale health data, such as EHRs, offer new opportunities for developing clinical decision-support tools that may address this challenge. In a recent study published in the peer-reviewed open-access medical journal Nature Partner Journals (npc) Digital Medicine, researchers used machine learning models to accurately predict differential treatment response probabilities for patients and between antidepressant classes based on real-world EHR data. The pipeline incorporated AI and non-AI features, as well as unstructured data (i.e. clinical notes) to maximize the use of information contained in EHRs.

The study included 17,556 patients who received a new antidepressant prescription from non-psychiatrists, and data were obtained from 20 years of EHRs spanning from January 1990 to August 2018. The patients had at least one International Classification of Diseases (ICD) code for depression and at least one ICD code for non-recurrent depression during their history.

ICD codes from EHR data were obtained for adult patients (age ≥ 18 years) with at least one visit (the first visit with an antidepressant prescription is defined as the “index visit” for each patient) with a diagnostic ICD code for a depressive disorder (defined as ICD-9-CM: 296.20–6, 296.30–6, and 311; ICD-10-CM: F32.0–9, F33.0–9) co-occurring with an antidepressant prescription, and at least one ICD code for non-recurrent depression (ICD-9-CM: 296.20–6 and 311; ICD-10-CM: F32.0–9) any time during their history.

The resulting models achieved good accuracy, discrimination, and positive predictive value, which could be valuable for further efforts aiming to provide clinical decision support for prescribers. However, the researchers noted several limitations, including missing data in EHRs(e.g. patients who may receive some of their care outside of the healthcare system), and secular trends in clinician prescribing or documentation practices that may have affected model performance.

In summary, the study presents a novel computational pipeline based on real-world EHR data for predicting differential responses to commonly used classes of antidepressants. The approach demonstrated here could be adapted to a wide variety of other clinical applications for optimising and individualising treatment selection.


  1. Sheu, Yh., Magdamo, C., Miller, M. et al. AI-assisted prediction of differential response to antidepressant classes using electronic health records. npj Digit. Med. 6, 73 (2023).

Voice-to-text medical software using NLP technology

When the doctor sits down with you on your visit, the doctor normally spends a lot of time inputting the how and the why of what’s happening to you, conventionally into a paper-based case note/medical record.

These free text narratives are further aggravated as not all doctors “speak the same way” in note creation and management.

These notes about your condition are rendered not easily extractable in ways that the data can be analyzed by a computer.

The good thing is this unstructured data of free text has given way to more and more ways to digital record-keeping—into the electronic health record systems (EHRs) way, away from the days of trying to decipher doctors’ medical lingo on hand written medical records and medical reports. However, EHRs are as unstructured patient data like its cousin, the paper-based medical record.

Inevitably, EHRs create challenges for doctors and that can be frustrating with additional data input responsibilities often bogged down by form-filling through the many clicks and screens required to navigate their EHRs, as well as they spending additional hours on updating EHRs.

EHRs became more important to be accurate and immediate with the scourge of the COVID-19 pandemic and with an increased reliance on contact-free consultations between doctors and patients.

Ultimately, huge volumes of unstructured patient data continue to be input into EHRs on a daily basis. As healthcare documentation is mostly unstructured, and it therefore goes largely unutilised, since mining and extraction of this data is challenging and resource intensive.

Medical Natural Language Processing (NLP) is steadily proving to be a solution to this challenge, creating new and exciting opportunities for healthcare delivery and patient experience. The adoption of NLP in healthcare is rising because of its recognized potential to search, analyze and interpret mammoth amounts of patient datasets.

Human beings use text and spoken words to fill up the human language with homonyms, homophones, sarcasm, idioms, metaphors, grammar and usage exceptions, variations in sentence structure, as some examples of ambiguities and irregularities as only they understand their usage.

NLP is a branch of artificial intelligence (AI) concerned with giving computers the ability to understand text and spoken words in much the same way we human beings can.

It is the main concept behind translation and personal assistance apps like Google Translate, OK Google, Siri, Cortana, and Alexa.

Without NLP technology using NLP healthcare tools capable of scrubbing large sets of unstructured health data, that data is not in a usable format for modern computer-based algorithms to easily access, extract, and accurately interpret clinical documentation of the actual patient record previously considered buried in text form.

NLP technology services accurately give voice to the unstructured data of the healthcare universe while processing the content of long chart notes of medical records, giving incredible insight into understanding quality, improving methods, and better results for patients that helps determine the disease burden and valuable decision support can be obtained.

Augnito is a voice-to-text medical software using NLP technology hoping  to improve healthcare, but for now specifically developed for the Indian market launched six months ago, and now being used in 24 States in India.

The voice has become the most powerful tool in technology today. Just by talking, the voice is the most natural way of communication for humans. We are able to do sophisticated and important jobs with gadgets like Alexa.

Like the Alexa gadget been able to do sophisticated and important jobs using voice controlled NLP technology, the Augnito software available for a monthly subscription on both Mac and Windows platforms, types out notes that are dictated to and saves it in an editable textual format on a cloud server.

The Augnito voice recognition software has a pre-programmed list of medical terms (its vocabulary database is constantly updated in keeping with doctors’ requirements and feedback), a built-in editor, report templates and keyboard shortcuts that help reduce repetitive typing.

Voice recognition software like Augnito using NLP technology, has the potential to boost a doctor’s productivity at a time of increased online consultations.

MRPM.VOW.02.14: Healthcare’s four-letter word? It’s ‘silo


To continue the series of videos for 2014, this week’s Video of the Week (VOW) pick is about the change that must evolve when it is common practice that hospitals, clinics and doctor’s offices, digital health devices and services all continue to keep close tabs on their information and silo their data.

Those of you who have been following the trends at healthcare conferences and exhibitions for some time will recognise it is common at one after another conference and exhibition when we see vendors show off systems that work great, but we soon realise that they don’t get along with each other. I think it has become less of a desire to acquire new systems and more of a requirement in the rapidly changing healthcare industry, keeping up with new technologies and innovations.

You would also already been aware that departments within the hospital or health network, digital health devices and services all keep close tabs on their information and have control over a certain subset of data that they’re not into sharing, they just all seem to silo their data.

And for those of you who are familiar with mHealth, surely are all too familiar that mHealth devices continue to over-emphasize the collection of vital signs and real-time transmission to healthcare providers.

Enters Patrick Soon-Shiong, a South African-born surgeon who is pushing for a vision of integrated healthcare through a network of digital, genomic and clinical solutions. He envisions a future healthcare system as an integrated system that connects all the dots which follows a human being through the continuum of life serving a patient throughout his or her life, not just in sickness.

In the accompanying video (click on to watch the video, which will open in a new tab of your current browser window), watch and listen to Soon-Shiong discoursing among other things the following views:

  • healthcare has to break the rule of capturing vital signs at all times and focus more on gathering data and identifying trends;
  • likens a health journey much like a long plane trip, during which a true operating system which encompasses clinical decision support, machine learning and “adaptive amplified intelligence” pulls in data from all sources that “integrates pieces of the puzzle” and gives you inputs from the consumer so that the caretaker can plot a course, and adjust that course as things happen and manage outputs;
  • that the Electronic Medical Record (EMR) is “basically a flight log” that needs to be tapped for information at times which could be a part of that solution, but not the whole solution;
  • he believes that healthcare isn’t being held back by technology as a barrier, rather the real problem is a workflow management problem since technology is not been used properly, and is falling behind other industries like banking and entertainment;
  • nobody is taking the trouble of taking each of their siloed pieces and integrating them into a single healthcare system; and
  • he concludes that change management as the next challenge while taking advantage of the fear to resist wholesale change in healthcare.


  1. Healthcare’s four-letter word? It’s ‘silo’, mHealth News, viewed 28 April 2014, <>

  2. Healthcare’s four-letter word? It’s ‘silo’, mHealth News, viewed 28 April 2014, <>

Five Reasons Why Electronic Medical Records Are Good For Patients

Investment in developing a good Electronic Medical Record (EMR) system to provide value to patients by driving up safety, quality, operational excellence, transparency and access can be seen as shown by the example at Cleveland Clinic Abu Dhabi, a carefully designed EMR system modelled after the famous EMR model at Cleveland Clinic, Ohio, United States – a long time leader in EMR systems.

The infographic below (click on the image to open in a new tab of your current window to view a larger image) shows a summary of five (5) good reasons why EMRs are good for patients as from the example at Cleveland Clinic Abu Dhabi.



  1. Five Reasons Why Electronic Medical Records Are Good For Patients, Marc, H 2013, LinkedIn, viewed 15 July 2013, <>