It’s an annoyance of modern life. You’re busily working in your office in Vancouver when a text or email alert flashes on your mobile phone. Your bank wants to know if you’ve purchased gas in Orlando with your debit card that morning.
That fraudulent activity, more likely than not, was flagged by your bank’s artificial intelligence software. Mental health researchers, too, are beginning to use similar predictive algorithms to identify patients likely to attempt suicide and get them help before it’s too late.
As hospitals and research institutes amass bigger troves of electroencephalography (EEG), neuroimaging, genetic, and other data from patients, researchers are designing algorithms that can learn and problem solve, to analyze millions of data points in the hopes of identifying potential trends and associations, says Joshua Gordon, director of the National Institute of Mental Health (NIMH). “Researchers are throwing a whole bunch of data into these programs, and with a large enough sample, they may be able to identify how different brain areas communicate with one another, who is at elevated risk for suicide, and who will respond well to certain drugs,” he explains.
The United States is experiencing a marked rise in suicide rates. From 1999 to 2016, the U.S. Centers for Disease Control and Prevention (CDC) documented increases in nearly every state in the country ranging from just under a six percent increase in Delaware to an increase of more than 57 percent in North Dakota.
Those dramatic numbers serve as the backdrop for an effort by researchers from the Army and Veterans Administration (VA), to deploy AI techniques to prevent suicides among veterans. Armed with vast troves of medical and psychological data about service members, the Army and VA researchers have developed a program that uses AI to predict a person’s risk of suicide. The Recovery Engagement and Coordination for Health – Veterans Enhanced Treatment (REACH VET) program can sift through a patient’s medical history, highlighting dozens of strong predictive factors for suicide including past pain diagnoses and greater use of emergency room care, in order to flag patients who may be at high risk for suicide. For those patients, physicians can focus on providing more comprehensive support via more frequent clinician-initiated check-ins as well as reviews and potential revisions of current treatment regimens.
Civilian healthcare systems are beginning to explore predictive models to prevent suicides as well. Kaiser Permanente’s Mental Health Research Network created a computer model that used information from electronic medical records to predict which patients across their network may be at highest risk. In doing so, the models can help ensure that patients at highest risk can get the help they need before they even attempt suicide.
“This is a great area for AI — and it’s an area that has translational relevance,” said Gordon. “Not everyone has easy access to mental healthcare. With these kinds of programs, a primary care provider could be able to help identify who is at risk and help direct them to the help they need.”
AI also offers other benefits to those working with patients with depression. Recent data from the Substance Abuse and Mental Health Services Administration (SAMHSA) suggests more than 16 million adults in the United States have experienced at least one major depressive episode. And finding the right treatment can be quite difficult, says David Benrimoh, a psychiatry resident at McGill University. But AI can help psychiatrists better personalize treatments, helping doctors prescribe the right medication right off the bat.
“You can have two patients who have a diagnosis of depression but that doesn’t mean that they share all of the same symptoms or will respond to the same treatments,” he says. “AI can learn patterns from large sets of data, and those patterns can offer us new insights into what’s different between sets of patients. It allows us to parcellate patients in a way we haven’t been able to do before — and better determine what treatment is best without having to go through a long, trial and error type process.”
Benrimoh is currently working with technology experts in a new AI start-up, Aifred Health, to create a model highlighting links between specific patient features, focusing on clinical and demographic features with plans to add in promising biomarkers, and the most effective treatment. Their approach and initial results helped them secure the number one spot in the ongoing IBM Watson AI XPRIZE.
“We’ve been trying for years to find a way to partition patients based simply on symptoms and it hasn’t worked out that well,” he said. “But by looking at remission — what treatments helped patients — and clustering people based on their remission using AI, we can see patterns that we haven’t been able to see before about what types of patients will do the best with what types of medications.”
Gordon says we are only scratching the surface of what AI can offer the mental health community — and he says he is looking forward to new and more diverse applications in the future.
“Today, psychiatrists rely on just one interaction once a week for 45 minutes to make some pretty complex decisions for each patient,” he said. “But AI offers the possibility, using a wide range of different data types, to give psychiatrists new tools that could improve diagnosis and treatment. It’s very promising.”