Will AI revolutionise psychiatry?
Researchers are working on ways to apply machine learning to psychiatry, which will hopefully help clinicians better monitor their patients, writes Dr Bob Murray.
Thanks to advances in artificial intelligence, computers can now assist doctors in diagnosing disease and help monitor patients’ vital signs from hundreds of miles away. Currently, researchers are working to apply machine learning to psychiatry, with a speech-based mobile app that can categorise a patient’s mental health status as well as, or better than, a human can.
What the researchers say
“We are not in any way trying to replace clinicians,” said the co-author of a new paper in Schizophrenia Bulletin that lays out the promise and potential pitfalls of AI in psychiatry. “But we do believe we can create tools that will allow them to better monitor their patients.”
Low-ball estimates claim one in five US adults live with a mental illness, many in remote areas where access to psychiatrists or psychologists is scarce. Others can’t afford to see a clinician frequently, don’t have time or can’t get in to see one.
Even when a patient does make it in for an occasional visit, therapists base their diagnosis and treatment plan largely on listening to a patient talk – an age-old method that can be subjective and unreliable.
“Humans are not perfect. They can get distracted and sometimes miss out on subtle speech cues and warning signs,” the researchers claim. “Unfortunately, there is no objective blood test for mental health.” (Actually, there are for some mental illnesses – researchers should do their research).
In pursuit of an AI version of a blood test, the team developed machine learning technology able to detect day-to-day changes in speech that hint at mental health decline.
“For instance, sentences that don’t follow a logical pattern can be a critical symptom in schizophrenia. Shifts in tone or pace can hint at mania or depression. And memory loss can be a sign of both cognitive and mental health problems,” they say.
Again, some further research is needed before they make pronouncements. Memory loss can be a sign of impending stroke, or the result of a mini-stroke (which can be quite harmless) or even ocular migraine (which is quite common).
“Language is a critical pathway to detecting patient mental states,” said the lead researcher. “Using mobile devices and AI, we are able to track patients daily and monitor these subtle changes.”
The new mobile app asks patients to answer a five- to 10-minute series of questions by talking into their phone.
Among various other tasks, they’re asked about their emotional state, asked to tell a short story, listen to a story and repeat it, and given a series of touch-and-swipe motor skills tests.
The AI system that assesses those speech samples compares them with previous samples by the same patient and the broader population and rates the patient’s mental state.
In one recent study, the team asked human clinicians to listen to and assess speech samples of 225 participants – half with severe psychiatric issues, half healthy volunteers – in rural Louisiana and Northern Norway. They then compared those results with those of the machine learning system.
“We found that the computer’s AI models can be at least as accurate as clinicians,” they said.
The researchers envision a day when the AI systems they’re developing for psychiatry could be in the room with a therapist and a patient to provide additional data-driven insight or serve as a remote-monitoring system for the severely mentally ill.
If the app detected a worrisome change, it could notify the patient’s doctor to check in.
I can think of so many reasons why this is a bad idea that it would take several TR-length essays to cover them all. Firstly, as I have noted above, many physical illnesses can have cognitive consequences – heart disease is just one of them. Secondly, one person’s “normal” is another’s “abnormal”, and that bias is going to be built into the AI system (look at the differences in symptomologies and diagnoses of common mental problems in the various editions of the standard psychiatric reference book DSM).
AI in this area would make an efficient technology for use by a coercive state looking to detect changes in allegiance or nonconformity to their preferred norms of thought or behavior.
A friend of ours recently endured a four-hour compulsory psychiatric exam as part of applying for a job. Pretty soon there’ll be an AI app for that, too.
Finally (for now), God preserves us from AI researchers who have done insufficient research in the area they are attempting to bring AI into (like the ones who gave us the now laughed-at sentencing app relied on by judges). The lead researcher behind this study is the originator of the AI that grades school essays. Just imagine the biases incorporated in that!
“Kindness is the language that the deaf can hear and the blind can see.” – Mark Twain