Would You Talk to a Machine Therapist?
This post is also available in: Deutsch
Professor of Psychiatry
Prof. Dr. med. Gerhard Gründer is head of the Molecular Neuroimaging Department at the Central Institute of Mental Health, Mannheim.View full profile ››
In my recent blog post, I reported on the optimistic view that “digital phenotyping” with smartphone technology would improve psychiatric diagnosis and possibly even treatment. Building on this, I discussed general advancements towards including big data in psychiatry. Another important aspect of the digitization of psychiatry is the development of machine therapists in mental health care, working with artificial intelligence. Adam Miner and colleagues from Stanford University give a brief overview of the current status in their article “Talking to machines about personal mental health problems”.1
Machine therapists that communicate with patients are already in use in the USA and in China. These “conversational agents” are called “Gabby” or “Ellie”. They perform psychiatric interviews and might even someday be able to perform formal psychotherapy. Miner and colleagues are optimistic about the potential usefulness of conversational agents. “Optimism is growing that conversational agents can now be deployed in mental health to automate some aspects of clinical assessment and treatment.”1 According to them, “Some data suggests that people respond to them as though they are human.”1 This could be helpful, “especially to improve access for underserved populations.”1 And interestingly, one study suggests that people who know that they are talking to a computer are more willing to open up.
Miner et al. further state: “The bridge from human responses and machine responses has already been crossed in ways that are not always made clear to users. Chinese citizens engage in intimate conversations with a text-based conversational agent named Xiaoice.”1 The authors admit, however, that conversational agents have not been evaluated in clinical trials and that they might not only be ineffective, but also cause harm. Additional future problems with the technology might be issues of confidentiality. Does a machine therapist have to be as secretive as a human?
Furthermore, most of the current technology seems to be based on text communication, meaning it is based on semantics. Communication in psychiatry, by contrast, is highly contextual. Empathy cannot be coded in words.
While we once believed that psychiatry is the most human medical specialty, scientists now seem to believe that this is the first specialty that will be replaced by computers. Would you talk to a machine therapist about your emotions, your conflicts, your desires?