What assumptions have you made about artificial intelligence? Are there alternative viewpoints?
The above are examples of Socratic questioning, a key component of cognitive behavioral therapies, like cognitive processing therapy, which uses open-ended questions to encourage reflection. At the Road Home Program: The National Center of Excellence for Veterans and Their Families at Rush, where these therapies are used, researchers have developed an AI tool called Socrates 2.0 that uses Socratic questioning to ethically enhance and supplement traditional mental health care delivery.
Over the past decade, AI has rapidly progressed from a science fiction trope to a tool deeply embedded in many aspects of everyday life. The advent and progression of large language model chatbots like ChatGPT (where GPT stands for generative pre-trained transformer) had a domino effect for AI tools across a multitude of specialties, including mental health care.
“There are many therapy apps and chatbots out there, but very few of those have been tested,” said Philip Held, PhD, Road Home’s research director.
In 2023, Road Home researchers set out to create something different — a tested tool that could help put the evidence-based practices clients learn in therapy at their fingertips, around the clock. Researchers asked themselves: Could a chatbot trained in Socratic dialogue help people challenge distressing beliefs they hold about themselves, others and the world?
Large language models seemed perfect for Socratic dialogue. Users can evaluate their own beliefs through answers to questions and adjust them as needed to be more helpful or realistic. This process, known as cognitive restructuring, has been associated with symptom improvement.
“What makes Socrates 2.0 unique is that it involves multiple AI entities that interact with one another to provide the user with a high-quality experience — beyond what current therapy chatbots on the market can do,” Dr. Held said. “It’s essentially like having an AI treatment team with an AI therapist and two AI supervisors by your side at all times.
“This initiative marks a new era in innovation, not just within our organization but potentially setting a precedent in the integration of generative AI and mental health care nationwide.”
How can traditional therapy be improved?
Traditionally, therapists who provide cognitive behavioral therapies have given their clients worksheets to practice the skills they’re learning in their free time. While these worksheets have proven to be effective, they aren’t engaging and, often, aren’t well-used by clients.
Researchers wondered if they could replicate those effective therapeutic tools in a more engaging way using modern technology.
Traditionally, therapists who provide cognitive behavioral therapies have given their clients worksheets to practice the skills they’re learning in their free time. While these worksheets have proven to be effective, they aren’t engaging and, often, aren’t well-used by clients.
Researchers wondered if they could replicate those effective therapeutic tools in a more engaging way using modern technology.
Road Home’s Sarah Pridgen, MA, LCPC, manager of research operations, brought this question to her husband, Sean Pohorence, PhD, an independent researcher who works with AI and machine learning. He offered to help build the prototype for Road Home, and within a week, Socrates 1.0 impressed the team.
After some testing, the chatbot displayed some limitations that can be common for large language models, like giving repetitive answers in conversations. Knowing this can be distressing, particularly in conversations about mental health, Dr. Held and his team questioned how they could improve upon it.
“That’s where the fun began,” Dr. Held said. “We had the realization that this can be a good tool, but it had limitations. We connected this to how we handle training people. Psychologists can sometimes cling to the use of one question. So, what do we do for them? We provide them with supervision to improve their style of questioning.”
That’s where the idea for Socrates 2.0 was born.
How can this tool help clients?
When psychologists are in training, supervisors provide them with feedback on how to make their sessions more effective. Dr. Held and his team translated this idea into the creation of an AI supervisor that would provide the AI chatbot with input on how to improve its questioning.
Humans can sense when someone’s beliefs shift during a conversation, but Socrates 1.0 had a difficult time recognizing this. By adding an AI supervisor to help improve conversations and an AI assessor to monitor shifts in beliefs, looping or redundant answers were reduced. This helped the chatbot acknowledge progress and appropriately wrap up sessions. Altogether, these three AI entities — therapist, supervisor and assessor — form Socrates 2.0.
Dr. Held, Pridgen and their team completed a feasibility study of Socrates 2.0 in which users were given unlimited access to the chatbot for a month. Most users interacted with the chatbot for five minutes at a time as it fit into their day. This resulted in moderate reductions in common symptoms of depression, anxiety and chronic stress.
People felt they could open up to the chatbot without fear of human judgment. They used it as a practice ground before talking to their therapist or to handle difficult situations, like a panic attack at 2 a.m.
Now, Road Home researchers are conducting additional feasibility and safety studies on Socrates 2.0. They are collaborating with several partners to test the tool in various contexts.
“The idea is that you would use this tool in tandem with working with a trained professional — a person who can help guide you on how to use it to shift the beliefs you’re struggling with,” Dr. Held said. “We want clients to develop a sense of self-efficacy and have the tools and means to feel better.”
How can therapists benefit from this tool?
In a separate feasibility study with therapists, some expressed concerns about Socrates 2.0 replacing their jobs. Upon use, most found it to be a helpful tool in their therapeutic toolbox, similar to the aforementioned CBT worksheets outside of the clinical space. Some therapists also used the tool to workshop their approach to their own clients.
This led researchers to develop Socrates Coach, a separate chatbot that aims to help therapists hone their Socratic dialogue skills. In this version, the chatbot roleplays as a client whom the therapist works to treat. Therapists receive feedback from AI “supervisors” on how to improve their dialogue.
This initiative was funded by a grant from Face the Fight®, an organization founded by USAA, Reach Resilience and the Humana Foundation that aims to raise awareness and support for veteran suicide prevention.
“We’re fortunate to be part of that because they’ve entrusted us with building Socrates Coach and a couple of other tools to improve clinicians’ abilities to develop effective interventions,” Dr. Held said.
He and his team are in the midst of a larger feasibility study of Socrates Coach with clinicians undergoing training for evidence-based PTSD therapy.
Can this tool influence mental health care everywhere?
While many throughout the mental health field are developing AI tools, Road Home was one of the first to develop a multi-agent approach and complete feasibility studies.
“Many people are looking to Socrates 2.0 as an example,” Dr. Held said. “We’re developing science in a clinically applicable way. We’re getting this tool out there, in front of people, getting information from real users and using feedback to make changes.”
This approach is not going unnoticed. Rush is one of a handful of nationwide collaborators in the Center for Responsible and Effective AI Technology Enhancement of Treatments for PTSD, or the CREATE Center, at Stanford University. The center received an $11.5 million grant from the National Institutes of Health to develop tools like Socrates 2.0 that can assist clinicians while meeting specific criteria for patient safety, privacy and effectiveness.
Ultimately, Dr. Held and his team hope their work can advance mental health care by giving clients a tool they can use in between therapy sessions or as they’re waiting for treatment to supplement their care.
“Once we know the tool is safe and works as intended, the hope and dream is for people to have the option of replacing boring worksheets with Socrates for practice, which can lead to consistent and better outcomes,” he said.
And dreaming even bigger, the team hopes this type of tool, partnered with a coaching model, can become its own form of treatment for certain clients.
“One size does not fit all,” Dr. Held said. “That’s Road Home’s entire driver. The next frontier for care may be to use tools younger generations already know, but have these tools be developed and tested by mental health experts.”