PTSD Treatment: How AI Is Helping Veterans With Post-Traumatic Stress Disorder

Ellie1
A model (not an actual soldier) poses with Ellie, a virtual counselor. Courtesy of Gale Lucas

There is a real appeal to shouting into the void: the ubiquity of Google search as confessional, the popularity of PostSecret, the draw of confiding in a trusted friend with the hope verging on understanding that our secrets won't be shared all point to this. A group of researchers from the University of Southern California, with funding from the DARPA wing of the Department of Defense, believe that desire might drive a preference among veterans with PTSD to anonymously discuss their symptoms with a computerized avatar.

They found that the service members who volunteered for the study disclosed more symptoms of PTSD when speaking with a computerized "virtual human" than when filling out a symptoms checklist on the military's Post Deployment Health Assessment (PDHA). They also reported more symptoms even when filling out a completely anonymized version of the PDHA. The idea, the researchers suggest, is that people are more willing to disclose their symptoms when they know the data is anonymous.

It's unclear if that will fly in real life. Whether the program will truly help veterans remains to be seen. And its implementation raises questions about medical ethics and the stigma around mental-health in the military and culture at large.

The Institute for Creative Technologies at USC got lots of buzz for its original research, and introducing the world to Ellie, a digital diagnostic tool that strongly resembles, but cannot replace a human therapist. Ellie, an avatar of a woman in a cardigan with olive-toned skin and a soothing voice, listens to the people who come to her, and does what any human sounding board does. She listens to the content of their speech, and scans their facial expressions, tone, and voice, for cues that hint at meanings beyond speech. Ellie's design was decided upon by the research group's art team. As for how Ellie sounds, "she has a very comforting voice," Lucas told Newsweek.

(Unlike a human, of course, this kind of reading is made explicit enough to show in a video.)

Based off their results, Lucas says, the researchers believe that for veterans with PTSD, Ellie combines the best of both worlds: her warm demeanor and sympathetic responses establish the kind of rapport that a therapist would create, and the knowledge that she's not actually a person, and crucially, that she's not built into a chain of reporting, mean you can say whatever you want. This builds off past research this same team did, in which participants more intensely expressed feelings of sadness and reported lower rates of fear when going through a health screening with a virtual human rather than one they were led to believe was controlled by an actual human.

The way Lucas envisions it, Ellie is an economical and efficient solution. "All you need is a webcam, a, laptop, and a microphone." She imagines Ellies existing in a kind of kiosk that can be tucked into a local VA. ""I know it sounds creepy to put it in a closet, but you could put it in a closet."

And at least one psychiatrist thinks Ellie has potential. "This technology has amazing potential to drill down into the elements of rapport, and whether it differs by patient characteristics; something that is not possible with real life therapists or clinicians. A simple example is whether the sex and age of the avatar alters the effect," says Joseph Hayes, a psychiatrist at University College London.

However, Hayes believes the anonymity that drives the study's result might be impossible in real life. The participants in the study may feel more comfortable disclosing to Ellie because, unlike with the PDHA, which soldiers know will go on their permanent health record, what they say to Ellie will not. But If Ellie were really integrated into a care center, it's hard to imagine that the data she collects wouldn't also be accessible by treating clinicians.

"For an intervention to be possible ultimately, the disclosure would have to be shared with the same commanding officers who have traditionally received the results of the service members PDHA, and entered into their military health record. Once this is made explicit, would disclosure reduce to levels seen previously?" Hayes wrote. "If so, it is a culture change (reducing public stigma–within the military and more broadly) which is truly going to impact on disclosure and provision of appropriate treatment," Hayes wrote.

EllieLab3
A model poses with Ellie, the virtual human. Courtesy of Gale Lucas

Lucas and her colleagues have been thinking about this problem, and she's optimistic they can work out a solution. She maintains that since the research is being implemented within the department of defense, it's under different rules than treatments marketed to civilians.

The way Lucas envisions it, even in real life, a session with Ellie can stay fully anonymous. A veteran can go in, talk with 'her', and at the end of the session Ellie can suggest they follow up with a clinician if the person needs further treatment. But, Lucas hopes, that choice will be up to the patient.

With one big exception.

If a therapist or doctor learns of a person's intent to harm or kill, among other acts, they are compelled to break confidentiality and intervene. But Ellie, Lucas maintains, can't be mandated to report these things because she isn't human. Lucas's ideal solution is that, if a veteran comes to Ellie expressing thoughts of self-harm, the program would send out a red flag of sorts to a human clinician who would then be compelled to act.

It has yet to be determined whether that approach is legally, ethically, or practically feasible.

Even if it is feasible, there are still many other problems to tackle when it comes to PTSD in veterans, Hayes says.

"As a clinician, I'd want to know that this technology could effectively detect cases of PTSD, rather than just increasing disclosure of less severe, potentially non-diagnostic, responses to trauma. The bottleneck is not necessarily in the shortage of resources for diagnosis, but a shortage in the resources to deliver effective evidence-based care following diagnosis," Hayes wrote. And while AI technologies that do exactly that are being developed and tested, he thinks it's a long ways away before they can take over that work.

And help is needed.

"Veterans account for 20% of suicides in the US," Hayes said. "Better support systems, beyond the brief provision of therapy may help reduce this shocking statistic."

Uncommon Knowledge

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

About the writer


Joseph Frankel is a science and health writer at Newsweek. He has previously worked for The Atlantic and WNYC. 

To read how Newsweek uses AI as a newsroom tool, Click here.

Newsweek cover
  • Newsweek magazine delivered to your door
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go
Newsweek cover
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go