Sponsor MessageBecome a KQED sponsor
upper waypoint

Will AI Replace Your Therapist? Kaiser Won’t Say No

Save ArticleSave Article
Failed to save article

Please try again

Ilana Marcucci-Morris and other Kaiser therapists strike in Oakland in August 2022. A union battle between mental health workers and Kaiser exposes the current reality and future potential of AI’s role in therapy sessions. (Courtesy of Karna Roa)

Every day, clinical social worker Ilana Marcucci-Morris talks to her patients about the most private, most vulnerable details of their lives, and she’s not interested in having AI software listen in or sharing any of her responsibilities with a chatbot.

A self-attested millennial and lover of gadgets, Marcucci-Morris knows artificial intelligence is here to stay in health care, but when it comes to therapy, she wants it to be optional and assistive, a tool that will augment human connection, not diminish it.

She figured that would be a simple assurance her union could win when they sat down at the bargaining table last summer to hash out their next contract with Kaiser Permanente. The therapists with the National Union of Healthcare Workers submitted their proposed contract language — that AI would be used to “assist” mental health clinicians, but not “replace” them — never expecting it to be controversial. After all, Kaiser signed a contract with their sister union in Southern California just months earlier that contained the same language.

Sponsored

But this time, Kaiser refused, sending back a counterproposal in the fall with that paragraph deleted.

We have asked them point-blank about language to prevent replacing therapists with artificial intelligence, and they have been very clear that they want the ‘flexibility’ to increase AI and reduce their need for us,” Marcucci-Morris said.

The local contract debate taps into an existential question plaguing American workers across professions: When is AI coming for my job? As health systems embrace the technology to save money and time, and consumers increasingly consult AI chatbots for mental health support, the theoretical question has suddenly turned concrete for Kaiser therapists and they are testing their union power to see if and how they can influence the inevitable transformation of their vocation.

A Kaiser clinician during the previous December 2018 strike. (April Dembosky/KQED)

“AI is not inherently good or bad. It holds promise, but it isn’t without serious risks,” said Maya Sandalow, associate director for health programs at the nonprofit Bipartisan Policy Center. “When we talk about this, we need to be asking, ‘how might this solution improve upon the status quo?’ The status quo is that we are in a mental health crisis.”

Worldwide prevalence of depression and anxiety spiked 25% in 2022, and today nearly two-thirds of American youth regularly experience mental health distress, though fewer than half of them seek professional help.

Finding a therapist, especially one who accepts insurance, has become notoriously difficult as the field contends with workforce shortages and low reimbursement rates.

Kaiser has been battling these industry dynamics for more than a decade. California regulators have cited the company multiple times and fined it twice for making patients wait too long for mental health appointments, ordering Kaiser to address understaffing.

Administrators are actively exploring how AI tools could help expand access to therapists, for example, by helping them spend less time on paperwork and more time with patients.

Kaiser declined several requests for an interview, but said in a statement that AI tools don’t make medical decisions or replace human care. Rather, they hold “significant potential to benefit health care by supporting better diagnostics, enhancing patient-clinician relationships, optimizing clinicians’ time, and ensuring fairness in care experiences and health outcomes by addressing individual needs.”

Kaiser contracts with mental health workers typically span two to four years. The company did not respond to specific questions about how AI could lead to job losses during that timeframe.

Managers told the union during negotiations that they do not “intend” to lay off therapists because of the technology, but when pressed to put that in writing in the contract, several union representatives, including Marcucci-Morris, said Kaiser told them, “We can’t predict the future. We need to maintain flexibility,” and “We want to leave our options open.”

How Kaiser uses AI now in mental health care

Kaiser is already deploying AI note-taking technology in mental health care. Piloted first in medical exam rooms, these digital scribes record interactions between doctors and patients, then generate summaries for the patient’s medical record. Many mental health clinicians are optimistic about this innovation, as they typically spend two and a half hours a day, often in the evenings, writing clinical notes.

“It’s called pajama time,” said Jodi Halpern, a psychiatrist and professor of bioethics at UC Berkeley. Her research shows that paperwork is the biggest cause of burnout among clinicians. “So the idea that we could replace that so that human care could grow, I love that idea.”

Kaiser mental health care workers and supporters march from Oakland Kaiser Medical Center to Kaiser’s corporate headquarters on Friday, Aug. 19, 2022, the fifth day of an open-ended strike. (Beth LaBerge/KQED)

The technology is controversial among Kaiser clinicians, though. Some appreciate digital scribe software as a time saver that also allows them to be more present with their clients, making eye contact rather than typing. But many are wary of potential privacy breaches, the ethical implications of using therapy transcripts to train AI models, and whether patients might censor themselves when they’re being recorded. Marcucci-Morris has declined to use it for these reasons, anticipating that only one out of 10 of her patients would consent to it if she asked.

“It’s not the same as talking to your physician about a rash or your vitamin D deficiency,” she said. “I wouldn’t want a recording of my disagreements with a family member or details of the terrible things that have happened to me.”

In light of the unknowns, therapists have asked Kaiser management for a contract clause that stipulates the use of digital scribes will remain optional, or at least “not mandatory,” but Kaiser declined the proposal.

The union is also concerned about Kaiser’s recent introduction of electronic mental health triaging, an optional tool where patients are routed into care based on how they answer questions about anxiety and depression in an online questionnaire.

Brittany Beard, a licensed clinical therapist at Kaiser Permanente, poses for a portrait at her home in Vallejo on Nov. 24, 2025. (Gina Castro/KQED)

Some patients won’t like this, but some will prefer it, said Merage Ghane, a clinical psychologist and director of responsible AI at the Coalition for Health AI. “There are people who really don’t like talking to a real person,” she said.

Vallejo-based therapist Brittany Beard used to do this triage work herself, talking to clients for 15 to 20 minutes on the phone, but after Kaiser outsourced many of those calls to an outside company and developed the e-visit, she was reassigned to a new department. Though still employed at Kaiser, she already feels replaced by an app.

“They sell it as accessing care faster, but I’ve seen the opposite,” Beard said. Now, when some of her patients meet her for their first appointment, “They’re frustrated. It was like they were battling just to get to me.”

Is AI coming for your therapist?

How much AI infiltrates mental health care will be determined, in part, by the consumer. Experts have identified a “trust gap” between health administrators’ eagerness to roll out AI tools and patient concerns; to bridge the divide, they recommend transparency and involving patients in implementation. Qualitative studies show that patients are optimistic about the technology’s potential to improve diagnosis and treatment, but they remain skeptical of “robots” or “machines” taking over from humans.

“The prevailing sentiment really was that AI is at its best when it’s a tool that doctors can use to do their jobs better. Once that moved into the realm of replacing human interaction and experience, that was not a good thing,” said Michele Cordoba, a researcher at Culture IQ, which produced a report for the California Health Care Foundation.

Kaiser mental health care workers and supporters march from Oakland Kaiser Medical Center to Kaiser’s corporate headquarters on Friday, Aug. 19, 2022, the fifth day of an open-ended strike. (Beth LaBerge/KQED)

At the same time, the use of commercial AI chatbots for mental health has soared. One study surveyed AI users who have mental health conditions and found nearly half turn to their chatbot for psychological support, and of those, 63% said the advice was helpful.

But mental health professionals have questioned the efficacy of such advice, and several families have sued AI companies, alleging their chatbots encouraged suicidal and self-harming behavior.

In the meantime, clinical psychologists are developing evidence-based chatbots, like TheraBot, to deliver tested therapeutic guidance. The Food and Drug Administration acknowledged the broad demand for such apps at a November meeting and is exploring what kind of authority it might have to regulate them, including requiring human mental health professionals to oversee them.

Kaiser therapists want to know what all these trends mean for their own job security in the immediate and long term. When one of them asked a panel of AI experts to expound on this during a statewide training webinar in October, the 200 therapists in attendance heard a wide range of answers.

“I would encourage you all not to fear for your profession,” said Nicholas Jacobson, a psychologist at Dartmouth and co-creator of TheraBot. “I think there is no possibility in your lifetime that you all will feel replaced by AI.”

But UC Berkeley’s Halpern was much more circumspect, especially in light of chatbots’ popularity among youth. A third of teen AI users said they preferred to have serious conversations with their chatbot rather than a human. “I am not sure we won’t see a tremendous loss of human interactions,” Halpern said. “I’m very worried about that.”

Ultimately, patients should have choices, psychologist Ghane told KQED. If they live in rural areas and can’t access a therapist, or they have a neurodevelopmental condition where human communication is more aversive than facilitative, she said it’s important they have AI options. In that version of the future, therapists are right to ask if they will be replaced.

“The answer is they can be,” Ghane said. “We can all be replaced at some point.”

Sponsored

lower waypoint
next waypoint
Player sponsored by