Sponsor MessageBecome a KQED sponsor
upper waypoint

Kaiser Therapists Battle to Fend Off Artificial Intelligence

Save ArticleSave Article
Failed to save article

Please try again

Michelle Gutierrez Vo, a registered nurse at Kaiser Permanente Medical Center in Fremont, speaks during a rally alongside fellow nurses from across California at Kaiser Permanente on Geary Blvd in San Francisco on April 22, 2024, to advocate for patient safety in the face of artificial intelligence. (Beth LaBerge/KQED)

Airdate: Wednesday, December 17 at 9AM

In recent contract negotiations, Kaiser Permanente therapists asked for language to specify that artificial intelligence would not “replace” humans in mental health care, but the employer has so far refused. Kaiser already uses AI technology in mental health care to take notes and create summaries, but Kaiser therapists worry further use of the technology could usurp their jobs. We talk about the ways AI may be entering our mental health care system and how it could affect therapists and their patients.

Guests:

April Dembosky, health correspondent, KQED News

Jodi Halpern, professor of bioethics and chancellor's chair, University of California, Berkeley

Vanessa Coe, secretary–treasurer, National Union of Healthcare Workers

Anna Benassi, therapist, associate professor and executive director of clinics, California Institute of Integral Studies

Sponsored

This partial transcript was computer-generated. While our team has reviewed it, there may be errors.

Alexis Madrigal: Welcome to Forum. I’m Alexis Madrigal. What a strange world we live in. Our health care systems are now contemplating a near future where they might bring in AI to do some heavy lifting — even in mental health care. At least that’s how it sounds from recent negotiations between Kaiser therapists and the organization itself.

Kaiser can already use AI technology in mental health care to take notes and create summaries, but human therapists worry that further use of the technology could take their jobs.

Here to talk with us about this hinge moment in AI and health, we’re joined by April Dembosky, health correspondent with KQED News. Welcome.

April Dembosky: Thank you. Good morning.

Alexis Madrigal: And we’re also joined by Vanessa Coe, secretary-treasurer of the National Union of Healthcare Workers. Welcome.

Vanessa Coe: Thank you. Good morning, everyone.

Alexis Madrigal: So, April, can you give us a quick overview of how AI plays a role in mental health care and how that’s evolved in recent years?

April Dembosky: Sure. I think there are basically two main uses of AI right now in mental health care. One is on the administrative side. You mentioned AI note-taking — sometimes called digital scribes. We’ve seen this mainly on the medical side of health care so far, where doctors bring a cell phone into the appointment and record their interactions with patients.

People tend to like this. It opens up the opportunity for doctors to make more eye contact and really listen, instead of staring at a computer and taking notes. Then the AI technology later summarizes the appointment and adds it to the medical chart.

Now we’re talking about whether — and to what extent — we should bring that into mental health care. There are therapists who are really excited about that, and some who are really concerned. Beyond that, there are therapeutic uses of chatbots.

Alexis Madrigal: This seems to be the thing I’ve heard the most about — people kind of doing, I don’t know, budget therapy by talking to ChatGPT.

April Dembosky: Exactly. We’ve really seen the rise of this with commercial AI companions, where people are confiding in AI like a therapist. They’re seeking advice and mental health support — sometimes with pretty bad outcomes.

Beyond that, there are also clinical psychologists who are actively working to develop evidence-based, tested therapeutic AI chatbots. Those are still in development. There’s a lot of work being done, but there’s broad consensus right now that any use of those tools would be as an adjunct to human-based therapy and would require human oversight. How those may develop over the years — it’s all moving so fast.

Alexis Madrigal: Yeah. For those who might be thinking, “Wait, people are talking to chatbots as therapists?” — there was a study that came out recently showing that about half of current large language model or chatbot users have gone to one of these tools for psychological support. So we know this is happening.

Let’s talk about this specific battle at Kaiser. A crucial part of this story is the language involved. What’s at stake there?

April Dembosky: That’s right. Kaiser therapists — the union of mental health workers at Kaiser — bargain a new contract with Kaiser every two to four years. I’ve covered several of these negotiations over the years. It’s usually about what you’d expect: wages, benefits, working conditions, patient protections.

This year, I was really surprised that one of the first sticking points in negotiations was AI. That was interesting because the sister union representing Southern California therapists had just ratified a contract in May. They secured a provision saying that any introduction of AI tools had to be used only to assist therapists, not replace them. Both sides agreed to that language.

So when Northern California therapists went into bargaining this summer, they assumed they could secure the same provision. And Kaiser said no.

Alexis Madrigal: Vanessa, talk to me about what Kaiser has said to you, or how they’re thinking about this. We have a statement from them, which I’ll read in a moment, but what’s your perspective?

Vanessa Coe: Sure. What Kaiser is telling us is that they’re refusing to accept any language that would stop them from deploying technology in a way that could replace therapists. At the bargaining table, they’ve said they want flexibility to dramatically increase the use of AI protocols and potentially use those protocols to replace therapists. That’s what we’ve continued to hear.

Alexis Madrigal: Do you see any potential for these tools?

Vanessa Coe: We’re not opposed to technology enhancing the quality of patient care. But we already have major concerns. Kaiser has already started implementing AI in its behavioral health system.

Right now, when patients reach out for mental health care, they’re no longer guaranteed to be triaged by a therapist trained to ask the right questions and understand the nuance and complexity of patient responses. Instead, people get a prompt, and the AI decides their next step.

Our members are telling us — and they’re telling Kaiser — that they’re seeing patients who should have been seen right away, or who were assigned to the wrong level or specialty of care. That’s really concerning.

Alexis Madrigal: Kaiser gave this statement to April, saying these tools, quote, “hold significant potential to benefit health care by supporting better diagnostics, enhancing patient-clinician relationships, optimizing clinicians’ time, and ensuring fairness in care experiences and health outcomes by addressing individual needs.”

April, when I read that, it feels like part of this is about access. Mental health care can be really hard to find, especially on the timelines people need. Are you seeing that argument in this space — that even imperfect AI might be better than the status quo?

April Dembosky: Yeah. Across the industry, there’s an incredible shortage of mental health providers. Demand is huge, and there aren’t enough services to go around. Kaiser has been dealing with this for more than a decade. They’ve been cited and fined by state regulators multiple times for having excessively long wait times — historically four to six weeks.

Kaiser has tried contracting with outside networks of therapists to get people into care sooner. And when you look at administrative uses of AI, therapists can spend two to three hours a day just catching up on notes. I’ve heard people call it “pajama time” — work you do after putting your kids to bed.

So Kaiser sees an opportunity: if AI could save therapists two to three hours a day on paperwork, that time could potentially be used to see more patients. You can see how they view this as a way to expand access.

Alexis Madrigal: Vanessa, what kind of guardrails would you like to see around this technology?

Vanessa Coe: We’re asking that Kaiser consult therapists when implementing any kind of AI. They should talk to the people actually providing the care. We want implementation to be informed by clinicians, evidence-based, and thoughtful — not centered on efficiency or cost, but on what’s actually good for patients.

Alexis Madrigal: What about privacy implications? How is the union thinking about that?

Vanessa Coe: We’re definitely concerned about what’s being recorded, where that data lives, what happens to it, and how AI interprets and outputs that information. Those are serious concerns for us.

Alexis Madrigal: Are these tools being built in-house, or is Kaiser contracting with outside companies?

Vanessa Coe: We’re not totally sure.

Alexis Madrigal: Which makes it harder to assess those privacy risks.

Vanessa Coe: Exactly. We’re also really concerned about the use of chatbots as a form of therapy. When we talk about guardrails, that’s part of it too.

Right now, many people can’t access consistent therapy through Kaiser, so they’re paying out of pocket — sometimes two hundred dollars a session. Our concern isn’t that there’s a shortage of therapists overall in this state. It’s that there’s a refusal to reimburse fairly and provide adequate working conditions.

If we want quality therapy, we need working conditions that make it possible. And Kaiser has sixty-seven billion dollars in reserves. They could be making better decisions around human-centered mental health care.

Alexis Madrigal: We’ve been talking about the ways AI may be entering our health care system. Vanessa Coe is secretary-treasurer of the National Union of Healthcare Workers. Thanks so much for joining us.

Vanessa Coe: No problem.

Alexis Madrigal: We’re also joined by April Dembosky, health correspondent with KQED News. And we want to hear from you. Are you a mental health provider? Are you using AI in your work? How are you thinking about it?

Call us at 866-733-6786. That’s 866-733-6786. Or email forum@kqed.org. Stay tuned.

Sponsored

lower waypoint
next waypoint
Player sponsored by