Major support for MindShift comes from
Landmark College
upper waypoint

'How Are You Using AI?' Therapists Should Ask You That Question, Experts Argue

Save ArticleSave Article
Failed to save article

Please try again

A cellphone screen is shown with the ChatGPT app icon in focus.
Chat GPT app icon is seen on a smartphone screen, Monday, Aug. 4, 2025, in Chicago.  (Kiichiro Sato/AP)

Increasingly, teens and adults are turning to artificial intelligence chatbots for companionship and emotional support, recent studies and surveys show. And so, mental health care providers should inquire if and how their patients are using this technology, just like they seek information on sleep, diet, exercise and alcohol consumption.

That’s according to a new paper out in JAMA Psychiatry.

“We’re not saying that AI use is good or bad,” says Shaddy Saba, an assistant professor at New York University’s Silver School of Social Work, “just like we wouldn’t say substance use is necessarily good or bad, [or] consulting with a friend about something is good or bad.”

However, learning about a person’s use of AI for emotional support and advice could provide valuable insight into someone’s life and mental health status, he says.

“Our job is to understand why people are behaving as they are — in this case, why they are seeking help from an AI system,” adds Saba. “And to learn about what it’s doing for them, what it’s not doing for them.”

Saba and his co-author’s recommendations are “very aligned” with recommendations by the American Psychological Association (APA) in a health advisory released in November of last year, says the APA’s Vaile Wright.

Asking what a patient is getting out of their conversations with an AI chatbot sets “a foundation for the therapist to better know how they are trying to navigate their emotional wellbeing and their mental illness,” says Wright.

“Treasure trove of information”

“People are using these tools on a regular basis to ask about how to cope with stressful experiences, personal relationship challenges,” explains Saba. 

And some are using chatbots for advice on how to cope with symptoms of anxiety and depression.

“To the extent that we can prompt our clients to bring these conversations, in increasing detail, even into the therapy room, I think there’s potentially a treasure trove of information,” he says.

It could be information about the main causes of stress in someone’s life, or if they are turning to a chatbot as a way to avoid confrontations.

“Let’s say, for example, you have a client who is having relationship issues with their spouse,” says the APA’s Wright. “And instead of trying to have open conversations with their spouse about how to get their needs met, they’re instead going to the chatbot to either fill those needs or to avoid having these difficult conversations with their spouse.”

That background will help a therapist better support the patient, she explains.

“Helping them understand how to have a safe conversation with their spouse, helping them understand the limitations of AI as a tool for filling those gaps in those needs.”

Discussing use of AI is also a chance to learn about things a client might not voluntarily share with a therapist, says psychiatrist Dr. Tom Insel, former director of the National Institute of Mental Health. “People often use the chatbots to talk about things that they can’t talk about with other people because they’re so worried about being judged,” he says.

For example, suicidal thoughts may be something a patient is reluctant to share with their therapist, but that is critical for the therapist to know to keep the patient safe.

Be curious, but don’t judge

When it comes to first broaching the subject with patients, Saba suggests doing it without any judgment.

“We don’t want to make clients feel like we’re judging them,” he says. “They’re just not going to want to work with us in general if we do that.”

He recommends therapists approach the topic with genuine curiosity, and offers suggested language for these conversations.

“‘You know, AI is something that’s kind of rapidly growing, and I’m hearing from a lot of people that they’re using things like ChatGPT for emotional support,” he suggests. “‘Is that the case for you? Have you tried that?'”

He also recommends asking specific questions about what they found helpful so they can better understand how a patient is using these tools.

It could also help a therapist figure out whether a chatbot can complement therapy in helpful ways, says Insel, such as to vet which topics to bring to their sessions or to vent about day-to-day life.

In a way, therapy and chatbots “could be aligned to work together,” says Insel.

Saba and his co-author, William Weeks, also suggest asking patients if they found any chatbot interactions unhelpful or problematic, and also offering to share risks of using chatbots for emotional support.

For example, the risks to data privacy, because many AI companies use the conversations — even sensitive ones — to further train their models.

There are also risks of treating a chatbot like a therapist, says Insel.

Talking with a chatbot about one’s mental health is “the opposite of therapy,” he says, because chatbots are designed to affirm and flatter, reinforcing users’ thoughts and feelings.

“Therapy is there to help you change and to challenge you,” says Insel, “and to get you to talk about things that are particularly difficult.”

Adopting the advice

Psychologist Cami Winkelspecht has a private practice working primarily with children and adolescents in Wilmington, Del.

She has been considering adding questions about social media and AI use to her intake form and appreciated Saba’s study as it offered some sample questions to include.

ChatGPT's landing page on a computer screen.
ChatGPT’s landing page on a computer screen. (Kiichiro Sato | AP)

Over the past year or so, Winkelspecht has had a growing number of clients and their parents ask her for help with using AI for brainstorming and other tasks in ways that don’t break a school’s honor code. So, she’s had to familiarize herself with the technology to be able to support her clients. Along the way, she’s come to realize that therapists and kids’ parents need to be more aware of how children and teens are using their digital devices — both social media and AI chatbots.

“We don’t necessarily think about what they’re doing with their phones quite as much,” says Winkelspecht. “And I think it’s pretty clear that we need to be doing that more and encouraging ourselves to have that conversation.”

Transcript:

JUANA SUMMERS, HOST:

Recent studies suggest that many Americans with mental health conditions now turn to AI chatbots for mental health advice. Now a new paper in JAMA Psychiatry suggests that therapists should regularly ask patients about their use of AI for emotional support, just like they seek information about sleep, exercise and how much you drink. NPR’s Rhitu Chatterjee reports.

RHITU CHATTERJEE, BYLINE: These days, when people feel stressed or anxious, many reach for an AI chatbot like ChatGPT. It’s at their fingertips and easy. Study author Shaddy Saba is an assistant professor at New York University Silver School of Social Work.

SHADDY SABA: You know, people who are using these tools on a regular basis to ask about stressful experiences and how to cope with stressful experiences, personal relationship challenges.

CHATTERJEE: For example, anticipating a tough conversation with a boss or a friend.

SABA: How do I approach it? Do I say this? Do I say that?

CHATTERJEE: People also vent to chatbots and ask for ways to cope with anxiety and depression.

SABA: If they’re doing a back and forth with a chatbot about these things, they might be picking up on ideas of what might be helpful for them. They might also be, you know, exposed to ideas that might be less helpful for them.

CHATTERJEE: That’s why Saba and his coauthors suggest mental health providers ask clients about their use of AI.

SABA: The extent that we can prompt our clients to bring these conversations, you know, in increasing detail even, into the therapy room, I think there’s potentially kind of a treasure trove of information.

CHATTERJEE: Information about the main causes of stress in someone’s life, or whether they’re turning to a chatbot to avoid confrontations.

VAILE WRIGHT: Let’s say, for example, you have a client who is having relationship issues with their spouse.

CHATTERJEE: Psychologist Vaile Wright is with the American Psychological Association.

WRIGHT: And instead of trying to have open conversations with their spouse about how to get their needs met, they’re instead going to the chatbot to either fill those needs or to avoid having these difficult conversations with their spouse.

CHATTERJEE: Wright says understanding this background will help a therapist better support the patient.

WRIGHT: So helping them understand how to have a safe conversation with their spouse, helping them understand the limitations of the AI as a tool for filling those gaps and those needs.

CHATTERJEE: Talking about AI use can also help therapists learn about things that their patient might not voluntarily share with them. Psychiatrist Tom Insel is former director of the National Institute of Mental Health.

TOM INSEL: People often use the chatbots to talk about things that they can’t talk about with other people ’cause they’re so worried about being judged.

CHATTERJEE: For example, if they are having thoughts of suicide – he says discussing AI use also allows mental health providers to educate patients about the risks of using a chatbot like a therapist.

INSEL: Because it’s the opposite of therapy in so many ways, you know, they’re affirming. They may even be sycophantic.

CHATTERJEE: Which only reinforces a user’s thoughts and feelings.

INSEL: Therapy is there to help you change and to challenge you and to get you to talk about things that are particularly difficult.

CHATTERJEE: And helping people understand this can itself be transformative for their mental health in the long run. Rhitu Chatterjee, NPR News.

(SOUNDBITE OF MUSIC)

lower waypoint
next waypoint
Player sponsored by