Just as adults are using generative AI companion chatbots for solace and connection, children are, too. So, how can parents stay in the conversation? (Anna Fifield/The Washington Post via Getty Images)
For children and teenagers feeling anxious and alienated from their peers and adults, AI companion chatbots can mimic the human compassion they’re longing for.
Regulatory solutions have yet to materialize, but the California legislature is considering State Bill 243, introduced by Sen. Steve Padilla, D-San Diego and soon to be heard in the Senate Appropriations committee, that would require chatbot operators to implement critical safeguards to protect users from the addictive, isolating and influential aspects of AI chatbots.
Sponsored
In a 2023 letter, 54 state attorneys general from both political parties urged Congress to act.
“We are engaged in a race against time to protect the children of our country from the dangers of AI,” they wrote. “The proverbial walls of the city have already been breached. Now is the time to act. ”
But if you’re a parent or caregiver whose child uses an AI chatbot — or wants to try one — how can you talk to them about the risks? That’s where human professionals can help. KQED reached out to experts, who’ve offered parents five key pieces of guidance.
Start with listening, rather than telling kids what to do
It’s natural for kids to be curious, said Vicki Harrison, Program Director for Stanford University’s Center for Youth Mental Health and Wellbeing, who has two teenagers, ages 13 and 15.
Younger children may struggle with the distinction between fantasy and reality, tweens may be vulnerable to parasocial attachment, and teens may use social AI companions to avoid the challenges of building and sustaining real relationships. (Courtesy of Common Sense Media and Stanford's Brainstorm Lab for Mental Health Innovation)
Harrison sympathizes with parents whose first inclination is to panic and demand their child delete the app, but adds, don’t do it. “’Cause they’re not going to ever tell you anything again if you react that way.” She encourages parents to approach the conversation with curiosity instead, even though she acknowledges this is easier said than done.
“If we’re coming in fearful, they’re going to go into a reactive space, and they’re going to want to defend themselves, because they already feel insecure and self-conscious,” said Laurie Cousins, a certified mindfulness teacher who works with children and families in Los Angeles. She also has two children, both now in college.
Both Harrison and Cousins recommend approaching the conversation about AI companions by bringing up recent news stories, because it’s clear to the children that they aren’t personally in trouble.
Cousins suggests sharing what you learned and asking open-ended questions. These could include: ‘I was curious because I found out this. Do you know anything about that?’”
Help your kids understand how AI companion chatbots take advantage of human wiring
AI models don’t “understand” emotions the way humans do. They recognize and respond to textual cues learned from processing massive amounts of data gleaned from past conversations, interaction with therapists and therapy-focused websites, as well as random advice found online on platforms like Reddit.
“We all want, but especially the primitive parts of us, want to feel in control,” Cousins said about the way humans respond to companion chatbots, adding, “Our dopamine receptors are firing, and the oxytocin is firing in the way that it feels relational, it feels like a positive reinforcement.”
Adults with some self-awareness of their own mental health struggles and personal history might be able to course correct when a companion chatbot fails to pick up on signs of depression, anxiety, ADHD and the like — and when it either affirms ideas ungrounded in reality, or encourages risky behavior in real life. Children often don’t have that self-awareness, and may not understand how some of the interactions they’re asking for may be damaging to their mental health.
“Adolescents are in the emotional brain, right?” Cousins said. “It’s not that they don’t have wisdom. They’re so flooded with emotions and maybe don’t have that risk-averse wisdom yet. Parents say, ‘What were you thinking?’ Well, they weren’t thinking.”
Additionally, knowing intellectually that software is designed to emotionally manipulate users into maintaining engagement is not the same thing as those users being able to control their emotional response to it, especially when they’re still developing their sense of identity, connection and belonging.
Cousins also suggests sharing with your kids — in an age-appropriate fashion — that companion chatbots are developed by companies with a profit-seeking motive. You could also explain to them that the companies behind the chatbots are likely sharing or selling all sorts of personal data, given that they are not bound by the same privacy laws as human therapists.
Model the kind of healthy human relationships you want your child to emulate
Whether it’s speaking kindly to a cashier at a store, joining sports leagues, or attending cultural events, Cousins argues parents need to show their children how to relate to other people. “That’s how we feel safer in community, in society, is when we feel we’re relating with one another.”
But that may require adults to address their own habits, given how digitally dependent we’ve all become.
“You and I know the ‘before times.’ We know that it’s possible to interact in the world without chatbots, without googling everything,” Harrison said.
Children, not so much. Increasingly, it’s difficult for all of us to avoid chatbots of one kind or another. “I’m slightly alarmed, being in Silicon Valley, just how prevalent AI has become. We’re unleashing it into the world regardless of the consequences.”
One idea is to encourage kids to make eye contact with others by modeling it yourself. “They need to see that someone is giving them eye contact, that’s meeting them,” Cousins said.
Keep an eye on how much time your child spends staring at a screen
Harrison urges parents to “scaffold” access to digital devices. For instance, rather than handing a smartphone to a child on their tenth or twelfth birthday, “maybe start with a not-so-smart phone, maybe only approve one app at a time.”
Harrison adds that parents can create family media plans and agreements and get their kids’ buy-in. “‘OK, here’s a new responsibility. Here’s my expectations of how you’re going to use it. If you want more privilege, you have to agree to use it in a certain kind of way.’”
Parents play an active role in managing screen time by setting clear boundaries — like approving apps individually — and involving kids in family media plans to build healthy habits early. (Chandan Khanna/AFP via Getty Images)
Cousins recommends periodically spot-checking the histories on children’s devices — at least, until they figure out how to delete their histories. But she also recommends parents monitor the amount of time children spend staring at a screen, regardless of whether they’re interacting with humans or not.
“I’ve worked with young people who are gaming all through the night and going to school with two hours of sleep. That’s a dependency, right? That’s an addiction. I’m saying to the parents, ‘Why do you have this in their room?’”
She argues responsible parenting means setting boundaries, likening digital dependency to an itch. Scratching the itch doesn’t cure the bite or the rash, but inflames both the itch and the urge to continue scratching it. “You got a big dopamine dump and now you’re chasing it, you know?”
In response to a recent report from Common Sense Media and Stanford researchers raising the alarm about minors using AI companion chatbots, a CharacterAI spokesperson wrote KQED, “Banning a new technology for teenagers has never been an effective approach — not when it was tried with video games, the internet, or movies containing violence.”
That said, the company delivers a specialized version of its large language model to 13–18 year olds. Among other things, this model includes a time-spent notification that notifies children if they have spent an hour on the platform.
Seek out additional human resources
allcove A free, welcoming space for youth ages 12–25, developed originally in Australia and now growing across the U.S., including locations in the Bay Area. Think of it as a “one-stop shop” offering mental health support, physical care, peer counseling, and help with substance abuse.
Common Sense Media The nonprofit provides independent, age-based reviews, ratings, and advice for parents, caregivers, and educators about media and technology for children and teens. It covers movies, TV shows, books, video games, apps, websites, and even TikTok and YouTube channels.
Bad Ass Girls An empowering mentorship and community program for preteen and teen girls. It’s built to help them explore confidence, connection, and emotional wellness through guided support and real talk.
#GoodforMEdia This is a youth-led program where teens help other teens make smart, thoughtful choices about their digital lives. It’s like having a mentor who actually understands how tricky social media can be — because they’re living it too.
Greater Good in Education A thoughtful collection of research-based articles, tips, and newsletters for parents and educators. Created by the Greater Good Science Center at UC Berkeley, it focuses on emotional well-being, mindfulness, and building empathy in young people.
Soluna Free for youth in California, this app offers mental health tools, mood tracking, and direct access to counselors. A great option for teens who might not feel ready to talk in person but still need support.
Teenline Teens everywhere can talk or text with trained teen volunteers. It’s anonymous and especially helpful for teens who feel more comfortable opening up to peers.
Children and screens
A national nonprofit based in D.C. that offers an “Ask the Experts” series written with parents in mind — full of clear, science-backed tips.
Sponsored
lower waypoint
Stay in touch. Sign up for our daily newsletter.
To learn more about how we use your information, please read our privacy policy.