Sponsor MessageBecome a KQED sponsor
upper waypoint

Newsom’s Tightrope Walk Between AI Regulation and Silicon Valley Cash

Save ArticleSave Article
Failed to save article

Please try again

Governor Gavin Newsom speaks at the Google offices in San Francisco on August 7, 2025. President Donald Trump’s calls for a light regulatory touch on AI don’t appear to be intimidating California lawmakers as they push a flurry of bills.  (Tayfun Coskun/Anadolu via Getty Images)

As Gov. Gavin Newsom eyes a potential run for the White House, he faces a political challenge on the homefront. Roughly 30 AI-related bills are moving through the state Legislature in the last weeks of this legislative session, and it’s estimated a dozen or so will land on Newsom’s desk.

Does he veto all or most of them to mollify Silicon Valley donors? Or does he defy President Donald Trump’s industry-friendly light touch and model a tougher state stance on regulation?

At an Aug. 7 event announcing AI training partnerships with Adobe, Google, IBM and Microsoft, Newsom told reporters he is trying to establish a middle ground that provides guardrails for public safety without squelching innovation: “We’ve led in AI innovation, and we’ve led in AI regulation, but we’re trying to find a balance.”

Sponsored

Newsom noted that, even though he vetoed the most controversial bill of the last legislative session, he ultimately signed 18 AI-related bills into law, addressing everything from training data transparency to deepfakes.

Industry voices, chief among them OpenAI’s Chief Global Affairs Officer Chris Lehane, continue to lobby against binding regulation at all levels of government. Lehane often appears to be speaking to California in his public posts.

Assemblymember Rebecca Bauer-Kahan says local jurisdictions need the power to stop a wildfire disaster before it starts. The assemblymember and other state lawmakers announced a bill to expand enforcement actions against PG&E and other utilities on February, 18, 2020.
Assemblymember Rebecca Bauer-Kahan on Feb. 18, 2020. (Eli Walsh/Bay City News)

“Imagine how hard it would have been for the US to win the Space Race if California’s aerospace and tech industries got tangled up in state-by-state regulations impeding the innovation of transistor technology,” Lehane wrote pointedly on LinkedIn last week.

But if Silicon Valley lobbyists see the industry in a “space race” against foreign adversaries, it’s not clear whether the general public shares that sense of urgency for technological advancement at any cost. An Axios/Harris poll released in early June 2025 showed strong majorities of Americans across all age groups want companies to take AI development slowly to “get it right the first time, even if that delays breakthroughs.”

Signing some of the AI bills that make it to his desk could also provide Newsom with an opportunity to stick a thumb in the eye of Trump. The challenge for Newsom is choosing which of the roughly 30 bills making their way through Sacramento are least likely to upset his supporters in Silicon Valley.

Assemblymember Rebecca Bauer-Kahan, D-Orinda, could be considered the foremost advocate for AI regulation this legislative session, having authored six still in play:

  • AB 222 — Would require data centers that power AI to disclose how much water and electricity they use, so the public can see their environmental footprint.
  • AB 412 — Would require AI developers to disclose when copyrighted works were used to train their models and give rights holders a way to check and challenge that use.
  • AB 621 — Would let victims, including minors, sue creators and facilitators of non-consensual deepfake sexual material and increase the damages they can collect.
  • AB 1018 — Would regulate how AI-driven decision systems are developed, tested, disclosed, audited, and appealed when used to make consequential decisions about people.
  • AB 1064 — Would ban AI chatbots from manipulating children into forming emotional attachments or harvesting their personal and biometric data.
  • AB 1405 — Would require AI auditors to enroll with the state, follow conflict-of-interest and reporting rules, and create a system for the public to report auditor misconduct starting in 2027.

When pressed to pick a personal favorite on the list, Bauer-Kahan chose AB 1064, the Leading Ethical AI Development for Kids Act.

“I have teenage children. You know, I live it every day,” Bauer-Kahan said. “We have to step in and make sure these companies are doing right by our children.

“I’m driven mostly as a mother, and I think that Gov. Newsom is driven as a father,” she said. “You know, his children are going to grow up in the AI age, as are mine. And I think he wants a safe environment, and I’m hopeful that will lead him to find balance between what industry needs and what the public needs.”

Close-up female hands with a blue manicure using pink smartphone outdoors.
Research shows that about 70% of teens use at least one kind of AI tool. (Tatiana Meteleva/Getty Images)

Bauer-Kahan said the long game for AI legislation requires a careful effort to define terms so it’s not easy for companies to sidestep the mandate, and to protect mechanisms of enforcement so the law has teeth. For instance, if a child suffers actual harm as a result of the use of a covered product, AB 1064 allows that child, or a parent or guardian, to sue.

But industry critics say the bill is misguided and full of issues that could lead to unintended consequences.

“AB 1064’s definitions are drafted so broadly that they could unintentionally capture almost all chatbot tools, even basic customer service functions, and would require invasive age-verification to maintain functionality … It risks limiting minors’ access to lawful and beneficial AI tools — raising significant First Amendment concerns,” Robert Boykin, executive director of California and the Southwest for the trade group TechNet, wrote in an email to KQED.

The bill could also spur costly litigation and create substantial regulatory uncertainty, Boykin said.

But a bill that provides no capacity for individuals to sue is a bill without teeth, according to many who watch Sacramento politics. That is to say, the State Attorney General’s Office has limited budget, people and attention to carry the entire burden of enforcement on its proverbial shoulders, especially when it’s busy pursuing more than 37 lawsuits against the second Trump administration.

After congressional Republicans rejected a push to ban states from regulating artificial intelligence for a decade in the One Big Beautiful Bill Act, Trump appears to have signaled his intent to roll out a backdoor ban with his AI Action Plan. The executive orders include language like, “AI is far too important to smother in bureaucracy at this early stage, whether at the state or Federal level.”

The implied threat: leave off AI regulation or federal funding could be reduced.

“It seemed to be using money as a cudgel, but I’m not sure that will be effective here, or that’s something that really, would bother us, or prevent us in California from moving forward,” state Sen. Josh Becker, D-Menlo Park, said.

Becker has introduced two AI bills this session. SB 243 would require companion chatbots to frequently remind users that it isn’t a person, in order to reduce the risk of emotional manipulation or unhealthy attachment. SB 468 requires AI developers to design and disclose their own security measures to protect personal data, subject to state oversight.

Becker, a former businessman, said he works with industry lobbyists to hash out language companies consider feasible, if not unobjectionable. In the case of SB 243, Boykin of TechNet acknowledged Becker’s constructive engagement, adding, “Our goal has always been to narrow some of the sweeping provisions — particularly around the definition of companion chatbots, reporting obligations, and costly third-party audits.”

But Becker said California lawmakers are also well practiced in battling Silicon Valley lobbyists intent on killing or neutering legislation. “Even before Trump, there [was] a bipartisan effort to preempt our privacy laws here in California. So that’s gonna be a constant fight,” Becker said.

He added that he speaks with people who work in Silicon Valley who tell him they want regulation.

“My anecdotal experience is that there’s much more conversation by the people who work at these companies offline, about the potential impact of these technologies, than you’ll hear out of the communications people,” Becker said.

Sponsored

lower waypoint
next waypoint