A passerby called first responders after hearing a male juvenile “screaming for help,” according to the police report. The officer who responded overheard the passenger saying the Waymo “told him to get out of the vehicle, even though it was in the middle of the street.”
A Waymo spokesperson told the Washington Post the teen opened the door while the vehicle was traveling 35 mph, and attempted to exit before the vehicle had come to a complete stop.
“Waymo — over the past few months — doesn’t have a great track record of being overtly transparent with their data,” said Billy Riggs, a professor at the University of San Francisco School of Management and the director of the Autonomous Vehicles and the City Initiative.
Riggs was referring to Dec. 22, when many of Waymo’s self-driving cars blocked streets of San Francisco during a mass power outage and forced the company to temporarily suspend service, raising questions about the autonomous vehicles’ ability to adapt to real-world driving conditions.
The vehicles, Riggs said, “are driving based on the rules of the road that we give them.” Waymos, he said, follow the speed limit, unlike many humans in a school zone.
“That collision would have been a lot more severe at a higher speed,” he added.
The Santa Monica crash happened the same day that the NTSB said it was opening an investigation into Waymo’s behavior around school buses in Austin. Austin Independent School District officials said in November they documented 19 cases of Waymos “illegally and dangerously” passing buses since the beginning of the 2025-26 school year.
Riggs said he’s looked into those cases and found Waymos were not entirely at fault in all the incidents. “Some of these situations are a little more complex,” he said. “Similar situations are being reported as if they were the same, and they’re not precisely the same.”
Additionally, he said, “The fleet learns as it scales, and so they can issue these patches, and it shouldn’t repeat the same error twice.”