The collaboration began with good intentions. In early December, Waymo used Austin Independent School District buses to test a number of light patterns and conditions, bringing cameras and equipment to help the company's software better understand how to recognise and respond to school bus stop arms and flashing lights. It was meant to be a fresh approach to a growing problem.
By mid-January, it became clear the training exercise had failed. New video showed Waymo's driverless vehicles had been caught again illegally passing stopped school buses in Austin weeks after the company said it updated its software to solve the issue and filed a voluntary recall, with Austin ISD issuing citations as recently as January and four violations occurring since December 10 when Waymo issued the voluntary recall.
The scope of the problem is substantial. KXAN obtained and viewed videos showing at least 24 instances this school year where district bus cameras recorded Waymo autonomous vehicles passing while the stop arm is out, counted among more than 7,000 violations the district issued to various drivers this year. But Waymo vehicles represent just 0.3% of that total violation count; the rest are human drivers. The issue is not the volume but the nature of the failure: Videos of Waymo's stop arm violations have not all looked the same, with some vehicles driving past the first stop-arm on the bus and abruptly stopping alongside the bus before passing the second stop-arm, while in others the Waymo would stop briefly and then seconds later maneuver around the bus while the stop-arm was still deployed.
What makes these incidents particularly troubling is the presence of children. In seven of the videos reviewed, children could be seen in a video frame. State law requires all vehicles to come to a complete stop when school buses are loading and unloading students, and this rule applies in every American state.
Waymo's response has been framed as a learning process. A Waymo spokesperson told KXAN the company has seen material improvement in its performance since the software update. The company has pointed to its broader safety record: Mauricio Peña, Waymo chief safety officer, said the company safely navigates thousands of school bus encounters weekly across the United States and there have been no collisions in the events in question, with confidence that safety performance around school buses is superior to human drivers.
Yet the district had made its position clear. A KXAN investigation found for nearly a month AISD had been asking the self-driving car company to cease operations during the hours when students are loading and unloading school buses, and Austin ISD Transportation Director Kris Hafezi said Waymo has not ceased operations as AISD Police Chief Wayne Sneed asked them to. Waymo voluntarily recalled software in more than 3,000 of its vehicles but has not ceased operating in the area.
The failures have prompted serious federal scrutiny. The National Transportation Safety Board announced it has opened an investigation into Waymo robotaxis following a series of incidents in Austin, Texas, with NTSB investigators travelling to Austin to gather information on incidents in which the automated vehicles failed to properly yield for buses with lights flashing and stop signs activated. The NTSB's probe could take 12 to 14 months to complete, though a preliminary report will be released within 30 days.
Beyond Austin, the problem extends elsewhere. The NHTSA opened an investigation prompted by a viral cell phone video from September showing a Waymo failing to remain stopped when approaching a school bus in Metro Atlanta with red lights flashing, the stop-arm deployed and the crossing arm control arm deployed, but the Waymo passed the entire left side of the bus while students were getting off on the other side.
The Austin case reveals something fundamental about autonomous systems: they can log millions of miles of driving experience yet struggle with a rule that human drivers learn through basic driver education. Waymo Chief Safety Officer Mauricio Peña said the company is proud of its safety record but holding the highest safety standards means recognising when behaviour should be better, and the company plans to file a voluntary software recall with NHTSA and will continue analysing vehicle performance and making necessary fixes, having identified a software issue that contributed to the incidents. The collaborative approach with Austin schools suggested Waymo was serious about improvement. The fact that violations continued after that collaboration raises harder questions about whether the technology can reliably be trained on rules of this critical importance.