The Waymo School Bus Problem
The "better than a human driver" principle applies, but not the way Waymo wants it to
Waymo robotaxis are blowing past stopped school buses with children present. (Also see reports from KXAN Austin.) The school buses have red lights flashing, swing-out stop signs, and even red flashing lights on the stop signs during dark hours. This problem been caught on video dozens of times, and there is every reason to believe that this is the tip of an iceberg that extends to locations that do not have automated enforcement cameras mounted on their school buses. (There are credible reports from Atlanta of this issue as well.) We have twice been promised Waymo fixed the problem, and yet it is still happening.
Screenshot of video from an Austin WFAA TV report,
which notes Waymo just expanded their service area in Austin.
The Austin Independent School District (ISD) has called for Waymo to stand down operations during school bus operational hours until they are sure this problem has been fixed. According to the ISD Police Chief regarding Waymo’s response: “They did not agree with our risk assessment and respectfully declined to stop operating.”
Waymo’s response to an Austin TV station on the latest school bus violation: "Our vehicles have 12x fewer crashes involving injuries to pedestrians compared to human benchmarks, and we're invested in demonstrating exceptional driving performance around school bus interactions that exceed human-driven vehicles. We have met with Austin ISD, including on a collaborative data collection of various light patterns and conditions, and are reviewing these learnings. We have seen material improvement in our performance since our software update."
Translation: Waymo claims they’re trying really hard, and they are really safe, so we should ignore them choosing to continue to put school children in avoidable danger. When I read that response to a problem that has occurred dozens of times despite at least two failed attempts to fix it, I feel that Waymo cares more about expanding operations and having a rosy story to tell its investors than children’s safety.
It is also important to note the rhetorical slight-of-hand here. Go back and read the quote a couple paragraphs back. Did you get out of it that they “exceed human-driven vehicle” performance around school buses? But … that’s not what they are really saying. They merely claim they have “invested” in that capability. Not that they have achieved it. I have found that a similar degree of analytic care is required when parsing any of their safety claims and mishap reports.
As we shall see later, Waymo’s real-world performance at stopping for school buses is shockingly bad compared to human drivers according to the metric data we can find. “Material improvement” (whatever that might mean) might sound impressive, but that’s easier to get when you’re 10x worse than you need to be, and the claimed improvements have not stopped the flow of incidents.
Details of the situation
The timeline is complicated, but involves problems in Atlanta starting with the new school year in September 2025, a purported fix in November, more problems, a NHTSA recall on December 10, and even more problems. The most recent incident caught (thus far) seems to be on Monday January 12, 2026. There are plenty of videos, providing a clear record of the violations.
The explanation in a NHTSA recall given by Waymo is: “Instances of proceeding again before the school bus had deactivated flashing lights and/or retracted the stop arm could occur if, while yielding to the school bus, the ADS determined that it may be impeding the school bus or another priority vehicle, and then reasoned that it should proceed in order to cease impeding the other vehicle.”
This explanation simply does not match up with the videos. Many of the incidents did not involve the Waymo visibly stopping, or even slowing, so they seem to be ignored by the description. At least one stop did not happen until the vehicle was almost past the bus. (We note that Waymo did not actually claim they stopped in time to obey the law in their explanation.) Stop & go incidents generally do not involve a Waymo repositioning to avoid blocking the school bus. Moreover, moving while the stop indicators are active is dangerous regardless of whether there is blocking of the school bus. The bus is not going anywhere until after closing its door and turning off the indicators, so there is no reason to move.
This is yet another instance of Waymo’s government/public relations staff deflecting accountability and transparency by telling only part of the story, and using rhetorical tactics to seem to claim more than they actually do upon a closer critical reading. They’re covering for a three-time failure of the engineering team to achieve reasonable operation around stopped school buses.
Waymo was observed to react to this problem only when the public pressure started. And even then they have failed to solve the problem after a software update and then a later recall. We are not talking a one-off incident. We are talking dozens of documented issues happening over an entire school semester, now spilling into the next one.
What Waymo Should Be Doing
Waymo can still do the right thing and stand down operations around school buses. That will allow them to maintain public trust and grow their business. Hopefully in the fullness of time they will deliver on the significant safety and societal benefits they promise. But those aspirational benefits do not forgive their behavior in this situation.
Apparently, Waymo is fine taking the risk of hitting children by not stopping for school buses as required by law. That’s just wrong.
This situation exposes the Waymo PR talking point of “The Waymo Driver is designed with safety as its top priority.” as the literal chatbot script and bromide is has always been. (BTW, this does not actually say safety is the company’s top priority, but rather that it is a design team goal that might or might not have been met.) Their decision to keep operating even as Austin ISD begged them to stop for the safety of their school children, while incidents grind on week after week, is clearly prioritizing operations over safety.
A robotaxi driving past school buses performing student loading/unloading is one of the most clear-cut road safety violations there is. Don’t do it. Period. Whatever need you might have to move can wait until the school bus door is closed and the indicators turn off.
A key to what Waymo should do hinges on the concept of an Operational Design Domain (ODD). The ODD is the set of intended operational design limits on the system. As examples, parts of a city might be within the ODD, and other parts not. Local roads inside the ODD, highways outside the ODD. Sunny days inside, blizzards outside. And so on. Inside the ODD the robotaxi is supposed to work safely, and outside there is no expectation of continued operations (but safety must still be assured).
At some point early in the dozens of documented incidents, Waymo should have realized that it was treating stopped school buses as inside its ODD, but was getting them wrong. It should have then put stopped school buses outside its validated ODD pending future design improvements. That in turn means making sure to avoid operating near stopped school buses to the degree possible, and doing a safe operational shut-down if one is encountered.
Waymo should stand down operations in the vicinity of school buses involved in student load/unload until they can prove to school districts that they have this problem completely solved.
Avoiding school buses might be done by avoiding times they are active for school pickups and dropoffs, as requested by Austin ISD. But it might also mean avoiding school buses themselves somehow (Tracking school bus locations? Avoiding known school bus routes at those times? Avoiding scheduled school bus stops near scheduled stop times (every school district has this list)? Treating all stopped school buses as loading/unloading regardless of indicators? I’m sure there are more ideas. Waymo has smart people who can figure something out that is better than what we are seeing on the roads.) The technical approaches are varied in both complexity and practicality, but finding one that works for them is Waymo’s problem if they can’t handle student safety properly yet.
If Waymo keeps operating near school buses, they are putting children at unnecessary risk. Eventually they will be able to fix the problem. But until then, every encounter with a school bus is rolling the dice. If they (and a child, and the child’s family) get unlucky that is entirely on Waymo’s head. There is no technical reason to keep operating in this reckless manner.
Waymo is making an explicit choice to gamble with children’s lives. They say they like the odds, but it is not their gamble to make.
Deeper Implications
The criterion here is not perfection for all road situations. Rather, the criterion needs to be that:
For any specific situation, a robotaxi should behave at least as safely as a careful and competent human driver would in that same situation.
A robotaxi that cannot pull that off in clearcut situations which would be well under the control of a reasonable human driver is not fit to be operating. Waymo needs to restrict their ODD accordingly so that they can act properly in their chosen operational environment (and safely stop operations if the robotaxi finds itself outside its ODD). For now, the ODD needs to exclude school bus load/unload scenarios.
There are deeper implications here for Waymo safety in general that stakeholders should heed. For those steeped in system safety, the school bus stopping incidents amount to a collapse of Waymo’s safety argument. Whether the credibility of Waymo’s safety argument can be rehabilitated is unclear.
Two failed attempts at fixing the problem means we cannot trust Waymo’s vaunted safety engineering and validation process to keep problems from showing up in deployed vehicles, including problems they have specifically attempted to fix. Why should we believe that the next time it will really be fixed just based on Waymo’s word? Waymo needs to get its validation house in order. Other stakeholders are justified in being skeptical of claims that future problems have been fixed without concrete evidence supporting such a claim.
This seems to be a problem that has suddenly appeared in the past few months. One can speculate it is associated with a major software change to end-to-end machine learning (Waymo has been cagey about what is really happening there). This puts arguments of millions of miles of experience in doubt. Most of their claimed 100+ million miles of operation were on different software versions, with claims of prediction of current software version safety being questionable. It has been obvious to experienced safety engineers from the start this sort of failure of an existing capability would eventually happen. But here we have tangible proof of illusory nature of the the Waymo “look at all them miles” safety argument.
Waymo should have known that school bus incidents were happening before someone managed to catch it on camera in Atlanta, but did not act until there was public pressure on them to fix it. And apparently didn’t really take it all that seriously until Austin ISD launched a name and shame campaign. (Why no recall for the first fix?) If Waymo is operating in your city, you should seriously consider how you plan to document and put pressure on them to behave safely when other issues happen in your town. (In fairness, this probably applies to all robotaxi companies. Waymo is just the one in the news these days.)
As noted earlier, this is another lesson in not taking anything Waymo says about safety without doing a very careful rhetorical analysis of what they say — and what they don’t say. If they leave space for ambiguity, in my experience it is reasonable to assume that any empty space contains something unfavorable to their claim of safety, or they would not have created the ambiguity.
For those into the technical detail, I have been predicting for a while that end-to-end machine learning will do better at nominal functionality, but will prove unruly when trying to clean up edges cases. This might turn out to be that problem showing up in practice. Waymo really seems to be struggling to get the behavior right, going so far as to create additional edge case training data for it. We’ll have to see how this plays out.
The Apologists
As with any article pointing out that an automated vehicle company is acting badly, I expect to encounter apologists and what-about-ists. Here are a few of the usual excuses:
“But nobody was hurt”
The law is not that you can drive recklessly (or otherwise endanger others) so long as you get lucky. The law is to stop for school buses.
A human driver cannot use the excuse that the are an excellent driver and therefore should be trusted to break the law in a safe way when there is no exigent circumstance forcing them to do so. That applies to robotaxis as well.
“Saving lives”
Waymo’s consistent response in such situations is to explain that they are safer than human drivers, so we should ignore their avoidable unsafe behaviors. That’s ridiculous.
Being a safe driver does not give you a free pass to ignore the law. Imagine a human driver who contested a school bus ticket for one of these incidents by saying they had done this dozens of times and had not hit a child yet, so they should be allowed to keep driving dangerously. (You don’t get one free hit on a child until you are forced to obey the law.)
“People do it too”
Yes, human drivers go past school buses, but we were promised robotaxis that do not drive like terrible human drivers. When looking at the numbers keep in mind there are a LOT more people than robotaxis.
Human drivers don’t get a free pass on blowing past school buses. (Some might get away with it, but the cameras on Austin ISD school buses are there to curb that problem.) They get traffic tickets with a fine that is meaningfully large to most drivers. They get points on their license. If they do it dozens of times I expect that they eventually lose their license. This is intended to serve as a corrective force, and it seems to be working pretty well.
From Austin: “We've issued more than 7,000 violations to vehicles passing our school buses, and 12, or in this case now, 24 [Waymo] violations does seem like a small drop in the bucket, but I think it's important to point out that, based on the statistics that we run of everybody that received or was issued, a stop arm violation for the first semester, approximately 98% of people who received one did not receive a second one," Assistant Chief Pickford said.”
Yes, those 2% should lose their license if they keep repeating such offenses. But the other 98% get the message and do not repeat the offense.
More importantly, the relevant area in Austin has ballpark 700,000 residents with about two cars per household, while the whole Waymo fleet has 200 cars. It is not as if (24/200)=12% of Austin human drivers have been caught driving past school buses. For human drivers it is more like a 1% problem overall, with few repeat offenders. This does not seem to reflect Waymo’s talking point of “exceptional driving performance around school bus interactions that exceed human-driven vehicles” quoted earlier.
It seems that human drivers are both better behaved and more trainable than a Waymo robotaxi.
Parting Thoughts
It seems odd for Waymo to simultaneously make claims of more than 100 million miles of experience (more than a human driver by far) and yet beg for tolerance because they’re still learning how to obey a life-critical road rule. Waymo admits that “our behavior should be better” apparently with regard to their vehicle, but the real behavior that needs to get better is with the company itself.
Waymo needs to stop operating their vehicles in the vicinity of school buses until they are absolutely sure this is fixed. That includes using safety drivers in operations near school buses until they are sure they have a robust solution in practical operation. While they like to talk about how rigorous their software validation is, they have gotten it wrong three times so far (original, first fix, NHTSA recall), so special effort is required. School district stakeholders are justified in both their request for Waymo to stand down operations near school buses, and any desire to witness validation of the fourth try at getting it right.
Hopefully Waymo gets their act together soon. A school kid should not have to be hit by a robotaxi to underscore the point that vulnerable pedestrian safety should not be sacrificed on the alter of Waymo’s growth charts.
Update: Apparently Austin ISD has had a school bus tracking app since 2016. And school bus stops with estimated times are published by every school district I’ve ever dealt with. So it is should not be a mystery to Waymo when/where a school bus is stopped for kids even if their vision algorithms can’t figure it out. There are plenty of ways to stop putting children at risk if Waymo wanted to do so.
“For a successful technology, reality must take precedence over public relations, for nature cannot be fooled." - Richard Feynman, the Rogers Commission Report into the Challenger Crash, Appendix F, 1986.



As I have said so often in the past... just added to my syllabus. Right next to William Widen's video on "Law as a Design Consideration in AVs".
For my Detroit-based auto engineering managers. Maybe it will have an impact? 🤷♂️
Phil, thank you for going down in deep on this topic, and making practical suggestions as to what Waymo can still do to redeem themselves. An important topic that we should all take a note of.