
On Monday, the NHTSA publicly announced it had launched a preliminary investigation into approximately 2,000 Waymo robotaxis following an incident that happened about a month ago. A fully autonomous vehicle—no safety driver onboard—failed to stop for a school bus with flashing red lights and extended stop arms as students disembarked.
The robotaxi initially paused beside the bus, then drove around the front, passing both the stop arm and the crossing control arm. With Waymo logging roughly two million miles per week, the agency warned the “likelihood of other prior similar incidents is high,” raising urgent questions about autonomous vehicles in school zones.
Who Is Waymo?

Waymo LLC, led by co-CEOs Dmitri Dolgov and Tekedra Mawakana, is the autonomous driving tech company at the center of this investigation. Headquartered in Mountain View, California, Waymo is a subsidiary of Alphabet Inc., which also owns Google and other tech ventures. As of June, the company employs 2,500 people, with 1,474 engineers and over 1,500 vehicles in operation.
The National Highway Traffic Safety Administration, led by Administrator Jonathan Morrison, opened the probe under case PE25013.
What Actually Happened?

On September 22, 2025, a Waymo robotaxi in Atlanta failed to stop for a school bus with flashing red lights, an extended stop arm, and a crossing control arm deployed. The vehicle initially paused, then slowly drove around the bus, passing where students were disembarking.
Waymo says the vehicle approached from an angle, obscuring signals, and maintained a “safe distance from children.”
Why NHTSA Opened an Investigation?

The NHTSA launched a preliminary investigation last week, on October 17, citing high operational mileage—2 million miles per week—and the likelihood of similar incidents. The agency is evaluating whether Waymo’s fifth-generation ADS complies with federal and state laws governing school bus stops.
The investigation is unprecedented, marking the first federal inquiry into autonomous vehicle failures for school bus safety. Could it redefine the rules for all self-driving fleets?
Legal Context and Stakes

All 50 states and D.C. require drivers to stop for school buses. In Georgia, “Addy’s Law” imposes fines up to $1,000, 12 months in jail, and six license points for violations. Children are statistically most vulnerable while boarding or exiting buses, with 1.6 times more pedestrian fatalities than riders.
The September 22 incident highlights the tension between emerging AI technology and age-old child-protection laws. Will autonomous systems face stricter legal standards in the future?
How the Vehicle Failed to Stop?

The Waymo robotaxi encountered a bus occlusion—flashing lights and a stop arm blocked from sensor view. The vehicle paused, then edged forward slowly to gain visibility, passing students while maintaining what the company described as a “safe distance.”
This technical “blind spot” raises urgent questions about the limits of machine vision. If AI misses one of the most universal danger signals, what else might go unseen?
How Waymo’s Technology Works?

Waymo’s fifth-generation ADS uses 360-degree cameras, radar, and Lidar, processing real-time data with machine learning algorithms. Each vehicle costs roughly $200,000 and operates with zero onboard drivers. The fleet runs 24 trips daily per vehicle, totaling 2 million miles weekly.
Despite the use of advanced sensors, occlusions still pose a challenge to AI.
Why Safety Concerns Are Growing?

Children are most at risk while boarding or exiting buses. With over 650,000 students in operational zones, each robotaxi interacts with thousands of school buses annually. NHTSA warns that similar incidents are likely given Waymo’s 2 million miles per week.
The investigation tests whether AI can replicate human judgment in high-stakes environments.
Who Is Most Affected?

School-age children in operational zones are the most vulnerable. Approximately 650,000 students live in Waymo’s service areas, including Los Angeles, Phoenix, Austin, and San Francisco.
Parents and school districts now face questions about whether robotaxis can safely coexist with students boarding or leaving buses.
Waymo Responds Publicly

Waymo issued a public statement on Monday, October 20, 2025, following the NHTSA’s announcement of the federal investigation. The company emphasized, “Safety is our top priority, as we provide hundreds of thousands of fully autonomous paid trips every week in some of the most challenging driving environments in the U.S.”
The delayed response—nearly a month after the September 22 incident—sparked debate.
Explaining the Incident

Waymo explained that the vehicle approached the school bus from a position where flashing lights and the stop sign were obscured. The robotaxi navigated slowly around the front while maintaining a “safe distance from the children.”
Occlusion—the bus partially blocking a driveway—prevented sensors from fully detecting the signals.
Cooperation With Regulators

Waymo highlighted its collaborative approach: “NHTSA plays a vital role in road safety, and we will continue to work collaboratively with the agency as part of our mission to be the world’s most trusted driver.”
The statement conveys a cooperative tone rather than a disputatious one.
Corrective Measures Already Taken

Waymo confirmed that improvements related to stopping for school buses had already been implemented, and additional software updates were planned for upcoming releases.
The company emphasized that “driving safely around children has always been one of Waymo’s highest priorities.”
Safety Data Defense

Waymo cited comparative statistics to defend its safety record: “The data shows we are improving road safety in the communities in which we operate, achieving a fivefold reduction in injury-related crashes compared to human drivers, and 12 times fewer injury crashes involving pedestrians.”
Even with positive statistics, one high-profile failure near a school bus raises urgent questions. Can fleet-wide safety metrics reassure regulators and the public?
Historical Significance of the Probe

This is Waymo’s third NHTSA investigation, but the first focused on school bus compliance. It represents an unprecedented scenario: AI encountering a “blind spot” for flashing red lights—the most recognized danger signal in American driving culture.
Regulators are shifting from encouraging innovation to demanding proof of safety.
Critiques of Waymo’s Response

Waymo’s reaction is perceived as reactive, rather than proactive, issuing statements only after the federal investigation announcement. The company acknowledged technical limitations but defended its cautious, low-speed maneuver, maintaining a safe distance from the children.
The approach underscores the challenges of transparency and crisis communication.
Trust In Autonomous Vehicles Dented

Public narratives have shifted from optimistic acceptance to cautious skepticism. The Waymo incident highlights AI’s limitations in protecting children, sparking debates on accountability and regulation.
Social media erupted with concerns questioning whether driverless cars are ready. Polls show declining confidence in urban AV deployment.
Industry Expert Comments

AV specialists emphasize that transparency, rapid incident response, and robust safety systems are crucial for maintaining regulatory trust. Waymo’s cooperation and proactive software updates reinforce confidence with NHTSA and the public.
Safety advocates call for child-focused AV standards, enhanced reporting, and clear definitions of responsibility. Experts note that proactive measures now could shape future autonomous vehicle regulations around schools.
What to Expect from the NHTSA Investigation?

The current preliminary investigation is expected to last 2–14 months, during which time Waymo’s software logs, operational data, and incident reports will be analyzed. NHTSA will also review media coverage and consumer complaints to determine whether similar incidents occurred.
If warranted, a deeper engineering analysis may follow, including scenario testing and evaluation of school bus detection algorithms. Most likely, Waymo will issue voluntary software updates, which will lead to the investigation’s closure.
What Schools, Parents, And Cities Should Do Now?

Schools should audit bus stops, enhance visual signaling with reflective markings and beacons, and educate drivers on AV behavior. Clear reporting channels with Waymo ensure safety incidents are logged.
Parents must teach children to double-check for stopping vehicles and document violations. Cities can coordinate with AV operators, enforce traffic enhancements, and run public awareness campaigns. Immediate action enhances safety while federal oversight is in progress.
Looking Ahead: Balancing Innovation and Safety

Waymo’s NHTSA investigation highlights both the promise and challenges of autonomous vehicles in school zones. While federal outcomes remain pending, proactive steps by schools, parents, and cities can immediately protect children.
As robotaxis expand, balancing innovation with vigilant safety measures will determine how communities safely adopt driverless technology.