` Google's Waymo Faces 3rd Recall After Robotaxis Blow Past School Buses With Children Present - Ruckus Factory

Google’s Waymo Faces 3rd Recall After Robotaxis Blow Past School Buses With Children Present

komocode – Reddit

Waymo, owned by Google’s parent company Alphabet, is recalling software after its robotaxis repeatedly passed stopped school buses illegally in Texas and Georgia. School districts and federal regulators grew alarmed, triggering an investigation by the National Highway Traffic Safety Administration (NHTSA).

No injuries occurred, but the incident reveals that the company’s AI system failed to follow America’s most basic traffic rule: stopping completely when a school bus displays flashing red lights and an extended stop-arm signal.

The Pattern Emerges

school bus school bus transport training vehicle yellow back to school to learn school bus school bus school bus school bus school bus bus bus
Photo by NoName 13 on Pixabay

Austin school bus cameras first detected violations in August 2025. By October, the Austin Independent School District documented 12 violations and alerted NHTSA. The incidents weren’t isolated glitches—they occurred repeatedly across multiple locations, several times a week.

By early December, Austin had recorded 20 violations, with 6 additional violations in Atlanta, Georgia. Waymo’s November 17 software update failed to stop new violations from occurring afterward.

A Critical Safety Standard

a yellow school bus driving down a street
Photo by Pedro Gandra on Unsplash

All 50 U.S. states require drivers—human or autonomous—to stop completely for school buses displaying flashing red lights and extended stop-arms. This law protects students boarding and exiting buses who face a serious injury risk. Police enforce this rule with traffic citations.

Waymo’s fifth-generation autonomous system repeatedly violated this fundamental rule, raising urgent questions about whether the company’s technology is suitable for public roads in populated areas.

Building Pressure

a car that is driving down the street
Photo by gibblesmash asdf on Unsplash

In October 2025, Austin ISD sent formal letters to Waymo, demanding action. By November 20, additional violations continued despite Waymo’s claimed fixes.

The district’s legal counsel requested Waymo cease operations from 5:20 a.m. to 9:30 a.m. and 3 p.m. to 7 p.m. on school days. Waymo refused.

Federal regulators are prepared to escalate, with NHTSA moving toward formal demands for extensive documentation and explanations from the company.

The Recall Announcement

black and gray bus seats
Photo by Jesiel Rubio on Unsplash

On December 6, 2025, Waymo announced it would file a voluntary software recall with NHTSA—the company’s third recall in less than two years.

Chief Safety Officer Mauricio Peña stated, “Waymo experiences twelve times fewer injury crashes involving pedestrians than human drivers. Holding highest safety standards means recognizing when our behavior should improve.”

Violations persisted even after Waymo’s November 17 software update, indicating that the fix remained incomplete.

A Student’s Close Call

A white car is stopped at an intersection
Photo by Hoseung Han on Unsplash

On November 20, 2025, a Waymo vehicle approached a stopped school bus on South First Street in Austin. Bus cameras captured the vehicle slowing, then accelerating and passing while a student crossed in front of it and remained in the roadway.

The student had just started crossing after the vehicle stopped, creating a dangerous vulnerability. This incident crystallized the district’s fear: the system failed not only to stop but also to maintain safe positioning around children.

Multiple Locations, Similar Problems

Waymo autonomous vehicle on California Street San Francisco California USA
Photo by Dietmar Rabich on Wikimedia

Atlanta schools reported six violations where Waymo vehicles illegally passed stopped school buses.

A September 22, 2025, video captured one incident: a vehicle passed a school bus’s stop arm while students were visibly exiting the bus in broad daylight, with red lights flashing and the stop arm fully extended.

Unlike some Austin incidents documented but less visible, the Atlanta video provided clear evidence of the violation, making it impossible for Waymo to minimize the severity of the incident.

A Stalled Update Creates Questions

A self-driving car navigates through a bustling city street in San Francisco capturing urban mobility in action
Photo by Abhishek Navlakha on Pexels

In November 2025, Waymo announced it had identified the software problem and deployed a fix through automatic update on November 17. Company officials claimed the update would “significantly improve performance” around school buses.

However, Austin ISD monitoring systems recorded at least 5 additional violations after November 17, with the latest on December 1, 2025. This pattern suggested Waymo’s diagnosis remained incomplete, its solution insufficient, or multiple root causes existed.

Federal Escalation

A Waymo-operated Jaguar I-Pace in San Francisco The vehicle is operating autonomously without any person in the driver s seat
Photo by Dllu on Wikimedia

On December 3, 2025—three days before Waymo’s recall announcement—NHTSA sent Waymo a formal investigative letter demanding detailed responses by January 20, 2026.

The agency requested comprehensive documentation of how Waymo’s fifth-generation system recognizes school buses, interprets traffic signals, and decides whether to proceed.

NHTSA demanded video evidence of each violation and explanations of internal tests. Federal regulators signaled that they viewed the voluntary recall as insufficient and were preparing stronger enforcement.

A Troubling History

A Waymo self-driving car on the road in Mountain View making a left turn Side view
Photo by Grendelkhan on Wikimedia

The December 2025 recall marks Waymo’s third in under two years, revealing a pattern beyond school bus issues. In February 2024, Waymo recalled 444 vehicles after pickup truck collisions in Phoenix. In June 2024, it recalled over 670 vehicles after a robotaxi hit a wooden utility pole.

These earlier recalls raised questions about the robustness of Waymo’s object-recognition and obstacle-avoidance systems—the same capabilities required for proper school bus responses. Each incident added to concerns about the readiness of autonomous systems.

Waymo’s Position on Safety

White self-driving car on a city street
Photo by Sam Szuchan on Unsplash

Waymo positions itself as the industry’s safety leader, claiming its vehicles experience “twelve times fewer injury crashes involving pedestrians” compared to human drivers.

The company emphasized that no injuries occurred during school bus incidents and that violations involved either no students in the vehicle’s path or sufficient stopping distance to prevent an incident.

Waymo used this safety record argument as its primary defense against district and regulatory criticism, suggesting actual risk remained low.

District Demands Go Unheeded

Lexus RX450h retrofitted by Google for its driverless car fleet At the left side is parked a Tesla Model S electric car
Photo by Steve Jurvetson on Wikimedia

Austin ISD Police Chief Wayne Sneed publicly criticized Waymo’s refusal to halt operations during school loading times. “If we truly care about student safety, we must stop actions we know create documented safety risks for kids,” Sneed stated.

The district formally requested suspension from 5:20 a.m. to 9:30 a.m. and from 3 p.m. to 7 p.m. on school days, but Waymo declined, claiming that the November 17 updates had resolved the issue. This conflict between local authority and corporate confidence would persist into 2026.

Legal Maneuvering

Inside view of a car s steering wheel
Photo by Annie Spratt on Unsplash

Austin ISD explored legal remedies, announcing officials were “reviewing all potential legal options to ensure student safety.” The district’s legal counsel, Jennifer Oliaro, documented each violation and maintained correspondence with Waymo demanding immediate action.

By early December, the district issued Waymo 20 citations totaling $2,100 in fines. Whether the district would pursue injunctive relief, petition state transportation authorities, or seek damages remained unclear, but the district shifted from requests to enforcement.

The Police Compliance Issue

two police officers standing on the side of a road
Photo by Adil Edin on Unsplash

Beyond school bus incidents, the Austin Police Department and AISD authorities raised another safety concern: Waymo vehicles ignored traffic control from on-duty police officers. Officers documented instances where Waymo vehicles proceeded through intersections or maneuvered around officers despite having their emergency lights activated or receiving hand signals.

This pattern suggested that the autonomous system failed to recognize and respond to law enforcement authority—a gap extending beyond school buses to basic emergency-response protocols that all vehicles must follow.

Expansion Despite Concerns

A Waymo self-driving car on the road in Mountain View Front view
Photo by Grendelkhan on Wikimedia

Despite the escalating school bus crisis, Waymo announced in late November 2025 plans to launch fully autonomous operations in Dallas, Houston, San Antonio, Miami, and Orlando in 2026. The timing appeared tone-deaf—the announcement preceded public disclosure of violations.

Continuing aggressive expansion while federal regulators demanded explanations about fifth-generation system failures suggested company leadership either believed the school bus issue was isolated or prioritized market growth over regulatory caution in decision-making.

Regulatory Momentum

Imported image
Reddit – walky22talky

NHTSA’s December 3 letter signals a preliminary investigation could escalate to a full formal investigation if Waymo’s responses prove unsatisfactory. The agency’s questions demand documentation and engineering explanations for why the AI system made specific decisions in specific scenarios.

If NHTSA determines a safety defect exists across the fleet, it could trigger broader recalls or operational restrictions. The January 20, 2026, deadline marks a critical decision point for federal regulators regarding Waymo’s future operations.

Automation’s Compliance Gap

a room with many machines
Photo by ZHENYU LUO on Unsplash

The Waymo failures highlight a fundamental challenge in autonomous vehicles: consistently following rules, especially in complex situations. Humans internalize school bus stops instinctively. Autonomous systems must explicitly encode such rules and ensure they override other priorities, such as speed or route efficiency.

Waymo’s failures suggest its AI system either underweighted legal compliance or failed to reliably detect when compliance rules applied. Industry experts recognize this type of AI problem as a significant obstacle to the safe deployment of autonomous systems.

Public Perception and Trust

trust arrangement shaking hands shaking hands shaking hands shaking hands shaking hands shaking hands
Photo by bertholdbrodersen on Pixabay

News of violations spread across media outlets and online platforms, with parents and education advocates questioning whether autonomous vehicles belong near schools. The phrase “robotaxis blow past school buses with children present” symbolized AI safety concerns broadly.

Some noted human drivers lose their licenses for such violations; others questioned whether Waymo’s safety claims remained credible, given failures in basic tasks that teenagers typically master. The incidents became a benchmark for debates on the national adoption of autonomous vehicles.

Precedent and Prediction

Imported image
Reddit – walky22talky

Waymo’s recall parallels earlier autonomous vehicle failures where AI systems struggled with routine human tasks. Tesla’s Autopilot failed to recognize pedestrians in certain lighting conditions. Uber suspended its self-driving program in 2018 after a fatal collision exposed flaws in its decision-making process.

In each case, the underlying problem wasn’t overall safety compared to humans but rather a specific inability to handle predictable scenarios that regulators and the public deemed essential. Waymo’s response—rapid updates and voluntary recall—follows this established pattern but offers no guarantees of success.

The Road Ahead

Imported image
LinkedIn – The Arc of the United States

Waymo’s school bus recall will likely determine whether the autonomous vehicle industry reaches mainstream adoption. The company positioned itself as the gold standard for AI safety, but Austin and Atlanta failures exposed a gap between positioning and reality.

The coming months will reveal whether a software fix resolves the issue, whether regulators impose stricter limits, and whether public confidence recovers. If Waymo cannot reliably follow basic traffic law in predictable situations, the case for expanding autonomous vehicles becomes far weaker.

Sources:
USA Today, December 9, 2025
CBS News, December 5, 2025
PC Magazine, December 8, 2025
NHTSA preliminary investigation, October–December 2025
Historical analysis of autonomous vehicle incidents, 2016–2025
ABC News, CNN, CBS News reporting, December 4-8, 2025