Waymo is recalling 3,800 robotaxis after self-driving cars entered floodwaters. Discover the safety risks and NHTSA investigations here.
Waymo, the autonomous driving subsidiary of Alphabet, has announced a massive recall of approximately 3,800 robotaxis operating across the United States. The decision comes after reports that the self-driving vehicles were unable to properly detect and avoid flooded roadways, raising critical questions about the safety of AI-driven transport during extreme weather conditions.

The Trigger: A Dangerous Dip in San Antonio
The recall was sparked by a concerning incident on April 20 in San Antonio, Texas. During a period of extreme weather, one of Waymo’s robotaxis drove directly into a flooded lane. While the vehicle was empty at the time and no injuries were reported, the incident highlighted a significant flaw in the system’s ability to navigate hazardous environmental conditions, particularly on high-speed roads where water depth can be deceptive.
Following the event, Waymo conducted a comprehensive review of similar scenarios, focusing on high-speed maneuvers and roads that become impassable due to flash flooding.

How Waymo is Addressing the Flaws
To mitigate these risks, Waymo is deploying a series of software safeguards designed to enhance the vehicles’ environmental awareness. According to company representatives, the updates include:
- Software Refinements: Adding new layers of protection to help the AI better recognize flood risks.
- Operational Adjustments: Fine-tuning how the cars operate during heavy rain and extreme weather.
- Geofencing: Restricting vehicle access to areas known for high flash-flood risks.
The National Highway Traffic Safety Administration (NHTSA) confirmed that Waymo has temporarily narrowed its operating zones to implement these weather-related restrictions and update its mapping data while a permanent solution is developed.

A Growing Pattern of Safety Concerns
This recall is not an isolated incident. Waymo is currently under the microscope of multiple federal agencies following a string of safety lapses:
Collision with Pedestrian
The NHTSA is investigating a crash from January 2026, in which a Waymo vehicle collided with a child near an elementary school in Santa Monica, California. The child sustained minor injuries, but the event raised alarms about the technology’s reliability in school zones.
Traffic Law Violations
In March 2026, the National Transportation Safety Board (NTSB) opened an investigation into a Texas incident where Waymo vehicles illegally overtook a school bus that was stopped with its warning lights flashing—a clear violation of state traffic laws.
The Challenge of Real-World Autonomy
These recurring issues—ranging from colliding with pedestrians and pets to abrupt stops and traffic violations—underscore the immense challenge of deploying autonomous vehicles in complex urban environments. While the promise of robotaxis is high, the reality of unpredictable weather and human behavior continues to pose a significant hurdle for Alphabet’s ambitious project.
As Waymo works to refine its software, the industry continues to watch closely to see if AI can truly master the unpredictability of the open road.

