Picture this: a school bus parked curbside in sunny Austin, red lights pulsing like a heartbeat, stop arm swung out wide, kids spilling across the street with backpacks bouncing. Now imagine a sleek, driverless SUV—piloted by nothing but code—rolling right on through, as if the whole scene were just a minor detour. That’s not a scene from a dystopian flick; it’s real footage from Texas streets, captured by the Austin Independent School District (AISD) and now fueling a federal probe into Waymo’s autonomous fleet.
The National Highway Traffic Safety Administration (NHTSA) kicked off the investigation on December 3, zeroing in on whether Waymo’s self-driving tech can reliably obey the rules when it comes to stopped school buses.
In a letter to the company, NHTSA noted, “such unexpected driving behaviors or not complying with traffic safety laws concerning school buses may increase the risk of crash, injury, and property damage.”
That’s regulator-speak for “this could get kids hurt,” and it’s a stark reminder that rolling out robotaxis across America demands ironclad safety, not just flashy promises.
AISD police aren’t mincing words either. They’ve slapped Waymo with 20 citations since the school year kicked off, documenting at least 19 close calls where the vehicles breezed past flashing lights and crossing pedestrians.
Chief of Police Wayne Sneed didn’t hold back in his statement to FOX Business: “Despite numerous requests, Waymo has refused to cease operations.” He pointed to a fresh violation on December 1, when one of their rides zipped by a bus that’s been idling with signals on for a full minute—students loading up and all. “In this instance, the bus was and had been stopped with red flashing lights and activated for nearly a full minute before their vehicle passed—an unequivocal violation of State Law.”
Waymo caught wind of the trouble back in mid-November and claims it patched things up with software tweaks by November 17. A company spokesperson told FOX Business the fixes have boosted performance “to a level better than even human drivers,” insisting their logs show the cars only edged forward when the coast was clear.
“Improving road safety is our top priority at Waymo and we’re deeply invested in safe interaction with school buses. We swiftly implemented software updates to address this and will continue to rapidly improve,” the rep added.
But here’s the rub: even after those updates, incidents kept piling up, including five more in late November alone, according to AISD’s earlier letter to NHTSA. Sneed put it bluntly: “They reported that programming changes had been implemented in mid-November to correct previous violations. However, the Dec. 1, 2025 incident indicates that those programming changes did not resolve the issue or our concerns.”
This isn’t just a local headache—it’s a test for how America balances cutting-edge tech with the bedrock duty to protect our communities. Self-driving vehicles promise to slash accidents caused by drowsy or distracted drivers, potentially saving lives and freeing up roads for more productive pursuits like hauling goods to market or getting families to jobs on time.
Waymo’s parent, Alphabet, is betting big on this future, with plans to roll out rides in Dallas come 2026 after some quiet road-testing there. Done right, it could supercharge economic growth: think lower logistics costs, fewer insurance headaches, and jobs in software and maintenance that keep dollars flowing stateside.
Yet these bus-bypassing blunders expose the gaps. Texas law—and every state’s, for that matter—demands drivers yield to school buses without question, a rule etched in the hard lessons of past tragedies. When algorithms falter on something so basic, it erodes trust in the very innovation meant to propel us forward. American ingenuity built the interstate system and turned Detroit into a powerhouse; it can crack autonomous driving too. But only if companies like Waymo treat safety as non-negotiable, not an afterthought to expansion.
The feds’ probe could force real fixes—maybe better sensors for spotting those stop arms from afar or AI trained harder on edge cases like bustling school zones. For now, Austin’s cops are left chasing citations while parents eye the streets warily. As Waymo eyes more turf in the Lone Star State, let’s hope this scrutiny sharpens their edge without slamming the brakes on progress. After all, the road to economic revival runs smoother when our tech serves people first, not the other way around.




Interesting dilemma. IF they have the citations and video from the buses, they need to shut Waymo down until it’s fixed. Looks like A:I ain’t so smart after all. First kid down will settle it. Warnings are never enough.