Let’s paint the picture. It’s a typical school morning in Austin, Texas. A yellow school bus pulls to the curb, red lights flashing, stop arm extended, crossing control arm deployed. A child steps off the bus and begins crossing the street. A Waymo robotaxi approaches, initially slows, and then, inexplicably, proceeds through the stop zone. The child is still on the road.
This wasn’t a one-time glitch. According to the Austin Independent School District’s December 2025 letter to Waymo, this scenario, or a variation of it, has played out at least 19 times since the start of the school year. Five of those violations occurred *after *Waymo claimed it had pushed software updates to fix the problem, and a recall was issued in October.
“Put simply, Waymo’s software updates are clearly…
Let’s paint the picture. It’s a typical school morning in Austin, Texas. A yellow school bus pulls to the curb, red lights flashing, stop arm extended, crossing control arm deployed. A child steps off the bus and begins crossing the street. A Waymo robotaxi approaches, initially slows, and then, inexplicably, proceeds through the stop zone. The child is still on the road.
This wasn’t a one-time glitch. According to the Austin Independent School District’s December 2025 letter to Waymo, this scenario, or a variation of it, has played out at least 19 times since the start of the school year. Five of those violations occurred *after *Waymo claimed it had pushed software updates to fix the problem, and a recall was issued in October.
“Put simply, Waymo’s software updates are clearly not working as intended nor as quickly as required,” wrote Jennifer Bergeron Oliaro, senior counsel for the Austin school district. “We cannot allow Waymo to continue endangering our students while it attempts to implement a fix.”
Welcome to the wild west of autonomous vehicles, where Silicon Valley’s “move fast and break things” philosophy has been deployed on public roads with our children as the beta testers.
The San Francisco Experiment
To understand how we got here, you need to understand how Waymo and Cruise exploited California’s regulatory framework to turn San Francisco into the world’s largest autonomous vehicle test track.
In August 2023, the California Public Utilities Commission voted 3-1 to approve unlimited 24/7 robotaxi operations throughout San Francisco, over the explicit objections of the city’s fire chief, police department, and transportation officials. Fire Chief Jeanine Nicholson had documented 55 instances of robotaxis interfering with emergency responses that year alone. San Francisco police, firefighters, and transit workers pleaded with regulators to pump the brakes.
The commission approved anyway.
“It is not our job to babysit their vehicles,” Chief Nicholson told the commission. She described robotaxis stopping in front of fire stations, running over firefighter hoses, and freezing at emergency scenes for up to 30 minutes each.
The Cruise Catastrophe
On October 2, 2023, at approximately 9:30 p.m. on Market Street in downtown San Francisco, a hit-and-run driver struck a pedestrian, throwing her into the path of a Cruise robotaxi. The Cruise vehicle braked hard but ran over the woman. It stopped.
The robotaxi’s sensor system failed to detect that a human being was pinned beneath its chassis. Following its programming to “pull over after a collision,” the vehicle accelerated to 7 mph and dragged the woman 20 feet down the street before coming to a final stop, with its rear wheel resting on her legs.
Emergency crews had to use the jaws of life to extract her.
According to the U.S. Department of Justice, Cruise subsequently filed reports with NHTSA that omitted the dragging entirely. In a videoconference the next morning, Cruise employees showed regulators a video of the incident, but “due to technical difficulties,” the portion depicting the dragging conveniently didn’t play.
In November 2024, Cruise admitted to filing a false report to influence a federal investigation and agreed to pay $500,000 in criminal fines, in addition to a separate $1.5 million civil penalty from NHTSA. The California DMV immediately suspended Cruise’s permits. The company’s CEO resigned. Hundreds of employees were laid off. GM has since lost more than $8 billion on its Cruise investment.
Researchers from Carnegie Mellon analyzed the crash and found the robotaxi had multiple opportunities to avoid hitting the pedestrian in the first place, but its programming prevented it from recognizing a pedestrian about to be struck in an adjacent lane, caused it to lose tracking of the victim after impact, and then essentially “forgot” it had just run over someone when initiating its pullover maneuver.
**The Underride Problem **
While Waymo and Cruise were turning San Francisco into an autonomous free-for-all, Tesla’s Autopilot system was developing its own deadly pattern, one with direct implications for the trucking industry.
On May 7, 2016, Joshua Brown was driving his Tesla Model S on Autopilot near Williston, Florida, when a tractor-trailer made a left turn across his path. Tesla’s system failed to detect the white side of the trailer against a brightly lit sky. Neither the Autopilot nor the automatic emergency braking engaged. Brown’s car passed under the trailer at 74 mph, shearing off the roof and killing him instantly.
The NTSB determined that Brown had not touched his steering wheel for 37 minutes before the crash.
Nearly three years later, history repeated itself with horrifying precision. On March 1, 2019, Jeremy Banner engaged Autopilot in his Tesla Model 3 in Delray Beach, Florida. Ten seconds later, his car underrode a semi-trailer crossing the highway. Banner had his hands off the wheel for the final eight seconds. Neither he nor the Autopilot system did anything to avoid the crash.
The NTSB’s subsequent investigation found that “Tesla’s Autopilot was not designed to, and could not, identify the truck crossing the Tesla’s path or recognize the impending crash.” The system’s automatic emergency braking was designed for rear-end collisions, not crossing-path scenarios. Yet Tesla continued to market the system under its suggestive “Autopilot” branding.
Then came the emergency responder crashes. NHTSA’s Office of Defects Investigation documented at least 16 incidents in which Teslas operating on Autopilot plowed into first responder vehicles at crash scenes, despite flashing lights, flares, illuminated arrow boards, and road cones. Most occurred after dark. The vehicles’ systems simply failed to recognize stationary emergency vehicles as obstacles requiring braking.
In December 2023, Tesla issued a recall of 2 million vehicles, stating the issue had been resolved through over-the-air software updates. NHTSA immediately opened a new investigation to determine whether the recall was actually effective.
The AV Truck Difference That Demanded Respect
While robotaxi companies were rushing to deploy in urban environments with minimal testing and aggressive lobbying of state regulators, the autonomous trucking industry took a fundamentally different approach.
Aurora Innovation didn’t begin commercial driverless operations until May 2025, and only after completing over 3 million autonomous miles with safety drivers behind the wheel and delivering more than 10,000 supervised customer loads. The company spent years testing on controlled routes, refining its systems, and building what it calls a “safety case”, a comprehensive evidence package demonstrating its technology is acceptably safe for public roads.
Kodiak Robotics similarly accumulated over 3 million miles with safety drivers before transitioning to driverless operations in the Permian Basin. The company’s trucks have logged over 750 hours of commercial driverless operation without a driver on board.
The difference in approach is considerable. A Waymo robotaxi weighs about 5,400 pounds and operates in dense urban environments filled with pedestrians, cyclists, and unpredictable traffic. A fully loaded Class 8 tractor-trailer weighs up to 80,000 pounds and primarily operates on controlled-access highways with (relatively) predictable traffic patterns.
“Driverless trucks need to look much farther down the road than robotaxis in busy cities,” industry analysts note. “They take steps to respond to situations that won’t unfold for another few seconds. This includes the ability to see pedestrians in the dark from hundreds of yards away, and being able to predict when another car might run a red light.”
Aurora’s sensors can detect objects beyond the length of four football fields, which is critical when you’re piloting a vehicle that can’t stop on a dime.
Inside the Machine
Understanding why autonomous trucks have (so far) avoided the catastrophic failures plaguing robotaxis requires understanding the technology itself.
Modern autonomous vehicles rely on three primary sensor types working in concert: LiDAR (Light Detection and Ranging), radar, and cameras. Each has strengths and weaknesses, and how companies combine them reveals a lot about their approach to safety.
LiDAR fires millions of laser pulses per second, measuring how long each takes to bounce back from objects. This creates a precise 3D “point cloud” map of the environment, accurate to centimeters. It’s particularly effective for detecting pedestrians, cyclists, and oddly shaped objects. However, LiDAR can struggle in heavy rain, fog, or dusty conditions, and until recently was prohibitively expensive. Modern systems from Luminar and Hesai can detect objects beyond 500 meters, which is crucial for highway-speed trucking.
Radar uses radio waves rather than light, making it effective in adverse weather conditions where LiDAR falters. It excels at detecting metal objects and measuring the speed of moving vehicles. However, radar’s lower resolution means it struggles to distinguish between objects of similar size and shape,a limitation that contributed to Tesla Autopilot’s infamous trailer underride failures.
Cameras provide visual context that neither LiDAR nor radar can match, including reading traffic signs, detecting lane markings, and interpreting traffic signals. But cameras are vulnerable to glare, darkness, and weather conditions. Tesla famously bet everything on a camera-centric approach, abandoning radar and LiDAR. The results speak for themselves.
Aurora’s trucks combine all three sensor types through “sensor fusion,” with multiple redundant systems for braking, steering, power, sensing, controls, computing, cooling, and communication. Waymo uses a similar multi-sensor approach in its sixth-generation hardware. The key difference? Aurora spent four years validating that fusion with professional drivers who could intervene when the system made mistakes.
SAE Levels
The autonomous vehicle industry loves throwing around terms like “Level 2” and “Level 4” without explaining what they mean. Here’s the breakdown according to SAE International’s J3016 standard, the industry’s bible on automation taxonomy.
Level 0 (No Automation): The human driver does everything. Warning systems like blind-spot monitoring and lane departure warnings are still considered Level 0 because they don’t control the vehicle; they just alert you.
Level 1 (Driver Assistance): The vehicle can assist with either steering OR acceleration/braking, but not both simultaneously. Adaptive cruise control is Level 1. Lane-keeping assist is Level 1. The driver must remain fully engaged.
Level 2 (Partial Automation): The vehicle can handle steering AND acceleration/braking simultaneously in specific scenarios, but the driver must monitor at all times and be ready to intervene. Tesla’s Autopilot, GM’s Super Cruise, and Ford’s BlueCruise are all Level 2 systems, despite marketing that suggests otherwise. When crashes happen in Level 2 vehicles, the driver is legally responsible.
Level 3 (Conditional Automation): This is the critical inflection point. At Level 3, the vehicle handles all driving tasks within its “operational design domain”, but the human must be ready to take over when requested. Liability begins shifting from the driver to the manufacturer. Mercedes-Benz offers limited Level 3 in some markets. It’s the “uncanny valley” of autonomy that many companies are trying to skip entirely.
Level 4 (High Automation): The vehicle can drive itself without human intervention within a defined geographic area and under specific conditions. No human backup required. This is where Waymo’s robotaxis and Aurora’s trucks operate. The catch: Level 4 vehicles typically cannot operate outside their defined operational domain, hence the geofenced routes and highway-only trucking deployments.
Level 5 (Full Automation): The vehicle can drive anywhere, in any conditions, without human intervention. No steering wheel required. This remains the industry’s holy grail, and it doesn’t exist yet. Anyone claiming otherwise is lying.
Guardrails Coming, But Not Fast Enough
Federal Motor Vehicle Safety Standards were written for vehicles with human drivers. They specify everything from brake performance to lighting requirements, but they assume someone is sitting behind the wheel.
Autonomous vehicles operate in a regulatory gray zone. NHTSA requires crash reporting through Standing General Order 2021-01, but there are no federal performance standards specific to automated driving systems. No mandatory testing protocols. No minimum competency requirements. Companies essentially self-certify that their vehicles are safe.
In April 2025, Transportation Secretary Sean P. Duffy announced a new AV Framework with three principles: prioritize safety, remove unnecessary regulatory barriers, and enable commercial deployment. NHTSA has since announced rulemakings to modify Federal Motor Vehicle Safety Standards for AV-specific scenarios, updating requirements for transmission controls, windshield systems, and lighting that don’t make sense for vehicles without human drivers.
Several pieces of legislation are working through Congress: the AV Accessibility Act, the AV Safety Data Act, and the Autonomous Vehicle Acceleration Act. The Teamsters are pushing for requirements that all autonomous commercial vehicles have a human on board. California, which banned driverless trucks, is now reconsidering.
NHTSA has 70 rulemakings on its current agenda and is operating at a fraction of its normal staffing levels. According to Reuters, the agency has lost more than 25% of its employees. The timeline for comprehensive AV safety standards remains measured in years, not months.
Nearly 40,000 Americans died in traffic crashes in 2024. Human drivers are imperfect, often distracted, sometimes impaired, and frequently fatigued. The trucking industry faces a chronic driver shortage, with 3.6 million unfilled positions globally, according to IRU. Long-haul drivers spend weeks away from their families and face one of the most dangerous jobs in America.
Autonomous technology, done right, could change that equation. Aurora claims its trucks can detect pedestrians hundreds of meters away in complete darkness. Its Verifiable AI approach includes guardrails specifically designed to yield for emergency vehicles. Waymo’s own data suggests its robotaxis are involved in 91% fewer serious injury crashes than human drivers. “Done right” is the operative phrase.
What we’ve witnessed with robotaxis is an industry that prioritized deployment speed over systematic safety validation, lobbied its way past local opposition, and treated public streets as testing grounds. When problems emerged, school buses, emergency vehicles, and pedestrians dragged under chassis, the response was software patches and PR statements, not a fundamental safety reassessment.
The autonomous trucking industry, whether by choice or necessity, has taken a different path. The consequences of an 80,000-pound vehicle failure are catastrophically worse than a 5,400-pound robotaxi error. Insurance requirements are higher. Federal scrutiny from FMCSA is more intense. And the trucking industry’s culture, forged through decades of DOT compliance, hours-of-service regulations, and CSA scores, is inherently more safety-focused than Silicon Valley’s “disrupt everything” ethos.
The Future
As of this writing, Aurora has completed over 100,000 driverless miles on public roads in Texas, operating commercial freight between Dallas, Houston, Fort Worth, and El Paso. The company plans to expand to Phoenix by year’s end and scale to hundreds of trucks by 2026. Kodiak’s driverless trucks are hauling sand in the Permian Basin. Volvo has purpose-built the VNL Autonomous platform with redundant systems throughout.
Meanwhile, Waymo continues expanding to new cities, Philadelphia, Tokyo, and London, while dealing with ongoing federal investigations, school bus violations, and the December 2025 San Francisco blackout that left its vehicles frozen at intersections throughout the city.
Will we learn from the robotaxi industry’s mistakes before repeating them at scale with 80,000-pound vehicles. That’s what we should be asking.
Twenty children in Austin are lucky that Waymo’s school bus failures didn’t result in tragedy. One woman in San Francisco will carry the scars, physical and psychological, of being dragged under a robotaxi for the rest of her life.
The trucking industry can either follow the robotaxi playbook of rapid deployment and pray nothing goes wrong, or continue the methodical, safety-first approach that has characterized the sector’s best operators for generations.
So far, they’ve chosen wisely. Here’s hoping that continues.