Few of Waymo's Most Serious Crashes Were Waymo's Fault
Posted4 months agoActive4 months ago
understandingai.orgTechstory
calmpositive
Debate
40/100
Autonomous VehiclesWaymoTesla
Key topics
Autonomous Vehicles
Waymo
Tesla
An analysis of Waymo's crashes found that most were caused by human error, not the autonomous vehicle system, sparking discussion about the safety and reliability of self-driving cars.
Snapshot generated from the HN discussion
Discussion Activity
Light discussionFirst comment
13m
Peak period
2
26-28h
Avg / period
1.2
Key moments
- 01Story posted
Sep 18, 2025 at 10:44 AM EDT
4 months ago
Step 01 - 02First comment
Sep 18, 2025 at 10:57 AM EDT
13m after posting
Step 02 - 03Peak activity
2 comments in 26-28h
Hottest window of the conversation
Step 03 - 04Latest activity
Sep 19, 2025 at 1:56 PM EDT
4 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45290342Type: storyLast synced: 11/20/2025, 5:54:29 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
> Very few of Waymo’s most serious crashes were Waymo’s fault
>> I looked at 45 major Waymo crashes—most were human error.
Not sure whether you intentionally submitted a misleading title, or was a genuine mistake. Either way, the submitted title grossly changes the meaning.
EDIT: Read the article to see whether the headline was clickbaity. Spoiler - it was.
> At 1:14 AM on May 31st, a Waymo was driving on South Lamar Boulevard in Austin, Texas, when the front left wheel detached. The bottom of the car scraped against the pavement as the car skidded to a stop, and the passenger suffered a minor injury, according to Waymo.
Among the 45 most serious crashes Waymo experienced in recent months, this was arguably the crash that was most clearly Waymo’s fault. And it was a mechanical failure, not an error by Waymo’s self-driving software.
--
That's the most serious of the crash. The article is written in a way to sound like the autonomous driving had issues, when it was mechanical failure. Not saying that's bad, but isnt what the title of the article was trying to say. It could have as well be $any_car
That's not ambiguous at all. If you are unable to stop in time if the car in front of you slams on the brakes, you were following too closely, and that is your fault.
My guess is that the ambiguity is about the trade-off: "even if no one should hit you for slamming on the brakes, is it a good risk to take over a cat?"
https://electrek.co/2025/08/04/tesla-withheld-data-lied-misd...