Waymo Runs Over Beloved Neighborhood Cat in Mission District
Posted2 months agoActiveabout 2 months ago
instagram.comTechstory
heatednegative
Debate
80/100
Autonomous VehiclesWaymoSafety Concerns
Key topics
Autonomous Vehicles
Waymo
Safety Concerns
News article now published: https://sfstandard.com/2025/10/28/waymo-kills-cat-san-franci...
A Waymo self-driving car ran over and killed a neighborhood cat in San Francisco's Mission District, sparking debate about the safety and accountability of autonomous vehicles.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
18m
Peak period
29
36-48h
Avg / period
6.9
Comment distribution55 data points
Loading chart...
Based on 55 loaded comments
Key moments
- 01Story posted
Oct 28, 2025 at 6:33 PM EDT
2 months ago
Step 01 - 02First comment
Oct 28, 2025 at 6:51 PM EDT
18m after posting
Step 02 - 03Peak activity
29 comments in 36-48h
Hottest window of the conversation
Step 03 - 04Latest activity
Nov 4, 2025 at 2:57 AM EST
about 2 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45740161Type: storyLast synced: 11/20/2025, 4:41:30 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
So ever 0 isn't "safe", lol
The interesting point will be when insurance companies reduce your rate if your car doesn’t have a steering wheel (or, equivalently, charge a “driving manually” fee). It might be obscured if car companies take on the risk themselves, but at some point people will start to notice that driving manually costs more.
Not "better than the best", but "safer than the average driver" - and if you aren't the only one on the road, your safety is a mix of your skill and everyone else's.
I was less than 18, using one of those little cars that reaches at most 50 Km/h. I slammed the break and manage to stop maybe 2 cm from the kitty, which managed to continue out of the street alive.
The scooter behind me came close to me and complained that I almost killed them by slamming the breaks. To this day, I still don't know if that was the right call. That guy could have been a dad and I could have killed a father. Still I couldn't think of killing a cat either.
The reason why the guy behind me survived is because he did veer when I slammed the breaks
It sounds more like instead of learning a lesson about following too closely, he decided to turn the mix of anger or fear onto you. Hopefully, with time removed from the situation, he will realize that he should not follow as closely. And hopefully you won't be too affected by guilt-manipulation. (Obviously, it's a good idea when something behind you cannot slow quickly, to try not to brake too quickly, but in theory, a scooter should be able to stop very quickly.) With hard calls, you only can do your best with the information you had at the time.
Still, this cat was on a busy stretch of 16th Street for nearly a decade and was unharmed by human drivers. I think Waymo failed pretty badly here. Some of the dismissive comments I've seen on this topic seem to me like they're making excuses.
Each story is probably a sad one, but hmm, an Instagram post about one of these being published on Hacker News because it involved a Waymo? Wow!
At least the number of articles suggesting that trolley problems are highly relevant to self-driving car implementations have gone down.
So it’s safe to ago ahead and short it.
Remember it trades under GOOG
Perhaps assign a safety driver that puts its own driving license and criminal liability on the line, so the company cannot evade responsibility.
Pushing companies to investigate and take responsibility, and report these accidents is going to overall to improve reliability of the system.
The reality is that if you do not put strong punishments, these companies wont have the incentive to fix it, or they will push these priorities way lower on the to-do list.
I think the point is you don't know for certain what you hit if you hit and run. The car should have enough collision detection to know when it's hit something.
That said, this story is sending up red flags with the "allegedly" in the title and lack of evidence beyond hearsay.
This is really only true for Waymo, who appear to be the only folks operating at scale who did the work properly. Robotaxi, Cruise and all the others are in a separate bucket and should be statistically separated.
Very bleak and very tech-bro-coded, no wonder that regular people have started seeing us like pariah, we deserve it.
Waymo? How is this ambiguous. Waymo makes the car, writes the software and operates the vehicle.
https://waymo.com/blog/2025/05/waymo-making-streets-safer-fo...
Though, Waymo should absolutely be responsible for this and be treated as if it were a human who hit the cat.
Also note that there is an enormous issue of trust and dignity.
By "trust" I mean: We have seen how data and statistics are created. They are useful on average, but trusting them on very important, controversial topics, when they come from the private entity that stands to benefit from them, is an unrealistic ask for many normal humans.
By "dignity" I mean: Normal humans will not stand the indignity of their beloved community members, family, or pets being murdered by a robot designed by a bunch of techies chasing profit in silicon valley or wherever. Note that nowhere in that sentence did I say that the techies were negligent - they may have created the most responsible, reliable system possible under current technology. Too bad normal humans have no way of knowing if that's the case. Especially humans who are at all familiar with how all other software works and feels. It's a similar kind of hateful indignity and disgust to when the culpable party is a drunk driver, though qualitatively different. The nature of the cause of death matters a lot to people. If the robot is statistically safer, but when it kills my family it's because of a bug, people generally won't stand for that. But of course we don't know why exactly, as observers of an individual accident - maybe the situation was truly unavoidable and a human wouldn't have improved the outcome. The statistics don't matter to us in the moment when the death actually happens. Statistics don't tell us whether specifically our dead loved one would have died at the hands of a human driver - only that the chances are better on average.
Human nature is the hardest thing for engineers to relate to and account for.
Wonder why the title states allegedly but not the article?
You probably get more honest answers by presenting a trolley problem and then requiring a response within a second. It's a great implicit bias probe.
But if this is the worst that can be said about Waymo then that gives me a lot of confidence in their general driving abilities.
Otherwise it's a slippery slope of "well but it's generally good"