Lessons From the Pg&e Outage
Key topics
As the lights went out in San Francisco, Waymo's self-driving cars ground to a halt, causing traffic jams and raising questions about their disaster recovery plan. Commenters were quick to point out that the outage highlighted the need for more robust planning, with some arguing that the company's "move fast and break things" approach prioritized profits over public safety. Others defended the technology, noting that stopping in uncertain situations is a reasonable precaution, and that remote dispatch might not always be the best solution. The debate underscored the complexities of deploying autonomous vehicles in real-world scenarios, where unexpected events can have unforeseen consequences.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
1h
Peak period
78
0-12h
Avg / period
16
Based on 160 loaded comments
Key moments
- 01Story posted
Dec 23, 2025 at 9:16 PM EST
19 days ago
Step 01 - 02First comment
Dec 23, 2025 at 10:45 PM EST
1h after posting
Step 02 - 03Peak activity
78 comments in 0-12h
Hottest window of the conversation
Step 03 - 04Latest activity
Dec 30, 2025 at 10:31 PM EST
12 days ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
Sounds like it was and you’re not correctly understanding the complexity of running this at scale.
The fact this backlog created issues indicates that it's perhaps Waymo that doesn't understand the complexity of running at that scale, because their systems got overwhelmed.
I'm very happy they're moving fast so hopefully fewer people die in the future
How's that for a real world trolley problem?
This is almost hilariously false. "Oh yeah, those words on paper? Well, they actually physically stopped me from running the red light and plowing into 4 pedestrians!"
> If you think that a giant, private, for-profit company cares about people's lives, you are in for a ride.
I honestly wonder how leftists manage to delude themselves so heavily? I'm sure a bunch of politicians really have my best interests at heart. Lol
It's very clearly proven that hitting a pedestrian with 50 km/h is exponentially more dangerous than hitting them with 30 km/h. It's very clearly proven that having physically separted bike lines prevents deaths. It's very clearly proven that other measure like speed bumps, one-way streets, smart traffic routing prevents deaths.
And I am not even going to respond to your idiotic "leftist" statement.
> And I am not even going to respond to your idiotic "leftist" statement.
This says more about you than it does me.
They're one-of-one still. Having ridden in a Waymo many times, there's very little "move fast and break things" leaking in the experience.
They can simulate power outages as much as they want (testing) but the production break had some surprises. This is a technical forum.. most of us have been there.. bad things happened, plans weren't sufficient, we can measure their response on the next iteration in terms of how they respond to production insufficiencies in the next event.
Also, culturally speaking, "they suck" isn't really a working response to an RCA.
That's what they're learning and fixing for in the future to give the cars more self-confidence.
This kind of attitude to me indicates a lack of experience building complex systems and responding to unexpected events. If they had done the opposite and been overly aggressive in letting Waymo’s manage themselves during lights that are out would you be the first in line criticizing them then for some accident happening?
All things being considered, I’m much happier knowing Waymo is taking a conservative approach if the downside means extra momentary street congestion during a major power outage; that’s much rarer than being cavalier with fully autonomous behavior.
Waymo didn't give much info. For example, is loss of contact with the control center a stop condition? After some number of seconds, probably. A car contacting the control center for assistance and not getting an answer is probably a stop condition. Apparently here they overloaded the control center. That's an indication that this really is automated. There's not one person per car back at HQ; probably far fewer than that. That's good for scaling.
Almost certainly no - you don’t want the vehicle to enter a tunnel, then stop half way through due to a lack of cell signal.
Rather, areas where signal dropouts are common would be made into no-go areas for route planning purposes.
It is simply false advertising at this point.
Self-driving car advertisers like Musk or Waymo just want to co-opt this term because it sounds cool. The term also deliberately hides the fact that these vehicles surveil and track you.
I think it fits the state of affairs well-enough.
https://www.oxfordlearnersdictionaries.com/us/definition/eng...
The same applies to "autonomous drones", which are the most remote assisted machines imaginable.
But of course the advertising departments want to evoke an image of the Marlboro man saddling his horse rather than a GPS tracked, surveillance riddled, face scanning, remote assisted contraption.
It may even be appropriate to have medical professionals like paramedics (with transport units) standing by. There's no telling how you might react.
You are subject to road signs, traffic, police directions, etc while driving. In the event of a natural disaster it seems feasible that you could end up in a situation where you don't know how you ought to proceed. So neither are you "free of external influence or control" in an absolute sense.
This situation does not require a sophistic argument that we are not autonomous because we rely on the sun. If a child walks alone to school without asking for directions, it walks autonomously. If it has to call its parents or uses a GPS phone, it is not autonomous. This is really not that hard.
Obviously your point here highlights your pedantry: autonomy is not absolute. Despite being a mostly functioning and definitely autonomous human being, I sometimes have to call someone who knows better to ask for directions.
> If it has to call its parents or uses a GPS phone, it is not autonomous.
You ought to be able to imagine plenty of scenarios where this would be the case (ie the child got lost) and yet clearly you still believe the child to qualify as autonomous. Analogously, the vehicles are not necessarily disqualified as being considered such despite being unable to independently navigate in some subset of scenarios.
https://brx-content.fullsight.org/site/binaries/content/asse...
Harvesting outrage is about the only reliable function the internet seems to have at this point. You're not seeing enough of it?
It's approximately one 9/11 a month. And that's just the deaths.
Worldwide, 1.2m people die from vehicle accidents every year; car/motorcycle crashes are the leading cause of death for people aged 5-29 worldwide.
https://www.transportation.gov/NRSS/SafetyProblem
https://www.who.int/news-room/fact-sheets/detail/road-traffi...
I mean really. I’m a self driving skeptic exactly because our roads are inherently dangerous. I’ve been outraged at Cruise and Tesla for hiding their safety shortcomings and acting in bad faith.
Everything I’ve seen from Waymo has been exceptional… and I literally live in a damn neighborhood that lost power, and saw multiple stopped Waymos in the street.
They failed-safe, not perfect, definitely needs improvement, but safe. At the same time we have video of a Tesla blowing through a blacked out intersection, and I saw a damn Muni bus do the same thing, as well as a least a dozen cars do the same damn thing.
People need to be at least somewhat consistent in their arguments.
build infrastructure that promotes safe driving, and
train drivers to show respect for other people on the road
However, those are both non-starters in the US. So your answer, which comes down to "at least self-driving is better than those damn people" might be the one that actually works.
If we could do anything like "train drivers to show respect for other people on the road" at scale, then we'd live in a different world by now.
However, I used to live in a place where every local driver did an 'after you' that included pedestrians, regardless of road rules, and generally drove the speed limit (and usually less).
Both of these places in the United States!
The latter is not impossible, just rare.
The US isn't close to being the highest per traffic fatality rate in the western hemisphere.
I count 14 countries higher.
https://en.wikipedia.org/wiki/List_of_countries_by_traffic-r...
Notice eastern europe is always left out of social issue discussions.
Some Mediterranean bordering nations are always left out of government efficacy discussions.
It's not about comparing like-ish for like-ish. It's about finding a plausibly deniable way to frame the issue so that the US gets kneecapped by the inclusion of West Virginia or 'bama New Mexico or Chicago or whatever else it is that is an outlier and tanks its numbers while the thing on the other side of the comparison exempts that analogue entirely.
[1] https://en.wikipedia.org/wiki/Western_world#/media/File:West...
[2] https://en.wikipedia.org/wiki/Western_world
"English-speaking countries with a similar GDP per capita" (aka the "first-world anglosphere") seems like a reasonable proxy for "countries whose policies we might look to emulate due to their comparable culture & wealth."
Maybe there's something to be said for left-hand driving, I see Japan ranks very highly too. ;)
The real reason is I guess we take road safety seriously, we have strict drink-driving laws, and our driving test is genuinely difficult to pass.
I seem to remember road safety also featuring prominently throughout the primary national curriculum.
And of course, our infamous safety adverts that you never quite forget, such as: https://www.youtube.com/watch?v=mKHY69AFstE
In countries that drive on the right then drivers use their dominant hand for any controls that are on the inward side and their other hand for the control that are on the outward side of the driver.
Generally that means that the non-dominant hand handles exterior lighting, turn signals, windows, and locks. The dominant hand handles windshield wipers, transmission, and anything on the center console such as the climate and entertainment systems, and often also the navigation system.
In left drive counties that is mostly reversed for right-handed people, with the possible exception of the exterior lighting, turn signals, and windshield wipers. Those exceptions are the controls that are usually on stalks attached to the steering column. From what I've read sometimes manufactures use the same stalk positions in left and right drive models instead of reversing them like they do the rest of the controls.
Could dominant vs non-dominant hand for operating things on the center console make a difference? If everyone obeyed safety recommendations I'd expect it to not make enough difference to be noticeable, but not everyone obeys safety recommendations 100% of the time.
If someone for example tried to type in a destination using the on-screen keyboard on the navigation system console while driving I'd expect that they would take longer to do so if they were using their non-dominant hand, so they would be distracted longer.
Large airplanes usually have a pilot on either side of the center console, and they AFAIK take turns operating the airplane, so if it made a difference, I'd expected it to be studied by the aerospace industry. Given that I've never seen it mentioned on any of the airplane incident reports I've read, it probably isn't a big factor, and I see no reason why it would be different for cars.
Is this a serious comment? Is that actually what you think they meant by "Western"? When people talk about Russia vs "the West", do you also think they mean Russia vs the Western hemisphere?
If I kill someone with my car, I’m probably going to jail. If a Waymo or otherwise kills someone, who’s going to jail?
"Accountability" is fucking worthless, and I am tired of pretending otherwise.
So this is very much not at all true.
https://freakonomics.com/podcast/the-perfect-crime/
My entire point is that we don’t care about human lives on our roads. So yelling about the safety concerns about Waymos makes no sense.
This is rarely true in the US. A driver's license is a license to kill with near impunity.
https://www.cbsnews.com/chicago/news/man-gets-10-days-in-jai...
Also, that still doesn't excuse Waymo blocking roads. These are two different, independent problems. More people die in care crashes than they do in plane crashes but that doesn't mean we should be replacing all cars by planes either.
The education bit can’t be fixed by the government though in the short term, as the outcomes correlate too strongly with stable home life conditions (which are in free fall over the past 50 years).
"Parental authority" should not be an educational goal.
1. [citation needed]
2. Just because it's theoretically possible, doesn't mean it's an option that actually exists. I'm sure you can dream up of some plan for a futuristic utopia where everybody lives in a 15 minute city, no private cars are needed, and the whole transportation system is net zero, but that doesn't mean it's a realistic option that'll actually get implemented in the US, nor does it mean that we we should reject hybrid or EVs on the basis that they're worse than the utopian solution, even though they're better than the status quo of conventional ICE cars.
Road casualty statistics for Denmark (being 7x lower than the US).
> 2. Just because it's theoretically possible, doesn't mean it's an option that actually exists.
Denmark exists. I've been there.
Imagine that when smartphones were first coming out they could only function with recent battery-tech breakthroughs. Mass-adoptions was pretty quick, but there was scattered reporting that a host of usage patterns could cause the battery to heat up and explode, injuring or killing the user and everyone in a 5-10ft radius.
Now, the smartphone is a pretty darn useful device and rapidly changes how lots of businesses, physical and digital, operate. Some are pushing for bans on public usage of this new battery technology until significant safety improvements can be made. Others argue that it's too late, we're too dependent on smartphones and banning their public use would cause more harm than good. Random explosions continue for decades. The batteries become safer, but also smartphone adoption reaches saturation. 40,000 people die in random smartphone explosions every year in the US.
The spontaneous explosions become so common and normalized that just about everyone knows someone who got caught up in one, a dead friend of a friend, at least. The prevailing attitude is that more education about what settings on a phone shouldn't be turned on together is the only solution. If only people would remember, consistently, every time, to turn on airplane mode before putting the phone in a pocket. Every death is the fault of someone not paying sufficient attention and noticing that the way they were sitting was pressing the camera button through their pants. Every phone user knows that that sort of recklessness can cause the phone to explode!
You as an engineer know how people interact with the software you deploy, right? You know that regardless of education, a significant portion of your users are going to misunderstand how to do something, get themselves in a weird state, tap without thinking. What if every instance in your logs of a user doing something strange or thoughtless was correlated with the potential for injury? You'd pull your software from the market, right? Not auto-makers. They fundamentally cannot reckon with the fact that mass adoption of their product means mass death. Institutionally incapable.
The only responsible thing to do is to limit automobile use to those with extensive training and greatly reduce volume. The US needs blue collar jobs anyway, so why not start up some wide-scale mass-transit projects? It's all a matter of political will, of believing that positive change is possible, and that's sorely lacking.
That’s an extraordinary claim.
We all know we can die when we drive poorly or ignore shocks and tires. But we don't like the idea of dying because of someone else.
That would be like every traffic incident ever? I don't think US has public cars or state-owned utilities.
The only other option I can think of is to build some kind of high density low power solar powered IoT network that is independent of current infrastructure but then where is the spectrum for that?
https://www.telecomtrainer.com/lte-v2x-vs-nr-v2x-key-differe...
That's hardly new. What do you think happens to traffic when a semi flips over on a busy interstate, or electricity goes out, turning all traffic lights into 4 way stops and severely limiting throughput?
What happens when one company's engineering failure does that to most roads?
For reference, the US considers tactically blocking traffic to be something that smart terrorists or nation state adversaries would want to do to significantly harm the US economically.
What do these cars do if Google's entire self driving infrastructure falls over because some component gets misconfigured? It will happen eventually.
How many human drivers did similar because the power went out?
Or at least frequently enough to supply multiple subreddits dedicated to these people.
There were indeed accidents, and so yes, human cars were in fact stopped in the middle of traffic.
I doubt they have more than that.
Does anyone know if a Waymo vehicle will actually respond to a LEO giving directions at a dark intersection, or if it will just disregard them in favour of treating it as a 4 way stop?
In a lot of cases rather than seasonal it will be a surge every weekend.
I often see humans drivers being confused with the police officers gesturing more and more until the person figures it out.
I remember being in a construction area where they had temporary traffic lights, PLUS a flagman directing traffic against the red lights. sigh.
Tesla fanboys gush about how FSD can understand LEO at irregular traffic conditions, but no company I’m aware of has confirmed their systems are capable.
Waymo said they normally handle traffic light outages as 4-way stops, but sometimes call home for help - perhaps if they detect someone in the intersection directing traffic ?
Makes you wonder in general how these cars are designed to handle police directing traffic.
We know that Waymos phone home when needed, but not sure how Tesla handles these situations. I'm not sure how you conclude anything about Tesla based on their current temporary "safety monitor" humans in the cars - this is just a temporary measure until they get approval to go autonomous.
I wonder who gets the ticket when a driverless car does break the law and get stopped by police? If it's a Taxi service (maybe without a passenger in the car) then maybe it'd the service, but that's a bit different than issuing a traffic ticket to a driver (where there's points as well as a fine).
What if it's a privately owner car - would the ticket go to the car owner, or to the company that built the car ?!
They do follow hand signals from police. There are many videos documenting the behaviour. Here is one from waymo: https://waymo.com/blog/2024/03/scaling-waymo-one-safely-acro...
Look for the embed next to the text saying “The Waymo Driver recently interpreting a police officer’s hand signals in a Los Angeles intersection.”
Or here is a video observing the behaviour in the wild: https://youtu.be/3Qk_QhG5whw?si=GCBBNJqB22GRvxk1
Do you want confirmation about something more specific?
That ~1000 drivers on the road are all better trained on what to do in the next power outage is incredible.
There will always be unexpected events and mistakes made on the roads. Continual improvement that is locked in algorithmically across the entire fleet is way better than any individual driver's learning / training / behaviorior changes.
From my understanding the reason the Waymos didn't handle this was because humans were breaking traffic rules and going when they shouldn't have been. If most humans navigated it fine, then waynos would have handled this better.
Can't just program it to be all "when confused copy others" because it will invariably violate the letter of the law and people will screech. So they pick the legally safe but obviously not effective option, have it behave like a teenager on day 1 of drivers ed and basically freeze up. Of course that most certainly does not scale whatsoever, but it covers their asses so it's what they gotta do.
But over here in the reality of the world that we have to interact that, this concept of perfect rule-following will never, ever happen -- unless something first manages to wipe the last of the stain of humanity off of the earth's face.
Imagine I created a magic bracelet that could reduce bicycling related deaths and injuries from 130,000 a year to 70,000. A great win for humans! The catch is that everyone would need to wear it, even people that do not ride bikes, and those 70,000 deaths and injuries would be randomly distributed among the entire population. Would you wear it?
Safe vs Unsafe isn’t as simple as who gets a 10/10 on the closed course test. Humans are more predictable than random chance would allow, and often even when drunk or distracted. I can’t count how many times I’ve seen someone wobbling on the road and knew to stay back. You can also often tell when someone might yank over into your lane based on them flying up in the other lane, getting just in front of you in that lane then wiggling a bit, clearly waiting for the first chance to pull in front of you and take off. There are lots of other little ‘tells’ that, if you’re a defensive driver, have avoided countless accidents.
Being a prudent defensive driver goes out the window when the perfect speed limit adhering driver next to you goes to straight to -NaN when someone drives past it with Christmas lights on their car, or the sun glares off oversized chrome rims, or an odd shaped vehicle doesn’t match “vehicle” in the database, or, or, or.
* I’m very much not saying that the example I mentioned above is reason enough, I’m saying that I’m not sure enough thought is being put into how many more pages I could go on, and I’m just some schmuck that worked for some number of years on the underlying technology - not the guy watching it fail in imaginative ways on the road.
Something said earlier that really overestimated what’s happening: it doesn’t get smarter, it gets another “if” statement.
That's just what getting smarter is though. I mean, we want to see the human "if" as somehow better than the machine "if" due an obvious bias, but mechanically, what's the difference?
https://old.reddit.com/r/SelfDrivingCars/comments/1pem9ep/hm...
Waymo halts service during S.F. blackout after causing traffic jams
https://news.ycombinator.com/item?id=46342412