Tesla Is Urging Drowsy Drivers to Use 'full Self-Driving'. That Could Go Wrong
Posted3 months agoActive3 months ago
wired.comTechstory
skepticalnegative
Debate
20/100
TeslaAutonomous VehiclesRoad Safety
Key topics
Tesla
Autonomous Vehicles
Road Safety
Tesla is encouraging drivers to use 'Full Self-Driving' mode even when drowsy, raising concerns about safety and the technology's limitations, with commenters questioning the wisdom of this approach.
Snapshot generated from the HN discussion
Discussion Activity
Light discussionFirst comment
N/A
Peak period
3
0-1h
Avg / period
2
Key moments
- 01Story posted
Sep 27, 2025 at 11:52 AM EDT
3 months ago
Step 01 - 02First comment
Sep 27, 2025 at 11:52 AM EDT
0s after posting
Step 02 - 03Peak activity
3 comments in 0-1h
Hottest window of the conversation
Step 03 - 04Latest activity
Sep 27, 2025 at 4:55 PM EDT
3 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45396818Type: storyLast synced: 11/20/2025, 5:28:51 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
Obviously the most responsible choice is to not drive while sleep deprived, or to pull over to a safe location and rest if one is too drowsy to operate a vehicle.
That said, if one is refusing to do that, for whatever reason, I have a hard time believing the upcoming FSD 14 stack is going to be more dangerous than driving drunk, or than driving while so drowsy as to be equally unsafe to driving drunk.
The older pre-NN stacks definitely weren't "there", but it requires either a meaningful ignorance or motivated partisanship to argue that FSD 13 isn't at least approaching the threshold of being a better driver than the worst drivers on the road, if not definitively better.
Speaking of ignorance and motivated partisanship, how do you explain the fact that Tesla has filed a motion in federal court to keep automated crash data away from the public?
If FSD is as good as you say, I would expect them to jump at the chance to show everyone the proof.
https://www.reuters.com/legal/government/musks-tesla-seeks-g...
When (not if) visuals are obscured, there is no backup. The vehicle is driving blind.
This is a fundamental engineering failure --- a lack of redundancy in a critical safety system. It can't/won't be fully overcome without additional feedback in my opinion.