Tesla’s Autonomous Driving Claims Might Be Coming to an End [video]
Posted4 months agoActive4 months ago
youtube.comTechstory
heatednegative
Debate
85/100
TeslaAutonomous DrivingElon Musk
Key topics
Tesla
Autonomous Driving
Elon Musk
A YouTube video critiques Tesla's autonomous driving claims, sparking a heated discussion on the safety and efficacy of Tesla's Autopilot and Full Self-Driving features.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
7m
Peak period
28
0-6h
Avg / period
5.5
Comment distribution55 data points
Loading chart...
Based on 55 loaded comments
Key moments
- 01Story posted
Sep 4, 2025 at 7:58 PM EDT
4 months ago
Step 01 - 02First comment
Sep 4, 2025 at 8:05 PM EDT
7m after posting
Step 02 - 03Peak activity
28 comments in 0-6h
Hottest window of the conversation
Step 03 - 04Latest activity
Sep 8, 2025 at 11:29 PM EDT
4 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45133607Type: storyLast synced: 11/20/2025, 5:54:29 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
2) Tesla's fraud eclipses Nikola's few billion amateur hour.
> Just want to be absolutely clear that everyone’s top priority is achieving an amazing Autopilot demo drive. Since this is a demo, it is fine to hardcode some of it, since we will backfill with production code later in an OTA update. I will be telling the world that this is what the car will be able to do, not that it can do this upon receipt
You could maybe call it staged if they were, say, very selective about the conditions and course, but within those constraints it was actually doing what they said it did. That doesn't seem to have been the case.
Personal computer appliance users truly blame themselves for the failures of the devices, no matter the degree of wreckage and injury caused by bad design, while cheering the men (unbearable a*holes) who foist malfunctional and dangerous tech upon them.
The captains of this industry are notorious for refusing to take responsibility for their mistakes and actively planning to deploy tech they themselves claim is hazardous, while being continually cheered by investors hungry to make a killing.
Tesla is a case study for the world about the hazards of California Ideology libertarianism and the precedence of greed over personal responsibility and justice.
Since Ronald Reagan, personal responsibility has never been a libertarian (Republican) trait. It's always "Oops I did it again!" and "I forgot!"
No surprise the "trolley problem" is the signature thought experiment for the industry as its technocrats constantly hunt for ways to escape responsibility and seek unearned profits.
With the woeful performance of Musk's cars and robots, DOGE fiasco, Federal schedule drug habit, goonerism, and inability to maintain personal relationships, Musk's plans for a mars adventure are psychotic.
But what fun to watch!
The civil justice system is slow (and COVID made the whole justice system even slower for quite a while), but aside from that, maybe they're not? A quarter billion dollar verdict for a 2019 crash was recently returned against them.
https://www.reuters.com/legal/litigation/tesla-rejected-60-m...
I was glad when they started charging for it, 'cuz it just meant fewer dangerous Teslas on the road.
I have no doubt we'll get to full autopilot...eventually, and we've "gotten there" already with adaptive cruise control, BUT in the interim, if you can't pay full attention while driving you shouldn't be driving.
Self driving trucks could double that, which would mean much better use of the trucks, faster deliveries, and an "epidemic" of lower costs across the economy.
Things start to improve again once the features get even more capable.
IMO Tesla fsd is well past the hump compared to most current cars with acc+lateral.
But what makes this story note worthy? 8k per month sounds very naive when you consider, you can indirectly foster a herd of sheeps with dark money by paying for and click-farming advertisememt on their content, which is already happening today.
- The scary effect music shows it's intended as a hit piece.
- The constant intermixing of Autopilot and Full Self Driving, two very different things.
- Implying that driving just based on visual input is unsafe, when that is how all humans drive.
Of course, that the video is an unserious hit piece doesn't mean these Tesla features are safe. But I need something more serious to be convinced.
Or the accident stats that don't count an accident any collision without airbag deployment, regardless of injuries?
Except that's not at all how humans drive. Sure, there's visual input, but human vision is largely based on expectations. You see what you expect and ignore a lot of things. The predictive engine of the brain does a lot more than evaluate the present environment. This is both good and bad insofar as safety is concerned.
Not really clear what the argument is meant to be here...
Should I go on? This is trivial stuff, man.
An interesting question might be if the automation system can be evaluated against the human perceptual system, or amalgamation of systems. This seems like an exceedingly difficult premise to evaluate though, given the varied and dynamic nature of the real-world driving environment.
Your argument that human processing is superior to Tesla processing seems orthogonal (that's about how the data is handled, not about what input data is available).
I think I would concede the argument that it's "just processing" once someone has recreated the processing in an automation system. That seems unlikely. Or when an automation system outperforms a human in every situation one might encounter in a chaotic driving environment.
Your analogy is a bad-faith definition of orthogonal.
That's all I meant to say.
> ... Teslas have at least as good a visual input system as humans, so that alone can't be a reason they're worse drivers.
I don't think that's an accurate description of the original claim. At this point I do wish it had been. This perpetuates the discarding of how past experience affects perception, which seems like part of the "visual input system" of humans to me.
Implying that driving just based on single-eye 20/120 vision [1] is safe, when that is in fact illegal.
[1] https://news.ycombinator.com/item?id=44129317
Pretty wild to see several comments condemning their imaginary single camera system. Where do these lies come from?
Here is a diagram: https://www.researchgate.net/figure/Locations-of-cameras-on-...
First of all, it is convenient that you ignored the part about 20/120 vision which is below the legal minimum requirements for driving and thus no human is legally allowed to drive with such poor vision.
As for your misunderstanding, a human has two eyes placed in a binocular orientation on a swivel mount allowing for binocular depth perception when the head is pointed in a direction. That is how humans drive.
Except for the forward direction, the Tesla cameras have largely non-overlapping fields of view. As they are not mounted on a swivel, they only have single-eye vision in any non-forward direction. In the forward direction, the Tesla cameras do have overlapping fields of view, but are not only too close to support binocular perception, they also have different focal distances preventing binocular perception. As such, Teslas only have single-eye vision in the forward direction as well. So, they only have single-eye perception for objects.
Adding on their visual acuity far below minimum requirements for legal human driving, it is safe to say that their sensors are, in fact, not at all how humans drive. Any manufacturer who would claim as such is making false claims willfully and knowingly.
The actual reasons behind it are surely as varied as the users themselves... ;)
Might produce an interesting little dataset for the "data science folks" out there to add a "State your reasoning for this Flag:" requester to the flagging process.
That's the new "science" (or religion / cult?) surrounding lying these days isn't it? You just have to repeat a lie enough times and declare all the actual facts as "fake news" or "hoax" and the lie becomes truth, and the truth a lie.
Even GP stated a "worked for me" over and over to support the claim its generally working.
"Cult" is a decent description.
i've had "fsd" for years and basically never use it now. i just don't trust it.
anytime there is a new version update, i do try to have it drive from the house to the market (about 3 miles: two rights at stop signs, two rights and 1 left at stop lights) and there has never been a single time where i didn't have to take over at least once.
and maybe the problem is that i have had "fsd" while it was going through development. the trust is low from the many times it has tried to kill me. so, whenever it is on, there is nothing but stress. and so i'm more apt than not to take over when i see anything even minutely out of the ordinary.
they literally never have, and only offer "10x better" claims via musk. it's 10x of an unknown number.
it's blatantly obvious that it sometimes works. that doesn't mean it reliably always works and never causes a serious catastrophic failure.
i've never had it work end-to-end where i live. i always have to intercept it. and i own 3 teslas. and i like my teslas. but FSD is not even close to the perfection you're decreeing.
https://www.youtube.com/watch?v=6ltU9q1pKKM