this post was submitted on 24 Aug 2023
415 points (89.5% liked)
Technology
59646 readers
2656 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Autopilot or FSD Beta has never been something you're supposed to rely on and every Tesla owner knows this. If they drive over someone it's the fault of the driver, not the vehicle. Accidents will always happen and if you focus on individual incidents you're missing the big picture. You're never going to reach 100% safety and 99.99% safety means 33000 accidents a year in the US alone. Also the little statistics we have about this indicate that drivers with FSD or Autopilot engaged already crash less than the average.
Source
Too bad there's so many owners relying on it.
Not at all. In fact, the point is to focus on classes of crashes. Which Tesla fails miserably at.
This is an outright lie. Period. Having owned a Tesla since 2018, I'm quite familiar with the garbage software and the user community that loves to say no one should trust it on one side, and on the other side of their face says that it's better than a human.
Literally can't debate with you guys because you straight out refuse to believe any evidence presented to you and just base your opinions on anecdotal evidence and individual incidents. If those stats are made up then provide a better source that backs you up.
You seem to be confusing evidence with marketing. That's your problem. If you actually owned one, you'd know I was right.
That's anecdotal evidence. There's nothing scientific about sample of one.
I don't know how to break this to you, but you don't have any data at all. Please do link it here for me if you do, though. I'd love to see what you think is "data". If it's Tesla's EOQ slides, I'll at least have got some great laughs out of this thread.
I'm basing my view on the only data that I'm aware of and it's linked above. You might not like the source of this data and that's fair but you not liking it is not evidence to the contrary. Even this article that refutes those stats doesn't provide any significant evidence that Tesla autopilot/FSD is more dangerous than a human driver. You can focus on individual incidents all day but what truly matters is the big picture and wether on average it's safer than a human driver or not and currently it seems that at worse it's about on par with humans and thus it's just a matter of time untill it's better. A lot better.
I have no dog in this fight and I'm fully willing to grant that the data Tesla hands out is very likely more or less misleading but that still doesn't make my basic argument wrong which is that Teslas and their driver assist systems aren't inherently any more dangerous than the competition. Tesla just gets way more attention in the media.
Of course you link Brad's bullshit. Man I just love it.
The data that exists is the standing general order data which shows Tesla absolutely sucks at life. Period.