this post was submitted on 20 Apr 2024
578 points (96.9% liked)
Technology
59598 readers
3376 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
That's a problem with the software. The passengers in the car were never at risk and the car could have stopped at any time, the issue was that the car didn't know what was happening. This situation wouldn't have engaged the autopilot in the way we are discussing.
As an aside, if what you said is true, people at Tesla should be in jail. WTF
Tesla washes their hands of any wrongdoing with terms of use where owner agrees he's responsible bla bla bla.
Here's a related video.
Here is an alternative Piped link(s):
related video
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I'm open-source; check me out at GitHub.