this post was submitted on 05 Oct 2023
248 points (94.3% liked)
Technology
59105 readers
3353 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Doesn't sound as bad as the other headline that was lurking in another magazine. I do think it's an unexpected situation and the AI handled it as best as it could.
It couldn’t avoid her, no. The bigger problem was that the car parked on her, I think.
"Do nothing" is usually not that bad an approach to dealing with an unknown situation. I could easily see a situation where trying to back away from the person you just hit would increase the damage.
As other comments have suggested, we should wait for the video before judging whether this was really a bad choice by the autonomous car.
That doesn't get any truer even if you repeat it a few more times.
Truth is that a general approach was not sufficient here. This cars programming was NOT good enough. It has made a bad decision with bad consequences.
And "no it isn't" isn't a very convincing argument to the contrary.
Yes, in this particular case, maybe the car should have moved a bit. I'm talking about the general case. What are the odds that a car happens to come to a stop with its wheel exactly on top of someone's limb, versus having that wheel finish up somewhere near the person where further movement might cause additional harm? And how can the car know which situation it's currently in?
Wrong question.
If you want autonomous cars outside in the real world (as opposed to artificial lab and test scenarios), then they have to deal with real world situations. This situation has happened in reality. You don't need to ask about odds anymore.
That is an engineering question. A good one. And again one of these that should have been solved before they let this car out into the real world.
This situation happened, yes. Do you think this is the only time that an autonomous car will ever find itself straddling a pedestrian and need to decide which way to move its tires to avoid running over their head? You can't just grab one very specific case and tell the car to treat every situation as if it was identical to that, when most cases are probably going to be quite different.