this post was submitted on 16 Feb 2024
261 points (100.0% liked)

Technology

59581 readers
3051 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Air Canada must pay damages after chatbot lies to grieving passenger about discount | Airline tried arguing virtual assistant was solely responsible for its own actions::Airline tried arguing virtual assistant was solely responsible for its own actions

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 9 months ago* (last edited 9 months ago) (1 children)

Remember that my argument here, and the deciding factor, is specifically about whether or not the customer believes the price they are being offered is genuine.

And that's what happened in this case. The man thought the chatbot was giving him genuine information. "My family member is dying. Do you have a discount for bereavement situations?" The chatbot: [And I guarantee you, it did say this] "I'm so sorry you're going through difficult times. Of course! Here's what you need to do." The customer is already in a turmoil of emotions, so we can't really expect him to say "wait, this is too good to be true," especially if the answer aligns with what he is asking.

It's not like he is saying "can you put me on first class for free because I feel like it?" It's practically "What's your bereavement discount policy?" which is something airline companies do at their discretion. So, yes, the company must honor it.

I do appreciate your comments.

[–] [email protected] 2 points 9 months ago* (last edited 9 months ago) (1 children)

I agree that's 100% what happened in this specific case. The customer had absolutely no reason to suspect the information they were given was bad, and the airline should have honoured the deal.

A top-level comment on the post was also mine, by the way, in which I expressed the same and said "Shame on Air Canada for even fighting it."

Air Canada were completely and utterly wrong in this case - but I haven't been talking about this case! At least, I wasn't intending to!

If it seemed that way I can understand now why people were so vehemently against me.

My comments in this chain have all actually been trying to discuss how to determine, in the general case, which party is "in the right" when things like this happen.

There are cases like this Air Canada one where the customer is obviously right. We can also imagine hypothetical cases where I personally believe the customer would be in the wrong - for example if the customer intentionally exploited a flaw in the system to game a $1 flight - which is again obviously not what happened here, it's just an example for the sake of argument.

My fundamental point at the start of this comment chain was that I don't actually think we need any new mechanisms to work this out, because the existing mechanisms we already have in place to determine who is right between a company and a customer all still apply and work exactly the same regardless of whether it is AI or not AI.

And that mechanism is, fundamentally, that the customer should generally be considered right as long as they have acted in good faith.

That's why I'm very pleased with the ruling that Air Canada were wrong here and they cannot dodge their responsibilities by blaming the AI.

I'm honestly glad I can put the stress of this days-long comment chain behind me, since it seems we weren't even arguing about the same thing this whole time!

[–] [email protected] 1 points 9 months ago

Haha, it really did seem like that.

What you say is reasonable, but that goes beyond a simple "non-human" system, as it can also happen with human beings (e.g. social engineering; didn't an individual sent 25 million dollars to a scammer a couple of weeks ago?)

So, should a company honor an absurd offer? Probably not. But the whole pain they get from irate customers will be well-deserved, as it's their fault for having a flawed solution to a problem that can be solved more effectively: a human operator, or a very good web site search bar.