this post was submitted on 17 May 2024
502 points (94.8% liked)

Technology

59598 readers
3336 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 56 points 6 months ago (3 children)

I'm a bit annoyed at all the people being pedantic about the term hallucinate.

Programmers use preexisting concepts as allegory for computer concepts all the time.

Your file isn't really a file, your desktop isn't a desk, your recycling bin isn't a recycling bin.

[Insert the entirety of Object Oriented Programming here]

Neural networks aren't really neurons, genetic algorithms isn't really genetics, and the LLM isn't really hallucinating.

But it easily conveys what the bug is. It only personifies the LLM because the English language almost always personifies the subject. The moment you apply a verb on an object you imply it performed an action, unless you limit yourself to esoteric words/acronyms or you use several words to overexplain everytime.

[–] [email protected] 14 points 6 months ago* (last edited 6 months ago) (2 children)

It's easily the worst problem of Lemmy. Sometimes one guy has an issue with something and suddenly the whole thread is about that thing, as if everyone thought about it. No, you didn't think about it, you just read another person's comment and made another one instead of replying to it.

I never heard anyone complain about the term "hallucination" for AIs, but suddenly in this one thread there are 100 clonic comments instead of a single upvoted ones.

I get it, you don't like "hallucinate", just upvote the existing comment about it and move on. If you have anything to add, reply to that comment.

I don't know why this specific thing is so common on Lemmy though, I don't think it happened in reddit.

[–] [email protected] 3 points 6 months ago

"Hallucination" pretty well describes my opinion on AI generated "content". I think all of their generation is a hallucination at best.

Garbage in, garbage out.

[–] [email protected] 3 points 6 months ago

I don't know why this specific thing is so common on Lemmy though, I don't think it happened in reddit.

When you're used to knowing a lot relative to the people around you, learning to listen sometimes becomes optional.

[–] [email protected] 4 points 6 months ago

They're nowadays using it to humanize neural networks, and thus oversell its capabilities.

[–] [email protected] -1 points 6 months ago* (last edited 6 months ago) (2 children)

What I don’t like about it is that it makes it sound more benign than it is. Which also points to who decided to use that term - AI promoters/proponents.

Edit: it’s like all of the bills/acts in congress where they name them something like “The Protect Children Online Act” and you ask, “well, what does it do?” And they say something like, “it lets local police read all of your messages so they can look for any dangers to children.”

[–] zalgotext 16 points 6 months ago (1 children)

The term "hallucination" has been used for years in AI/ML academia. I reading about AI hallucinations ten years ago when I was in college. The term was originally coined by researchers and mathematicians, not the snake oil salesman pushing AI today.

[–] [email protected] 5 points 6 months ago (1 children)

I had no idea about this. I studied neural networks briefly over 10 years ago, but hadn’t heard the term until the last year or two.

[–] [email protected] 0 points 6 months ago

We were talking about when it was coined, not when you heard it first

[–] [email protected] 7 points 6 months ago* (last edited 6 months ago) (1 children)

In terms of LLM hallucination, it feels like the name very aptly describes the behavior and severity. It doesn't downplay what's happening because it's generally accepted that having a source of information hallucinate is bad.

I feel like the alternatives would downplay the problem. A "glitch" is generic and common, "lying" is just inaccurate since that implies intent to deceive, and just being "wrong" doesn't get across how elaborately wrong an LLM can be.

Hallucination fits pretty well and is also pretty evocative. I doubt that AI promoters want to effectively call their product schizophrenic, which is what most people think when hearing hallucination.

Ultmately all the sciences are full of analogous names to make conversations easier, it's not always marketing. No different than when physicists say particles have "spin" or "color" or that spacetime is a "fabric" or [insert entirety of String theory]...

[–] [email protected] 5 points 6 months ago* (last edited 6 months ago) (1 children)

After thinking about it more, I think the main issue I have with it is that it sort of anthropomorphises the AI, which is more of an issue in applications where you’re trying to convince the consumer that the product is actually intelligent. (Edit: in the human sense of intelligence rather than what we’ve seen associated with technology in the past.)

You may be right that people could have a negative view of the word “hallucination”. I don’t personally think of schizophrenia, but I don’t know what the majority think of when they hear the word.

[–] [email protected] 5 points 6 months ago

You could invent a new word, but that doesn't help people understand the problem.

You are looking for an existing word that describes providing unintentionally incorrect thoughts but is totally unrelated to humans. I suspect that word doesn't exist. Every thinking word gets anthropomorphized.