this post was submitted on 17 May 2024
502 points (94.8% liked)

Technology

59675 readers
3206 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 12 points 6 months ago (1 children)

The part that's being ignored is that it's a problem, not the existence of the hallucinations themselves. Currently a lot of enthusiasts are just brushing it off with the equivalent of ~~boys will be boys~~ AIs will be AIs, which is fine until an AI, say, gets someone jailed by providing garbage caselaw citations.

And, um, you're greatly overestimating what someone like my technophobic mother knows about AI ( xkcd 2501: Average Familiarity seems apropos). There are a lot of people out there who never get into a conversation about LLMs.

[–] [email protected] 1 points 6 months ago

I was talking to a friend recently about this. They studied medieval English and aren't especially techy, besides being a Millennial with techy friends; I said that merely knowing and using the term LLM correctly puts their AI knowledge above the vast majority of people (including a decent chunk of people trying to make a quick buck off of AI hype)