this post was submitted on 11 Feb 2025
601 points (99.0% liked)

Technology

63614 readers
3122 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] mindbleach 1 points 2 weeks ago* (last edited 2 weeks ago)

"Mistake" is a misguided label. This system has no idea what's real. It's just completing plausible sentences.

It's not doing critical analysis. It's guessing words. It's eerily close, sometimes - but all these efforts to make it an oracle are a soup sandwich.

LLMs are not the kind of neural network that will accomplish this task reliably. It's simply not what they're for. Plausibility will suffice when drawing a hand, but if you ask it to draw the back of your hand, it will have no such information, but it may try anyway.