this post was submitted on 11 Feb 2025
601 points (99.0% liked)
Technology
63614 readers
3122 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
"Mistake" is a misguided label. This system has no idea what's real. It's just completing plausible sentences.
It's not doing critical analysis. It's guessing words. It's eerily close, sometimes - but all these efforts to make it an oracle are a soup sandwich.
LLMs are not the kind of neural network that will accomplish this task reliably. It's simply not what they're for. Plausibility will suffice when drawing a hand, but if you ask it to draw the back of your hand, it will have no such information, but it may try anyway.