this post was submitted on 10 Sep 2023
686 points (95.5% liked)
Technology
59581 readers
2944 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It has no notion of logic at all.
It roughly works by piecing together sentences based on the probability of the various elements (mainly words but also more complex) being there in various relations to each other, the "probability curves" (not quite probability curves but that's a good enough analog) having been derived from the very large language training sets used to train them (hence LLM - Large Language Model).
This is why you might get things like pieces of argumentation which are internally consistent (or merelly familiar segments from actual human posts were people are making an argument) but they're not consistent with each other - the thing is not building an argument following a logic thread, it's just putting together language tokens in common ways which in its training set were found associate with each other and with language token structures similar to those in your question.
That's a great summary of how it works. Well done.