this post was submitted on 22 Sep 2024
35 points (81.8% liked)
Programming
17492 readers
42 users here now
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities [email protected]
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
How about something autonomous that makes choices of its own will, and performs long term learning that influences the choices it makes, just as a flat benchmark.
LLMs don't qualify, they're trained, retain information within a conversation, then forget it after the conversation is closed. They don't do any long term learning after their initial training so they're basically forever trapped in the mode of regurgitating within the parameters set by the training data at the time they're trained.
That's just a very fancy way to search and read out the training data. Definitely not an active intelligence in there.
They also don't have any autonomy, they're not active of their own accord when they're not being addressed. They're not sitting there thinking, so they have no internal personal landscape of thought. They have no place in which a private intelligence can be at play.
They're innert.