130
submitted 2 weeks ago* (last edited 2 weeks ago) by [email protected] to c/[email protected]

Small rant : Basically, the title. Instead of answering every question, if it instead said it doesn't know the answer, it would have been trustworthy.

you are viewing a single comment's thread
view the rest of the comments
[-] [email protected] 46 points 2 weeks ago

I'd love to agree with you - but when people say that LLMs are stochastic parrots, this is what they mean...

LLMs don't actually know what the words they're saying mean, they just know what words are most likely to be next to each other based on training data.

Because they don't know the meaning of what they're saying, they also don't know the factuality of what they're saying - as such they simply can't self-fact check.

load more comments (6 replies)
this post was submitted on 29 Jun 2024
130 points (91.7% liked)

ChatGPT

8666 readers
2 users here now

Unofficial ChatGPT community to discuss anything ChatGPT

founded 1 year ago
MODERATORS