this post was submitted on 22 Sep 2023
15 points (89.5% liked)

LocalLLaMA

2292 readers
1 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 2 years ago
MODERATORS
 

Reversal knowledge in this case being, if the LLM knows that A is B, does it also know that B is A, and apparently the answer is pretty resoundingly no! I'd be curious to see if some CoT affected the results at all

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 4 points 1 year ago* (last edited 1 year ago) (3 children)

That's a logical fallacy. Given A is B it does not follow that B is A.

edit: it would make sense if it was phrased as "A is equivalent to B". Saying "A is B" in a scientific context has a very specific meaning. Makes me wonder how trustworthy the paper itself is.

[–] noneabove1182 1 points 1 year ago (1 children)

I'm not really sure I follow, it's just a simplification, the most appropriate phrasing I guess would be "given A belongs to B, does it know B 'owns' A" like the examples given with "A is the son of B, is B the parent of A"

[–] [email protected] 1 points 1 year ago

Looks like the findings are specifically about out-of-context learning, i.e. fine-tuning on facts like "Tom Cruise's mother was Mary Lee Pfeiffer" is not enough to be able to answer a questions like "Who are the children of Mary Lee Pfeiffer?", without any prompt engineering/tuning.

However, if you have in the context something like "Who was Tom Cruise's mother?", then the LLM has no problem answering correctly "Who are the children of Mary Lee Pfeiffer?", listing all the children, including Tom Cruise.

Note that it would be confusing even to a human to ask "Who is the son of Mary Lee Pfeiffer?", which is what they test on, since the lady had more than one son. That was the point of my comment, it's just a misleading question.

But that's not the issue in general that the researchers have unearthed, as I assumed based on the "A is B" summary, so yeah, it's just a poor choice of wording.

load more comments (1 replies)