this post was submitted on 26 Jun 2023
119 points (97.6% liked)

Asklemmy

43167 readers
1381 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy ๐Ÿ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_[email protected]~

founded 5 years ago
MODERATORS
119
Deleted (lemmy.dbzer0.com)
submitted 1 year ago* (last edited 1 year ago) by [email protected] to c/[email protected]
 

Deleted

you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 52 points 1 year ago (14 children)

If you can use human screening, you could ask about a recent event that didn't happen. This would cause a problem for LLMs attempting to answer, because their datasets aren't recent, so anything recent won't be well-refined. Further, they can hallucinate. So by asking about an event that didn't happen, you might get a hallucinated answer talking about details on something that didn't exist.

Tried it on ChatGPT GPT-4 with Bing and it failed the test, so any other LLM out there shouldn't stand a chance.

[โ€“] [email protected] 15 points 1 year ago (1 children)

On the other hand you have insecure humans who make stuff up to pretend that they know what you are talking about

load more comments (12 replies)