this post was submitted on 04 Mar 2025
19 points (95.2% liked)

Technology

63897 readers
5952 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] merde -2 points 1 day ago (2 children)

mistral's ai is shit. It's not hallucinating, it's trying to deceive.

[–] [email protected] 3 points 1 day ago (1 children)

All models hallucinate, it's just how language models work.

Do you have sources for this claim that Mistral's models are trying to deceive anyone?

[–] merde -2 points 1 day ago (1 children)

All models hallucinate, it's just how language models work.

yes, i know. I'm ok with hallucinations.

Do you have sources for this claim that Mistral's models are trying to deceive anyone?

source is me and a chat i had with "le chat" a couple of days ago. I wanted to test it's capabilities, so I asked it to invent a joke. It copy-pasted from reddit everytime! I pointed that out, i asked it to stop using reddit as source. It kept excusing itself and giving me reddit jokes while claiming that they're genuine "never heard before" jokes. I call that "deception" and not "hallucination".

[–] [email protected] 5 points 1 day ago (1 children)

Umm, that is quite literally hallucinations what you are describing? Am I missing something here?

[–] merde 0 points 1 day ago

i would call a hallucination, seeing what's not "there". Copying jokes from reddit is not hallucinating.

good or bad, inventing new jokes would need the ability to "hallucinate"

[–] mindbleach 1 points 1 day ago (1 children)

Surely deception would require more intelligence than giving the expected answer.

[–] merde 1 points 1 day ago (1 children)

it wasn't giving the "expected answer"

[–] mindbleach 1 points 1 day ago (1 children)

There's a lot of this conversation you're not getting.

[–] merde 1 points 23 hours ago

c'est réciproque, je pense