this post was submitted on 07 Feb 2025
23 points (78.0% liked)

Technology

61774 readers
3467 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 3 comments
sorted by: hot top controversial new old
[–] [email protected] 21 points 8 hours ago

This was an interesting raising-awareness project.

And the article says they didn't let the chatbot generate its own responses (and therefore produce LLM hallucinations) but rather used an LLM in the background to categorize user's question and return an answer from said category.

[–] [email protected] 8 points 8 hours ago (1 children)

Surely nothing could go wrong...

"What do you know about money laundering?"

[–] Enkers 9 points 8 hours ago* (last edited 8 hours ago)

I mean, couldn't you just use any of a plethora of other uncensored LLMs from huggingface if you want those sorts of answers?