this post was submitted on 20 Aug 2024
361 points (91.9% liked)

Technology

59708 readers
1799 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 4 points 3 months ago* (last edited 3 months ago)

Knowing the limits of your knowledge can itself require an advanced level of knowledge.

Sure, you can easily tell about some things, like if you know how to do brain surgery or if you can identify the colour red.

But what about the things you think you know but are wrong about?

Maybe your information is outdated, like you think you know who the leader of a country is but aren't aware that there was just an election.

Or maybe you were taught it one way in school but it was oversimplified to the point of being inaccurate (like thinking you can do physics calculations but end up treating everything as frictionless spheres in gravityless space because you didn't take the follow up class where the first thing they said was "take everything they taught you last year and throw it out").

Or maybe the area has since developed beyond what you thought were the limits. Like if someone wonders if they can hook their phone up to a monitor and another person takes one look at the phone and says, "it's impossible without a VGA port".

Or maybe applying knowledge from one thing to another due to a misunderstanding. Like overhearing a mathematician correcting a colleague that said "matrixes" with "matrices" and then telling people they should watch the Matrices movies.

Now consider that not only are AIs subject to these things themselves, but the information they are trained on is also subject to them and their training set may or may not be curated for that. And the sheer amount of data LLMs are trained on makes me think it would be difficult to even try to curate all that.

Edit: a word