this post was submitted on 21 Feb 2024
288 points (95.0% liked)

Technology

58091 readers
3064 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

ChatGPT has meltdown and starts sending alarming messages to users::AI system has started speaking nonsense, talking Spanglish without prompting, and worrying users by suggesting it is in the room with them

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 8 points 6 months ago* (last edited 6 months ago) (1 children)

Is an octopus intelligent? Can an octopus build an airplane?

Why do you expect these models to have human skills if they are not humans?

How can they build a house if they don't even have vision or a physical body? Can a paralized human that can only hear and speak build a house? Is that human intelligent?

This is clearly not human intelligence, it clearly lacks human skills. Does it mean it isn't intelligent and it has no skills?

[–] [email protected] -3 points 6 months ago* (last edited 6 months ago) (2 children)

Exactly. They are just “models”. There is nothing intelligent about them.

Yes octopus are very intelligent. They can think themselves out of a box without relying on curated data to train them.

Logic, reasoning, and deduction. LLMs have zero ability to reject data based on their understanding of reality. Big diff.

[–] [email protected] 5 points 6 months ago* (last edited 6 months ago)

You keep asking questions like "can a model build a house" but keep ignoring questions like "can an octopus build a house". Then asking "can a model learn in seconds how to escape from a complex enclosure" and then ignoring "can a newborn human baby do that?"

Can an octopus write a poem? Can a baby write an essay? Can an adult human speak every human language, including fictional languages?

Just because it isn't as intelligent as a human doesn't mean this isn't some type if intelligence.

Go and check what we call AI in videogames. Do you think that's a simulated human? Go see what we've been calling AI in chess. Is that a simulated human being playing chess? No.

We've been calling Artificial intelligence things that are waaaaaay dumber than GPTs for decades. Even in the academia. Suddenly a group of people decided "artificial intelligence must be equal to human intelligence". Nope.

Intelligence doesn't need to be the same type of human intelligence.

[–] [email protected] 2 points 6 months ago

LLMs have zero ability to reject data based on their understanding of reality.

Me:

I just found a proof that 2+2=5!

ChatGPT:

The statement "2 + 2 = 5" is a classic example of a mathematical fallacy or incorrect reasoning, often used for humorous or illustrative purposes. In standard arithmetic, of course, 2 + 2 = 4. Proofs claiming to show otherwise rely on some form of invalid manipulation of algebraic properties or deliberate arithmetic mistakes.

A common "proof" of 2 + 2 = 5 might go something like this:

  1. Start with an incorrect assumption or manipulate the algebraic steps in a way that hides a division by zero or another illegal operation.
  2. Through these manipulations, arrive at the conclusion that 2 + 2 = 5.

However, each step of a valid mathematical proof must adhere strictly to the rules and properties of mathematics. Any proof leading to 2 + 2 = 5 will contain an error or a misapplication of those rules.

If you have a specific "proof" in mind, I can help identify the error in it.