this post was submitted on 05 Aug 2024
468 points (97.0% liked)

Technology

59669 readers
3115 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 42 points 3 months ago* (last edited 3 months ago) (4 children)

I feel like the amount of training data required for these AIs serves as a pretty compelling argument as to why AI is clearly nowhere near human intelligence. It shouldn't take thousands of human lifetimes of data to train an AI if it's truly near human-level intelligence. In fact, I think it's an argument for them not being intelligent whatsoever. With that much training data, everything that could be asked of them should be in the training data. And yet they still fail at any task not in their data.

Put simply; a human needs less than 1 lifetime of training data to be more intelligent than AI. If it hasn't already solved it, I don't think throwing more training data/compute at the problem will solve this.

[–] [email protected] 27 points 3 months ago (1 children)

There is no "intelligence", ai is a pr word. Just a language model that feeds on a lot of data.

[–] [email protected] 7 points 3 months ago

Oh yeah we're 100% agreed on that. I'm thinking of the AI evangelicals who will argue tooth and nail that LLMs have "emergent properties" of intelligence, and that it's simply an issue of training data/compute power before we'll get some digital god being. Unfortunately these people exist, and they're depressingly common. They've definitely reduced in numbers since AI hype has died down though.

[–] [email protected] 12 points 3 months ago (1 children)

Humans have the advantage of billions of years of evolution.

[–] [email protected] -1 points 3 months ago (2 children)

"ai" also has the advantage of billions of years of evolution.

[–] [email protected] 4 points 3 months ago

We're very proficient at walking, but somehow haven't produced a walking home or anything like that.

It's not very linear.

[–] [email protected] 3 points 3 months ago

Definitely not the same thing. Just because you can make use of the end result of major efforts does not somehow magically give you access to all the knowledge from those major efforts.

You can use a smart phone easily, but that doesn't mean you magically know how to make one.

[–] [email protected] 5 points 3 months ago (1 children)

You’ve had the entire history of evolution to get the instinct you have today.

Nature Vs Nurture is a huge ongoing debate.

Just because it takes longer to train doesn’t mean it’s not intelligent, kids develop slower than chimps.

Also intelligent doesn’t really mean anything, I personally think Intelligence is the ability to distillate unusable amounts of raw data and intuit a result beneficial to one’s self. But very few people agree with me.

[–] [email protected] 0 points 3 months ago

I see intelligence as filling areas of concept space within an econiche in a way that proves functional for actions within that space. I think we are discovering more that "nature" has little commitment, and is just optimizing preparedness for expected levels of entropy within the functional eco-niche.

Most people haven't even started paying attention to distributed systems building shared enactive models, but they are already capable of things that should be considered groundbreaking considering the time and finances of development.

That being said, localized narrow generative models are just building large individual models of predictive process that doesn't by default actively update information.

People who attack AI for just being prediction machines really need to look into predictive processing, or learn how much we organics just guess and confabulate ontop of vestigial social priors.

But no, corpos are using it so computer bad human good, even though the main issue here is the humans that have unlimited power and are encouraged into bad actions due to flawed social posturing systems and the confabulating of wealth with competency.

[–] [email protected] 4 points 3 months ago (1 children)

A human lifetime worth of video is not anywhere close to equalling a human lifetime of actual corporeal existence, even in the perfect scenario where the AI is as capable as a human brain.

[–] [email protected] 3 points 3 months ago

Strange to equate the other senses to performance in intellectual tasks but sure. Do you think feeding data from smells, touch, taste, etc. into an AI along with the video will suddenly make it intelligent? No, it will just make it more likely to guess what something smells like. I think it's very clear that our current approach to AI is missing something much more fundamental to thought than that, it's not just a dataset problem.