this post was submitted on 02 Jun 2024
193 points (91.8% liked)

Technology

57472 readers
3655 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 30 points 2 months ago (8 children)

It's a hype bubble. AI had been around for a while and will continue to be. The problem is specifically Large Language Models. They've been trained to SOUND human, but not to actually use that ability for anything more useful than small talk and bullshit. However because it SOUNDS charasmatic, and that is interesting to people, companies have started cramming it into everything they can think of to impress shareholders.

Shareholders are a collective group of people who are, on average, really more psychologically similar to crows than other humans - they like shiny things, have a mob mentality, and can only use the most basic of tools available, in their case usually money. New things presented in a flashy way by a charasmatic individual are most attractive to them, and they will seldom do any research beyond superficial first impressions. Any research they actually do generally skews towards confirmation bias.

This leads to an unfortunate feature of capitalism, which is the absolute need to make the numbers go up. To impress their shareholders, companies have to jangle keys in front of their faces. So whenever The Hip New Things comes along, it's all buzzwords and bullshit as they try and find any feasible way to cram it into their product. If they could make Smart Corn 2.0 powered by Chat GPT they would, and sell it for three times as much in the same produce isle as normal corn. And then your corn would tell you this great recipe if knows where the sauce is made with a battery acid base.

In most recent memory, this exact scenario played out with NFTs. When the NFT market collapsed as was inevitable, the corporations who swore it would supercharge their sales all quietly pretended it never happened. Soon something new was jangled in front of the shareholders and everybody forgot about them.

Now that generative AI is proving itself to just be a really convincing bullshitter, it's only a matter of time until it either dies and quietly slinks away or mutates into the next New Things and the cycle repeats. Like a pandemic of greed and stupidity. Maybe they'll figure out how to teach Chat GPT how to check and cite verified sources and make it actually do what they currently claim it does.

I guess it depends on if they can make it shiny enough to impress the crows.

[–] [email protected] 8 points 2 months ago (6 children)

I think we're in an ai bubble because Nvidia is way over valued and I agree with you that often people flock to shiny new things and many people are taking risk with the hope of making it big...and many will get left holding the bag.

But how do you go from NFTs, which never had widespread market support, to the market pumping a trillion dollars into Nvidia alone? This makes no sense. And to down play this as "just a bullshitter" leads me to believe you have like zero real world experience with this. I use copilot for coding and it's been a boost to productivity for me, and I'm a seasoned vet. Even the ai search results, which many times have left me scratching my head, have been a net benefit to me in time savings.

And this is all still pretty new.

While I think there it is over hyped and people are being ridiculous with how much this will change things, at the very least this is going to be a huge new tool, and I think you're setting yourself up to be left behind if you aren't embracing this and learning how to leverage it.

[–] [email protected] 0 points 2 months ago (2 children)

The AI technology we're using isn't "new" the core idea is several decades old, only minor updates since then. We're just using more parallel processing and bigger datasets to brute force the "advances". So, no, it's not actually that new.

We need a big breakthrough in the technology for it to actually get anywhere. Without the breakthrough, we're going to burst the bubble once the hype dies down.

[–] [email protected] 5 points 2 months ago* (last edited 2 months ago)

The landmark paper that ushered in the current boom in generative AI (Attention is all you need (Vaswani et al 2017)) is less than a decade old (and attention itself as a mechanism is from 2014), so I'm not sure where you are getting the idea that the core idea is "decades" old. Unless you are taking the core idea to mean neural networks, or digital computing?

[–] [email protected] 2 points 2 months ago

I just don't get this. There has not been some huge leap in processing power over the past few years, but there has been in generative AI. parallel processing, on the other hand, has been around for decades.

I just don't know how one can look at this and think there hasn't been some big step forward in ai, but instead claim it's all processing power. I think it's pretty obvious that there has been some huge leap in the generative AI world.

Also I've been incorporating it more and more. It boggles my mind that someone would look at this and seea passing fad.

load more comments (3 replies)
load more comments (4 replies)