this post was submitted on 03 Sep 2024
1580 points (97.7% liked)

Technology

59525 readers
3854 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 0 points 2 months ago* (last edited 2 months ago)

I generally agree, but I really think people in this thread are being overly dismissive about how useful LLMs are, just because they're associated with techbros who are often associated with relatively useless stuff like crypto.

I mean most people still can't run an LLM on their local machine, which vastly limits what developers can use them for. No video game or open source software can really include them in any core features because most people can't run them. Give it 3 years when every machine has a dedicated neural chip and devs can start using local LLMs that don't require a cloud connection and Azure credits and you'll start seeing actually interesting and inventive uses of them.

There's still problems with attributing sources of information but I honestly feel like if all LLMs that were trained on copyrighted data had to be published open source so that anyone could use them it would get us enough of the way there that their benefits would outweigh their costs.