this post was submitted on 30 Jan 2024
505 points (93.6% liked)

Technology

59689 readers
3200 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] gravitas_deficiency 38 points 10 months ago (1 children)

There are very valid philosophical and ethical reasons not to use it. We’re not just being luddites for the hell of it. In many cases, we’re engineers and scientists with interest, experience, or expertise in neural nets and LLMs ourselves, and we don’t like how fast and loose (in a lot of really, really important ways) all these big companies are playing it with the training datasets, nor how they’re actively disregarding any sort of legal or ethical responsibility around the technology writ large.

[–] [email protected] -5 points 10 months ago (1 children)

Likewise. The same could be said about every technology.

[–] [email protected] 2 points 10 months ago

Uh, no. Why would that be the case? Every technology has unique upsides and downsides and the downsides of this one are not being handled correctly and are in fact being exacerbated.