this post was submitted on 15 Jul 2023
505 points (95.8% liked)

Technology

59669 readers
3109 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

ChatGPT use declines as users complain about ‘dumber’ answers, and the reason might be AI’s biggest threat for the future::AI for the smart guy?

(page 2) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 6 points 1 year ago* (last edited 1 year ago)

Surely the rampant server issues are a big part of that.

OpenAI have been shitting the bed over the last 2 weeks with constant technical issues during the workday for the web front end.

[–] [email protected] 6 points 1 year ago* (last edited 1 year ago)

error loading comment

[–] [email protected] 6 points 1 year ago (1 children)

It definitely got more stupid. I stopped paying for plus because the current GPT4 isn't much better than the old GPT3.5.

If you check downdetector.com, it's obvious why they did this. Their infrastructure just couldn't keep up with the full size models.

I think I'll get myself a proper GPU so I can run my own LLMs without worrying that they could stop working for my use case.

[–] [email protected] 2 points 1 year ago (1 children)

GPT4 needs a cluster of around 100 server-grade GPUs that are more than 20k each, I don’t think you have that lying around at home.

[–] [email protected] 2 points 1 year ago

I don't, but a consumer card with 24GB of VRAM can run a model that's about as powerful as the current GPT3.5 in some use cases.

And you can rent some of that server-grade hardware for a short time to do fine-tuning, which lets you surpass even GPT4 in some niches.

[–] [email protected] 2 points 1 year ago

I’ve definitely seen GPT-4 become faster and the output has been sanitized a bit. I still find it incredibly effective in helping with code reviews where GPT-3 was never helpful in producing useable code snippets. At some point it stopped trying to write large swaths of code and started being a little more prescriptive and you still need to actually implement snippets it provides. But as a tool, it’s still fantastic. It’s like a sage senior developer you can rubber duck anytime you want.

I probably fall in the minority of people who thinks releasing a castrated version of GPT is the ethical approach. People outside the technology bubble don’t have a comprehension of how these models work and the capacity for harm. Disinformation, fake news and engagement algorithms are already social ills that manipulate us emotionally and most people are too technologically illiterate to see how pervasive these problems are already.

load more comments
view more: ‹ prev next ›