this post was submitted on 13 Aug 2023
385 points (74.3% liked)

Technology

59168 readers
2380 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 60 points 1 year ago (12 children)

I don't think it does. I doubt it is purely a cost issue. Microsoft is going to throw billions at OpenAI, no problem.

What has happened, based on the info we get from the company, is that they keep tweaking their algorithms in response to how people use them. ChatGPT was amazing at first. But it would also easily tell you how to murder someone and get away with it, create a plausible sounding weapon of mass destruction, coerce you into weird relationships, and basically anything else it wasn't supposed to do.

I've noticed it has become worse at rubber ducking non-trivial coding prompts. I've noticed that my juniors have a hell of a time functioning without access to it, and they'd rather ask questions of seniors rather than try to find information our solutions themselves, replacing chatbots with Sr devs essentially.

A good tool for getting people on ramped if they've never coded before, and maybe for rubber ducking in my experience. But far too volatile for consistent work. Especially with a Blackbox of a company constantly hampering its outputs.

[–] [email protected] 64 points 1 year ago (7 children)

As a Sr. Dev, I'm always floored by stories of people trying to integrate chatGPT into their development workflow.

It's not a truth machine. It has no conception of correctness. It's designed to make responses that look correct.

Would you hire a dev with no comprehension of the task, who can not reliably communicate what their code does, can not be tasked with finding and fixing their own bugs, is incapable of having accountibility, can not be reliably coached, is often wrong and refuses to accept or admit it, can not comprehend PR feedback, and who requires significantly greater scrutiny of their work because it is by explicit design created to look correct?

ChatGPT is by pretty much every metric the exact opposite of what I want from a dev in an enterprise development setting.

[–] [email protected] 6 points 1 year ago (1 children)

Honestly once ChatGPT started giving answers that consistently don't work I just started googling stuff again because it was quicker and easier than getting the AI to regurgitate stack overflow answers.

load more comments (5 replies)
load more comments (9 replies)