this post was submitted on 18 Jul 2023
66 points (98.5% liked)

Technology

59168 readers
2380 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 4 points 1 year ago* (last edited 1 year ago)

I’ve noticed the same thing with ChatGPT when I ask it to write some code. On first glance, it seems awesome and like it solved your question with ease. However, when you try to go forward and use it, or just look a bit more closely, you realize it has used classes or functions that don’t exist in the library it imports and it made up something. Or, it’s just wrong and not solving the question. This seems fine, but if you go three or four rounds trying to correct it with prompts, and it still gets it wrong, you start to look for other answers.

It’s often helpful for getting headed in the right direction and it has saved me some time, but it doesn’t write flawless code. It needs to be reviewed and corrected by a human.