this post was submitted on 24 Oct 2023
853 points (93.0% liked)

Programmer Humor

19735 readers
759 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 1 year ago (1 children)

ChatGPT cannot explain, because it doesn't understand. It will simply string together a likely sequence of characters. I've tried to use it multiple times for programming tasks and found each time that it doesn't save much time, compared to an IDE. ChatGPT regularly makes up methods or entire libraries. I do like it for creating longer texts that I then manually polish, but any LLM is awful for factual information.

[–] [email protected] 0 points 1 year ago (1 children)

ChatGPT regularly makes up methods or entire libraries

I think that when it is doing that, it is normally a sign that what you are asking for does not exist and you are on the wrong track.

ChatGPT cannot explain, because it doesn’t understand

I often get good explanations that seem to reflect understanding, which often would be difficult to look up otherwise. For example when I asked about the code generated, {myVariable} , and how it could be a valid function parameter in javascript, it responded that it is the equivalent of {"myVariable":myVariable}, and "When using object literal property value shorthand, if you're setting a property value to a variable of the same name, you can simply use the variable name."

[–] [email protected] 3 points 1 year ago

If ChatGPT gives you correct information you're either lucky or just didn't realize it was making shit up. That's a simple fact. LLMs absolutely have their uses, but facts ain't one of them.