this post was submitted on 14 Mar 2025
305 points (96.6% liked)
Technology
66584 readers
3832 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I have certainly found that to be the case with developers working with me. They run into a small problem so they instantly go to Copilot and just paste what it says the answer is. Then, because they don't know what they're asking or fully understand the problem, they can't comprehend the answer either. Then later, they come to me having installed a library they don't understand and code that's been hallucinated by Copilot and then ask me why it's not working.
A little bit of stepping back and going "What do I hope to achieve with this?" and "Why do I have to do it this way?" goes a long way. It stops you going down rabbit holes.
Then again, isn't that what people used to do with StackOverflow?
Yes, one of the major issues with StackOverflow that answerers complained about a lot was the "XY problem.".
https://meta.stackexchange.com/questions/66377/what-is-the-xy-problem
Where you're trying to do X, but because you're inexperienced you erroneously decide Y must be the solution even though it is a dead end, and then ask people how to do Y instead of X.
ChatGPT drives that problem up to 11 because it has no problems enabling you to focusing on Y far longer than you should be.
I find that interesting because, sometimes AI actually does the opposite for me. It suggests approaches to a problem that I hadn't even considered. But yes, if you push it a certain direction it will certainly lead you along.