~~Developers~~ ~~developers~~ ~~developers~~ ~~developers~~, ~~developers~~ ~~developers~~ ~~developers~~ ~~developers~~ AI
Programming
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities [email protected]
Forward-thinking companies should use AI to transform each developer into a "10x developer,"
Developer + AI ≠ Developer x 10
At best, it means 1.25 x Developer, but in most cases, it will mean 0.5 x Developer. Because AI cannot be trusted to generate safe, reliable code.
I think 10x is a reasonable long term goal, given continued improvements in models, agentic systems, tooling, and proper use of them.
It's close already for some use cases, for example understanding a new code base with the help of cursor agent is kind of insane.
We've only had these tools for a few years, and I expect software development will be unrecognizable in ten more.
It also depends on the usecase. It likely can help you better at throwing webpages together from zero, but will fall apart once it has to be used to generate code for lesser-discussed things. Someone once tried to solve an OpenGL issue I had with ChatGPT, and first it tried to suggest me using SDL2 or GLFW instead, then it spat out a barely working code that was the same as mine, and still wrong.
A lot of it instead (from what I've heard from industry connections) being that the employees are being forced to use AI so hard they're threatened with firings, so they use most of their tokens to amuse themselves with stuff like rewriting the documentation in a pirate style or Old English. And at the very worst, they're actually working in constant overtime now, because people were fired, contracts were not extended, etc.
It’s made me a 10x developer.
As someone who transitioned form Junior to Dev as we embraced LLMs. Our company saved that much time that we all got a pay rise with a reduction in hours to boot.
Sick of all this anti LLM rhetoric when it’s a tool to aid you. People out here thinking we just ask ChatGPT and copy and paste. Which isn’t the case at all.
It helps you understand topics much quicker, can review code, read documentation, etc.
My boss is the smartest person I’ve ever met in my life and has an insane cv in the dev and open source world. If he is happy to integrate it in our work then I’m fine with it. After all we run a highly successful business with many high profile clients.
Edit: love the downvotes that don’t explain themselves. Like I’m not earning more money for doing less hours and productivity has increased. Feel like many of the haters of LLMs don’t even work in the bloody industry. 😂
And it's intentional. Lay off the workers. Implement AI Slop. Slop does sloppy work. Hire back workers as Temps or Contractors. No benefits. Lower pay.
Like all of Capitalism. It's a fucking scam. A conjob. A new innovation in fucking over workers. (Ironically the only "innovation" ever directly produced by Capitalism)
Since when are contractors lower pay? Companies waste fortunes on them.
They dont usually have benefits (eg: health insurance) or time off
Even accounting for that (at least in countries with national healthcare), they're definitely more expensive than regular employees.
I remember when everyone was saying that companies would need programmers and that every kid should learn programming. Now I realize that companies were promoting that idea so they're be a surplus of programmers competing with each other and companies could underpay and swap out workers quickly.
Yeah obviously. Whenever a company says "we can't get enough X workers" they implicitly mean "at the price we want to pay".
But that doesn't mean they were wrong. Programming is still an amazingly well paying and low stress career. Being replaced by AI is a little worrying, but I think by the time AI is good enough to really replace programmers, it will also be able to replace most white collar jobs - HR, finance, etc. - and society will have bigger problems.
I would not market an industry well know for burnouts as "low stress" though.
The games programming industry is high stress, but apart from that it isn't. I don't think it's known for burnouts any more than any other industry.
Even if AI is an actual tool that improves the software development speed of human developers (rather than something that ends up taking away in time spending reviewing, correcting and debugging the AI generated code, the time savings it gives in automatically writing the code), it's been my experience in almost 30 years of my career as a Software Engineer that every single tooling improvements that makes us capable of doing more in the same amount of time is eaten up by increasing demands on the capabilities of the software we make.
Thirty years ago user interfaces were either CLI or pretty simple with no animations. A Software Systems was just a software application - it ran on a single machine with inputs and outputs on that machine - not a multi-tiered octopus involving a bunch of back end data stores, then control and data retrieval middle tiers, then another tier doing UI generation using a bunch of intermediate page definition languages and a frontends rendering those pages to a user and getting user input, probably with some local code thrown into the mix. Ditto for how cars are now mostly multiple programs running of various microcontrollers with one or more microprocessors in the mix all talking over a dedicated protocol. Ditto for how your frigging "smart" washing machine talking to your dedicated smartphone app for it probably involves a 3rd machine in the form of some server from the manufacturer and the whole thing is running over TCP/IP and using the Internet (hence depending on a lot more machines with their dedicated software such as Routers and DNS servers) rather than some point-to-point direct protocol (such as Serial) like in the old days.
Anyways, the point being that even if AI actually delivers more upsides than downsides as a tool to improve programmer output, that stuff is going to be eaten up by increasing demands on the complexity of the software we do, same as the benefits of better programming languages were, the benefits of better IDEs were, of the widespread availability of pre-made libraries for just about everything were, of templating were, of the easiness to find solutions for the problem one is facing from other people on the Internet were, of better software development processes were, of source control were, of colaborative development tools were and so on.
Funnily enough, for all those things there were always people claiming it would make the life of programmers easier, when in fact all it did was make the expectations on the software being implemented go up, often just in terms of bullshit that's not really useful (the "smart" washing machine using networking to talk to a smartphone app so that the machine manufacturers can save a few dollars by not putting as many physical controllers in it, is probably a good example)
This assumes it is about output. 20 years of experience tell me it's not about output, but about profits and those can be increased without touching output at all. 🤷♂️
*specifically short-term profits. Executives only care about the next quarter and their own incentives/bonuses. Sure the company is eventually hollowed out and left as a wreck, but by then, the C Suite has moved on to their next host org. Rinse and repeat.
Often they only want the illusion of output, just enough to keep the profits eternally rising.
Genuinely a bit shocked to see the number of robolovers in these comments. Very weird, very disheartening. No wonder so much shit online doesn't work properly lol
I don’t honestly believe that AI can save me time as a developer. I’ve tried several AI agents and every single one cost me time. I had to hold its hand while it fumbled around the code base, then fix whatever it eventually broke.
I’d imagine companies using AI will need to hire more developers to undo all the damage the AI does to their code base.
AI can absolutely save you time, if you use it right. Don't expect it to magically be as good as a real programmer... but for instance I made an HTML visualisation of some stuff using Claude, and while it got it a bit wrong, fixing it took me maybe 20 minutes, while writing it from scratch would have taken me at least a couple of hours.
I guess for some simple stuff it can work fine, but the majority of the code I write is not at all simple, and it’s all highly dependent on the libraries I’ve written, which the AI is really bad at learning.
And then in terms of documentation, it is just hopelessly inept.
I've found it can just about be useful for "Here's my data - make a schema of it" or "Here's my function - make an argparse interface". Stuff I could do myself but find very tedious. Then I check it, fix its various dumb assumptions, and go from there.
Mostly though it's like working with an over-presumptuous junior. "Oh no, don't do that, it's a bad idea because security! What if (scenario that doesn't apply)" (when doing something in a sandbox because the secured production bits aren't yet online and I need to get some work done while IT fanny about fixing things for people that aren't me).
Something I've found it useful for is as a natural language interface for queries that I don't have the terminology for. As in "I've heard of this thing - give me an overview of what the library does?" or "I have this problem - what are popular solutions to it?". Things where I only know one way to do it and it feels like there's probably lots of other ways to accomplish it. I might well reject those, but it's good to know what else exists.
In an ideal world that information would be more readily available elsewhere but search engines are such a bin fire these days.
I mostly use AI as advanced autocomplete. But even just using it for documentation is wrong so often that I do't use it for anything more complex than tutorial level.
I got pretty far with cursor.com when doing basic stuff that i have to spend more time looking up documentation than writing code, but I wouldn't trust it with complex usec cases at this point.
I check back every 6 months or so, to keep track of the progress. Maybe I can spent my days as a software developer drinking cocktails by the pool yelling prompts into the machine soon, but so far I am not concerned I'll be replaced anytime soon.
I was in the same boat about...3mos ago. But recent tooling is kind of making me rethink things. And to be honest I'm kind of surprised. I'm fairly anti-AI.
Is it perfect? Fuck no. But with the right prompts and gates, I'm genuinely surprised. Yes, I still have to tweak, but we're talking entire features being 80% stubbed in sub 1 minute. More if I want it to test and iterate.
My major concern is the people doing this and not reviewing the code and shipping it. Because it definitely needs massaging...ESPECIALLY for security reasons.
Which tools are you finding success with?
My theory is that C-suites are actually using "AI efficiency gain" as an excuse for laying off workers without scaring the shareholders.
"I didn't lay off 10% of the workforce because the company is failing. It's because... uhmmmm... AI! I have replaced them with AI! Please give us more money."
It's the next RTO.
The funny thing is that if AI coding were that good, we would already see widespread adoption in open source projects. But we haven't, because it sucks. Of course commercial software development companies are free to lie about how much they use AI, or get creative with their metrics so they can get their KPI bonuses. So we can't really believe anything they say. But we can believe in transparency.
As always, there are so many people selling snake oil by saying the word AI without actually telling you what they mean. Quite obviously there are a great many tools that one could call AI that can be and are and have been used to help do a ton of things, with many of those technologies going back decades. That's different from using ChatGPT to write your project. Whenever you hear someone write about AI and not give clear definitions, there's a good chance they're full of s***.
You can fucking swear on the internet
Ironically, processing large amounts of data and making soft decisions and planning based on such data makes AI ideal for replacing C-suite members.
Not to mention the cost savings difference. Developer salaries make a ChatGPT subscription look like a bargain. C-level salaries make racks of dedicated hardware to run local models look like one.
Let's make a community powered, open source project to do this and watch them squirm when investors demand that million dollar CEOs get replaced with AI for higher investor returns.
Pointing this out in company wide meetings is a fun past time.
AI-assisted coding […] means more ambitious, higher-quality products
I'm skeptical. From my own (limited) experience, my use-cases and projects, and the risks of using code that may include hallucinations.
there are roughly 29 million software developers worldwide serving over 5.4 billion internet users. That's one developer for every 186 users,
That's an interesting way to look at it, and that would be a far better relation than I would have expected. Not every software developer serves internet users though.
it means more ambitious, higher-quality products
No ... the opposite actually.
That middle graph is absolute fucking bullshit. AI is not fucking ever going to replace 75% of developers or I've been working way too fucking hard for way to little pay these past 30 years. It might let you cut staff 5-10% because it enables folks to accomplish certain things a bit faster.
Christ on a fucking crutch. Ask developers who are currently using AI (not the ones working for AI companies) how much time and effort it actually saves them. They will tell you.
What do you expect? Half of these decision makers are complete idiots that are just good at making money and think that that means they are smarter than anyone who makes less than them. They then see some new hyped up tech, they chat with ChatGPT and they are dump enough to be floored by it's "intelligence" and now they think it can replace workers but since it's still early, they assume that it will quickly surpass the workers. So in their mind, firing ten programmers and saving like two million a year, while only spending maybe a few tens of thousands a year on AI will be a crazy success that will show how smart they are. And as time goes on and the AI gets better, they will save even more money. So why spend more money to help the programmers improve, when you can just fire them and spend a fraction of it on AI?