this post was submitted on 15 Feb 2025
184 points (95.5% liked)
LinkedinLunatics
3930 readers
408 users here now
A place to post ridiculous posts from linkedIn.com
(Full transparency.. a mod for this sub happens to work there.. but that doesn't influence his moderation or laughter at a lot of posts.)
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This kind of logic never made sense to me, like: if an AI could build something like Netflix (even if it needed the assistance of a mid software engineer), then it means every indie dev will be able to build a Netflix competitor, bringing the value of Netflix down. Open source tools would quickly reach a level where they’d surpass any closed source software, and would be very user-friendly without much effort.
We’d see LLMs being used to create and improve rapidly infrastructure like compilers, IDEs and build systems that are currently complex and slow, rewrite any slow software into faster languages etc. So many projects that are stalled today for lack of manpower would be flourishing and flooding us with new apps and features in an incredible pace.
I’m yet to see it happen. And that’s because for LLMs to produce anything with enough quality, they need someone who understands what they’re outputting, someone who can add the necessary context in each prompt, who can test it, integrate it into the bigger scheme without causing regressions etc. It’s no simple work and it requires even understanding LLMs’ processing limitations.
LLMs, by definition, can't push the limit. LLM's can only ever produce things that look like what they were trained on. That makes them great for rapidly producing mediocre material, but it also means they will never produce anything complex.