this post was submitted on 10 Nov 2024
199 points (92.0% liked)
Programming
17668 readers
130 users here now
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities [email protected]
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
As someone that have worked in software for 30 years, and deplying complicated software, shared libraries is a misstake. You think you get the benefit of size and easy security upgrades, but due to deployment hell you end up using docker and now your deployment actually added a whole OS in size and you need to do security upgrades for this OS instead of just your application. I use rust for some software now, and I build it with musl, and is struck by how small things get in relation to the regular deployment, and it feels like magic that I no longer get glibc incompatibility issues.
Maybe tackle that deployment hell instead of band-aiding it with docker?
He is. By using statically linked binaries.
Technically this is conflating two things: bundling dependencies and static/dynamic linking. But since you have to bundle your dependencies to use static linking, and there's little point dynamic linking if you bundle your dependencies... most of the time they are synonymous.
Exceptions are things like plugins, but that's pretty rare.
Maybe for your use cases that's OK, but there are many situations where the size and ease of upgrading provided by shared libraries is worthwhile. For example it would suck to need to push a 40+ GB binary to a fleet of systems with a poor or unreliable internet connection. You could try to mitigate this sort of thing by splitting the application up into microservices, but that adds complexity, and isn't always a viable tradeoff if maximizing compute efficiency is also a concern.
I'm not so sure that dynamic libraries always reduces the size. Specially with libraries that are linked by a single binary.
With static libraries, you can conditionally compile only the features you're gonna use. With dynamic libraries, however, the whole library must be compiled.
EDIT: just to clarify, I'm not saying that static libraries result always in less size. I'm saying that it's not a black and white issue.