nyan

joined 9 months ago
[–] nyan 3 points 6 months ago

I agree that Gentoo will probably work, as it still has functional i486 support. Be aware that you may be spending a lot of time compiling if you go that route and don't have a second, faster machine to use for distcc or the like.

As for the nvidia card, the proprietary driver won't work for something of that age. Check the supported cards in Nouveau (and maybe even the really old drivers for prehistoric cards). In a pinch, the vesa driver should work. Good luck.

[–] nyan 22 points 6 months ago

Gnome and other desktops need to start working on integrating FOSS

In addition to everything everyone else has already said, why does this have anything to do with desktop environments at all? Remember, most open-source software comes from one or two individual programmers scratching a personal itch—not all of it is part of your DE, nor should it be. If someone writes an open-source LLM-driven program that does something useful to a significant segment of the Linux community, it will get packaged by at least some distros, accrete various front-ends in different toolkits, and so on.

However, I don't think that day is coming soon. Most of the things "Apple Intelligence" seems to be intended to fuel are either useless or downright offputting to me, and I doubt I'm the only one—for instance, I don't talk to my computer unless I'm cussing it out, and I'd rather it not understand that. My guess is that the first desktop-directed offering we see in Linux is going to be an image generator frontend, which I don't need but can see use cases for even if usage of the generated images is restricted (see below).

Anyway, if this is your particular itch, you can scratch it—by paying someone to write the code for you (or starting a crowdfunding campaign for same), if you don't know how to do it yourself. If this isn't worth money or time to you, why should it be to anyone else? Linux isn't in competition with the proprietary OSs in the way you seem to think.

As for why LLMs are so heavily disliked in the open-source community? There are three reasons:

  1. The fact that they give inaccurate responses, which can be hilarious, dangerous, or tedious depending on the question asked, but a lot of nontechnical people, including management at companies trying to incorporate "AI" into their products, don't realize the answers can be dangerously innacurate.
  2. Disputes over the legality and morality of using scraped data in training sets.
  3. Disputes over who owns the copyright of LLM-generated code (and other materials, but especiallly code).

Item 1 can theoretically be solved by bigger and better AI models, but 2 and 3 can't be. They have to be decided by the courts, and at an international level, too. We might even be talking treaty negotiations. I'd be surprised if that takes less than ten years. In the meanwhile, for instance, it's very, very dangerous for any open-source project to accept a code patch written with the aid of an LLM—depending on the conclusion the courts come to, it might have to be torn out down the line, along with everything built on top of it. The inability to use LLM output for open source or commercial purposes without taking a big legal risk kneecaps the value of the applications. Unlike Apple or Microsoft, the Linux community can't bribe enough judges to make the problems disappear.

[–] nyan 2 points 6 months ago (2 children)

Which means that if you have a flatpak with an uncommon library and the dev stops issuing updated flatpaks because they get hit by a bus, you could be SOL with respect to that library. Distro libs are less likely to have this happen because very few distros have a bus factor of 1—there's usually someone who can take over.

[–] nyan 1 points 6 months ago* (last edited 6 months ago)

Not for everyone, no. For me, each supposed pro has a corresponding con or is just a no-op:

  1. Only one package for all distros: Despite what people think, this does not lower the amount of work for the program's creator, who was never required to create any sort of binary package at all. Furthermore, it means that fewer people are checking the package for faults—that's part of what distro maintainers do, y'know.

  2. No external dependencies: Not only does this cause disk bloat, but it means that if the flatpak is no longer updated, the dependencies packaged inside it may not be either . . . which is one of the issues that dynamic linking was supposed to avoid in the first place. Might as well just go old-school and statically link the binary.

  3. Installations at user rather than system level: Only of value if I don't have admin authority, and I don't have to deal with a single system where that's the case, so this is a no-op.

  4. Supposedly more rapid updates: I'm running Gentoo, not Debian ~~fossil~~ :cough: oldstable. If I really want to, I can have my package manager install direct pulls from source control for many packages. New changes every day—beat that, flatpak. Plus, unless there's been a substantial change to a package's build method, I can bump actual releases myself just by copying and renaming a small file, then running a couple of commands.

  5. Sandboxing: As far as I'm concerned, the amount of security added by sandboxing and the amount of security added by the additional scrutiny from the distro maintainers is probably about even (especially since the sandbox, as a non-trivial piece of software, will inevitably contain bugs). And I can can throw firejail on top if I'm worried about something specific (or run it in a VM if I'm really nervous). I can understand why this might be attractive to some people, but for me the weight is very low.

.

So I'm left with avoiding bloat and bugs in flatpak's system integration vs. a little bit of security gained by additional sandboxing (which I don't think I really need, because I'm only mid-level paranoid). Thus, I'm not interested in complexifying my update process by incorporating flatpak into my system. Others' needs may be different.

[–] nyan 5 points 6 months ago

Ships with Gentoo by default, since you actually need a nongraphical editor there and nano is easier to learn than vi or emacs.

[–] nyan 2 points 6 months ago

They're unlikely to do worse than my laptop from 2008, and it's perfectly usable under Linux (bit of a lag when starting up large programs, that's all). As has already been said, go for a lighter desktop environment (XFCE, LXQT, Mate, TDE) unless these machines were high-spec'd for their era. For the oldest machines, you might want to consider installing Puppy Linux rather than one of the more mainstream distributions, since Puppy specializes in old x86-family hardware.

[–] nyan 1 points 6 months ago

Some of them throw up their hands and reinstall at the first sign of a problem. The rest get someone else to do the "hard" part for them, in my experience. They hand it over to the Unofficial Repair Person Paid in Beer and Pizza, who does the command-line stuff, registry editing, etc. Or they get an official repair person. Less than 10% of the Windows-using population does their own fixes.

[–] nyan 2 points 6 months ago* (last edited 6 months ago)

Gentoo's benefits come from having software specifically compiled for your specific CPU, which can take advantage of its quirks. Technically that's achievable with other distros as well; it's just a lot more work when it isn't built into your package manager. You can also eke out additional performance by building a custom kernel and removing various features that are meant to protect against bugs or security concerns, and while Gentoo doesn't push custom kernels as hard as it did twenty years ago, the capability is still readily accessible.

So: Gentoo makes it easier to access methods than can in theory be used to speed up any distro. The gains are either quite modest (for custom compilation) or not necessarily that good a tradeoff (disabling Spectre mitigations and other protections in the kernel). 🤷

(Yes, I wrote a serious response to a joke post. Bite me.)

[–] nyan 5 points 6 months ago

One thing no one seems to have touched on yet: distros have philosophies—guiding principles that affect what packages they have and how they're presented.

For instance, Debian is strongly open-source-centric. Closed-source software is not normally found in their main repository (even when it would be useful to put it there, like some drivers).

Gentoo, on the other hand, is all about user choice. You're expected to choose for yourself whether you want to use systemd or OpenRC, X or Wayland, which DE (if any) you want to use, and which features you do or don't want compiled into your software. However, Gentoo is quite happy to include closed-source software in the main package repository, because using it is also a choice that some people prefer to make.

Red Hat, Arch, and Slackware (to name the remaining major foundational distros) also have their own philosophies. Some descendant distros retain their parents' principles, while others discard them and develop a philosophy of their own (Ubuntu doesn't have Debian's Open Source Uber Alles tendencies, for instance).

[–] nyan 1 points 6 months ago

I prefer mplayer—novel-length man page and all—for video, but there's nothing innately wrong with VLC. I did try it, a very long time ago, but it felt too GUI-oriented for my taste back then.

(I can think of exactly two times mplayer has failled to play a file I presented it with, and in both cases it was my own fault for not compiling in support for that codec. However, the man page is justifiably frightening.)

[–] nyan 3 points 6 months ago

I have a laptop of that era (2008 HP Pavilion, Athlon64x2, 2GB RAM, 100GB HDD). It runs the Trinity Desktop Environment, which works just as well now as it did when that laptop was a flagship machine. (Updating a Gentoo system running on a machine that old is a bit time-consuming, mind you, but that isn't the DE's fault.)

I've tried several of the other lighter-weight DEs—XFCE, LXDE, Lumina, Gnome2 before it became MATE—but TDE does what I need it to do, and (just as importantly) the development team prefers to work on features and compatibility rather than tearing out things that still work or forcing new paradigms that don't really make sense for my use case onto me. It's there, it's solid, and I've already learned its quirks, so I can save my brain cells for learning useful features in other programs rather than having to figure out where the control for some bit of the GUI ran off to this time. Why would I use anything else? The thing I want most from my DE is for it to stay out of the way and not keep me from using other software.

(Plus, Konqueror may no longer be useful as a web browser, but it's still a better file manager than, say, Thunar, which I found to be a pain in the arse when I tried XFCE.)

[–] nyan 6 points 7 months ago

Rolling-release Linux distros such as Arch, Gentoo, and OpenSUSE Tumbleweed constantly release the latest updates, but they’re not used in businesses.

So some businesses decided that monolithic releases were more important than being able to get the latest upstream vanilla kernel version, and somehow that's the fault of "all Linux kernel vendors" (including rolling-release distros, since there was no attempt to qualify "all") and not the businesses' decisions about tradeoffs?

view more: ‹ prev next ›