Actually I think this might be https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1023748 and you might be able to get around it by having an older Java installed or set as the main one when installing the CA certificates package.
planish
Your root problem is:
dpkg: error processing package ca-certificates-java (--configure):
installed ca-certificates-java package post-installation script subprocess returned error exit status 1
It can't finish installing ca-certificates-java
because the script that is supposed to set it up isn't working. So it also can't finish installing anything that depends on it, since it doesn't want to run their scripts until ca-certificates-java
is installed properly.
Maybe uninstall ca-certificates-java
and what depends on it, and then reinstall it alone, and see if you can get a message about why exactly its script is failing?
Yes?
You can also shout into the void yourself. Eventually somebody might notice and reply.
You can watch the local or federated timelines and if someone says something interesting you can follow them and reply. Or you if you get linked from elsewhere to a good post you can follow the author. Or you can see a list of popular hashtags (or look up a hashtag) and see posts in them, or post to them.
What else is there supposed to be to do there?
You can use interest rates to convert between stocks and flows of money. If the prevailing interest rate is 5%, a thing will produce 5%, or 1/20th, of its actual value every year. So you can take the annual cost of something and multiply by 20 (and vigorously wave your hands at compounding) to get its actual value.
A $10/month subscription costs $120/year, or $2,400 over 20 years. So it's equivalent to a $2,400 purchase.
You can also think of it as, you need to set aside $2,400 in investments to pay for your subscription, e.g. in retirement. Or, if you ditched your subscription you could afford to borrow $2,400 more to e.g. buy a house. Or, you as a customer are the same value to the business as $2,400 in capital, minus whatever they have to spend to make the thing.
You should think a lot about a $2,400 purchase.
Games are a good example. One might want to publish a game and then work on the next game, not go back to the first game again and add dynamic permission prompts for the accelerometer or recompile with the new SDK or whatever. But someone also might want to play Space Grocer I before Space Grocer II-X to get the whole story.
The fewer breaking changes there are, the lower the burden of an app being "supported" is. Someone might be willing to recompile the app every couple years, or add a new required argument to a function call, but not really able to commit to re-architecting the program to deal with completely new and now-mandatory concepts.
Even on software I actively work on that is "supported" by me, I struggle with the frequency of e.g. angry messages demanding I upgrade to new and incompatible versions of Node modules. Time spent porting to new and incompatible versions of a framework is time not spent keeping the app worth using.
If you write a commercial program and sell it once, you are probably not going to be selling new copies in 10 years. If you keep getting paid you should indeed keep working. But if you stop working on it, it is better for the finished software to last longer.
Windows 11 has a "compatibility mode" that goes back to before XP. Android has a dialog that says that an old APK "needs to be updated", regardless of the continued existence of the original developer or whether the user is happy with the features and level of support.
It is this attitude of "we don't need to think about backward compatibility because we are powerful and all software has a developer on call to deal with our breaking changes" that causes software to go obsolete very quickly now. User needs also change over time, but not nearly as fast.
Not all of the same weaknesses. If it's just "let the judge move stuff around because they're a judge", then yeah. But if you implement any sort of security on it, you can say that the judge can only move stuff when also countersigned by the jury, who were demonstrably selected by a fair random draw, or something.
And even if you don't do that you still have a great record of which judge exactly is stealing everyone's stuff.
You can't just wave a blockchain wand and get a government that works, but you can just wave a blockchain wand and get an accountable record of things.
Friends at the Table is the best podcast.
It's an actual play podcast run in game systems designed mostly for story generation, operated by people who who know there's no such thing as a monster, and I'd never seen anything like it.
They ran some seasons in a post-fantasy-apocalypse world, some in a Star-Wars-meets-Gundam science fantasy world, and one recently in a Western sort-of-horror setting. I started at the beginning, with Autumn in Heiron, featuring orc archivists who work magic using extremely specific shopping lists, undead pastry chef boyfriends, and an "evil" alignment of "destroy something rather than trying to understand it".
But for the impatient you can start with Marielda. Marielda is a series of heists by a crew of illegal knowledge dealers, in a fantasy city that sounds like New Orleans, patrolled by living statues and ruled by a god who forged the sun, whom our players proceed to fight.
The sci-fi side, which is running its fourth season now, starts with COUNTER/Weight, a game set in the aftermath of a mecha movie never made. It features a character who is "what if Han Solo used to be Beyoncé", psychic hackers, and mechs who might be gods.
Also there's no sponsors because the GM is too punk for that.
How come Yogthos is all over my Hot page with months-old posts though? Is tankie-ism the secret to blazing hot posts months in the future?
This is great!
You take a PeerTube channel name and treat it as if it were a community name on Lemmy (https://peertube.instance/c/channelname
) and search it up in your instance and make it federate over.
ZFS zRAID is pretty good for this I think. You hook up the drives from one "pool" to a new machine, and ZFS can detect them and see that they constitute a pool and import them.
I think it still stores some internal references to which drives are in the pool, but if you add the drives from the by-ID directory when making the pool it ought to be using stable IDs at least across Linux machines.
There's also always Git Annex for managing redundancy at the file level instead of inside the filesystem.