If you're just worried about people you live with and passive scan type stuff I'd do a LUKS flash drive and a txt file. If you are worried about more active stuff from 3 letters then I still think digital is going to be the best bet, but you'd better use qubes or even dedicate an airgapped computer with an encrypted drive but even that is iffy for a serious anti gov threat model.
HumanPerson
Fmdlocator. It can auto reply a text with location to whitelisted contacts.
Nice
My friend who is better at identifying cars than I says f type BMW, probably f10.
Looks like at least 6
My opinion generally aligns with those who are saying to talk with them so they have a better understanding and don't try to be overly strict with parental controls and such.
What I do want to add and don't see in other comments is that if you want tracking software, you can set up fmd locator. It uses contact whitelisting so if they get a specific text from a whitelisted contact it will automatically text back their location. It isn't for the use case of constant tracking to see if they're sneaking out or whatever but if you want something that's more trust based location sharing.
I don't think so. I think they sort of have to branch off as lemmy gains users.
I use the grapheneos pin fingerprint combo with a longer password if that fails or bfu.
No. Lubuntu is designed to use very little resources which makes it faster on slow hardware where the os is a lot of the load. If you have fast hardware, regular Ubuntu might use (making this up but the point generally stands) 2%CPU and 3G of RAM and lubuntu would use 1%CPU and 2G of RAM. That would be a much larger boost if you have a much weaker CPU and only 4G of RAM, but you likely wouldn't notice a difference on fast hardware.
Edit: spelling
Ollama can pull info from the web using multiple sites, but yes local AIs are more prone to hallucination. Google did release Gemma3 which has a 27B model which is probably the most cost effective way to get into local models that rival chatgpt (if you can call about 2k cost effective). That was why I recommended duck.ai as well, as it has access to gpt and llama3.3:70b which will do a lot better.
You could check out localllama on lemmy to run foss ai models locally, or you could check out duck.ai as someone else mentioned. Your mental health should come first so do what you can for privacy but don't feel bad about making compromises.
Eh, I sometimes spin up a temporary docker container for some nonsense on a separate computer. I usually just go for it after checking no one is on and backing up necessary data.