this post was submitted on 26 Nov 2023
6 points (87.5% liked)

Self-Hosted Main

511 readers
1 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

For Example

We welcome posts that include suggestions for good self-hosted alternatives to popular online services, how they are better, or how they give back control of your data. Also include hints and tips for less technical readers.

Useful Lists

founded 1 year ago
MODERATORS
 

For now my server doesn't have very important data most of it are your "Linux isos" I can just download again and I'm thinking of starting to move my file and photos to the server but in afraid. What if I get a ransomwarei don't realize and all my backups get encrypted too? Or if the backups are corrupted and my disks breaks? But also I'm afraid about cloud because I've seen some posts about people getting their google accounts closed without notice for breaking TOS (maybe they did something wrong maybe not).

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 4 points 11 months ago (5 children)

It is impossible to fully eliminate the risk but with a decent backup system in place it is somewhat unlikely to lose all of your data.

The 321rule should be used as a baseline. Your local backup should be snapshotted and somewhat hardened against ransomware (pull backups instead of pushing them, do not mount the backup volume to other machines). Cold backups also help.

Can I construct scenarios in which I lose all my stuff? Sure. But in those, we are either in deep shit anyway (CME, some big astroid) or it is pretty unlikely (targeted hacking)

[–] [email protected] 1 points 11 months ago

This. With a proper backup strategy, you are reducing the probability of a catastrophic sequence of events. It becomes P(some unlikely event) x P(some other unlikely event) x ... Etc. for as many events you can think of and/or can afford to mitigate.

As you say, the risk will never be zero. And even the best-laid plans can fail - the Gitlab incident a few years back saw five layers of backups and disaster preparedness fail.

Really, all you can do is backup your data using standard methods, and TEST THE RESTORE before you need to rely on it!

load more comments (4 replies)
[–] [email protected] 4 points 11 months ago

I'm more scared of online services being discontinued and/or being getting vendor locked and forced to pay ransom on a regular basis. Therefore, I host and back up everything on my own.

[–] [email protected] 3 points 11 months ago (1 children)

Follow the 3-2-1 principle and there is less reasons being scared of loosing data.

load more comments (1 replies)
[–] [email protected] 3 points 11 months ago

A) Make backups B) take them offline.

[–] [email protected] 2 points 11 months ago (1 children)

You SHOULD be scare of losing your data. In fact it's a very likely outcome if you're storing data and you don't know what you're doing. This is true of every electronic storage format. If you're not ready to lose everything, you have periodically practice recovery from backup.

Over the years, i've met tons of novice computer users who tell me "I'm worried my files will get hacked if I store them on dropbox / cloud". I always set them straight: the number one risk for you is losing your own data, not data theft, unless somehow your files contain industrial secrets worth hundreds of millions of USD.

I consider myself an experienced computer user and developer, having had various roles that border on sysadmin. I don't trust myself to run my DIY NAS. For the stuff that matters, you should fear complexity as it 's a source of errors. You should doubt yourself at every step. Practice recovery. This is true for everything. I messed up my pfSense router config this weekend, it wouldn't boot. I took the opportunity to practice recovering from backed up config (I should have done that much earlier).

[–] [email protected] 1 points 11 months ago
[–] [email protected] 2 points 11 months ago

ZFS (mirrored) two HDDs. If one HDD fails, then replace it and let it rebuild. Use 3 HDDs mirrored if you really think you could get a failure while the array is rebuilding.

Also have two external backups, one you do regularly at home, and another you keep off-site. When you visit that location (be it your parents, siblings, relatives, friends house) swap out your external backup with their off-site to ensure its kept up to date.

Make sure all disks are fully encrypted of course.

[–] sloppy_diffuser 2 points 11 months ago

I backup to Backblaze b2. I encrypt myself using rclone. Costing me $1-2/mo for about 100Gb that I'm currently using.

API key I use for automated backups is pretty much limited to write only and files are set to hidden when deleted, so not much risk, just an annoyance, if the key were stolen and they defaced my backups.

Once a year I might go delete some history to reduce my usage.

I lean towards scripts to automate setting up a system, so I don't do full system backups. Downloaded video I also mostly skip using mirrored storage. In the event of a real disaster, its an acceptable loss.

[–] [email protected] 2 points 11 months ago

No... I have proper, tested backups.

[–] [email protected] 2 points 11 months ago

321 rule - anything super critical also gets off-sited to the cloud.

[–] [email protected] 2 points 11 months ago (1 children)

My backup strategy:

Data:
- Sycnthing with 1x Copy with my Clients and 1x Copy on my Server accessible via Nextcloud
- Daily Push-Backup with of my Nextcloud-Data-Folder via Kopia to Backblaze
- Daily Pull-Backup of my Nextcloud-Data-Folder via QNAP-NAS in the basement

VM:
- Daily Backup of my VM's to a Proxmox Backup Server running on QNAP-NAS
- Daily Backup of my VM's to BackBlaze (but encrypted before)

Still, I'm not fan of having just one Cloud-Backup. So I think I will also get Hetzner Cloud Storage for Borg Backup additional to Kopia.

Goal:
- Different Hardware (Server, QNAP, etc.)
- Different Backup software (Syncthing, Kopia, Borg)
- Different Backup technique (Push, Pull, Snapshots)
- Different Locations

[–] [email protected] 1 points 11 months ago (1 children)

How do you prevent your backups from file corruptions being backed up?

load more comments (1 replies)
[–] [email protected] 2 points 11 months ago (1 children)

Aren't you scared about loosing your data?

No. I still have files from 1991. I've got files that have migrated from floppy disk to hard drive to QIC-80 tape to PD (Phase Change) optical disk to CD-RW to DVD+RW and now back to hard drives.

What if I get a ransomwarei don't realize and all my backups get encrypted too?

Then you need to detect the ransomware before you backup. I use rsync --dry-run and look at what WOULD change before I run it for real. If I see thousands of files change that I did not expect then I would not run the backup and investigate what changed before running the rsync command for real.

Or if the backups are corrupted

I have 3 copies of my data. Local file server, local backup, remote file server.

I also run rsnapshot on /home every hour to another drive in the machine. I also run snapraid sync to dual parity drives in the system once a day.

I generate and compare stored file checksums twice a year across all 3 copies to detect any corruption. Over 300TB I have about 1 failed checksum every 2 years.

and my disks breaks?

If one of my disks breaks I buy a new one and restore from backups.

But also I'm afraid about cloud

I don't use any cloud services because I don't trust them.

load more comments (1 replies)
[–] [email protected] 2 points 11 months ago

Some data are backed up to a local NAS, some of that data is backed up to cloud (not Google or the big ones).

Most of my data aren’t important. Photo library is both local, in the cloud, and most on offsite DVDs.

~45K lossless music files is local and cloud. Those would suck losing, but I could rip them again.

I’ve been considering tape backup again, it’s like 20 years since I used it at home.

[–] [email protected] 2 points 11 months ago

Paranoia is the reason I self host. Clouds can kick you out or lose your data at any time.

[–] [email protected] 1 points 11 months ago

Personally my NAS isn't my main storage. I still use Google Photos and Google Drive for my important stuff, I just need to configure Rclone to download my stuff on it.

The one thing I'm really self hosting only is my music, outside of the couple of CDs and downloading from iTunes, I don't have a proper backup.

[–] [email protected] 1 points 11 months ago

I run Proxmox VE and Proxmox Backup Server on two machines at the same time. I pull the main backups from the main machine, where all the Vdisks are to the second one. Until now it works like a charm. The third of site machine is in the making

[–] [email protected] 1 points 11 months ago

I’m currently using just an external drive to backup too, I use cloud storage for all my personal files, but my systems (I run a lot of servers that would be a pain to rebuild and reconfigure) and all my Linux ISOs are backed up nightly to a large external hard drive. However, I appreciate that I’m not covered for the local disaster scenario if my house was to set on fire, so my plan is to also implement Backblaze cloud backups of my server machine so I could have cloud backups of my backups at least.

[–] [email protected] 1 points 11 months ago

you're chances of being hit by lightning is probably higher than all 3 copies of your data being inaccessible all at the same time for whatever reason.

[–] [email protected] 1 points 11 months ago

You didn't say how you currently keep your data...

[–] [email protected] 1 points 11 months ago

I'd be more concerned to lose data that is stored in the cloud than on my private network.

The adage "there is no cloud, it's just someone else's computer" is still true.

If you are afraid to lose the data on your clients and servers in your private network, improve your backup strategy and make sure to have one backup off premise (in a safe deposit box if needs be).

It doesn't hurt to improve overall security on your private network, either. 😉

[–] [email protected] 1 points 11 months ago

I'd be more concerned to lose data that is stored in the cloud than on my private network.

The adage "there is no cloud, it's just someone else's computer" is still true.

If you are afraid to lose the data on your clients and servers in your private network, improve your backup strategy and make sure to have one backup off premise (in a safe deposit box if needs be).

It doesn't hurt to improve overall security on your private network, either. 😉

[–] [email protected] 1 points 11 months ago (1 children)

Immutability! For some reason nobody here mentioned this! There's only one thing which can protect you against ransomware - backup storage with immutability! It can be S3, custom script setting immutable flag, read-only snapshots and so on...

But... You need to make sure, that your backup storage is properly tightened, so even you, as the owner, cannot change immutable data without physical access to the server.

[–] [email protected] 1 points 11 months ago

didn't know this could be done, I will check it for my photo backup as they are not gonna change. Thanks

[–] [email protected] 1 points 11 months ago

Not really, Anything important I have in multiple locations. Any media I have hosted I'm not worried about because I can just re-download it.

Photos and video I have on both cloud and my server. Anyone getting their google account closed are probably uploading things that aren't allowed.

In the future I want to build an additional, quite server and have it set up at a relatives house out of state so that I have 3 copies of important data.

[–] [email protected] 1 points 11 months ago

I store my documents in a 3 disk raid 5, which is backed up to a brand new NAS red, which is backed up to a 8TB external via Borg, and finally to my 1 TB OneDrive via rclone.

So at this point, no...

[–] [email protected] 1 points 11 months ago

Currently yes. But in the future, no.

[–] [email protected] 1 points 11 months ago

I have one server home, one off side, encrypted files in the cloud and for the most important stuff i also have cold storage which i also do checksums once a while. If this gets me fucked up i don't know what to do.

[–] [email protected] 1 points 11 months ago

Piggyback: anyone using LTO for backup/archival in their homelab?

[–] [email protected] 1 points 11 months ago (1 children)

No. I self host 100%. So, I have two separate storage stacks (truenas) that are always in replication.

[–] [email protected] 1 points 11 months ago

Unless you have enough snapshots replication will replicate deletes or encryption of files.

[–] [email protected] 1 points 11 months ago

Anything I truly care about losing is in at least two locations. It's like a total of a few hundred gigs. 95% of the content I have I can reacquire in like a week.

[–] [email protected] 1 points 11 months ago

But also I'm afraid about cloud because I've seen some posts about people getting their google accounts closed without notice

thats why you dont backup important data with google, apple, microsoft, etc. even paid accounts aren't safe with these companies and it's very difficult to talk with a real person if you need help. you use a company whose sole purpose is for backups. i use backblaze

[–] [email protected] 1 points 11 months ago

Thats why i burn M-Disks, even if i put them in a System with Malware, i can be sure it can't mess with the Data, in die Cloud i use Object Storage with Object Locks...

You can just setup a minio cluster (ideally separated in Hardware and Firewall) and put stuff there with Object Lock if you want to be sure.

I just removed write access from anything I don't want to be overwritten together with rclones --immutable flag. (Its push and pull, because my Laptop is not on all the time so the Laptop pushes when its on and the Server pulles when he is on)

[–] [email protected] 1 points 11 months ago

My technique is don't do too much so when it breaks you can get it back. Or maybe make a script on GitHub that'll redownload all shit

[–] [email protected] 1 points 11 months ago

Borg incremental and dump in gdrive encrypted

[–] [email protected] 1 points 11 months ago

I have my important data stored on a RAID1 for some redundancy, but otherwise it's just snapshotted to an external drive plugged into my server. It's not ideal and I have two plans I really need to get around to implementing.

I have a detatched garage so I am going to set up another machine hidden in the roof space of the garage where I can have "offsite" backups. It won't get found if my house is being burgled (you need a ladder) and it's far enough away that it hopefully survives if my house burns down. My garage already has ethernet cable.

Another option would be storing some stuff at a friends place if you have friends that self host. I've also been meaning on setting a Wireshark tunnel to a friends server and we can borrow a little bit of each others space.

load more comments
view more: next ›