Literally less than 24h ago: https://www.reddit.com/r/DataHoarder/comments/17y8omd/small_offsite_backup/
Data Hoarder
We are digital librarians. Among us are represented the various reasons to keep data -- legal requirements, competitive requirements, uncertainty of permanence of cloud services, distaste for transmitting your data externally (e.g. government or corporate espionage), cultural and familial archivists, internet collapse preppers, and people who do it themselves so they're sure it's done right. Everyone has their reasons for curating the data they have decided to keep (either forever or For A Damn Long Time (tm) ). Along the way we have sought out like-minded individuals to exchange strategies, war stories, and cautionary tales of failures.
Wow, thanks. I wonder how I missed that.
I have a very similar setup: an old PC with lots of big disk drives at my mom's house.
However, instead of leaving it powered on 24/7, I have it configured it to power itself on at a particular day and time every week. My backup script connects to it, mounts the disks, copies the data, and then send a shutdown command.
This way, the computer isn't running 24/7 (leaving it open to hacking attempts) , and the disks are only spinning when they're in use. It also saves my mom some money on her power bill :)
Just to be safe, I still have the PC plugged into a UPS.
Really interested in hearing how you would go about pulling this off. It sounds exactly like what I would want to do with certain shares on my Unraid server. Is it literally a script describing what folders to download through wireguard/tailscale/etc at a given moment? Or do you use something like syncthing, with the added instructions to shut the pc down when done?
More like the latter, except I use rsync (running over SSH) so as to minimize the amount of traffic. I understand syncthing works in a similar manner, but I haven't tried it out (I've heard good things about it though).
I really thought about doing this. A Dell Optiplex has this feature in BIOS as far as I know.
So, it should be really simple from this point:
- Run systemd timer at the time of the night where the remote machine should be started
- Ping the machine on it's wireguard interface until it answers
- SSH into the machine and do some basic checks
- Run backup script to do backup via borg to remote repository
- Shutdown the remote machine
Optionally, keep a config file for the script on the local machine. For eample, I do not want to shutdown the machine after the backup one single time, to be able to do some updates of the remote machine the next morning and shutdown it by hand afterwards.
I do not see any issues with your configuration.
You can really use anything there rclone with any cloud provider, backblaze personal and etc. You can also encrypt the data before uploading it to the cloud, so it won't be accessible to cloud provider, as alternative to it starwind vtl can be used to upload the data in parts.