This is an automated archive.
The original was posted on /r/datahoarder by /u/chemistocrat on 2024-01-22 19:20:23+00:00.
As part of my semi-regularly-scheduled "what can I improve/mess with" task, I'm thinking about ways to improve my backup strategy or make sure it's at least something resembling "good." Here's an overview of the setup I currently use:
- DS920+ Production NAS, SHR-1, 4x 14TB HDD, ~37TB total formatted capacity
- DS220j backup NAS, JBOD, 2x 20TB HDD, ~36TB total formatted capacity
- PowerEdge R710 running bare metal TrueNAS, raidz2 6x 14TB HDD, ~51TB total formatted capacity
- 14TB WD EasyStore
The DS920+ uses shared folder sync a couple of times a week to sync to the DS220j.
The R710 is powered on weekly by a plink script on another always-on computer, then the DS220j uses it as a Hyper Backup target. The same always-on computer powers the R710 off 12 hours after it's powered on.
The DS220j has a USB Copy task set to copy the entirety of its volume's contents to a 14TB WD EasyStore every time it is connected. I keep this off site and bring it home once a month.
Some improvements I'm considering:
- buying another WD EasyStore so I can swap between the one at home and the one off site
- considering running both Shared Folder Sync and Hyper Backup from the DS920+ instead of going 920>220>R710
- considering colocating DS220j and running Tailscale from 920+ to 220j (not sure how/if the 220j's CPU can handle WireGuard)
I am not a data scientist, I don't even work in tech, so this is strictly a hobby for me - I'm very sure I'm doing something dumb, so please keep that in mind while reading this and weighing in. Just looking for ways to potentially make this more robust and efficient.