I realize there are solutions, but I wanted my own for various reasons (better fit to the peculiar way I store and backup).
It was straightforward to write a python script to crawl a directory tree, adding files to an sqlite database. The script has a few commands:
- "check" computes checksums on files whose modification times have changed since last check, or on any file whose checksum is older than X days (find bitrot this way).
- "parity" Use par2 to compute parity files for all files in database. Store these in a ".par2" directory in the directory tree root so it doesn't clutter the directory tree.
I like this because I can compute checksums and parity files per directory tree (movies, music, photos, etc), and by disk (no raid here, just JBOD + mergerfs). Each disk corresponds exactly to a backup set kept in a pelican case.
The sqlite database has the nice side effect that checksum / parity computation can run in the background and be interrupted at any time (it takes a loooooooong time). The commits are atomic, so machines crashes or have to shut down, it's easy to resume from previous point.
Surely.... SURELY... someone has already written this. But it took me a couple of afternoons to roll my own. Now I have parity and the ability detect bitrot on all live disks and backup sets.
For the price of two 1Tb MicroSD cards, you could get a 4Tb SSD and enclosure. That's more than enough for your data and enough parity info to repair whatever is damaged. It's what I do. The SSD is reliable if powered on often enough for long enough to allow it to do its refreshing. Still, have a backup somewhere (cloud, disk, etc), but the SSD is fine, imho, and vastly superior to microsd.