this post was submitted on 28 Jun 2023
12 points (100.0% liked)

datahoarder

6608 readers
3 users here now

Who are we?

We are digital librarians. Among us are represented the various reasons to keep data -- legal requirements, competitive requirements, uncertainty of permanence of cloud services, distaste for transmitting your data externally (e.g. government or corporate espionage), cultural and familial archivists, internet collapse preppers, and people who do it themselves so they're sure it's done right. Everyone has their reasons for curating the data they have decided to keep (either forever or For A Damn Long Time). Along the way we have sought out like-minded individuals to exchange strategies, war stories, and cautionary tales of failures.

We are one. We are legion. And we're trying really hard not to forget.

-- 5-4-3-2-1-bang from this thread

founded 4 years ago
MODERATORS
 

Whenever I wipe my PC, I use tar to make an archive of the whole system. This works, but having to decompress the whole archive to pull files out is very annoying. Is there another archive format that:

  • Preserves permissions (i.e., is Unix-y)
  • Supports strong compression (I use either zstd or xz depending on how long I can be bothered to wait)
  • Supports pulling out individual files quickly
top 8 comments
sorted by: hot top controversial new old
[–] [email protected] 7 points 1 year ago

You don't need to extract the whole thing if you use tar. The reason you have to here is because you use zstd/xz on top of it.

Use tar as is. It's what it's made for.

[–] [email protected] 6 points 1 year ago

Maybe Borg is a possibility. However, I have not yet backed up an entire system with it, but only certain files.

  • The file permissions have always been correct when restoring files in my case.
  • Which compression (LZ4, zlib, LZMA or zstd) and which compression level is used can be specified when creating a backup.
  • Backups can be mounted via FUSE, so that you can restore individual files with an file manager or a terminal emulator, for example.
[–] [email protected] 5 points 1 year ago

At least on the Mac (bsdtar) you can extract single files out of a tar file.

E.g.,

Create the tar file:

tar cvzf pseudo.tgz pseudo/

Move to another directory

cd /tmp/tt

Extract a single file:

tar -xf ../pseudo.tgz pseudo/10481_2017.1069.png

You say PC, so might want to check the tar version you are using and see if there are extra parameters to do the file extraction.

[–] [email protected] 3 points 1 year ago

Take a look at DAR: http://dar.linux.free.fr/

It has many advanced functions and is actively developed (for over a decade).

[–] [email protected] 3 points 1 year ago

Take a look at squashfs. This creates a compressed archive that can be mounted as a read-only filesystem to pull out individual files. It is very fast and likely already installed on your system.

[–] [email protected] 2 points 1 year ago

Borg or restic since they do deduplication.

My biggest data regret is rsync-ing or tar-ing up my systems to my fileserver as a backup mechanism. So much wasted space. Extremely difficult to find anything. Impossible to properly organize. These backup solutions improve the situation tremendously.

[–] [email protected] 1 points 1 year ago

I use bupstash.io

[–] [email protected] 1 points 1 year ago

When I wipe my PC I always use Clonezilla. I have a separate /home partition and I usually copy /etc inside my user's home directory just before the cloning. I'd say you should give it a try.