It's A Digital Disease!

22 readers
1 users here now

This is a sub that aims at bringing data hoarders together to share their passion with like minded people.

founded 1 year ago
MODERATORS
101
 
 
This is an automated archive.

The original was posted on /r/datahoarder by /u/MrRayBloodyPurchase on 2024-01-22 11:38:57+00:00.


Hi there,

I'm looking for a 2TB 2.5" drive to install on a Mac mini, I was between Seagate BarraCuda (ST2000LM015) and Toshiba L200, any thoughts on which one will last longer? Planning to use for file storage.

102
 
 
This is an automated archive.

The original was posted on /r/datahoarder by /u/Content-Apple-833 on 2024-01-22 10:04:52+00:00.


I'm trying to find an app or whatever that will let me do the following from my Android phone with the data being on my TrueNAS server. If simpler, it can just be accessible from within my home network, but ideally I'd like it to work remotely. The three requirements are;

  • automatic backup of and scrolling through photos like Google photos on phone

  • access to upload and download other documents such as PDFs on demand on phone

  • while at home be able to access all the photos and documents over a network share in Windows Explorer from a PC

NextCloud seems to be the thing people recommend, but I haven't had much luck getting it to work the way I want so far. Between the changes between each TN version, differences in Scale and Core, the official app and the TrueCharts app, conflicting documentation and videos, it really seems more complicated than it needs to be. I got a simple container working but then couldn't figure out how to access the documents via SMB, either internally or through the External Storage app.

I've been playing with Proxmox on a different box and it's VM/CT support seems better than TN, but at this point it is still a learning task, and not something "home production". IE, I'd have preferred to get something known working on TN as I'm more familiar with it, and then separately experiment with how I to install, configure, and refine NC on Proxmox.

I guess I could try a Proxmox solution, but also for simplicity I was hoping the data would be on the same box as the NC server. I'd also have to have a backup/recovery plan for the Proxmox box, which I don't have now as just a test system.

Thoughts? Advice?

103
 
 
This is an automated archive.

The original was posted on /r/datahoarder by /u/thenormal on 2024-01-22 09:46:46+00:00.


Hello everyone. I run a website dedicated to a band where we upload all of the bootlegs we can hoard. I use file hosting services to store the releases, such as Google Drive and Mega.

I've recently begun considering using Archive.org's platform to backup those bootlegs. Main reason being in case one of the aforementioned file hosting services ceases to exist, Archive.org should offer a longer retention for the uploaded content.

The issue I'm having is that I am currently struggling with their platform. While it's easy to navigate and browse, their upload service is one of the worst I've ever seen. The main problem I've encountered is their upload speeds. I've done some research and apparently they cap upload speeds to about 2 MBits if you're in the USA, and to 1 if you're from another country (the latter is my case).

While uploading smaller files seems to work ok-ish, the bigger ones are a struggle. I've measured my upload speed when using their servers and it stabilizes at around 700 kbp/s, which means it takes about 10 minutes to upload a 50MB file.

I've also tried splitting the bigger files into smaller ones using .zip files, but apparently their system rejects them as spam. My idea was to begin uploading one or two zip files, and then use the my upload section to add more later on. A strategy which didn't work at all.

I'm curious about your experience. I think it's really a shame their platform is so lacking when it comes to uploading stuff, considering the fact they literally invite you to use it in order to create the biggest, most reliable online library.

104
 
 
This is an automated archive.

The original was posted on /r/datahoarder by /u/danielrosehill on 2024-01-22 11:46:31+00:00.


Hi folks,

As some of you may know (I feel like I've posted enough here!) I use optical media (specifically the M-Disc) to archive my personal data - mostly videos, audio files, photos and - occasionally - important documents.

My current system is burning every archive disc two times - one is kept onsite and the other is 'queued' for transfer to my offsite library. The offsite library is located at my inlaws' place in the US (I'm on a different continent ... and yes ... I guess I better stay married to this girl for the rest of my life if I want to keep my backup data 😂).

I do the 'offsiting' process myself - by which I mean I put a binder full of CDs into my checked luggage. But I've thought it might be worth trying to mail over some of my CDs. I have hopes that my father in law will join the 'datahoarding movement' (he seems interested in M-Disc). So perhaps he'd do me the favor of taking a UPS delivery from me a few times a year.

I'm wondering has anybody done something like this? And if so any thoughts on how to best package whatever physical media you're sending - whether it be optical media, HDDs, SDDs, LTO tapes, or really whatever?

TIA

105
 
 
This is an automated archive.

The original was posted on /r/datahoarder by /u/Nokita_is_Back on 2024-01-22 10:44:28+00:00.


I have bought a HP Elitedesk 800 G2 Tower with a i7-6700 and gotten 2 optane drives from a friend who doesn't need them anymore. One is a H10 32GB Optane with 512GB ssd nvme the other a standalone Optane with 32GB.

Truenas Scale is installed on the Elitedesk and I wanted to use the Optane's for Cache(L2arc&SLOG/Read&Write). I connected them via PCIE Adapter to the Motherboard (16x and 4x).

The standalone Optane is throwing me a 313 SMART error but the H10 32GB Optane with 512GB SSD is being accepted by Truenas Scale & BIOS as a nvme drive and working.

Is the system ignoring the Optane part in the H10 and just using the NVME? Can I test this somehow? Is the SMART Error more likely to be hardware related (32GB standalone Optane failing) or Optane compatability related?

System: Bios 2.6, Elitedesk 800 G2 Tower, Truenas Scale (Debian)

106
 
 
This is an automated archive.

The original was posted on /r/datahoarder by /u/goal_dante_or_vergil on 2024-01-22 10:42:43+00:00.


I have spent the last few hours browsing this sub and I am confused because I received conflicting information. Some people are saying that external hard drives are bad and it is better to buy an internal hard drive and put it in an enclosure with a fan. But there are also others that say that for the average user, a regular external will be sufficient.

I use my external hard drives to store the movies, tv shows, manga, anime and video games that I download via torrent. Once they finish downloading and uploading to a ratio of at least 2.0, I stop and then transfer them to my external hard drives. So my external hard drives are not plugged in and running 24/7.

Only when there is a movie or tv show that I want to watch then I will connect them to my pc and run them off the external hard drive. In those instances, the external will be connected and running for several hours while I am watching but once I finish, I always disconnect before shutting down my pc. So my externals are not plugged in and running overnight.

If there is a video game that I want to play, I will connect the external to my pc, install the game, then once the game has been installed, there is no longer any need to keep the hard drive connected so I always disconnect the external so they are not running while I am gaming all day.

So is it better for my usage to get an external or an internal with an enclosure with a fan?

107
 
 
This is an automated archive.

The original was posted on /r/datahoarder by /u/Random7321 on 2024-01-22 10:08:07+00:00.


Anybody know what is the difference between this two HDD models? (almost same price here)

HUH728080ALE601

HUH728080ALE604

And are they better or worse than this drive HUS728T8TALE6L4 ?

108
1
Refurbished HDDs (zerobytes.monster)
submitted 10 months ago by [email protected] to c/[email protected]
 
 
This is an automated archive.

The original was posted on /r/datahoarder by /u/Loitering14 on 2024-01-22 09:57:00+00:00.


I'm expanding my NAS and looking for some deal on eBay I found this guy who is selling 4 2TB barracuda for less than 100€, I contacted him and there are no bad sectors and all have about 10.000 hours of functioning time, is it a good deal or I would end up with a pile of crap?

109
 
 
This is an automated archive.

The original was posted on /r/datahoarder by /u/dingdongs44 on 2024-01-22 09:18:15+00:00.


What deal tracking sites or subreddits do you guys use to get better discounts on storage tech?

110
 
 
This is an automated archive.

The original was posted on /r/datahoarder by /u/testicularbat on 2024-01-22 07:14:48+00:00.


So I am done with hard drive noise. can't take it anymore. I live in a tiny apartment and I can hear the damn thing purring as the torrents write when I sleep.

All I want is a total of 15GB in whatever config.... No need for backup (I'll use the HDD's for cold backup).

I could get four 4TB NVME's but that's about 1200$, or one of those 16TB Intel SSD's for 500$ but I heard they go bad?

Speed is not important, it's for a home plex server.

111
 
 
This is an automated archive.

The original was posted on /r/datahoarder by /u/sebsnake on 2024-01-22 05:37:39+00:00.


I'm a small hoarder, currently sitting at 3TB on a 16TB pool (mostly small files < 20MB) and a larger pool in the 40TB (used) area. Using truenas, the larger pool is raidz-2 with 8x12TB drives 1vdev and the smaller 7x4TB drives 1vdev... I need to optimize this (mostly for power consumption and noise) and need to move the data to my backup device (which I haven't kept in sync for about a year... -.-)

So, what is your way to move large amounts of data to another device, hardware wise? I currently only have 1Gbit NICs, so I'm capped at 100MB/s although the drives could probably give at least a little more. Is there some reliable usb-to-usb solution, or some good WiFi usb NICs someone could advise to me for wireless transport? Or even 2.5Gbit usb NICs that won't break under this load? Are these reliable for larger amounts of continuous transfer?

Bonus question, halfway related: if I do setup truenas anew, is there some way to have a single pool being offline while the system is running, just doing regular scrubs and checks on it? Because than I would do this as a secondary backup, where I could internally move files back if something breaks in the future.

112
 
 
This is an automated archive.

The original was posted on /r/datahoarder by /u/mxl555 on 2024-01-22 04:53:53+00:00.


The interface says SATA, anyone know why it wouldn't work?

113
 
 
This is an automated archive.

The original was posted on /r/datahoarder by /u/harritaco on 2024-01-22 02:06:54+00:00.

114
 
 
This is an automated archive.

The original was posted on /r/datahoarder by /u/road_hazard on 2024-01-22 01:40:24+00:00.


Was looking at ways to quiet down my SC846 and stumbled across Mr. Gibson's page. It looks like this is exactly what I need. BUT, he says he's pausing orders until mid-December. I can't find any way to contact him on his page.

Anyone know how to get in touch with him to see when we'll be able to order items?

115
 
 
This is an automated archive.

The original was posted on /r/datahoarder by /u/HTWingNut on 2024-01-22 01:00:01+00:00.


I did a thing! I wrote my own PowerShell hashing script called... POWERHASH! (cue the dramatic music - PowerShell + Hash = POWERHASH). It does more than just hash your files though.

Windows does a crap job at offering built in tools to validate your data, so you have to resort to third party tools. Unfortunately there are limited third party tools out there, and many older ones that are no longer maintained or supported or cost money. So I decided to write my own. I am not a developer or programmer, so pardon the mess, but as far as I can tell the script is fully functional and efficient. I'm open to feedback from testing, however.

The biggest thing I wanted from a hashing program is to be able to update hashes of changed or added files to a folder without having to rehash the entire folder. I did not find this feature in the handful of programs I evaluated. So I implemented that feature. This way you can hash some files, then run the UPDATE function to update hashes and remove hashes of files removed from the folder. But it can do more than that as well. See below for features.

The project was first started by using stock PowerShell 5.1 that comes with Windows, because I like to use stock apps whenever I can. But I quickly realized that PS 5.1 is antiquated and the latest PowerShell Core 7 (currently version 7.4.1) is so much better, offers a ton of cmdlets to prevent having to load or program your own functions, and is very efficient with file comparisons. It's free and easy to install too.

****** FEATURES ******

So what does PowerHash do?

  • Generate SHA256 (or MD5) hashes of files in a folder recursively (makes use of Get-FileHash cmdlet)
  • Omit folders and files from checksum operation by using exclusion keywords for folders and files separately
  • Update existing hash log file with only files that have changed and/or been deleted from a folder so a full hash of all files does not need to be completed
  • Scrub a folder against an existing POWERHASH log file to check for discrepancies
  • Compare two log files for discrepancies (so you can hash two locations then scrub them based on the log files)
  • Find duplicate files based on matching hashes
  • Run from interactive menu or use command line flags (so you can schedule updates and scrubs with Task Scheduler)

****** REQUIREMENTS ******

PowerShell 7 (Core) is required to run this script, but it's free and easy to install. Instructions here:

Or just type in cmd prompt or powershell prompt: winget install --id Microsoft.Powershell --source winget

This will install the latest PowerShell version (currently 7.4.1 as of this writing, which is what this program was validated with).

****** DOWNLOAD ******

While I recommend you read my diatribe on how to use the program, you can download it from github:

It's currently considered BETA. You can nab the powerhash.ps1 over on the right under "releases".

I have a video on how to use the program as well here:

****** INSTRUCTIONS ******

The program runs in both an interactive menu mode through cmd prompt or powershell prompt. It's best to CD to the folder where your powerhash.ps1 folder resides and run from there.

To run: pwsh .\powerhash.ps1

By default it will generate SHA256 hashes of files, but you can use the -MD5 flag to have it generate and manage MD5 hash checksums instead (i.e. pwsh .\powerhash.ps1 -md5)

This will take you to the interactive menu. I will first take you through the interactive menu, then show you how to run the same commands using command line mode.

It may be a good idea to have a Windows Explorer window open so you can click/drag folder and file names to the cmd windows to save keystrokes. Just remember to click the command windows after dragging file names/paths over.

You can get command line help at any time by typing: pwsh .\powerhash.ps1 -help or even a full readme with -readme flag. Or use -readme flag with specific function you want. Those available readme's are shown in the -help.

The script will generate a MAIN HASH LOG that will contain SHA256 hash values, file name, file size, last modified date of every file hashed. This is the MAIN HASH LOG that will be set to READ ONLY using the naming convention:

SHA256_[FOLDER NAME]_[DATE TIME STAMP].log
Example: User enters folder D:\DATA, the log file will be 'SHA256_DATA_20240117_190211.log'

Let's call this '[hashlog].log'

You can rename [hashlog].log whatever you want to after it's made, but the file will be set to READ ONLY. This file should not be hand edited or it could make it not work properly with the script.

All operations that touch the MAIN HASH LOG will be summarized in the '[hashlog]_history.log' file. This file you can make notes in if desired as its reference only.

There will be other supplementary log files generated depending on what function you use. The log files that can be generated are:

   HISTORY: ‘[hashlog]_history.log’     Maintains summary of all actions performed on [hashlog].log
   UPDATED: ‘[hashlog]_updated.log’     Details of file changes when using ‘UPDATE’ fn
   COMPARE: ‘[hashlog]_compare.log’     Details of results when comparing two logs ‘COMPARE’ fn
     SCRUB: ‘[hashlog]_scrub.log’       Details of results after running ‘SCRUB’ fn
DUPLICATES: ‘[hashlog]_duplicates.log’  Details of duplicate files list after running ‘DUPLICATES’ fn
EXCLUSIONS: ‘[hashlog]_excluded.log’    Lists files Excluded from exclusion keywords set by user
  PREVIOUS: ‘[hashlog]_previous.log’    Copy of [hashlog].log as “undo” after running ‘UPDATE’ fn

****** MAIN MENU ******

The Main Menu will present you with multiple options:

=POWERHASH SHA256= by HTWingNut v2024.01.17
Type q from any menu to return here

Choose from the following:
 [G]enerate New SHA256 Hash Log
 [U]pdate Hash Log
 [C]ompare Hash Logs
 [S]crub Folder with Log
 [D]uplicate File Check
 [Q]uit

CHOICE:

****** GENERATE (OR CREATE) ******

BEFORE YOU CAN DO ANYTHING with any of the other functions you must create a MAIN HASH LOG ([G] in menu or -create flag in command line). You point the program to the folder that you want to generate hashes from.

You can then assign any folder and file exclusion parameters you want. These are in the fomr of keywords or keyphrases and must be entered in a specific format. Folder and File exclusions are independent of each other. Folder means path without the file. File means just the file name and extension.

The format to use - single quotes around each keyphrase with multiple keyphrases separted by a comma:

Enter FOLDER Exclusion: '\LinuxISO','Temp','\Recycle Bin'

No wildcards allowed and IS NOT case sensitive. This example would exclude any file path starting with LinuxISO, any folder path containing the word temp, and any file path starting with Recycle Bin

`Enter FILE Exclusion: '.pdf','Thumbs.db'

This would exclude any files with a .pdf extension (or if file contains .pdf in the filename itself) and anything named Thumbs.db.

Command Line

To create this log from command line use:

pwsh .\powerhash.ps1 -create -path "D:\Data"

if you want to add exclusions:

pwsh .\powerhash.ps1 -create -path "D:\Data" -excludefolders "'\LinuxISO','Temp','\Recycle Bin'" -excludefiles "'.pdf','Thumbs.db'"

Note the exclusion lists have to be wrapped completely in double quotes.

The results will be stored in '[hashlog].log'.

If exclusions were added it will generate a '[hashlog]_exclusion.log' file which will list the file names of all files excluded from hashing along with the rule that excluded it.

And of course everything will be summarized in the '[hashlog]_history.log' file.

****** UPDATE HASH LOG ******

This function will allow you to update an existing hash log with only files that have been changed, added/new to the folder, or deleted from the folder. It will also provide potentially renamed or moved files. You will have to provide the log file to update and the file path where to look for updated files.

Folder and File exclusions can also be added, modified, or removed at this time. It will remove any existing log entries that match the exclusion profiles provided by the user.

Command Line

The command line version of this would look something like this:

pwsh .\powerhash.ps1 -update -log "SHA256_DATA_20240117_190211.log' -path "D:\DATA"

And like the create/generate hash prompt you can exclude files as well.

pwsh .\powerhash.ps1 -update -log "SHA256_DATA_20240117_190211.log' -path "D:\Data" -excludefolders "'\LinuxISO','Temp','\Recycle Bin'" -excludefiles "'.pdf','Thumbs.db'"

If you want to clear exclusions from command line you can use the '-excludeclear' command:

pwsh .\powerhash.ps1 -update -log "SHA256_DATA_20240117_190211.log' -path "D:\Data" -excludeclear

The main log file '[hashlog].log' will be updated with any changes detected in the folder.

Details of the update will be stored in '[hashlog]_updated.log'.

And of course everything will be summarized in the '[hashlog]_history.log' file.

****** SCRUB FOLDER ******

The...


Content cut off. Read original on https://www.reddit.com/r/DataHoarder/comments/19cj4bb/powerhash_a_powershell_sha256md5_hashing_script/

116
 
 
This is an automated archive.

The original was posted on /r/datahoarder by /u/undeadartist1 on 2024-01-22 00:50:12+00:00.


I have about a couple thousand posts/reels saved and would like to download the media but the data export only put my saved in a json/ text file. Instead of logging in with a third party app, or downloading manually is there a good way to get the images/videos/posts exported using python or another way from the urls?

117
 
 
This is an automated archive.

The original was posted on /r/datahoarder by /u/Alarmed-Gazelle3369 on 2024-01-21 22:22:17+00:00.


118
 
 
This is an automated archive.

The original was posted on /r/datahoarder by /u/IvanezerScrooge on 2024-01-21 22:12:24+00:00.


without really needing to, I was looking at my future expansion possibilities, in my current chassis I have run out of physical space, aside from a 5.25 bay in the front which I have yet to populate. I was thinking that in the future I could use a 5.25 to 4x2.5 cage, and was looking at SSD's that could be put in there.

I keep seeing that the cheaper SSD's often have poor write endurence. Such as 700TB's for the crucial BX500 compared to 2800TB for an Ironwolf 125 But do I even need to care about this? I do write to my pool every day, but they're not large writes. When I do do large writes, they are usually only a few gigs.

But I practically never delete anything. I assume this applies to most of us hoarders. So I can't imagine I could ever come close to 700TB's on a 2TB drive. I suppose games and whatnot that I keep on there do get updates, but still, 700TB's?

(This is for a TrueNAS box, so ZFS.) Am I missing something?

119
 
 
This is an automated archive.

The original was posted on /r/datahoarder by /u/Known_Alternative565 on 2024-01-21 21:31:27+00:00.


I am not sure if this is the right sub-reddit to ask, so please tell me if this post is off topic.

Recently all educational accounts' (Google, o365) storages got limited. Those accounts used to give large amount of cloud storage, but now they have limited to 100gb or 20gb.

As I'm moving all my stuff to an external hdd, I was wondering what do you guys use for cloud storage? There must be some files that are convenient to be in the cloud for any paper work.

Furthermore, I've had personal experience with hdd just suddenly dying, so I prefer ssd, but then again the cost difference is unmissable. Are recent hdd's less fragile?

120
1
Hdd cage (zerobytes.monster)
submitted 10 months ago by [email protected] to c/[email protected]
 
 
This is an automated archive.

The original was posted on /r/datahoarder by /u/Silent_Bus_8510 on 2024-01-21 20:20:13+00:00.


I currently have 12 16TB HDDs, I plan to buy another 6, I have them mounted in a simple plastic that I bought, could you recommend a case to mount in?

121
 
 
This is an automated archive.

The original was posted on /r/datahoarder by /u/ThreeLeggedChimp on 2024-01-20 15:11:56+00:00.

122
 
 
This is an automated archive.

The original was posted on /r/datahoarder by /u/fumblesmcdrum on 2024-01-21 23:59:02+00:00.


I've been thinking about building a NAS and got the following for free:

  • no-name case (four 5.25" drive bays + three 3.5" internal bays), 80mm exhaust fan
  • Motherboard: Asus A88xm-plus: micro-ATX, FM2+ socket w\ 8 SATA6 ports, 64GB max ram.
  • RAM: 8GB DDR3 1333MhZ (2x4GB, Kingston KVR)
  • CPU: AMD A8-5600k (3.6Ghz 4 core, 4 thread.)
  • CPU cooler: stock low-profile with 80mm fan
  • Boot drive: Crucial BX500 240GB
  • PSU: generic brand, swapped out with Corsair RM650x

This feels pretty ancient but the onboard 8 SATA ports made me think it could be a good platform to explore building a first NAS. I'm planning to put TrueNAS Scale on it.

Question -->: Is this a worthwhile project, or should I look to invest in more modern hardware?

I'm mostly looking to use this as a storage device. I might try putting a media server on it once things are set up properly, but I have no idea how good or capable the A8 CPU is at transcoding (though I'd only be sending things to a 4k tv). Also, I have no idea if there are better CPUs I could chuck in this thing. The 5600k's 100w TDP seems like a lot in this case, but...it was free. Can anyone recommend an upgrade path here?

Speaking of upgrades, here were some new purchases I was considering:

  • More ram: I have 4 DIMMS and a 64GB cap, so I'll probably look at picking up some 8gb 1600mhz sticks. Not sure if I can find unbuffered ECC 16GB sticks compatible with this motherboard. (I can't seem to find and non-ECC DDR3 16GB sticks)
  • Drives, obviously.
  • Hot-swap caddies for the 5.25" drive bays Any recommendations?
  • quieter exhaust fans. Any recommendations? Was looking at Noctua 80mm fans but realized I don't know anything about this.

Thanks!

123
 
 
This is an automated archive.

The original was posted on /r/datahoarder by /u/lilfishgod on 2024-01-21 23:58:52+00:00.


So I'm a photographer for a large construction company. We need a free DAM for the purpose of tagging and key wording photos. I use mac and my co worker uses windows so I need something that works for both. Any suggestions?

124
 
 
This is an automated archive.

The original was posted on /r/datahoarder by /u/AlienBoy_tw on 2024-01-21 23:26:22+00:00.


I've been with CrashPlan for almost a decade, but my storage needs have far outgrown CrashPlan's ice age-speed. What's your experience of Backblaze vs IDrive, specifically on their backup and restore speed in the US?

I currently have 4TB data and growing. As a content creator I steadily output sizable data that needs versioning and backup, and I have a budget of around $10/mo.

I'm specifically comparing the speed of these two services, not other options. But if you feel strongly to suggest others, I won't stop you ;) Also, I'm mainly looking for their "standard" services, instead of B2, e2 or other object storage solution. Mainly because of the costs, and I admit, I'm not very familiar how the workflow & technology works.

Thanks in advance!

125
 
 
This is an automated archive.

The original was posted on /r/datahoarder by /u/EquivalentTip4103 on 2024-01-21 23:04:07+00:00.


Hi all.

Want to start archiving family photos and videos for long term storage using 100gb M-Discs (1.5tb in total so far), and seen the Asus BW-16D1HT internal drive.

Anyone use it? Any issues??

Also seen on eBay 20 disc spindles of 100gb from Japan for £95, yet seen others that are the same price for only 5 in cases. Any reason why there is such a massive difference??

Thanks.

view more: ‹ prev next ›