WindowlessBasement

joined 1 year ago
[–] [email protected] 1 points 11 months ago

I have tried using the in-built Pagination API to retrieve all relevant domain entries by splitting them into blocks but, due to the way the filters are applied, this only tells me if the entry is in the current block and I have to search each one manually. I have basically no coding knowledge

Short answer: you're asking questions that will take a program requesting data (the whole internet archive?) non-stop for a month or more. You are gonna need to learn to code if you want to interact with that much data.

I definitely don't have the ability to automate the search process for the paginated data.

You're going to need to automate it. A rate-limiter is going to kick in very quickly if you are just spamming the API.

explain to me like I'm 5

You need to learn for yourself if this is a project you are tackling. Also will need to familiarize yourself with the terms of service of the archive, because most services would consider scraping every piece of data they have as abusive behavior and/or malicious.

[–] [email protected] 1 points 11 months ago (1 children)

The error message tells you what to.

If this is just a random error you are unconcerned by (recent power outage, flakely cables, etc), you can clear the errors. If you believe the drive is failing, you can replace the drive. The array will remain in a degraded state until you make a decision.

[–] [email protected] 1 points 11 months ago

loseless compression doesn't exist for video. Like mathematically impossible.

[–] [email protected] 1 points 11 months ago

How long is a piece of string?

[–] [email protected] 2 points 11 months ago (1 children)

I have 7-8tb of vital info on that drive that I need to get off

If it's vital, it should already have a backup.

You don't always get warning signs; especially with a laptop or portable drive. They can fall off a table at any point and never get back up.

[–] [email protected] 1 points 11 months ago (1 children)

Usually, I'd dig deep and research more thoroughly, but time's running out with a website launch goal and the need to focus on customer service and daily business operations.

Like every other business, either make time or hire someone to do it.

Asking the community to basically write you an IT plan is ridiculous.

[–] [email protected] 1 points 11 months ago

Blurays are only region-locked in software. Ripping software isn't going to even check the region code.

Even in VLC, the region code is just a setting option.

[–] [email protected] 1 points 11 months ago

Terminal.

With that many files, you should be using CLI tools.

[–] [email protected] 1 points 11 months ago

Some of the responses here are making me reconsider that statement.

[–] [email protected] 1 points 11 months ago

How is that different than feeding one big MKV into handbrake?

 

Over the last couple weeks there's been a few threads of people insisting on using DVD Decrypter. I was wondering why are people still using it? Datahoarding tends to attract relatively technical people, so there must be some reason to keep using software that hasn't been updated since Windows XP was modern.

MakeMKV seems like the better option in every use case except full backups. However a full DVD image can be made with any imaging software or even just dd. Any player that can handle the DVD menus from an ISO is going to be able to decrypt during playback.

[–] [email protected] 1 points 11 months ago

I have an array of Exos. During high usage, they be heard in the next room.

[–] [email protected] 1 points 11 months ago

I have an array of Exos. During high usage, they be heard in the next room.

view more: next ›