this post was submitted on 31 Mar 2024
282 points (99.3% liked)

Piracy: ꜱᴀɪʟ ᴛʜᴇ ʜɪɢʜ ꜱᴇᴀꜱ

54083 readers
410 users here now

⚓ Dedicated to the discussion of digital piracy, including ethical problems and legal advancements.

Rules • Full Version

1. Posts must be related to the discussion of digital piracy

2. Don't request invites, trade, sell, or self-promote

3. Don't request or link to specific pirated titles, including DMs

4. Don't submit low-quality posts, be entitled, or harass others



Loot, Pillage, & Plunder


💰 Please help cover server costs.

Ko-FiLiberapay


founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 65 points 6 months ago (2 children)

For anyone wanting to contribute but on a smaller and more feasible scale, you can help distribute their database using torrents.

https://annas-archive.org/torrents

[–] [email protected] 46 points 6 months ago* (last edited 6 months ago) (1 children)

I know the last time this came up there was a lot of user resistance to the torrent scheme. I'd be willing to seed 200-500gb but having minimum torrent archive sizes of like 1.5TB and larger really limits the number of people willing to give up that storage, as well as defeats a lot of the resiliency of torrents with how bloody long it takes to get a complete copy. I know that 1.5TB takes a massive chunk out of my already pretty full NAS, and I passed on seeding the first time for that reason.

It feels like they didn't really subdivide the database as much as they should have...

[–] [email protected] 27 points 6 months ago (1 children)

There are plenty of small torrents. Use the torrent generator and tell the script how much space you have and it will give you the “best” (least seeded) torrents whose sum is the size you give it. It doesn’t have to be big, even a few GB is suitable for some smaller torrents.

[–] [email protected] 22 points 6 months ago* (last edited 6 months ago)

Almost all the small torrents that I see pop up are already seeded relatively good (~10 seeders) though, which reinforces the fact that A. the torrents most desperately needing seeders are the older, largest ones and B. large torrents don't attract seeders because of unreasonable space requirements.

Admittedly, newer torrents seem to be split into 300gb or less pieces, which is good, but there's still a lot of monster torrents in that list.

[–] [email protected] 7 points 6 months ago (1 children)

Thx.

Do you know how useful it is to host such a torrent? Who is accessing the content via that torrent?

[–] [email protected] 7 points 6 months ago (1 children)

Anyone who wants to. I think a lot of LLM trainers access them.

[–] [email protected] 1 points 6 months ago

Doesn't sound like I should host some of it. I'd be more down to host it for endusers

[–] [email protected] 30 points 6 months ago (2 children)

how big is the database?

books can't be that big, but i'm guessing the selection is simply huge?

[–] [email protected] 50 points 6 months ago (2 children)

The selection is literally all books that can be found on the internet.

[–] [email protected] 13 points 6 months ago (2 children)
[–] [email protected] 33 points 6 months ago (3 children)

According to their total dataset size excluding duplicates, over 900 TB

[–] [email protected] 17 points 6 months ago

Sure, that's a bit more than $65.000 per year with Backblaze.

[–] [email protected] 12 points 6 months ago (5 children)

Shit, my synology has more than that… alas, it is full of movie “archives”

[–] [email protected] 18 points 6 months ago (1 children)

You run a petabyte Synology at home?

[–] [email protected] 7 points 6 months ago

Well, it’s not just a single synology, it’s got a bunch of expansion units, and there are multiple host machines.

[–] [email protected] 6 points 6 months ago (2 children)

I'm guessing you're talking GBs?

[–] [email protected] 8 points 6 months ago (1 children)
[–] [email protected] 2 points 6 months ago (1 children)

That's awesome - how many drives and of what sizes do you have? Also why synology instead of higher enterprise grade solution at this point?

[–] [email protected] 4 points 6 months ago

Right now most of them are 20T each. I started smaller at first, but they’ve dropped so much in price. I usually wait until a sale and grab a bunch. There are… math… 62 drives?

When I first started, I only had the 6 bay… I chose synology because I wanted something that was managed for me. I don’t want to have to focus on setting things up and possibly doing things wrong. It comes with amazing tools. Also, the server buy-in was a lot less than the other “professional” rack mounted solutions.

I had such a great experience that I just kept with them. It is a pretty expensive hobby though, but so is buying physical movies. And, some things never get a physical release, so having it digitally protects me from when Netflix, or whomever, decides to drop something.

[–] FigMcLargeHuge 3 points 6 months ago* (last edited 6 months ago)

They put a link in with the total...

Total
Excluding duplicates
133,708,037 files
913.1 TB

load more comments (3 replies)
[–] [email protected] 5 points 6 months ago

Correct me if I'm wrong, but they only index shadow libraries and do not host any files themselves (unless you count the torrents). So, you don't need 900+ TB of storage to create a mirror.

[–] Cuntessera 4 points 6 months ago (1 children)

I imagine a couple of terabytes at the very least, though, I could be underestimating how many books have got deDRMed so far.

[–] [email protected] 4 points 6 months ago (1 children)
[–] Cuntessera 9 points 6 months ago (2 children)

Girl, what? No wonder they’re having trouble hosting their archive. Does Anna’s Archive host copyrighted content as well or is all that copyleft?

[–] [email protected] 13 points 6 months ago (2 children)

They host academic papers and books, most of them are copyrighted contents. They recently got in trouble for scraping a book metadata service to generate a list of books that hasn't been archived yet: https://torrentfreak.com/lawsuit-accuses-annas-archive-of-hacking-worldcat-stealing-2-2-tb-data-240207/

[–] [email protected] 2 points 6 months ago* (last edited 6 months ago)

They index, not host, no? (Unless you count the torrents, which are distributed)

[–] Cuntessera 2 points 6 months ago (2 children)

Is hosting all that stuff even legal? I mean, they’re not making any money off of it, but they’re still a “piracy” hub. How have they survived this long?

[–] [email protected] 4 points 6 months ago

It's very illegal. iirc it was created by a group called "Pirate Library Mirror" after the guy that runs z-library got arrested, so I assume they're taking anonymity seriously to avoid arrest.

[–] [email protected] 3 points 6 months ago

No, it's not.

They've survived by making themselves hard to identify and shut down. And as we can see here, by creating redundancies.

[–] [email protected] 4 points 6 months ago

The archive includes copyrighted works. Often multiple copies of each work, across different formats.

[–] [email protected] 2 points 6 months ago* (last edited 6 months ago)

bigger than zlib or project Gutenberg?

[–] [email protected] 15 points 6 months ago (1 children)

It is huge! They claimed to have preserved about 5% of the world’s books.

[–] [email protected] 2 points 6 months ago

oh i actually tought it was way more! there wasnt a single book i wanted (or even tought to look up) that i didnt actually find in there.

[–] [email protected] 26 points 6 months ago (2 children)

Could anyone broad-stroke the security requirements for something like this? Looks like they'll pay for hosting up to a certain amount, and between that and a pipeline to keep the mirror updated I'd think it wouldn't be tough to get one up and running.

Just looking for theory - what are the logistics behind keeping a mirror like this secure?

[–] [email protected] 22 points 6 months ago* (last edited 6 months ago) (3 children)

Could be worth asking on selfhosted (how do I link a sub on lemmy ?) They probably have more relevant experience at this sort of thing.

Edit

Does this work ?

https://lemmy.world/c/selfhosted

[–] can 20 points 6 months ago

[email protected] might work for more people.

[–] [email protected] 12 points 6 months ago* (last edited 6 months ago) (1 children)

[email protected]

Is probably more suitable. I'd be interested in the total size, though.

[–] [email protected] 3 points 6 months ago (1 children)

900 TB, according to other comments here.

[–] [email protected] 1 points 6 months ago (1 children)

Is it all or nothing sort of deal?

[–] [email protected] 1 points 6 months ago

There are partial torrents, also according to the other comments.

[–] [email protected] 5 points 6 months ago

It does. 😉

[–] [email protected] 15 points 6 months ago (1 children)
[–] [email protected] 4 points 6 months ago

This is a fascinating read

[–] [email protected] 16 points 6 months ago (1 children)

Also link any ways to donate if they're accepting that.

[–] [email protected] 10 points 6 months ago (2 children)

I had no idea about this project. Is it like a better search engine for libgen etc?

[–] weirdo_from_space 20 points 6 months ago

It searches through libgens, z-library and has it's own mirrors of the files they serve on top of that. I think it was created as a response to Z-Library's domain getting seized but I could be wrong.

[–] [email protected] 14 points 6 months ago

It has way more content than Libgen

load more comments
view more: next ›