this post was submitted on 23 Nov 2023
2 points (100.0% liked)

Data Hoarder

24 readers
1 users here now

We are digital librarians. Among us are represented the various reasons to keep data -- legal requirements, competitive requirements, uncertainty of permanence of cloud services, distaste for transmitting your data externally (e.g. government or corporate espionage), cultural and familial archivists, internet collapse preppers, and people who do it themselves so they're sure it's done right. Everyone has their reasons for curating the data they have decided to keep (either forever or For A Damn Long Time (tm) ). Along the way we have sought out like-minded individuals to exchange strategies, war stories, and cautionary tales of failures.

founded 10 months ago
MODERATORS
top 5 comments
sorted by: hot top controversial new old
[–] [email protected] 2 points 10 months ago

Wait until this thread will be closed or removed too because reddit don't like criticism of China.

[–] [email protected] 2 points 10 months ago (2 children)

How exactly do you backup an AI model anyway? How big is an AI model? Is it just a big zip file? ????

[–] [email protected] 2 points 10 months ago
  1. You simply download the file and keep it. We're talking about stuff you can run on your own computer, not services like chatgpt or such.

  2. Highly variable. Depends on model size and format. For LLMs, range would be from under 1GB up to a current maximum of ~240GB.

  3. It's a big file of numbers ("weights" or "parameters") either integers or floating point depending on format.

[–] [email protected] 1 points 10 months ago

Anywhere from 1 to several hundred GB. Quantized (compressed), the most popular models are 8-40gb each. LORAs are a lot smaller, but full models take up a lot of space.

[–] [email protected] 1 points 10 months ago

Someone make a torrent and ill archive them for sure