r/usenet - Usenet Discussion

15 readers
1 users here now

We are a thriving community dedicated to helping users old and new understand and use usenet.

founded 2 years ago
MODERATORS
51
 
 
The original post: /r/usenet by /u/External_Bend4014 on 2025-01-30 13:28:03.

With the massive growth of the Usenet feed, it’s understandable that Usenet servers are struggling to keep up with storing it. I’m curious are there any tools or methods to reliably measure the actual number of Usenet posts available across different providers?

For example, if a server claims "4500 days of retention" how can we see how many posts are actually accessible over that period? Or better yet, is there a way to compare how many posts are available for varying retention periods across all providers?

52
 
 
The original post: /r/usenet by /u/nzb-get on 2025-01-28 18:58:45.

For those of you using or interested in NZBGet, the dev team is now directly involved in moderating the r/nzbget subreddit. This will be the preferred space for NZBGet related conversations, including feature discussions, questions, and release notes.

The goal is to centralize feedback and provide a place where users can engage directly with the development team. This will allow us to better understand user priorities and focus on improvements that matter most to the community.

If you have questions, ideas, or want to stay updated on NZBGet developments, r/nzbget is the best place to do so.

—u/nzb-get

53
 
 
The original post: /r/usenet by /u/usenet_information on 2025-01-28 11:39:50.

A couple of days ago I posted this post regarding BlurayNZB:

https://www.reddit.com/r/usenet/comments/1i8b5u6/new_nzb_site/

As some of you already found out, there is another site called scenenzb.

The sites are related and are run by the same team.

BlurayNZB is focusing on certain Linux ISOs related to their name of the forum, while scenenzb take care of the rest.

Currently, only manual download is possible as both sites are forums/boards and no indexer.

They are working on RSS feed integration (SABnzbd / NZBget) and also on Radarr / Sonarr integration.

Registration is open as for now.

scenenzb:

www[dot]scenenzb[dot]org/login[dot]php

BlurayNZB:

www[dot]bluraynzb[dot]org/login[dot]php

Discord URL for support requests:

https://discord[dot]gg/Q8m34RepBj

54
 
 
The original post: /r/usenet by /u/OkStyle965 on 2025-01-27 16:55:32.

Would like to hear all the ins and outs, all the tips and tricks and anything that could enhance the download speed. All the options are viable, efficient, and inefficient. All the insights are appreciated. I want to make sure I’m getting the most out of my internet speed.

55
 
 
The original post: /r/usenet by /u/NewJobTitle on 2025-01-27 06:01:00.

After mucking around with different torrenting strategies in 2024, I got to the point where I was tired of even dealing with private tracker requirements and babysitting ratios. I used Black Friday as an excuse to jump into Usenet for the first time ever. My use case is essentially the *arrs stack, FWIW. If any of this info is interesting, cool beans.

Prowlarr stats across indexers

Provider stats from SABnzbd

56
 
 
The original post: /r/usenet by /u/Starbuckwhatdoyahear on 2025-01-26 18:13:18.

I have had my media server stack up and running for a few weeks. I noticed today that I hit the limit of API calls with DS and have a large number with ninja. I checked the .xml files on the API calls for both and it is showing shows and episodes I have never heard of. I renewed the API keys and changed my passwords on both and the calls are continuing, albeit at a slower pace, but that just might be because I am waiting the intently and checking more now and I don't know how to check the actual exact time of the call in the xml file.

Any idea how this can be happening? The stack is on a local mini pc behind passwords (not HTTPS) and I never use the services outside of my home. I am using a wireguard VPN assigned in my asus router to the mini pc only.

57
1
Audiobooks (zerobytes.monster)
submitted 3 weeks ago by [email protected] to c/[email protected]
 
 
The original post: /r/usenet by /u/naughtyruprecht on 2025-01-26 12:57:57.

Hello, I'm looking for advice please. Currently with slug and ninja, and I'm struggling to find audiobooks that I previously downloaded many years ago and foolishly deleted... Can anybody recommend an indexer for me please?

58
 
 
The original post: /r/usenet by /u/CcX1085 on 2025-01-25 22:59:25.

Hey everyone,

I'm looking to get back into Usenet and figuring out the best providers to go with.

I know some of the big players like Newshosting and UsenetServer use the Highwinds backbone, while others like Eweka and UsenetExpress are independent. Previously, I used Giganews, but I’m open to trying new more affordable options.

For those of you who’ve been using Usenet for a while:

Which provider do you swear by, and why?

Do you prefer sticking to a big backbone (like Highwinds), or do you think independent providers are better for redundancy and article completion?

Any good combinations for a main provider + backup block account you’d recommend?

My primary use is for media and software, so retention and completion are key. Would love to hear about your setups and experiences!

Thanks in advance!

P.s newshosting is £35 for 15 months!

59
 
 
The original post: /r/usenet by /u/sinisterpisces on 2025-01-25 20:29:25.

Hello,

I used to be hugely active on Usenet in the early to late 1990s, in various discussion groups in the alt tree.

Binary downloads were a thing, but it wasn't the thing, especially on a 14.4kbps modem.

A couple of questions as someone wanting to get back into it:

  1. Is there any data on how active actual discussion groups still are on Usenet?

  2. Are there providers around that focus on indexing/retaining conversation heavy groups? A lot of the service providers now seem to focus on binary data transfer and retention for binary groups, to the point they don't even really advertise the discussion groups.

60
 
 
The original post: /r/usenet by /u/Jimbuscus on 2025-01-25 13:20:46.

UPGRADE YOUR BLACK FRIDAY BLOCK

We hope you have been enjoying your 500GB Access from our NewsDemon Black Friday Sale! If you're enjoying our service, we would like to encourage you to upgrade to an annual unlimited account. This offer is ONLY for members who participated in the Black Friday Mystery deal, so it is exclusive to members like you.

Unlimited Access + VPN for only $14 for the first year!

Thats a substantial savings for uncensored, blazing fast, unlimited access to all three of our NewsDemon server farms plus access to our VPN product!

Then in 2026 it will renew at only $20 for the whole year!

You'll enjoy substantial savings with our premium plan at this rate, and starting in 2027, your account will renew at only $30 per year—nearly a 60% disc

Act quickly! This exclusive offer is only available for a limited time and is specifically for members like you who opted for the Mystery Deal on Black Friday.


I posted a screenshot, but it appeared to be auto-deleted. I ended up getting a refund and not using my BF2024 block, does anyone recommend NewsDemon?

I already have a Yearly with Frugal from BF2024.

61
 
 
The original post: /r/usenet by /u/eaglestarx on 2025-01-25 10:37:45.

Hey everyone,

I’ve been wondering about the difference between torrents and Usenet. Both seem like ways to download stuff, but I’ve noticed a lot of people go with Usenet even though you have to pay for it.

So what’s the deal? Is Usenet really that much better? Is it about faster speeds, better privacy, or something else?

I’ve given it a shot, but I just don’t get why it’s better than torrents. Honestly, it feels pretty expensive.

62
 
 
The original post: /r/usenet by /u/redditor100101011101 on 2025-01-24 16:45:10.

I currently have 1 provider and 2 indexers that i use. still kinda new to this.

If i cant find something im looking for, how do i tell if its either not on the providers servers or if its that the indexers im using cant find it?

Not sure if i need more/better indexers or if i need a better provider?

Thanks!

63
2
New NZB Site (zerobytes.monster)
submitted 3 weeks ago by [email protected] to c/[email protected]
 
 
The original post: /r/usenet by /u/usenet_information on 2025-01-23 19:06:13.

I just stumbled over BluerayNZB.

This is what they state on their site:

Our platform is new! We started this comparison on 20/12/2024. Here, you'll see we are the fastest in posting across all Usenet indexers. Pure scene, the fastest releases! Pay Attention: Sections we do is BLURAY, BLURAY-UHD & TV-BLURAY only. If is scene release, we post!

It seems that they emphasize on fast availability on Usenet.

I did not check against others yet.

Currently, only manual download is possible.

They are working on RSS feed integration (SABnzbd / NZBget) and also on Radarr / Sonarr integration.

If you want to check it out (I do not know if I am allowed to share full URL):

www[dot]bluraynzb[dot]org/login[dot]php

Maybe one of the mods wants to add this to the indexer wiki?

64
 
 
The original post: /r/usenet by /u/einhuman198 on 2025-01-23 18:38:32.

So Highwinds just hit 6000 days of retention a few days ago. When I saw this my curiosity sparked again, like it did several times before. Just how big is the amount of data Highwinds stores to offer 6000+ days of Usenet retention?

This time I got motivated enough to calculate it based on existing public data, and I want to share my calculations. As a site note: My last Uni Math Lessons are a few years in the past, and while I passed, I won't guarantee the accuracy of my calculations. Consider the numbers very rough approximations, since it doesn't include data taken down, compression, deduplication etc.. If you spot errors in the math please let me know, I'll correct this post!

As a reliable Data Source we have the daily newsgroup feed size published by Newsdemon and u/greglyda.

Since Usenet backbones sync the all incoming articles with each other via NNTP, this feed size will roughly be the same for Highwinds too.

Ok, good. So with these values we can make a neat table and use those values to approximate a mathematical function via regression.

For consistency, I assumed the provided MM/YY dates to each be on the first of the month. In my table, the 2017-01-01 (All my specified dates are in YYYY-MM-DD) marks x Value 0. It's the first date provided. The x-axis being the days passed, y-axis being the daily feed. Then I calculated the days passed from 2017-01-01 with a timespan calculator. I always use the first of the month for consistent calculation. For example, Newsdemon states the daily feed in August 2023 was 220TiB. So I calculated the days passed between 2017-01-01 and 2023-08-01 (2403 days), therefore giving me the value pair (2403, 220). The result for all values looks like this:

The values from Newsdemon in a coordinate system

Then via regression, I calculated the function closest to the values. It's an exponential function. I got this as a result

y = 26.126047417171 * e^0.0009176041129*x

with a coefficient of determination of 0.92.

https://preview.redd.it/719sxsea3see1.png?width=211&format=png&auto=webp&s=3dfdd1b23a369e8a8f1b229fc86cc821de3ea264

Not perfect, but pretty decent. In the graph you can see why it's "only" 0.92, not 1:

https://preview.redd.it/73gbnybb3see1.png?width=2125&format=png&auto=webp&s=ef437be446f33ec0bcc5b7e0ac4c1e812906c01c

The most recent values skyrocket beyond the "healthy" normal exponential growth that can be seen from January 2017 until around March 2024. In the Reddit discussions regarding this phenomenon, there was speculation that some AI Scraping companies abuse Usenet as a cheap backup, and the graphs seem to back that up. I hope the provider will implement some protection against this, because this cannot be sustained.

Unrelated Meme

Aaanyway, back to topic:

The area under this graph in a given interval is equivalent to the total data stored. When we calculate the Integral of the function, we will get a number that roughly estimates the total storage size based on the data we have.

To integrate this function, we first need to calculate which interval we have to view and calculate with.

So back to the timespan calculator. The current retention of Highwinds at the time of writing this post (2025-01-23) is 6002 days. According to the timespan calculator, this means the data retention of Highwinds starts 2008-08-18. We set 2017-01-01 as our day 0 in the graph earlier, so we need to calculate our upper and lower interval limits with this knowledge. The days passed between 2008-08-18 and 2017-01-01 are 3058. Between 2017-01-01 and today, 2025-01-23, 2944 days passed. So our lower interval bound is -3058, our upper bound is 2944. Now we can integrate our function as follows:

Integral Calculation

Therefore, the amount of data stored at Highwinds is roughly 422540 TiB. This equals ≈464,6 Petabytes. Mind you, this is just one copy of all the data IF they stored all of the feed. For all the data stored they will have identical copies between their US and EU Datacenters and they'll have more than one copy for redundancy reasons. This is just the accumulated amount of data over the last 6002 days.

Now with this info we can estimate some figures:

The estimated daily feed in August 2008, when Highwinds started expanding their retention, was 1.6TiB. The latest figure from Newsdemon we have is 475TiB daily from November 2024. If you break it down, the entirety of the daily newsfeed in August 2008 is now transferred every 5 minutes. 4.85 minutes for 1.6TiB in November 2024.

With the growth rate of the calculated function, the stored data size will reach 1 million TiB by Mid August 2027. It'll likely be earlier if the growth rate continues growing beyond it's "normal" exponential rate that the Usenet Feed Size maintained from 2008 to 2023 before the (AI?) abuse started.

10000 days of retention would be reached on 2035-12-31. At the growth rate of our calculated graph, the total data size of these 10000 days will be 16627717 TiB. This equals 18282 Petabytes, 39x the current amount. Gotta hope that HDD density growth comes back to exponential growth too, huh?

Some personal thoughts at the end: One big bonus that usenet offers is retention. If you go beyond just downloading the newest releases automated with *arr and all the fine tools we now got, Usenet always was and still is really reliable for finding old and/or exotic stuff. Up until around 2012, there used to be many posts unobfuscated and still indexable via e.g. nzbking. You can find really exotic releases from all content types, no matter if movies, music, tv shows, software. You name it. You can grab most of these releases and download them with Full Speed. Some random Upload from 2009? Usually not an issue. Only when they are DMCA'd it may not be possible. With torrents, you often end up with dried up content. 0 Seeders, no chance. It does make sense, who seeds the entirety of exotic stuff ever shared for 15 years? Can't blame the people. I personally love the experience of picking the best quality uploads from obscure media that someone posted to the usenet like 15 years ago. And more often than not, it's the only copy still avaliable online. It's something special. And I fear with the current development, at some point the business model "Usenet" is not sustainable anymore. Not just for Highwinds, but for every provider.

I feel like Usenet is the last living example of the saying that "The Internet doesn't forget". Because the Internet forgets, faster than ever. The internet gets more centralized by the day. Usenet may be forced to further consolidate with the growing data feed. If the origin of the high Feed figures is indeed AI Scraping, we can just hope that the AI bubble bursts asap so that they stop abusing Usenet. And that maybe the providers can filter out those articles without sacrificing retention for the past and in the future for all the other data people are willing to download. I hope we will continue to see a growing usenet retention and hopefully 10000 days of retention and beyond.

Thank you for reading till the end.

tl;dr Calculated from the known daily Usenet Feed sizes, Highwinds approximately stores 464,6 Petabytes of data with it's current 6002 days of Retention at the time of writing this. This figure is just one copy of the data.

65
 
 
The original post: /r/usenet by /u/jericko on 2025-01-23 01:39:32.

I can not post anything to help anyone without getting my post removed. What is this forum for anymore? Everything I post to help someone I get this..

"This has been removed.

"Posts about Usenet-related software (e.g., edited to not get removed) are prohibited. Support requests, troubleshooting, and detailed discussions are not allowed."

This does not align with the 6 Rules on the sidebar. I am legitimately asking why this Sub is even around anymore. We can not help anyone anymore. This is a Sub for Usenet, but we can not discuss it here. I get we can not talk about releases or point people to where to find things, but you all remove so many posts that do not break the 6 Rules. It's Usenet, we all know what we are using it for. This sub used to be a great place.

66
 
 
The original post: /r/usenet by /u/makesupwordsblomp on 2025-01-22 14:08:11.

I am a usenet user, mostly for TV via NZBGeek. I am an ages ago torrenter on What.cd etc, but have not torrented in a long time. There are just some records that I can not find on streaming or elsewhere and I want to to be able to find them. I am happy to seed, though I acknowledge some trackers likely have prerequisites that I don't fulfill.

What options are even available to me to join realistically? Most folks on here say torrents are the way for music, but, like, which modern trackers are popular and strong and can I actually join?

67
 
 
The original post: /r/usenet by /u/CGM on 2025-01-22 00:05:39.

Newsgrouper, my web gateway to Usenet, now has an option to search old posts downloaded from the Internet Archive. These run from the "Great Renaming" in 1987 up to 2013. The period after that is covered by the facility I already had to search BlueWorldHosting, which covers from 2003 to the present.

I now have archive files for the whole of the "big 8" hierarchies: comp humanities misc news rec sci soc talk. For groups where the archive search option is available you can find it by selecting a group and then clicking "Find Articles". Newsgrouper is at https://newsgrouper.org.uk/ .

68
 
 
The original post: /r/usenet by /u/AffectionateYellow24 on 2025-01-21 17:29:18.

Now this is the second time this is happening to me. First time was last year when I bought an account for a couple of months. I got their unlimited senior plan. My usage was pretty low for most of the time and speeds were wonderful (500 Mb/s). The traffic spiked on my end and I had to download like 1.8 TB in a couple of days. Suddenly I noticed being throttled to 128 kb/s.

I opened a ticket complaining about speeds and asked if I maybe tripped their „acceptable use policy“ in some way. They said no and told me the problem must be on my end. Now this can certainly rule out because

-No ISP issues -Tried connecting with and without VPN -Tried with and without SSL -Always trying the 1 GB test file -other providers (multiple) deliver top speeds -created a new News Agency trial account and guess what: no throttling and top speeds!

So my account ran out of time and the issue was not resolved. I thought that maybe I tripped their fair use policy and they simply do not tell me.

On Black Friday I got a new year long unlimited senior plan with them. Same story. Good speeds for a couple of weeks. Then I needed to a little bit more traffic for a certain time (like 1.8 TB in 3 days) and I’m throttled again.

I know they’re throttling because initially the download starts at >20 MB/s and then instantly drops to 128 kb/s.

Now I’ve got basically the same response from support but I’m now stuck with them for a year. A month has passed since the throttling and my speeds are still throttled.

I never shared my account nor exceeded max connections or anything.

Did anyone else had similar issues?

69
 
 
The original post: /r/usenet by /u/AtheistPi on 2025-01-20 13:51:38.

Hi Farmers,

It's Blue Monday and we want to make you feel a bit better by providing a better price for a couple of days.

So this means 25% off on all our packages until 2025-01-26 (UTC) with the coupon BM2025.

Thank you for your support and good luck this Monday!

Your friends at Usenet.Farm

70
 
 
The original post: /r/usenet by /u/jimit21 on 2025-01-20 11:37:56.

Last night it was ok, today it dropped to 2MB/s, connected to the NL server. Anyone else experiencing issues?

71
 
 
The original post: /r/usenet by /u/TheBingage on 2025-01-20 06:10:48.

NZBgeeks API just randomly failed on me today, and just wanted to be sure if that's a me thing or if that's happening to multiple users?

72
 
 
The original post: /r/usenet by /u/zwambagger on 2025-01-19 16:27:55.

Anyone getting renewal charge tries at the moment for NewsDemon? I haven't used them in ages, and suddenly they're trying to charge my prepaid credit card, 3 times in the past hour. It's not working since I haven't topped it up. Anyone knows what's up with this?

73
 
 
The original post: /r/usenet by /u/DoktorXNetWork on 2025-01-18 09:23:58.

Title say all, is newshosting having some sort of limited time plan to celibrate this milestone, with some good recuring offer like old 20usd/y deal?

74
 
 
The original post: /r/usenet by /u/eddi1984 on 2025-01-17 04:42:43.

I have tweaknews since 2019. Was charged the same amount since 2020 (30 EUR/year). Got charged a few days ago again, but this time its almost double! Obviously, I don't want to keep them. Just wondering if somebody experienced this! I signed up for Ultimate + VPN - 12 Month in 2019 on a promotion. It was supposed to be 30 EUR every year.

75
 
 
The original post: /r/usenet by /u/little_elephant1 on 2025-01-16 19:25:30.

I have a fair few indexers (plz don't judge, I went into Pokémon mode to catch them all):

Lifetime: Geek, Miatrix, Althub, Usenet Crawler

Free: NzbPlanet, Dognzb, tabula rasa

I also have Newshosting as my main provider, plus a couple other block accounts on different backbones.

I'm struggling to grab quite a few things so I'm thinking maybe I need another indexer to pick up those few stragglers.

I know a lot of people suggest Slug but that's invite only so are there are there any non-invite indexers you guys would suggest trying?

view more: ‹ prev next ›