this post was submitted on 18 Nov 2023
1 points (100.0% liked)

Data Hoarder

170 readers
1 users here now

We are digital librarians. Among us are represented the various reasons to keep data -- legal requirements, competitive requirements, uncertainty of permanence of cloud services, distaste for transmitting your data externally (e.g. government or corporate espionage), cultural and familial archivists, internet collapse preppers, and people who do it themselves so they're sure it's done right. Everyone has their reasons for curating the data they have decided to keep (either forever or For A Damn Long Time (tm) ). Along the way we have sought out like-minded individuals to exchange strategies, war stories, and cautionary tales of failures.

founded 1 year ago
MODERATORS
 

Hi.

I was trying to find a way to prevent server crash every time I do a streaming.

Basically I have a popular movie website and every time I release an episode I get minimum 30k live viewers and whenever I pass 8k live viewers, 20gbit bandwidth becomes useless. I tried to put 5mbps bandwidth limit occasionally to prevent server crash but it didn’t do much. And I don’t want to rent 100 gbit network bandwidth every time I release an episode. So my question is, is there a way to deal with 30 to 60k live viewers only by using 20 gbit network or I just need to rent 100 gbit network occasionally?

Thank you!

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 11 months ago

From the live video delivery, this is definitely above my expertise, though it sounds like a fun problem to have. Suffering from success and all that.
The only bit I can contribute is automated deployment is your friend. Many providers will allow very easy to use scripted deployment. I don't know what kind of computer power you need to push, but a VPS with a Gb (in theory) pipe is very cheap, and if you need more more powerful VPS or bare metal a lot of providers allow hourly billing, some even hourly billing for bare metal if you need a lot of compute with it- some providers will not take kindly to you pounding a shared compute instance at 100% CPU. What you should look into do is automated deployment...may k8s, maybe something else. Before you start a stream that might have that many viewers, you can just deploy like 40 instances. You don't even have to worry about bandwidth with most providers since you're just running a short time. When your stream is over, all of the instances are destroyed and the billing is stopped to the nearest hour.

You can seriously do what you need for a couple hours for a few bucks if you don't need a bunch of compute, more if you need compute but still very reasonable. The non-compute heavy go 1Gb/s connections as low as not even a whole cent an hour, and you can distribute this across multiple providers to give more geographic locations.
The one caution I would give with this method is before you invest time in making a platform work for your setup is to actually grab an instance manually and benchmark the connection. Some are 1Gb/s, but will throttle to less during peak times, so test over time too. Know what you are dealing with and how many instances you need.