this post was submitted on 01 Feb 2025
385 points (98.0% liked)

Open Source

32368 readers
966 users here now

All about open source! Feel free to ask questions, and share news, and interesting stuff!

Useful Links

Rules

Related Communities

Community icon from opensource.org, but we are not affiliated with them.

founded 5 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 5 points 1 day ago (1 children)

Hosting a model of that size requires ~800GB of VRAM. Even if they release their models, it wouldn't make them obsolete since most people and many companies couldn't host it either way.

[–] [email protected] 2 points 1 day ago* (last edited 1 day ago) (1 children)

Anyone can now provide that service. Why pay OpenAI when you can pay a different service who is cheaper or provides a service more aligned with your needs or ethics or legal requirements?

[–] [email protected] 1 points 16 hours ago

Anyone that has 300.000$ per instance, the know-how to set it up, the means to support it and can outbid OpenAI, yes.

I don't see that happening on a large scale, just like I don't see tons of DeepSeek instances being hosted cheaper than the original any time soon.

If they really are afraid of that they can always license it in a way that forbids reselling.