this post was submitted on 14 Feb 2024
17 points (90.5% liked)
LocalLLaMA
2268 readers
2 users here now
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Not truly open source but at least you can build your own and control what happens with all the data, if I understand this correctly?
It means that they want people to consult the code as a reference for how to best use the hardware acceleration.
If all software uses their cards to best effect, that makes their cards more useful and thus more valuable; making them money. If only their own frontend can do that, they lose out on most of that, while also having to spend money to make sure that the rest of the software, like the UI, is competitive.
Ah, that makes sense. Thank you!