this post was submitted on 13 Aug 2023
10 points (91.7% liked)
LocalLLaMA
2274 readers
3 users here now
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
it's a standardizing of a universal GGML format which would mean going forward no more breaking changes when new formats are worked on, and also includes the same functionality of llama.cpp for all GGML types (falcon, mpt, starcoder etc)
I definitely wouldn't count on that.
But it does make it much easier to do some changes, like adding/changing model specific fields which previously would have required a format change. Stuff like changing or dropping support for existing quantizations would also break stuff independent of the model format itself.
oh yeah definitely didn't mean "no more breaking changes", just that we've had several from ggml file format changes, and so THAT portion of the breaking is going away