this post was submitted on 21 Aug 2023
22 points (95.8% liked)

LocalLLaMA

2274 readers
3 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 2 years ago
MODERATORS
 

They still consider it a beta but there we go! It's happening :D

you are viewing a single comment's thread
view the rest of the comments
[–] Kerfuffle 3 points 1 year ago

Is there any reason why support for loading both formats cannot be included within GGML/llama.cpp directly?

It could be (and I bet koboldcpp and maybe other projects will take that route). There absolutely is a disadvantage to dragging around a lot of legacy stuff for compatibility. llama.cpp/ggml's approach has pretty much always been to favor rapid development over compatibility.

As I understand it, the new format is basically the same as the old format

I'm not sure that's really accurate. There are significant differences in how the model vocabulary is handled, for instance.

Even if it was true right now, in the very first version of GGUF that is merged it'll likely be less true as GGUF evolves and the stuff it enables starts getting used more. Having to maintain compatibility with the GGML stuff would make iterating on GGUF and adding new features more difficult.