this post was submitted on 26 Jul 2023
19 points (100.0% liked)

LocalLLaMA

2293 readers
1 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 2 years ago
MODERATORS
 

For example, does a 13B parameter model at 2_K quantiation perform worse than a 7B parameter model at 8bit or 16bit?

no comments (yet)
sorted by: hot top controversial new old
there doesn't seem to be anything here