this post was submitted on 05 Mar 2025
15 points (100.0% liked)
LocalLLaMA
2661 readers
2 users here now
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
founded 2 years ago
MODERATORS
insane, absolutely insane
Why insane? For quality, speed, size? I find the coder 1.5b and 3b light and good
It matches R1 in the given benchmarks. R1 has 671B params (36 activated) while this only has 32
GGUF quants are already out: https://huggingface.co/bartowski/Qwen_QwQ-32B-GGUF
Yay! let's try
ollama run hf.co/bartowski/Qwen_QwQ-32B-GGUF:Q4_K_M
/set parameter num_ctx 32768