The closest that I know is distillation, you can google to get few resources (e.g. https://huggingface.co/papers/2306.08543). I don't know if it is what you are looking for
this post was submitted on 17 Jun 2023
12 points (92.9% liked)
LocalLLaMA
2777 readers
43 users here now
Welcome to LocalLLama! This is a community to discuss local large language models such as LLama, Deepseek, Mistral, and Qwen.
Get support from the community! Ask questions, share prompts, discuss benchmarks, get hyped at the latest and greatest model releases! Enjoy talking about our awesome hobby.
As ambassadors of the self-hosting machine learning community, we strive to support eachother and share our enthusiasm in a positive constructive way.
founded 2 years ago
MODERATORS
I don't know about that, but you could try GGML (llama.cpp). It has quantization up to 2-bits so that might be small enough.