28
submitted 9 months ago by [email protected] to c/localllama
top 3 comments
sorted by: hot top controversial new old
[-] noneabove1182 3 points 9 months ago

Woah this is pretty interesting stuff, I wonder how practical it is to do, I don't see a repo offering a script or anything so may be quite involved but looks promising. Anything to reduce size while maintaining performance is huge at this time

[-] [email protected] 2 points 9 months ago
[-] noneabove1182 1 points 9 months ago

Somehow this is even more confusing because that code hasn't been touched in 3 months, maybe just took them that long to validate? Will have to read through it, thanks!

this post was submitted on 22 Sep 2023
28 points (100.0% liked)

LocalLLaMA

2118 readers
1 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 1 year ago
MODERATORS