this post was submitted on 13 Sep 2023
25 points (100.0% liked)

LocalLLaMA

2269 readers
3 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 1 year ago
MODERATORS
 

Linked is the new repo, it's still in relatively early stages but does work.

I'm using it in oobabooga text-gen-ui and the OLD GPTQ format, so not even the new stuff, and on my 3060 I see a genuine >200% increase in speed:

Exllama v1

Output generated in 21.84 seconds (9.16 tokens/s, 200 tokens, context 135, seed 1891621432)

Exllama v2

Output generated in 6.23 seconds (32.10 tokens/s, 200 tokens, context 135, seed 313599079)

Absolutely crazy, all settings are the same. And it's not just a burst at the front, it lasts:

Output generated in 22.40 seconds (31.92 tokens/s, 715 tokens, context 135, seed 717231733)

And this is using the old format, exllama v2 includes a new way to quant, allowing for much more granular bitrates.

Turbo went with a really cool approach here, you set a target bits per weight, say, 3.5, and it'll automatically adjust the appropriate weights to the appropriate quant levels to achieve maximum performance where it counts, saving data in important weights and sacrificing more on non important ones, very cool stuff!

Get your latest oobabooga webui and start playing!

https://github.com/oobabooga/text-generation-webui

https://github.com/noneabove1182/text-generation-webui-docker

Some models in the new format from turbo: https://huggingface.co/turboderp

top 3 comments
sorted by: hot top controversial new old
[–] [email protected] 3 points 1 year ago* (last edited 1 year ago)

~~I'm really interested in this. Is Exllama2 a separately trained variant of Llama2? The use restrictions of Llama2 have always irked me and a similarly performing open variant of that architecture is very intriguing.~~

Nevermind. This is a processor that runs the model, not the model itself. My bad.

[–] [email protected] 3 points 1 year ago (1 children)

Why'd you create your own dockerfile repo vs just improving/changing the one in the main ooba repo?

[–] noneabove1182 2 points 1 year ago

Good question, at the time I made it there wasn't a good option, and the one in the main repo is very comprehensive and overwhelming, I wanted to make one that was straight forward and easier to digest to see what's actually happening