this post was submitted on 14 Jun 2023
22 points (100.0% liked)

LocalLLaMA

2274 readers
3 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 2 years ago
MODERATORS
 

Promising stuff from their repo, claiming "exceptional performance, achieving a [HumanEval] pass@1 score of 57.3, surpassing the open-source SOTA by approximately 20 points."

https://github.com/nlpxucan/WizardLM

you are viewing a single comment's thread
view the rest of the comments
[–] noneabove1182 1 points 1 year ago (1 children)

Oh wait does ooba support this? Nvm then I'm enjoying using that, I'm just a little lost sometimes haha

[–] Kerfuffle 2 points 1 year ago (1 children)

I don't know if it does or doesn't, I was just saying those two projects seemed similar: presenting a frontend for running inference on models while the user doesn't necessarily have to know/care what backend is used.

[–] noneabove1182 2 points 1 year ago

Gotcha, koboldcpp seems to be able to run it, all of it is only a tiny bit confusing :D